2026-03-08T23:55:00.072 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T23:55:00.080 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T23:55:00.105 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308 branch: squid description: orch:cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.1} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client/kclient 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '308' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.1 ' name: kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 8017 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch:cephadm suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm03.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP6WVBChXHnyB+QgR58YYHmV3PwTbk0KEOPQ79H7sv/3lsYHV7exxAatXY3ULVlaN0nkhXVTF50CRVdFODKYe0k= vm06.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJvdLqpcU9vsscO1Tu+kaMvX7RY6ymFjBlr6IzmzI6vdCcq3eGDVFVqzBb95kcid7p2oQShvW+YqtVVVHV76Ew0= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.1 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.1 roleless: true - print: '**** done end installing v18.2.1 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 1 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay false - cephadm.shell: host.a: - ceph fs set cephfs inline_data true --yes-i-really-really-mean-it - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - kclient: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: [] meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: true teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_22:22:45 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done - ceph versions | jq -e '.mgr | length == 1' - ceph versions | jq -e '.mgr | keys' | grep $sha1 - ceph versions | jq -e '.overall | length == 2' - ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '.up_to_date | length == 2' - ceph orch ps - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-08T23:55:00.105 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T23:55:00.106 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T23:55:00.106 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T23:55:00.106 INFO:teuthology.task.internal:Checking packages... 2026-03-08T23:55:00.106 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T23:55:00.106 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T23:55:00.106 INFO:teuthology.packaging:ref: None 2026-03-08T23:55:00.106 INFO:teuthology.packaging:tag: None 2026-03-08T23:55:00.106 INFO:teuthology.packaging:branch: squid 2026-03-08T23:55:00.106 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:55:00.107 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-08T23:55:00.843 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-08T23:55:00.844 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T23:55:00.845 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T23:55:00.845 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T23:55:00.845 INFO:teuthology.task.internal:Saving configuration 2026-03-08T23:55:00.854 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T23:55:00.855 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T23:55:00.862 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm03.local', 'description': '/archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 23:53:46.289689', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:03', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP6WVBChXHnyB+QgR58YYHmV3PwTbk0KEOPQ79H7sv/3lsYHV7exxAatXY3ULVlaN0nkhXVTF50CRVdFODKYe0k='} 2026-03-08T23:55:00.869 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm06.local', 'description': '/archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 23:53:46.290317', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:06', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJvdLqpcU9vsscO1Tu+kaMvX7RY6ymFjBlr6IzmzI6vdCcq3eGDVFVqzBb95kcid7p2oQShvW+YqtVVVHV76Ew0='} 2026-03-08T23:55:00.869 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T23:55:00.870 INFO:teuthology.task.internal:roles: ubuntu@vm03.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-08T23:55:00.870 INFO:teuthology.task.internal:roles: ubuntu@vm06.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-08T23:55:00.870 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T23:55:00.877 DEBUG:teuthology.task.console_log:vm03 does not support IPMI; excluding 2026-03-08T23:55:00.883 DEBUG:teuthology.task.console_log:vm06 does not support IPMI; excluding 2026-03-08T23:55:00.883 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f058b8f0dc0>, signals=[15]) 2026-03-08T23:55:00.883 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T23:55:00.883 INFO:teuthology.task.internal:Opening connections... 2026-03-08T23:55:00.883 DEBUG:teuthology.task.internal:connecting to ubuntu@vm03.local 2026-03-08T23:55:00.884 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T23:55:00.946 DEBUG:teuthology.task.internal:connecting to ubuntu@vm06.local 2026-03-08T23:55:00.946 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T23:55:01.006 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T23:55:01.007 DEBUG:teuthology.orchestra.run.vm03:> uname -m 2026-03-08T23:55:01.035 INFO:teuthology.orchestra.run.vm03.stdout:x86_64 2026-03-08T23:55:01.036 DEBUG:teuthology.orchestra.run.vm03:> cat /etc/os-release 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:NAME="CentOS Stream" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:VERSION="9" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:ID="centos" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:ID_LIKE="rhel fedora" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_ID="9" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:PLATFORM_ID="platform:el9" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:ANSI_COLOR="0;31" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:LOGO="fedora-logo-icon" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:HOME_URL="https://centos.org/" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-08T23:55:01.090 INFO:teuthology.orchestra.run.vm03.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-08T23:55:01.091 INFO:teuthology.lock.ops:Updating vm03.local on lock server 2026-03-08T23:55:01.095 DEBUG:teuthology.orchestra.run.vm06:> uname -m 2026-03-08T23:55:01.111 INFO:teuthology.orchestra.run.vm06.stdout:x86_64 2026-03-08T23:55:01.111 DEBUG:teuthology.orchestra.run.vm06:> cat /etc/os-release 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:NAME="CentOS Stream" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:VERSION="9" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:ID="centos" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:ID_LIKE="rhel fedora" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:VERSION_ID="9" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:PLATFORM_ID="platform:el9" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:ANSI_COLOR="0;31" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:LOGO="fedora-logo-icon" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:HOME_URL="https://centos.org/" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-08T23:55:01.167 INFO:teuthology.orchestra.run.vm06.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-08T23:55:01.168 INFO:teuthology.lock.ops:Updating vm06.local on lock server 2026-03-08T23:55:01.172 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T23:55:01.175 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T23:55:01.176 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T23:55:01.176 DEBUG:teuthology.orchestra.run.vm03:> test '!' -e /home/ubuntu/cephtest 2026-03-08T23:55:01.178 DEBUG:teuthology.orchestra.run.vm06:> test '!' -e /home/ubuntu/cephtest 2026-03-08T23:55:01.225 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T23:55:01.226 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T23:55:01.226 DEBUG:teuthology.orchestra.run.vm03:> test -z $(ls -A /var/lib/ceph) 2026-03-08T23:55:01.233 DEBUG:teuthology.orchestra.run.vm06:> test -z $(ls -A /var/lib/ceph) 2026-03-08T23:55:01.246 INFO:teuthology.orchestra.run.vm03.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T23:55:01.283 INFO:teuthology.orchestra.run.vm06.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T23:55:01.284 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T23:55:01.296 DEBUG:teuthology.orchestra.run.vm03:> test -e /ceph-qa-ready 2026-03-08T23:55:01.309 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T23:55:01.509 DEBUG:teuthology.orchestra.run.vm06:> test -e /ceph-qa-ready 2026-03-08T23:55:01.528 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T23:55:01.718 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T23:55:01.719 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T23:55:01.720 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T23:55:01.722 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T23:55:01.741 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T23:55:01.742 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T23:55:01.743 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T23:55:01.743 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T23:55:01.781 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T23:55:01.802 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T23:55:01.803 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T23:55:01.803 DEBUG:teuthology.orchestra.run.vm03:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T23:55:01.856 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T23:55:01.856 DEBUG:teuthology.orchestra.run.vm06:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T23:55:01.875 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T23:55:01.875 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T23:55:01.898 DEBUG:teuthology.orchestra.run.vm06:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T23:55:01.923 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:55:01.934 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:55:01.942 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:55:01.951 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:55:01.952 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T23:55:01.954 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T23:55:01.954 DEBUG:teuthology.orchestra.run.vm03:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T23:55:01.977 DEBUG:teuthology.orchestra.run.vm06:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T23:55:02.019 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T23:55:02.021 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T23:55:02.021 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T23:55:02.047 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T23:55:02.075 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T23:55:02.132 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T23:55:02.189 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:55:02.189 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T23:55:02.251 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T23:55:02.274 DEBUG:teuthology.orchestra.run.vm06:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T23:55:02.330 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:55:02.331 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T23:55:02.392 DEBUG:teuthology.orchestra.run.vm03:> sudo service rsyslog restart 2026-03-08T23:55:02.394 DEBUG:teuthology.orchestra.run.vm06:> sudo service rsyslog restart 2026-03-08T23:55:02.424 INFO:teuthology.orchestra.run.vm03.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T23:55:02.461 INFO:teuthology.orchestra.run.vm06.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-08T23:55:02.701 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T23:55:02.703 INFO:teuthology.task.internal:Starting timer... 2026-03-08T23:55:02.703 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T23:55:02.706 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T23:55:02.708 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-08T23:55:02.708 INFO:teuthology.task.selinux:Excluding vm03: VMs are not yet supported 2026-03-08T23:55:02.708 INFO:teuthology.task.selinux:Excluding vm06: VMs are not yet supported 2026-03-08T23:55:02.708 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T23:55:02.708 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T23:55:02.708 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T23:55:02.708 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T23:55:02.709 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T23:55:02.710 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-08T23:55:02.711 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-08T23:55:03.272 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T23:55:03.278 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T23:55:03.278 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventoryxl_ur3xn --limit vm03.local,vm06.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T23:57:21.262 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm03.local'), Remote(name='ubuntu@vm06.local')] 2026-03-08T23:57:21.262 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm03.local' 2026-03-08T23:57:21.263 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T23:57:21.326 DEBUG:teuthology.orchestra.run.vm03:> true 2026-03-08T23:57:21.411 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm03.local' 2026-03-08T23:57:21.411 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm06.local' 2026-03-08T23:57:21.411 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm06.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T23:57:21.483 DEBUG:teuthology.orchestra.run.vm06:> true 2026-03-08T23:57:21.564 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm06.local' 2026-03-08T23:57:21.564 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T23:57:21.577 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T23:57:21.577 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T23:57:21.577 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T23:57:21.581 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T23:57:21.591 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T23:57:21.622 INFO:teuthology.orchestra.run.vm03.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-08T23:57:21.639 INFO:teuthology.orchestra.run.vm03.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-08T23:57:21.663 INFO:teuthology.orchestra.run.vm06.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-08T23:57:21.663 INFO:teuthology.orchestra.run.vm06.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-08T23:57:21.672 INFO:teuthology.orchestra.run.vm03.stderr:sudo: ntpd: command not found 2026-03-08T23:57:21.718 INFO:teuthology.orchestra.run.vm06.stderr:sudo: ntpd: command not found 2026-03-08T23:57:21.718 INFO:teuthology.orchestra.run.vm03.stdout:506 Cannot talk to daemon 2026-03-08T23:57:21.718 INFO:teuthology.orchestra.run.vm06.stdout:506 Cannot talk to daemon 2026-03-08T23:57:21.718 INFO:teuthology.orchestra.run.vm03.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-08T23:57:21.718 INFO:teuthology.orchestra.run.vm06.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-08T23:57:21.723 INFO:teuthology.orchestra.run.vm03.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-08T23:57:21.729 INFO:teuthology.orchestra.run.vm06.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-08T23:57:21.777 INFO:teuthology.orchestra.run.vm06.stderr:bash: line 1: ntpq: command not found 2026-03-08T23:57:21.779 INFO:teuthology.orchestra.run.vm03.stderr:bash: line 1: ntpq: command not found 2026-03-08T23:57:21.780 INFO:teuthology.orchestra.run.vm06.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T23:57:21.780 INFO:teuthology.orchestra.run.vm06.stdout:=============================================================================== 2026-03-08T23:57:21.781 INFO:teuthology.orchestra.run.vm03.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-08T23:57:21.781 INFO:teuthology.orchestra.run.vm03.stdout:=============================================================================== 2026-03-08T23:57:21.782 INFO:teuthology.run_tasks:Running task install... 2026-03-08T23:57:21.784 DEBUG:teuthology.task.install:project ceph 2026-03-08T23:57:21.784 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T23:57:21.784 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.1', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T23:57:21.784 INFO:teuthology.task.install:Using flavor: default 2026-03-08T23:57:21.787 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T23:57:21.787 INFO:teuthology.task.install:extra packages: [] 2026-03-08T23:57:21.788 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.1', 'wait_for_package': False} 2026-03-08T23:57:21.788 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-08T23:57:21.788 INFO:teuthology.packaging:ref: None 2026-03-08T23:57:21.788 INFO:teuthology.packaging:tag: v18.2.1 2026-03-08T23:57:21.788 INFO:teuthology.packaging:branch: None 2026-03-08T23:57:21.788 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:57:22.359 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.1^{} -> 7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-08T23:57:22.359 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-08T23:57:22.360 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.1', 'wait_for_package': False} 2026-03-08T23:57:22.360 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-08T23:57:22.360 INFO:teuthology.packaging:ref: None 2026-03-08T23:57:22.360 INFO:teuthology.packaging:tag: v18.2.1 2026-03-08T23:57:22.360 INFO:teuthology.packaging:branch: None 2026-03-08T23:57:22.360 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:57:22.360 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-08T23:57:22.996 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/ 2026-03-08T23:57:22.996 INFO:teuthology.task.install.rpm:Package version is 18.2.1-0 2026-03-08T23:57:23.029 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/ 2026-03-08T23:57:23.029 INFO:teuthology.task.install.rpm:Package version is 18.2.1-0 2026-03-08T23:57:23.343 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-08T23:57:23.386 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:57:23.386 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-08T23:57:23.388 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-08T23:57:23.388 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:57:23.388 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-08T23:57:23.420 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-08T23:57:23.423 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-08T23:57:23.423 INFO:teuthology.packaging:ref: None 2026-03-08T23:57:23.423 INFO:teuthology.packaging:tag: v18.2.1 2026-03-08T23:57:23.423 INFO:teuthology.packaging:branch: None 2026-03-08T23:57:23.423 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:57:23.423 DEBUG:teuthology.orchestra.run.vm06:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.1/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-08T23:57:23.424 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-08T23:57:23.424 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-08T23:57:23.424 INFO:teuthology.packaging:ref: None 2026-03-08T23:57:23.424 INFO:teuthology.packaging:tag: v18.2.1 2026-03-08T23:57:23.424 INFO:teuthology.packaging:branch: None 2026-03-08T23:57:23.424 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:57:23.424 DEBUG:teuthology.orchestra.run.vm03:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.1/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-08T23:57:23.494 DEBUG:teuthology.orchestra.run.vm06:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-08T23:57:23.504 DEBUG:teuthology.orchestra.run.vm03:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-08T23:57:23.581 DEBUG:teuthology.orchestra.run.vm06:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-08T23:57:23.605 DEBUG:teuthology.orchestra.run.vm03:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-08T23:57:23.637 INFO:teuthology.orchestra.run.vm03.stdout:check_obsoletes = 1 2026-03-08T23:57:23.639 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean all 2026-03-08T23:57:23.642 INFO:teuthology.orchestra.run.vm06.stdout:check_obsoletes = 1 2026-03-08T23:57:23.643 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean all 2026-03-08T23:57:23.837 INFO:teuthology.orchestra.run.vm06.stdout:41 files removed 2026-03-08T23:57:23.857 INFO:teuthology.orchestra.run.vm03.stdout:41 files removed 2026-03-08T23:57:23.867 DEBUG:teuthology.orchestra.run.vm06:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-08T23:57:23.890 DEBUG:teuthology.orchestra.run.vm03:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-08T23:57:24.866 INFO:teuthology.orchestra.run.vm06.stdout:ceph packages for x86_64 93 kB/s | 76 kB 00:00 2026-03-08T23:57:24.984 INFO:teuthology.orchestra.run.vm03.stdout:ceph packages for x86_64 89 kB/s | 76 kB 00:00 2026-03-08T23:57:25.503 INFO:teuthology.orchestra.run.vm06.stdout:ceph noarch packages 15 kB/s | 9.4 kB 00:00 2026-03-08T23:57:25.632 INFO:teuthology.orchestra.run.vm03.stdout:ceph noarch packages 15 kB/s | 9.4 kB 00:00 2026-03-08T23:57:26.133 INFO:teuthology.orchestra.run.vm06.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-08T23:57:26.263 INFO:teuthology.orchestra.run.vm03.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-08T23:57:26.779 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - BaseOS 14 MB/s | 8.9 MB 00:00 2026-03-08T23:57:27.404 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - BaseOS 7.9 MB/s | 8.9 MB 00:01 2026-03-08T23:57:28.663 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - AppStream 22 MB/s | 27 MB 00:01 2026-03-08T23:57:28.974 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - AppStream 35 MB/s | 27 MB 00:00 2026-03-08T23:57:32.335 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - CRB 8.1 MB/s | 8.0 MB 00:00 2026-03-08T23:57:33.853 INFO:teuthology.orchestra.run.vm06.stdout:CentOS Stream 9 - Extras packages 31 kB/s | 20 kB 00:00 2026-03-08T23:57:34.337 INFO:teuthology.orchestra.run.vm06.stdout:Extra Packages for Enterprise Linux 51 MB/s | 20 MB 00:00 2026-03-08T23:57:34.844 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - CRB 3.1 MB/s | 8.0 MB 00:02 2026-03-08T23:57:36.205 INFO:teuthology.orchestra.run.vm03.stdout:CentOS Stream 9 - Extras packages 55 kB/s | 20 kB 00:00 2026-03-08T23:57:36.676 INFO:teuthology.orchestra.run.vm03.stdout:Extra Packages for Enterprise Linux 54 MB/s | 20 MB 00:00 2026-03-08T23:57:39.061 INFO:teuthology.orchestra.run.vm06.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-08T23:57:40.549 INFO:teuthology.orchestra.run.vm06.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T23:57:40.549 INFO:teuthology.orchestra.run.vm06.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T23:57:40.553 INFO:teuthology.orchestra.run.vm06.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-08T23:57:40.554 INFO:teuthology.orchestra.run.vm06.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-08T23:57:40.582 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout:Installing: 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph x86_64 2:18.2.1-0.el9 ceph 6.4 k 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base x86_64 2:18.2.1-0.el9 ceph 5.2 M 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 ceph 839 k 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 ceph 142 k 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 ceph 1.4 M 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 ceph-noarch 132 k 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 ceph-noarch 1.8 M 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 ceph-noarch 7.4 M 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 ceph-noarch 50 k 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 ceph 7.7 M 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test x86_64 2:18.2.1-0.el9 ceph 40 M 2026-03-08T23:57:40.586 INFO:teuthology.orchestra.run.vm06.stdout: cephadm noarch 2:18.2.1-0.el9 ceph-noarch 221 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 ceph 31 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 ceph 658 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel x86_64 2:18.2.1-0.el9 ceph 127 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 ceph 161 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados x86_64 2:18.2.1-0.el9 ceph 321 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd x86_64 2:18.2.1-0.el9 ceph 297 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw x86_64 2:18.2.1-0.el9 ceph 99 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 ceph 86 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 ceph 171 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout:Upgrading: 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: librados2 x86_64 2:18.2.1-0.el9 ceph 3.3 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: librbd1 x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout:Installing dependencies: 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 ceph-noarch 23 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds x86_64 2:18.2.1-0.el9 ceph 2.1 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 ceph-noarch 242 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon x86_64 2:18.2.1-0.el9 ceph 4.4 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 ceph-noarch 15 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 ceph 24 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 ceph 165 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-08T23:57:40.587 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 ceph 474 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 x86_64 2:18.2.1-0.el9 ceph 4.5 M 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 ceph 45 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 ceph 124 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-08T23:57:40.588 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:Installing weak dependencies: 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:Install 117 Packages 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:Upgrade 2 Packages 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:Total download size: 182 M 2026-03-08T23:57:40.589 INFO:teuthology.orchestra.run.vm06.stdout:Downloading Packages: 2026-03-08T23:57:42.150 INFO:teuthology.orchestra.run.vm06.stdout:(1/119): ceph-18.2.1-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-08T23:57:42.198 INFO:teuthology.orchestra.run.vm03.stdout:lab-extras 65 kB/s | 50 kB 00:00 2026-03-08T23:57:42.740 INFO:teuthology.orchestra.run.vm06.stdout:(2/119): ceph-fuse-18.2.1-0.el9.x86_64.rpm 1.4 MB/s | 839 kB 00:00 2026-03-08T23:57:42.840 INFO:teuthology.orchestra.run.vm06.stdout:(3/119): ceph-immutable-object-cache-18.2.1-0.e 1.4 MB/s | 142 kB 00:00 2026-03-08T23:57:43.142 INFO:teuthology.orchestra.run.vm06.stdout:(4/119): ceph-mds-18.2.1-0.el9.x86_64.rpm 7.0 MB/s | 2.1 MB 00:00 2026-03-08T23:57:43.442 INFO:teuthology.orchestra.run.vm06.stdout:(5/119): ceph-mgr-18.2.1-0.el9.x86_64.rpm 4.8 MB/s | 1.4 MB 00:00 2026-03-08T23:57:43.561 INFO:teuthology.orchestra.run.vm06.stdout:(6/119): ceph-base-18.2.1-0.el9.x86_64.rpm 3.0 MB/s | 5.2 MB 00:01 2026-03-08T23:57:43.926 INFO:teuthology.orchestra.run.vm03.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T23:57:43.926 INFO:teuthology.orchestra.run.vm03.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-08T23:57:43.935 INFO:teuthology.orchestra.run.vm03.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-08T23:57:43.935 INFO:teuthology.orchestra.run.vm03.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-08T23:57:43.972 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout:Installing: 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph x86_64 2:18.2.1-0.el9 ceph 6.4 k 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base x86_64 2:18.2.1-0.el9 ceph 5.2 M 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 ceph 839 k 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 ceph 142 k 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 ceph 1.4 M 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 ceph-noarch 132 k 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 ceph-noarch 1.8 M 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 ceph-noarch 7.4 M 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 ceph-noarch 50 k 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 ceph 7.7 M 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test x86_64 2:18.2.1-0.el9 ceph 40 M 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: cephadm noarch 2:18.2.1-0.el9 ceph-noarch 221 k 2026-03-08T23:57:43.978 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 ceph 31 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 ceph 658 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel x86_64 2:18.2.1-0.el9 ceph 127 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 ceph 161 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados x86_64 2:18.2.1-0.el9 ceph 321 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd x86_64 2:18.2.1-0.el9 ceph 297 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw x86_64 2:18.2.1-0.el9 ceph 99 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 ceph 86 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 ceph 171 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout:Upgrading: 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: librados2 x86_64 2:18.2.1-0.el9 ceph 3.3 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: librbd1 x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout:Installing dependencies: 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 ceph-noarch 23 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds x86_64 2:18.2.1-0.el9 ceph 2.1 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 ceph-noarch 242 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon x86_64 2:18.2.1-0.el9 ceph 4.4 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 ceph-noarch 15 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 ceph 24 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 ceph 165 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 ceph 474 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 x86_64 2:18.2.1-0.el9 ceph 4.5 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 ceph 45 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 ceph 124 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-08T23:57:43.979 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout:Installing weak dependencies: 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout:Install 117 Packages 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout:Upgrade 2 Packages 2026-03-08T23:57:43.980 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:57:43.981 INFO:teuthology.orchestra.run.vm03.stdout:Total download size: 182 M 2026-03-08T23:57:43.981 INFO:teuthology.orchestra.run.vm03.stdout:Downloading Packages: 2026-03-08T23:57:44.141 INFO:teuthology.orchestra.run.vm06.stdout:(7/119): ceph-mon-18.2.1-0.el9.x86_64.rpm 6.3 MB/s | 4.4 MB 00:00 2026-03-08T23:57:45.009 INFO:teuthology.orchestra.run.vm06.stdout:(8/119): ceph-common-18.2.1-0.el9.x86_64.rpm 5.8 MB/s | 18 MB 00:03 2026-03-08T23:57:45.109 INFO:teuthology.orchestra.run.vm06.stdout:(9/119): ceph-selinux-18.2.1-0.el9.x86_64.rpm 241 kB/s | 24 kB 00:00 2026-03-08T23:57:45.146 INFO:teuthology.orchestra.run.vm06.stdout:(10/119): ceph-radosgw-18.2.1-0.el9.x86_64.rpm 7.7 MB/s | 7.7 MB 00:01 2026-03-08T23:57:45.246 INFO:teuthology.orchestra.run.vm06.stdout:(11/119): libcephfs-devel-18.2.1-0.el9.x86_64.r 312 kB/s | 31 kB 00:00 2026-03-08T23:57:45.352 INFO:teuthology.orchestra.run.vm06.stdout:(12/119): libcephfs2-18.2.1-0.el9.x86_64.rpm 6.1 MB/s | 658 kB 00:00 2026-03-08T23:57:45.423 INFO:teuthology.orchestra.run.vm03.stdout:(1/119): ceph-18.2.1-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-08T23:57:45.454 INFO:teuthology.orchestra.run.vm06.stdout:(13/119): libcephsqlite-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 165 kB 00:00 2026-03-08T23:57:45.553 INFO:teuthology.orchestra.run.vm06.stdout:(14/119): librados-devel-18.2.1-0.el9.x86_64.rp 1.2 MB/s | 127 kB 00:00 2026-03-08T23:57:45.679 INFO:teuthology.orchestra.run.vm06.stdout:(15/119): libradosstriper1-18.2.1-0.el9.x86_64. 3.7 MB/s | 474 kB 00:00 2026-03-08T23:57:46.020 INFO:teuthology.orchestra.run.vm03.stdout:(2/119): ceph-fuse-18.2.1-0.el9.x86_64.rpm 1.4 MB/s | 839 kB 00:00 2026-03-08T23:57:46.121 INFO:teuthology.orchestra.run.vm03.stdout:(3/119): ceph-immutable-object-cache-18.2.1-0.e 1.4 MB/s | 142 kB 00:00 2026-03-08T23:57:46.282 INFO:teuthology.orchestra.run.vm06.stdout:(16/119): librgw2-18.2.1-0.el9.x86_64.rpm 7.4 MB/s | 4.5 MB 00:00 2026-03-08T23:57:46.381 INFO:teuthology.orchestra.run.vm06.stdout:(17/119): python3-ceph-argparse-18.2.1-0.el9.x8 453 kB/s | 45 kB 00:00 2026-03-08T23:57:46.481 INFO:teuthology.orchestra.run.vm06.stdout:(18/119): python3-ceph-common-18.2.1-0.el9.x86_ 1.2 MB/s | 124 kB 00:00 2026-03-08T23:57:46.581 INFO:teuthology.orchestra.run.vm06.stdout:(19/119): python3-cephfs-18.2.1-0.el9.x86_64.rp 1.6 MB/s | 161 kB 00:00 2026-03-08T23:57:46.627 INFO:teuthology.orchestra.run.vm03.stdout:(4/119): ceph-mds-18.2.1-0.el9.x86_64.rpm 4.2 MB/s | 2.1 MB 00:00 2026-03-08T23:57:46.683 INFO:teuthology.orchestra.run.vm06.stdout:(20/119): python3-rados-18.2.1-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-08T23:57:46.785 INFO:teuthology.orchestra.run.vm06.stdout:(21/119): python3-rbd-18.2.1-0.el9.x86_64.rpm 2.9 MB/s | 297 kB 00:00 2026-03-08T23:57:46.885 INFO:teuthology.orchestra.run.vm06.stdout:(22/119): python3-rgw-18.2.1-0.el9.x86_64.rpm 998 kB/s | 99 kB 00:00 2026-03-08T23:57:46.926 INFO:teuthology.orchestra.run.vm03.stdout:(5/119): ceph-base-18.2.1-0.el9.x86_64.rpm 2.9 MB/s | 5.2 MB 00:01 2026-03-08T23:57:46.932 INFO:teuthology.orchestra.run.vm03.stdout:(6/119): ceph-mgr-18.2.1-0.el9.x86_64.rpm 4.8 MB/s | 1.4 MB 00:00 2026-03-08T23:57:46.985 INFO:teuthology.orchestra.run.vm06.stdout:(23/119): rbd-fuse-18.2.1-0.el9.x86_64.rpm 870 kB/s | 86 kB 00:00 2026-03-08T23:57:47.061 INFO:teuthology.orchestra.run.vm03.stdout:(7/119): ceph-common-18.2.1-0.el9.x86_64.rpm 9.4 MB/s | 18 MB 00:01 2026-03-08T23:57:47.388 INFO:teuthology.orchestra.run.vm06.stdout:(24/119): rbd-mirror-18.2.1-0.el9.x86_64.rpm 7.4 MB/s | 3.0 MB 00:00 2026-03-08T23:57:47.488 INFO:teuthology.orchestra.run.vm06.stdout:(25/119): rbd-nbd-18.2.1-0.el9.x86_64.rpm 1.7 MB/s | 171 kB 00:00 2026-03-08T23:57:47.586 INFO:teuthology.orchestra.run.vm06.stdout:(26/119): ceph-grafana-dashboards-18.2.1-0.el9. 235 kB/s | 23 kB 00:00 2026-03-08T23:57:47.663 INFO:teuthology.orchestra.run.vm03.stdout:(8/119): ceph-mon-18.2.1-0.el9.x86_64.rpm 6.0 MB/s | 4.4 MB 00:00 2026-03-08T23:57:47.686 INFO:teuthology.orchestra.run.vm06.stdout:(27/119): ceph-mgr-cephadm-18.2.1-0.el9.noarch. 1.3 MB/s | 132 kB 00:00 2026-03-08T23:57:47.770 INFO:teuthology.orchestra.run.vm03.stdout:(9/119): ceph-selinux-18.2.1-0.el9.x86_64.rpm 225 kB/s | 24 kB 00:00 2026-03-08T23:57:47.986 INFO:teuthology.orchestra.run.vm06.stdout:(28/119): ceph-mgr-dashboard-18.2.1-0.el9.noarc 5.9 MB/s | 1.8 MB 00:00 2026-03-08T23:57:48.122 INFO:teuthology.orchestra.run.vm03.stdout:(10/119): ceph-radosgw-18.2.1-0.el9.x86_64.rpm 7.3 MB/s | 7.7 MB 00:01 2026-03-08T23:57:48.221 INFO:teuthology.orchestra.run.vm03.stdout:(11/119): libcephfs-devel-18.2.1-0.el9.x86_64.r 312 kB/s | 31 kB 00:00 2026-03-08T23:57:48.327 INFO:teuthology.orchestra.run.vm03.stdout:(12/119): libcephfs2-18.2.1-0.el9.x86_64.rpm 6.1 MB/s | 658 kB 00:00 2026-03-08T23:57:48.428 INFO:teuthology.orchestra.run.vm03.stdout:(13/119): libcephsqlite-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 165 kB 00:00 2026-03-08T23:57:48.528 INFO:teuthology.orchestra.run.vm03.stdout:(14/119): librados-devel-18.2.1-0.el9.x86_64.rp 1.2 MB/s | 127 kB 00:00 2026-03-08T23:57:48.633 INFO:teuthology.orchestra.run.vm03.stdout:(15/119): libradosstriper1-18.2.1-0.el9.x86_64. 4.4 MB/s | 474 kB 00:00 2026-03-08T23:57:48.845 INFO:teuthology.orchestra.run.vm06.stdout:(29/119): ceph-osd-18.2.1-0.el9.x86_64.rpm 3.3 MB/s | 18 MB 00:05 2026-03-08T23:57:48.947 INFO:teuthology.orchestra.run.vm06.stdout:(30/119): ceph-mgr-modules-core-18.2.1-0.el9.no 2.3 MB/s | 242 kB 00:00 2026-03-08T23:57:49.047 INFO:teuthology.orchestra.run.vm06.stdout:(31/119): ceph-mgr-rook-18.2.1-0.el9.noarch.rpm 505 kB/s | 50 kB 00:00 2026-03-08T23:57:49.146 INFO:teuthology.orchestra.run.vm06.stdout:(32/119): ceph-prometheus-alerts-18.2.1-0.el9.n 147 kB/s | 15 kB 00:00 2026-03-08T23:57:49.181 INFO:teuthology.orchestra.run.vm06.stdout:(33/119): ceph-mgr-diskprediction-local-18.2.1- 6.2 MB/s | 7.4 MB 00:01 2026-03-08T23:57:49.238 INFO:teuthology.orchestra.run.vm03.stdout:(16/119): librgw2-18.2.1-0.el9.x86_64.rpm 7.4 MB/s | 4.5 MB 00:00 2026-03-08T23:57:49.249 INFO:teuthology.orchestra.run.vm06.stdout:(34/119): cephadm-18.2.1-0.el9.noarch.rpm 2.1 MB/s | 221 kB 00:00 2026-03-08T23:57:49.338 INFO:teuthology.orchestra.run.vm03.stdout:(17/119): python3-ceph-argparse-18.2.1-0.el9.x8 452 kB/s | 45 kB 00:00 2026-03-08T23:57:49.390 INFO:teuthology.orchestra.run.vm06.stdout:(35/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 194 kB/s | 40 kB 00:00 2026-03-08T23:57:49.437 INFO:teuthology.orchestra.run.vm03.stdout:(18/119): python3-ceph-common-18.2.1-0.el9.x86_ 1.2 MB/s | 124 kB 00:00 2026-03-08T23:57:49.513 INFO:teuthology.orchestra.run.vm06.stdout:(36/119): libconfig-1.7.2-9.el9.x86_64.rpm 273 kB/s | 72 kB 00:00 2026-03-08T23:57:49.538 INFO:teuthology.orchestra.run.vm03.stdout:(19/119): python3-cephfs-18.2.1-0.el9.x86_64.rp 1.6 MB/s | 161 kB 00:00 2026-03-08T23:57:49.640 INFO:teuthology.orchestra.run.vm03.stdout:(20/119): python3-rados-18.2.1-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-08T23:57:49.660 INFO:teuthology.orchestra.run.vm06.stdout:(37/119): libgfortran-11.5.0-14.el9.x86_64.rpm 2.9 MB/s | 794 kB 00:00 2026-03-08T23:57:49.714 INFO:teuthology.orchestra.run.vm06.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 623 kB/s | 33 kB 00:00 2026-03-08T23:57:49.744 INFO:teuthology.orchestra.run.vm03.stdout:(21/119): python3-rbd-18.2.1-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-08T23:57:49.802 INFO:teuthology.orchestra.run.vm06.stdout:(39/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 2.8 MB/s | 253 kB 00:00 2026-03-08T23:57:49.845 INFO:teuthology.orchestra.run.vm03.stdout:(22/119): python3-rgw-18.2.1-0.el9.x86_64.rpm 986 kB/s | 99 kB 00:00 2026-03-08T23:57:49.945 INFO:teuthology.orchestra.run.vm03.stdout:(23/119): rbd-fuse-18.2.1-0.el9.x86_64.rpm 862 kB/s | 86 kB 00:00 2026-03-08T23:57:49.955 INFO:teuthology.orchestra.run.vm06.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 8.2 MB/s | 1.2 MB 00:00 2026-03-08T23:57:50.035 INFO:teuthology.orchestra.run.vm06.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 1.3 MB/s | 106 kB 00:00 2026-03-08T23:57:50.157 INFO:teuthology.orchestra.run.vm06.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 1.1 MB/s | 135 kB 00:00 2026-03-08T23:57:50.231 INFO:teuthology.orchestra.run.vm06.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 1.7 MB/s | 126 kB 00:00 2026-03-08T23:57:50.242 INFO:teuthology.orchestra.run.vm06.stdout:(44/119): libquadmath-11.5.0-14.el9.x86_64.rpm 253 kB/s | 184 kB 00:00 2026-03-08T23:57:50.313 INFO:teuthology.orchestra.run.vm06.stdout:(45/119): python3-urllib3-1.26.5-7.el9.noarch.r 2.6 MB/s | 218 kB 00:00 2026-03-08T23:57:50.318 INFO:teuthology.orchestra.run.vm06.stdout:(46/119): boost-program-options-1.75.0-13.el9.x 1.3 MB/s | 104 kB 00:00 2026-03-08T23:57:50.353 INFO:teuthology.orchestra.run.vm03.stdout:(24/119): ceph-osd-18.2.1-0.el9.x86_64.rpm 5.1 MB/s | 18 MB 00:03 2026-03-08T23:57:50.362 INFO:teuthology.orchestra.run.vm03.stdout:(25/119): rbd-mirror-18.2.1-0.el9.x86_64.rpm 7.2 MB/s | 3.0 MB 00:00 2026-03-08T23:57:50.362 INFO:teuthology.orchestra.run.vm06.stdout:(47/119): flexiblas-3.0.4-9.el9.x86_64.rpm 613 kB/s | 30 kB 00:00 2026-03-08T23:57:50.381 INFO:teuthology.orchestra.run.vm06.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 784 kB/s | 15 kB 00:00 2026-03-08T23:57:50.414 INFO:teuthology.orchestra.run.vm06.stdout:(49/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 31 MB/s | 3.0 MB 00:00 2026-03-08T23:57:50.429 INFO:teuthology.orchestra.run.vm06.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 3.0 MB/s | 45 kB 00:00 2026-03-08T23:57:50.435 INFO:teuthology.orchestra.run.vm06.stdout:(51/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 2.9 MB/s | 160 kB 00:00 2026-03-08T23:57:50.450 INFO:teuthology.orchestra.run.vm06.stdout:(52/119): librdkafka-1.6.1-102.el9.x86_64.rpm 31 MB/s | 662 kB 00:00 2026-03-08T23:57:50.454 INFO:teuthology.orchestra.run.vm03.stdout:(26/119): rbd-nbd-18.2.1-0.el9.x86_64.rpm 1.7 MB/s | 171 kB 00:00 2026-03-08T23:57:50.461 INFO:teuthology.orchestra.run.vm03.stdout:(27/119): ceph-grafana-dashboards-18.2.1-0.el9. 234 kB/s | 23 kB 00:00 2026-03-08T23:57:50.469 INFO:teuthology.orchestra.run.vm06.stdout:(53/119): libxslt-1.1.34-12.el9.x86_64.rpm 13 MB/s | 233 kB 00:00 2026-03-08T23:57:50.470 INFO:teuthology.orchestra.run.vm06.stdout:(54/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 7.1 MB/s | 246 kB 00:00 2026-03-08T23:57:50.486 INFO:teuthology.orchestra.run.vm06.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 16 MB/s | 292 kB 00:00 2026-03-08T23:57:50.487 INFO:teuthology.orchestra.run.vm06.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 2.4 MB/s | 42 kB 00:00 2026-03-08T23:57:50.559 INFO:teuthology.orchestra.run.vm03.stdout:(28/119): ceph-mgr-cephadm-18.2.1-0.el9.noarch. 1.2 MB/s | 132 kB 00:00 2026-03-08T23:57:50.605 INFO:teuthology.orchestra.run.vm06.stdout:(57/119): openblas-openmp-0.3.29-1.el9.x86_64.r 45 MB/s | 5.3 MB 00:00 2026-03-08T23:57:50.624 INFO:teuthology.orchestra.run.vm06.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 12 MB/s | 244 kB 00:00 2026-03-08T23:57:50.641 INFO:teuthology.orchestra.run.vm06.stdout:(59/119): python3-jinja2-2.11.3-8.el9.noarch.rp 14 MB/s | 249 kB 00:00 2026-03-08T23:57:50.657 INFO:teuthology.orchestra.run.vm06.stdout:(60/119): python3-jmespath-1.0.1-1.el9.noarch.r 3.0 MB/s | 48 kB 00:00 2026-03-08T23:57:50.675 INFO:teuthology.orchestra.run.vm06.stdout:(61/119): python3-babel-2.9.1-2.el9.noarch.rpm 32 MB/s | 6.0 MB 00:00 2026-03-08T23:57:50.676 INFO:teuthology.orchestra.run.vm06.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 9.0 MB/s | 177 kB 00:00 2026-03-08T23:57:50.692 INFO:teuthology.orchestra.run.vm06.stdout:(63/119): python3-markupsafe-1.1.1-12.el9.x86_6 2.2 MB/s | 35 kB 00:00 2026-03-08T23:57:50.693 INFO:teuthology.orchestra.run.vm06.stdout:(64/119): python3-mako-1.1.4-6.el9.noarch.rpm 9.3 MB/s | 172 kB 00:00 2026-03-08T23:57:50.756 INFO:teuthology.orchestra.run.vm06.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 6.9 MB/s | 442 kB 00:00 2026-03-08T23:57:50.800 INFO:teuthology.orchestra.run.vm06.stdout:(66/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 3.5 MB/s | 157 kB 00:00 2026-03-08T23:57:50.842 INFO:teuthology.orchestra.run.vm06.stdout:(67/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 41 MB/s | 6.1 MB 00:00 2026-03-08T23:57:50.859 INFO:teuthology.orchestra.run.vm06.stdout:(68/119): python3-pyasn1-modules-0.4.8-7.el9.no 4.7 MB/s | 277 kB 00:00 2026-03-08T23:57:50.860 INFO:teuthology.orchestra.run.vm06.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 3.1 MB/s | 54 kB 00:00 2026-03-08T23:57:50.876 INFO:teuthology.orchestra.run.vm06.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 2.6 MB/s | 42 kB 00:00 2026-03-08T23:57:50.896 INFO:teuthology.orchestra.run.vm06.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 15 MB/s | 303 kB 00:00 2026-03-08T23:57:50.912 INFO:teuthology.orchestra.run.vm06.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 4.0 MB/s | 64 kB 00:00 2026-03-08T23:57:50.922 INFO:teuthology.orchestra.run.vm06.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 11 MB/s | 111 kB 00:00 2026-03-08T23:57:50.940 INFO:teuthology.orchestra.run.vm06.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 18 MB/s | 308 kB 00:00 2026-03-08T23:57:51.161 INFO:teuthology.orchestra.run.vm06.stdout:(75/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 64 MB/s | 19 MB 00:00 2026-03-08T23:57:51.166 INFO:teuthology.orchestra.run.vm06.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 4.8 MB/s | 25 kB 00:00 2026-03-08T23:57:51.169 INFO:teuthology.orchestra.run.vm06.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 15 MB/s | 49 kB 00:00 2026-03-08T23:57:51.173 INFO:teuthology.orchestra.run.vm06.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 20 MB/s | 67 kB 00:00 2026-03-08T23:57:51.187 INFO:teuthology.orchestra.run.vm06.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 57 MB/s | 838 kB 00:00 2026-03-08T23:57:51.196 INFO:teuthology.orchestra.run.vm06.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 63 MB/s | 548 kB 00:00 2026-03-08T23:57:51.199 INFO:teuthology.orchestra.run.vm06.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 13 MB/s | 29 kB 00:00 2026-03-08T23:57:51.202 INFO:teuthology.orchestra.run.vm06.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 22 MB/s | 60 kB 00:00 2026-03-08T23:57:51.205 INFO:teuthology.orchestra.run.vm06.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 15 MB/s | 43 kB 00:00 2026-03-08T23:57:51.207 INFO:teuthology.orchestra.run.vm06.stdout:(84/119): python3-cachetools-4.2.4-1.el9.noarch 14 MB/s | 32 kB 00:00 2026-03-08T23:57:51.210 INFO:teuthology.orchestra.run.vm06.stdout:(85/119): python3-certifi-2023.05.07-4.el9.noar 5.0 MB/s | 14 kB 00:00 2026-03-08T23:57:51.214 INFO:teuthology.orchestra.run.vm06.stdout:(86/119): python3-cheroot-10.0.1-4.el9.noarch.r 45 MB/s | 173 kB 00:00 2026-03-08T23:57:51.221 INFO:teuthology.orchestra.run.vm06.stdout:(87/119): python3-cherrypy-18.6.1-2.el9.noarch. 52 MB/s | 358 kB 00:00 2026-03-08T23:57:51.228 INFO:teuthology.orchestra.run.vm06.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 41 MB/s | 254 kB 00:00 2026-03-08T23:57:51.230 INFO:teuthology.orchestra.run.vm06.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 4.4 MB/s | 11 kB 00:00 2026-03-08T23:57:51.233 INFO:teuthology.orchestra.run.vm06.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 6.9 MB/s | 18 kB 00:00 2026-03-08T23:57:51.235 INFO:teuthology.orchestra.run.vm06.stdout:(91/119): python3-jaraco-collections-3.0.0-8.el 10 MB/s | 23 kB 00:00 2026-03-08T23:57:51.238 INFO:teuthology.orchestra.run.vm06.stdout:(92/119): python3-jaraco-context-6.0.1-3.el9.no 9.4 MB/s | 20 kB 00:00 2026-03-08T23:57:51.241 INFO:teuthology.orchestra.run.vm06.stdout:(93/119): python3-jaraco-functools-3.5.0-2.el9. 6.9 MB/s | 19 kB 00:00 2026-03-08T23:57:51.243 INFO:teuthology.orchestra.run.vm06.stdout:(94/119): python3-jaraco-text-4.0.0-2.el9.noarc 12 MB/s | 26 kB 00:00 2026-03-08T23:57:51.246 INFO:teuthology.orchestra.run.vm06.stdout:(95/119): python3-jwt+crypto-2.4.0-1.el9.noarch 3.6 MB/s | 9.0 kB 00:00 2026-03-08T23:57:51.260 INFO:teuthology.orchestra.run.vm06.stdout:(96/119): libarrow-9.0.0-15.el9.x86_64.rpm 14 MB/s | 4.4 MB 00:00 2026-03-08T23:57:51.261 INFO:teuthology.orchestra.run.vm06.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 2.7 MB/s | 41 kB 00:00 2026-03-08T23:57:51.285 INFO:teuthology.orchestra.run.vm06.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 1.9 MB/s | 46 kB 00:00 2026-03-08T23:57:51.329 INFO:teuthology.orchestra.run.vm06.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 1.8 MB/s | 79 kB 00:00 2026-03-08T23:57:51.331 INFO:teuthology.orchestra.run.vm03.stdout:(29/119): ceph-test-18.2.1-0.el9.x86_64.rpm 11 MB/s | 40 MB 00:03 2026-03-08T23:57:51.334 INFO:teuthology.orchestra.run.vm06.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 17 MB/s | 58 kB 00:00 2026-03-08T23:57:51.339 INFO:teuthology.orchestra.run.vm06.stdout:(101/119): python3-pecan-1.4.2-3.el9.noarch.rpm 48 MB/s | 272 kB 00:00 2026-03-08T23:57:51.342 INFO:teuthology.orchestra.run.vm06.stdout:(102/119): python3-portend-3.1.0-2.el9.noarch.r 5.8 MB/s | 16 kB 00:00 2026-03-08T23:57:51.347 INFO:teuthology.orchestra.run.vm06.stdout:(103/119): python3-kubernetes-26.1.0-3.el9.noar 12 MB/s | 1.0 MB 00:00 2026-03-08T23:57:51.348 INFO:teuthology.orchestra.run.vm06.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 15 MB/s | 90 kB 00:00 2026-03-08T23:57:51.351 INFO:teuthology.orchestra.run.vm06.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 13 MB/s | 31 kB 00:00 2026-03-08T23:57:51.353 INFO:teuthology.orchestra.run.vm06.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 41 MB/s | 188 kB 00:00 2026-03-08T23:57:51.354 INFO:teuthology.orchestra.run.vm06.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 20 MB/s | 59 kB 00:00 2026-03-08T23:57:51.357 INFO:teuthology.orchestra.run.vm06.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 12 MB/s | 36 kB 00:00 2026-03-08T23:57:51.358 INFO:teuthology.orchestra.run.vm06.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 22 MB/s | 86 kB 00:00 2026-03-08T23:57:51.362 INFO:teuthology.orchestra.run.vm06.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 46 MB/s | 230 kB 00:00 2026-03-08T23:57:51.363 INFO:teuthology.orchestra.run.vm06.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 19 MB/s | 90 kB 00:00 2026-03-08T23:57:51.367 INFO:teuthology.orchestra.run.vm06.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 6.0 MB/s | 22 kB 00:00 2026-03-08T23:57:51.368 INFO:teuthology.orchestra.run.vm06.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 62 MB/s | 427 kB 00:00 2026-03-08T23:57:51.369 INFO:teuthology.orchestra.run.vm06.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 8.1 MB/s | 20 kB 00:00 2026-03-08T23:57:51.380 INFO:teuthology.orchestra.run.vm06.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 17 MB/s | 191 kB 00:00 2026-03-08T23:57:51.401 INFO:teuthology.orchestra.run.vm06.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 51 MB/s | 1.6 MB 00:00 2026-03-08T23:57:51.418 INFO:teuthology.orchestra.run.vm03.stdout:(30/119): ceph-mgr-dashboard-18.2.1-0.el9.noarc 1.8 MB/s | 1.8 MB 00:00 2026-03-08T23:57:51.434 INFO:teuthology.orchestra.run.vm03.stdout:(31/119): ceph-mgr-modules-core-18.2.1-0.el9.no 2.3 MB/s | 242 kB 00:00 2026-03-08T23:57:51.517 INFO:teuthology.orchestra.run.vm03.stdout:(32/119): ceph-mgr-rook-18.2.1-0.el9.noarch.rpm 506 kB/s | 50 kB 00:00 2026-03-08T23:57:51.534 INFO:teuthology.orchestra.run.vm03.stdout:(33/119): ceph-prometheus-alerts-18.2.1-0.el9.n 146 kB/s | 15 kB 00:00 2026-03-08T23:57:51.618 INFO:teuthology.orchestra.run.vm03.stdout:(34/119): cephadm-18.2.1-0.el9.noarch.rpm 2.1 MB/s | 221 kB 00:00 2026-03-08T23:57:51.623 INFO:teuthology.orchestra.run.vm03.stdout:(35/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 459 kB/s | 40 kB 00:00 2026-03-08T23:57:51.744 INFO:teuthology.orchestra.run.vm03.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 6.4 MB/s | 794 kB 00:00 2026-03-08T23:57:51.759 INFO:teuthology.orchestra.run.vm03.stdout:(37/119): libconfig-1.7.2-9.el9.x86_64.rpm 512 kB/s | 72 kB 00:00 2026-03-08T23:57:51.776 INFO:teuthology.orchestra.run.vm03.stdout:(38/119): libquadmath-11.5.0-14.el9.x86_64.rpm 5.7 MB/s | 184 kB 00:00 2026-03-08T23:57:51.789 INFO:teuthology.orchestra.run.vm03.stdout:(39/119): mailcap-2.1.49-5.el9.noarch.rpm 1.1 MB/s | 33 kB 00:00 2026-03-08T23:57:51.809 INFO:teuthology.orchestra.run.vm03.stdout:(40/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 7.5 MB/s | 253 kB 00:00 2026-03-08T23:57:51.840 INFO:teuthology.orchestra.run.vm03.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 3.4 MB/s | 106 kB 00:00 2026-03-08T23:57:51.871 INFO:teuthology.orchestra.run.vm03.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 4.2 MB/s | 135 kB 00:00 2026-03-08T23:57:51.903 INFO:teuthology.orchestra.run.vm03.stdout:(43/119): python3-requests-2.25.1-10.el9.noarch 4.0 MB/s | 126 kB 00:00 2026-03-08T23:57:51.934 INFO:teuthology.orchestra.run.vm03.stdout:(44/119): python3-urllib3-1.26.5-7.el9.noarch.r 6.7 MB/s | 218 kB 00:00 2026-03-08T23:57:52.047 INFO:teuthology.orchestra.run.vm03.stdout:(45/119): python3-cryptography-36.0.1-5.el9.x86 4.8 MB/s | 1.2 MB 00:00 2026-03-08T23:57:52.048 INFO:teuthology.orchestra.run.vm06.stdout:(117/119): ceph-test-18.2.1-0.el9.x86_64.rpm 5.7 MB/s | 40 MB 00:06 2026-03-08T23:57:52.191 INFO:teuthology.orchestra.run.vm03.stdout:(46/119): flexiblas-3.0.4-9.el9.x86_64.rpm 206 kB/s | 30 kB 00:00 2026-03-08T23:57:52.259 INFO:teuthology.orchestra.run.vm03.stdout:(47/119): boost-program-options-1.75.0-13.el9.x 321 kB/s | 104 kB 00:00 2026-03-08T23:57:52.311 INFO:teuthology.orchestra.run.vm03.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 285 kB/s | 15 kB 00:00 2026-03-08T23:57:52.409 INFO:teuthology.orchestra.run.vm06.stdout:(118/119): librados2-18.2.1-0.el9.x86_64.rpm 3.2 MB/s | 3.3 MB 00:01 2026-03-08T23:57:52.412 INFO:teuthology.orchestra.run.vm03.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.6 MB/s | 160 kB 00:00 2026-03-08T23:57:52.462 INFO:teuthology.orchestra.run.vm03.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 898 kB/s | 45 kB 00:00 2026-03-08T23:57:52.573 INFO:teuthology.orchestra.run.vm03.stdout:(51/119): ceph-mgr-diskprediction-local-18.2.1- 3.7 MB/s | 7.4 MB 00:02 2026-03-08T23:57:52.624 INFO:teuthology.orchestra.run.vm03.stdout:(52/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 6.9 MB/s | 3.0 MB 00:00 2026-03-08T23:57:52.625 INFO:teuthology.orchestra.run.vm03.stdout:(53/119): librdkafka-1.6.1-102.el9.x86_64.rpm 4.0 MB/s | 662 kB 00:00 2026-03-08T23:57:52.695 INFO:teuthology.orchestra.run.vm03.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 3.2 MB/s | 233 kB 00:00 2026-03-08T23:57:52.735 INFO:teuthology.orchestra.run.vm03.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.6 MB/s | 292 kB 00:00 2026-03-08T23:57:52.735 INFO:teuthology.orchestra.run.vm06.stdout:(119/119): librbd1-18.2.1-0.el9.x86_64.rpm 2.3 MB/s | 3.0 MB 00:01 2026-03-08T23:57:52.737 INFO:teuthology.orchestra.run.vm06.stdout:-------------------------------------------------------------------------------- 2026-03-08T23:57:52.737 INFO:teuthology.orchestra.run.vm06.stdout:Total 15 MB/s | 182 MB 00:12 2026-03-08T23:57:52.745 INFO:teuthology.orchestra.run.vm03.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 835 kB/s | 42 kB 00:00 2026-03-08T23:57:52.862 INFO:teuthology.orchestra.run.vm03.stdout:(57/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 854 kB/s | 246 kB 00:00 2026-03-08T23:57:52.939 INFO:teuthology.orchestra.run.vm03.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 3.1 MB/s | 244 kB 00:00 2026-03-08T23:57:53.007 INFO:teuthology.orchestra.run.vm03.stdout:(59/119): python3-jinja2-2.11.3-8.el9.noarch.rp 3.6 MB/s | 249 kB 00:00 2026-03-08T23:57:53.072 INFO:teuthology.orchestra.run.vm03.stdout:(60/119): python3-jmespath-1.0.1-1.el9.noarch.r 737 kB/s | 48 kB 00:00 2026-03-08T23:57:53.116 INFO:teuthology.orchestra.run.vm03.stdout:(61/119): python3-babel-2.9.1-2.el9.noarch.rpm 16 MB/s | 6.0 MB 00:00 2026-03-08T23:57:53.149 INFO:teuthology.orchestra.run.vm03.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 2.3 MB/s | 177 kB 00:00 2026-03-08T23:57:53.184 INFO:teuthology.orchestra.run.vm03.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 2.5 MB/s | 172 kB 00:00 2026-03-08T23:57:53.202 INFO:teuthology.orchestra.run.vm03.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 666 kB/s | 35 kB 00:00 2026-03-08T23:57:53.234 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-08T23:57:53.270 INFO:teuthology.orchestra.run.vm03.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 6.3 MB/s | 442 kB 00:00 2026-03-08T23:57:53.281 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-08T23:57:53.281 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-08T23:57:53.342 INFO:teuthology.orchestra.run.vm03.stdout:(66/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.1 MB/s | 157 kB 00:00 2026-03-08T23:57:53.426 INFO:teuthology.orchestra.run.vm03.stdout:(67/119): python3-pyasn1-modules-0.4.8-7.el9.no 3.3 MB/s | 277 kB 00:00 2026-03-08T23:57:53.483 INFO:teuthology.orchestra.run.vm03.stdout:(68/119): python3-requests-oauthlib-1.3.0-12.el 942 kB/s | 54 kB 00:00 2026-03-08T23:57:53.554 INFO:teuthology.orchestra.run.vm03.stdout:(69/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 17 MB/s | 6.1 MB 00:00 2026-03-08T23:57:53.617 INFO:teuthology.orchestra.run.vm03.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 659 kB/s | 42 kB 00:00 2026-03-08T23:57:53.739 INFO:teuthology.orchestra.run.vm03.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 2.4 MB/s | 303 kB 00:00 2026-03-08T23:57:53.871 INFO:teuthology.orchestra.run.vm03.stdout:(72/119): openblas-openmp-0.3.29-1.el9.x86_64.r 4.7 MB/s | 5.3 MB 00:01 2026-03-08T23:57:53.951 INFO:teuthology.orchestra.run.vm03.stdout:(73/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 301 kB/s | 64 kB 00:00 2026-03-08T23:57:54.029 INFO:teuthology.orchestra.run.vm03.stdout:(74/119): fmt-8.1.1-5.el9.x86_64.rpm 701 kB/s | 111 kB 00:00 2026-03-08T23:57:54.050 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-08T23:57:54.050 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-08T23:57:54.097 INFO:teuthology.orchestra.run.vm03.stdout:(75/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 2.1 MB/s | 308 kB 00:00 2026-03-08T23:57:54.221 INFO:teuthology.orchestra.run.vm03.stdout:(76/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 200 kB/s | 25 kB 00:00 2026-03-08T23:57:54.261 INFO:teuthology.orchestra.run.vm03.stdout:(77/119): liboath-2.6.12-1.el9.x86_64.rpm 1.2 MB/s | 49 kB 00:00 2026-03-08T23:57:54.265 INFO:teuthology.orchestra.run.vm03.stdout:(78/119): libunwind-1.6.2-1.el9.x86_64.rpm 19 MB/s | 67 kB 00:00 2026-03-08T23:57:54.285 INFO:teuthology.orchestra.run.vm03.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 41 MB/s | 838 kB 00:00 2026-03-08T23:57:54.297 INFO:teuthology.orchestra.run.vm03.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 46 MB/s | 548 kB 00:00 2026-03-08T23:57:54.301 INFO:teuthology.orchestra.run.vm03.stdout:(81/119): python3-autocommand-2.2.2-8.el9.noarc 8.8 MB/s | 29 kB 00:00 2026-03-08T23:57:54.304 INFO:teuthology.orchestra.run.vm03.stdout:(82/119): python3-backports-tarfile-1.2.0-1.el9 19 MB/s | 60 kB 00:00 2026-03-08T23:57:54.309 INFO:teuthology.orchestra.run.vm03.stdout:(83/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 10 MB/s | 43 kB 00:00 2026-03-08T23:57:54.334 INFO:teuthology.orchestra.run.vm03.stdout:(84/119): libarrow-9.0.0-15.el9.x86_64.rpm 14 MB/s | 4.4 MB 00:00 2026-03-08T23:57:54.335 INFO:teuthology.orchestra.run.vm03.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 1.2 MB/s | 32 kB 00:00 2026-03-08T23:57:54.337 INFO:teuthology.orchestra.run.vm03.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 6.0 MB/s | 14 kB 00:00 2026-03-08T23:57:54.340 INFO:teuthology.orchestra.run.vm03.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 41 MB/s | 173 kB 00:00 2026-03-08T23:57:54.345 INFO:teuthology.orchestra.run.vm03.stdout:(88/119): python3-cherrypy-18.6.1-2.el9.noarch. 46 MB/s | 358 kB 00:00 2026-03-08T23:57:54.346 INFO:teuthology.orchestra.run.vm03.stdout:(89/119): python3-google-auth-2.45.0-1.el9.noar 39 MB/s | 254 kB 00:00 2026-03-08T23:57:54.348 INFO:teuthology.orchestra.run.vm03.stdout:(90/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 4.7 MB/s | 11 kB 00:00 2026-03-08T23:57:54.349 INFO:teuthology.orchestra.run.vm03.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 8.7 MB/s | 18 kB 00:00 2026-03-08T23:57:54.350 INFO:teuthology.orchestra.run.vm03.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 12 MB/s | 23 kB 00:00 2026-03-08T23:57:54.352 INFO:teuthology.orchestra.run.vm03.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 5.6 MB/s | 20 kB 00:00 2026-03-08T23:57:54.353 INFO:teuthology.orchestra.run.vm03.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 5.8 MB/s | 19 kB 00:00 2026-03-08T23:57:54.356 INFO:teuthology.orchestra.run.vm03.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 8.0 MB/s | 26 kB 00:00 2026-03-08T23:57:54.357 INFO:teuthology.orchestra.run.vm03.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 2.9 MB/s | 9.0 kB 00:00 2026-03-08T23:57:54.359 INFO:teuthology.orchestra.run.vm03.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 16 MB/s | 41 kB 00:00 2026-03-08T23:57:54.364 INFO:teuthology.orchestra.run.vm03.stdout:(98/119): python3-logutils-0.3.5-21.el9.noarch. 8.7 MB/s | 46 kB 00:00 2026-03-08T23:57:54.368 INFO:teuthology.orchestra.run.vm03.stdout:(99/119): python3-more-itertools-8.12.0-2.el9.n 23 MB/s | 79 kB 00:00 2026-03-08T23:57:54.374 INFO:teuthology.orchestra.run.vm03.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 9.6 MB/s | 58 kB 00:00 2026-03-08T23:57:54.379 INFO:teuthology.orchestra.run.vm03.stdout:(101/119): python3-kubernetes-26.1.0-3.el9.noar 47 MB/s | 1.0 MB 00:00 2026-03-08T23:57:54.381 INFO:teuthology.orchestra.run.vm03.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 42 MB/s | 272 kB 00:00 2026-03-08T23:57:54.382 INFO:teuthology.orchestra.run.vm03.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 4.7 MB/s | 16 kB 00:00 2026-03-08T23:57:54.385 INFO:teuthology.orchestra.run.vm03.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 28 MB/s | 90 kB 00:00 2026-03-08T23:57:54.386 INFO:teuthology.orchestra.run.vm03.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 9.6 MB/s | 31 kB 00:00 2026-03-08T23:57:54.389 INFO:teuthology.orchestra.run.vm03.stdout:(106/119): python3-rsa-4.9-2.el9.noarch.rpm 17 MB/s | 59 kB 00:00 2026-03-08T23:57:54.390 INFO:teuthology.orchestra.run.vm03.stdout:(107/119): python3-routes-2.5.1-5.el9.noarch.rp 38 MB/s | 188 kB 00:00 2026-03-08T23:57:54.392 INFO:teuthology.orchestra.run.vm03.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 14 MB/s | 36 kB 00:00 2026-03-08T23:57:54.394 INFO:teuthology.orchestra.run.vm03.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 24 MB/s | 86 kB 00:00 2026-03-08T23:57:54.397 INFO:teuthology.orchestra.run.vm03.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 50 MB/s | 230 kB 00:00 2026-03-08T23:57:54.398 INFO:teuthology.orchestra.run.vm03.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 24 MB/s | 90 kB 00:00 2026-03-08T23:57:54.401 INFO:teuthology.orchestra.run.vm03.stdout:(112/119): python3-xmltodict-0.12.0-15.el9.noar 7.2 MB/s | 22 kB 00:00 2026-03-08T23:57:54.404 INFO:teuthology.orchestra.run.vm03.stdout:(113/119): python3-werkzeug-2.0.3-3.el9.1.noarc 60 MB/s | 427 kB 00:00 2026-03-08T23:57:54.405 INFO:teuthology.orchestra.run.vm03.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 5.7 MB/s | 20 kB 00:00 2026-03-08T23:57:54.410 INFO:teuthology.orchestra.run.vm03.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 32 MB/s | 191 kB 00:00 2026-03-08T23:57:54.432 INFO:teuthology.orchestra.run.vm03.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 58 MB/s | 1.6 MB 00:00 2026-03-08T23:57:54.938 INFO:teuthology.orchestra.run.vm03.stdout:(117/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 13 MB/s | 19 MB 00:01 2026-03-08T23:57:55.119 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-08T23:57:55.127 INFO:teuthology.orchestra.run.vm06.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-08T23:57:55.140 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-08T23:57:55.309 INFO:teuthology.orchestra.run.vm06.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-08T23:57:55.315 INFO:teuthology.orchestra.run.vm06.stdout: Upgrading : librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-08T23:57:55.365 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-08T23:57:55.366 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-08T23:57:55.397 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-08T23:57:55.408 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rados-2:18.2.1-0.el9.x86_64 6/121 2026-03-08T23:57:55.411 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-08T23:57:55.413 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-08T23:57:55.423 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-08T23:57:55.424 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-08T23:57:55.444 INFO:teuthology.orchestra.run.vm03.stdout:(118/119): librbd1-18.2.1-0.el9.x86_64.rpm 3.0 MB/s | 3.0 MB 00:01 2026-03-08T23:57:55.461 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-08T23:57:55.463 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-08T23:57:55.514 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-08T23:57:55.520 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-08T23:57:55.547 INFO:teuthology.orchestra.run.vm06.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-08T23:57:55.556 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-08T23:57:55.560 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-08T23:57:55.589 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-08T23:57:55.611 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-08T23:57:55.616 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-08T23:57:55.623 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-08T23:57:55.626 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-08T23:57:55.632 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-08T23:57:55.642 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 22/121 2026-03-08T23:57:55.656 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cephfs-2:18.2.1-0.el9.x86_64 23/121 2026-03-08T23:57:55.694 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-08T23:57:55.749 INFO:teuthology.orchestra.run.vm03.stdout:(119/119): librados2-18.2.1-0.el9.x86_64.rpm 2.4 MB/s | 3.3 MB 00:01 2026-03-08T23:57:55.752 INFO:teuthology.orchestra.run.vm03.stdout:-------------------------------------------------------------------------------- 2026-03-08T23:57:55.753 INFO:teuthology.orchestra.run.vm03.stdout:Total 15 MB/s | 182 MB 00:11 2026-03-08T23:57:55.763 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-08T23:57:55.781 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-08T23:57:55.789 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-08T23:57:55.801 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-08T23:57:55.807 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librados-devel-2:18.2.1-0.el9.x86_64 29/121 2026-03-08T23:57:55.846 INFO:teuthology.orchestra.run.vm06.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-08T23:57:55.853 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-08T23:57:55.876 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-08T23:57:55.905 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-08T23:57:55.912 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-08T23:57:55.920 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-08T23:57:55.936 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-08T23:57:55.949 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-08T23:57:55.969 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-08T23:57:56.052 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-08T23:57:56.061 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-08T23:57:56.074 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-08T23:57:56.140 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-08T23:57:56.343 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-08T23:57:56.393 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-08T23:57:56.402 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-08T23:57:56.605 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-08T23:57:56.644 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-08T23:57:56.651 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-08T23:57:56.660 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-08T23:57:56.684 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-08T23:57:56.694 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-08T23:57:56.700 INFO:teuthology.orchestra.run.vm06.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-08T23:57:56.704 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-08T23:57:56.718 INFO:teuthology.orchestra.run.vm06.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-08T23:57:56.727 INFO:teuthology.orchestra.run.vm06.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-08T23:57:56.734 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-08T23:57:56.744 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-08T23:57:56.750 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-08T23:57:56.761 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-08T23:57:56.767 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-08T23:57:56.813 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-08T23:57:57.115 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-08T23:57:57.145 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-08T23:57:57.152 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-08T23:57:57.182 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-08T23:57:57.183 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-08T23:57:57.222 INFO:teuthology.orchestra.run.vm06.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-08T23:57:57.225 INFO:teuthology.orchestra.run.vm06.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-08T23:57:57.255 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-08T23:57:57.686 INFO:teuthology.orchestra.run.vm06.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-08T23:57:57.779 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-08T23:57:58.072 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-08T23:57:58.081 INFO:teuthology.orchestra.run.vm03.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-08T23:57:58.094 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-08T23:57:58.279 INFO:teuthology.orchestra.run.vm03.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-08T23:57:58.281 INFO:teuthology.orchestra.run.vm03.stdout: Upgrading : librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-08T23:57:58.331 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-08T23:57:58.332 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-08T23:57:58.365 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-08T23:57:58.376 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rados-2:18.2.1-0.el9.x86_64 6/121 2026-03-08T23:57:58.380 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-08T23:57:58.383 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-08T23:57:58.392 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-08T23:57:58.393 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-08T23:57:58.430 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-08T23:57:58.432 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-08T23:57:58.485 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-08T23:57:58.490 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-08T23:57:58.518 INFO:teuthology.orchestra.run.vm03.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-08T23:57:58.527 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-08T23:57:58.530 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-08T23:57:58.562 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-08T23:57:58.581 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-08T23:57:58.585 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-08T23:57:58.594 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-08T23:57:58.597 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-08T23:57:58.603 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-08T23:57:58.605 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-08T23:57:58.615 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 22/121 2026-03-08T23:57:58.631 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cephfs-2:18.2.1-0.el9.x86_64 23/121 2026-03-08T23:57:58.631 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-08T23:57:58.638 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-08T23:57:58.642 INFO:teuthology.orchestra.run.vm06.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-08T23:57:58.670 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-08T23:57:58.736 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-08T23:57:58.753 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-08T23:57:58.762 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-08T23:57:58.773 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-08T23:57:58.778 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librados-devel-2:18.2.1-0.el9.x86_64 29/121 2026-03-08T23:57:58.796 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-08T23:57:58.798 INFO:teuthology.orchestra.run.vm06.stdout: Upgrading : librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-08T23:57:58.818 INFO:teuthology.orchestra.run.vm03.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-08T23:57:58.827 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-08T23:57:58.830 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-08T23:57:58.834 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rbd-2:18.2.1-0.el9.x86_64 73/121 2026-03-08T23:57:58.841 INFO:teuthology.orchestra.run.vm06.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-08T23:57:58.848 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-08T23:57:58.878 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-08T23:57:58.887 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-08T23:57:58.895 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-08T23:57:58.913 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-08T23:57:58.927 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-08T23:57:58.939 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-08T23:57:59.015 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-08T23:57:59.031 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-08T23:57:59.043 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-08T23:57:59.061 INFO:teuthology.orchestra.run.vm06.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-08T23:57:59.064 INFO:teuthology.orchestra.run.vm06.stdout: Installing : librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-08T23:57:59.086 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-08T23:57:59.096 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-rgw-2:18.2.1-0.el9.x86_64 77/121 2026-03-08T23:57:59.096 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-08T23:57:59.113 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-08T23:57:59.134 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-08T23:57:59.234 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-08T23:57:59.247 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-08T23:57:59.282 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-08T23:57:59.325 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-08T23:57:59.390 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-08T23:57:59.403 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-08T23:57:59.405 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-08T23:57:59.411 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-08T23:57:59.416 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-08T23:57:59.420 INFO:teuthology.orchestra.run.vm06.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-08T23:57:59.423 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-08T23:57:59.442 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-08T23:57:59.442 INFO:teuthology.orchestra.run.vm06.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-08T23:57:59.442 INFO:teuthology.orchestra.run.vm06.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-08T23:57:59.442 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:57:59.457 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-08T23:57:59.489 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-08T23:57:59.489 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-08T23:57:59.489 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:57:59.508 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-08T23:57:59.534 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-08T23:57:59.551 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-08T23:57:59.557 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-08T23:57:59.566 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-08T23:57:59.571 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-08T23:57:59.572 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-08T23:57:59.575 INFO:teuthology.orchestra.run.vm06.stdout: Installing : cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-08T23:57:59.580 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-08T23:57:59.580 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 94/121 2026-03-08T23:57:59.583 INFO:teuthology.orchestra.run.vm03.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-08T23:57:59.586 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-08T23:57:59.598 INFO:teuthology.orchestra.run.vm03.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-08T23:57:59.606 INFO:teuthology.orchestra.run.vm03.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-08T23:57:59.611 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 95/121 2026-03-08T23:57:59.612 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-08T23:57:59.616 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-ceph-common-2:18.2.1-0.el9.x86_64 96/121 2026-03-08T23:57:59.625 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-08T23:57:59.633 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-08T23:57:59.646 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-08T23:57:59.652 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-08T23:57:59.699 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-08T23:58:00.017 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-08T23:58:00.052 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-08T23:58:00.059 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-08T23:58:00.130 INFO:teuthology.orchestra.run.vm03.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-08T23:58:00.133 INFO:teuthology.orchestra.run.vm03.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-08T23:58:00.163 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-08T23:58:00.604 INFO:teuthology.orchestra.run.vm03.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-08T23:58:00.701 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-08T23:58:00.719 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-08T23:58:00.746 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-08T23:58:01.098 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-08T23:58:01.105 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-08T23:58:01.156 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-08T23:58:01.156 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-08T23:58:01.157 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-08T23:58:01.157 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:01.164 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-08T23:58:01.603 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-08T23:58:01.635 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-08T23:58:01.641 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-08T23:58:01.647 INFO:teuthology.orchestra.run.vm03.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-08T23:58:01.818 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-08T23:58:01.820 INFO:teuthology.orchestra.run.vm03.stdout: Upgrading : librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-08T23:58:01.854 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-08T23:58:01.857 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rbd-2:18.2.1-0.el9.x86_64 73/121 2026-03-08T23:58:01.864 INFO:teuthology.orchestra.run.vm03.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-08T23:58:02.091 INFO:teuthology.orchestra.run.vm03.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-08T23:58:02.093 INFO:teuthology.orchestra.run.vm03.stdout: Installing : librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-08T23:58:02.111 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-08T23:58:02.121 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-rgw-2:18.2.1-0.el9.x86_64 77/121 2026-03-08T23:58:02.142 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-08T23:58:02.163 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-08T23:58:02.264 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-08T23:58:02.278 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-08T23:58:02.309 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-08T23:58:02.349 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-08T23:58:02.417 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-08T23:58:02.429 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-08T23:58:02.432 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-08T23:58:02.439 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-08T23:58:02.443 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-08T23:58:02.447 INFO:teuthology.orchestra.run.vm03.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-08T23:58:02.450 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-08T23:58:02.469 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-08T23:58:02.470 INFO:teuthology.orchestra.run.vm03.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-08T23:58:02.470 INFO:teuthology.orchestra.run.vm03.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-08T23:58:02.470 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:02.480 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-08T23:58:02.511 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-08T23:58:02.511 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-08T23:58:02.511 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:02.528 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-08T23:58:02.597 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-08T23:58:02.601 INFO:teuthology.orchestra.run.vm03.stdout: Installing : cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-08T23:58:02.607 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 94/121 2026-03-08T23:58:02.642 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 95/121 2026-03-08T23:58:02.645 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-ceph-common-2:18.2.1-0.el9.x86_64 96/121 2026-03-08T23:58:03.641 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-08T23:58:03.651 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-08T23:58:03.993 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-08T23:58:04.056 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-08T23:58:04.112 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-08T23:58:04.112 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-08T23:58:04.112 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-08T23:58:04.112 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:04.118 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /sys 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /proc 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /mnt 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /var/tmp 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /home 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /root 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /tmp 2026-03-08T23:58:08.331 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:08.364 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-08T23:58:08.497 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-08T23:58:08.501 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-08T23:58:09.025 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-08T23:58:09.028 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-08T23:58:09.091 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-08T23:58:09.170 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 103/121 2026-03-08T23:58:09.172 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-08T23:58:09.197 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-08T23:58:09.197 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:09.197 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T23:58:09.197 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T23:58:09.197 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T23:58:09.197 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:09.213 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-08T23:58:09.325 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-08T23:58:09.327 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-08T23:58:09.353 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-08T23:58:09.353 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:09.353 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T23:58:09.353 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T23:58:09.353 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T23:58:09.353 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:09.747 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-08T23:58:09.770 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-08T23:58:09.770 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:09.770 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T23:58:09.770 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T23:58:09.770 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T23:58:09.770 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:10.563 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-08T23:58:10.590 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-08T23:58:10.590 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:10.590 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T23:58:10.590 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T23:58:10.590 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T23:58:10.590 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /sys 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /proc 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /mnt 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /var/tmp 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /home 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /root 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /tmp 2026-03-08T23:58:10.975 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:10.976 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-2:18.2.1-0.el9.x86_64 109/121 2026-03-08T23:58:10.980 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-08T23:58:11.003 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-08T23:58:11.003 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:11.003 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T23:58:11.003 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T23:58:11.003 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T23:58:11.003 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:11.005 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-08T23:58:11.014 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-08T23:58:11.039 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-08T23:58:11.039 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:11.039 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T23:58:11.039 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:11.132 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-08T23:58:11.139 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-08T23:58:11.193 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-08T23:58:11.219 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-08T23:58:11.219 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:11.219 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T23:58:11.219 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T23:58:11.219 INFO:teuthology.orchestra.run.vm06.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T23:58:11.219 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:11.680 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-08T23:58:11.682 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-08T23:58:11.742 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-08T23:58:11.817 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 103/121 2026-03-08T23:58:11.820 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-08T23:58:11.843 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-08T23:58:11.843 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:11.843 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-08T23:58:11.843 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T23:58:11.843 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-08T23:58:11.843 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:11.857 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-08T23:58:11.971 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-08T23:58:11.973 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-08T23:58:11.996 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-08T23:58:11.996 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:11.996 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-08T23:58:11.996 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T23:58:11.996 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-08T23:58:11.996 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:12.235 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-08T23:58:12.259 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-08T23:58:12.259 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:12.259 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-08T23:58:12.259 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T23:58:12.259 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-08T23:58:12.259 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:13.127 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-08T23:58:13.157 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-08T23:58:13.157 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:13.157 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-08T23:58:13.157 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T23:58:13.157 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-08T23:58:13.157 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:13.301 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-test-2:18.2.1-0.el9.x86_64 113/121 2026-03-08T23:58:13.313 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-fuse-2:18.2.1-0.el9.x86_64 114/121 2026-03-08T23:58:13.317 INFO:teuthology.orchestra.run.vm06.stdout: Installing : rbd-nbd-2:18.2.1-0.el9.x86_64 115/121 2026-03-08T23:58:13.359 INFO:teuthology.orchestra.run.vm06.stdout: Installing : libcephfs-devel-2:18.2.1-0.el9.x86_64 116/121 2026-03-08T23:58:13.367 INFO:teuthology.orchestra.run.vm06.stdout: Installing : ceph-fuse-2:18.2.1-0.el9.x86_64 117/121 2026-03-08T23:58:13.376 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-08T23:58:13.380 INFO:teuthology.orchestra.run.vm06.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-08T23:58:13.380 INFO:teuthology.orchestra.run.vm06.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-08T23:58:13.398 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-08T23:58:13.398 INFO:teuthology.orchestra.run.vm06.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-08T23:58:13.532 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-2:18.2.1-0.el9.x86_64 109/121 2026-03-08T23:58:13.535 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-08T23:58:13.559 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-08T23:58:13.559 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:13.559 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-08T23:58:13.559 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T23:58:13.559 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-08T23:58:13.559 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:13.570 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-08T23:58:13.591 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-08T23:58:13.591 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:13.591 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-08T23:58:13.591 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:13.739 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-08T23:58:13.762 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-08T23:58:13.762 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-08T23:58:13.762 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-08T23:58:13.762 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T23:58:13.762 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-08T23:58:13.762 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:14.557 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-08T23:58:14.557 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 2/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 3/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 4/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 5/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 6/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 7/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 8/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 9/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 10/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 11/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 12/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 13/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 14/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 15/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 16/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 17/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 18/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 19/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 20/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 21/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 22/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 23/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 24/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 25/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 26/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 27/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 28/121 2026-03-08T23:58:14.558 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 29/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 30/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 31/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 32/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 33/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 34/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 35/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-08T23:58:14.559 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-08T23:58:14.560 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 118/121 2026-03-08T23:58:14.561 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-08T23:58:14.562 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 120/121 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout:Upgraded: 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: librados2-2:18.2.1-0.el9.x86_64 librbd1-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout:Installed: 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.663 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T23:58:14.664 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:58:14.665 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-08T23:58:14.767 DEBUG:teuthology.parallel:result is None 2026-03-08T23:58:15.808 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-test-2:18.2.1-0.el9.x86_64 113/121 2026-03-08T23:58:15.820 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-fuse-2:18.2.1-0.el9.x86_64 114/121 2026-03-08T23:58:15.826 INFO:teuthology.orchestra.run.vm03.stdout: Installing : rbd-nbd-2:18.2.1-0.el9.x86_64 115/121 2026-03-08T23:58:15.880 INFO:teuthology.orchestra.run.vm03.stdout: Installing : libcephfs-devel-2:18.2.1-0.el9.x86_64 116/121 2026-03-08T23:58:15.885 INFO:teuthology.orchestra.run.vm03.stdout: Installing : ceph-fuse-2:18.2.1-0.el9.x86_64 117/121 2026-03-08T23:58:15.893 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-08T23:58:15.897 INFO:teuthology.orchestra.run.vm03.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-08T23:58:15.897 INFO:teuthology.orchestra.run.vm03.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-08T23:58:15.911 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-08T23:58:15.911 INFO:teuthology.orchestra.run.vm03.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 2/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 3/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 4/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 5/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 6/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 7/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 8/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 9/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 10/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 11/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 12/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 13/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 14/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 15/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 16/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 17/121 2026-03-08T23:58:17.200 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 18/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 19/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 20/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 21/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 22/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 23/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 24/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 25/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 26/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 27/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 28/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 29/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 30/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 31/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 32/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 33/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 34/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 35/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-08T23:58:17.201 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-08T23:58:17.204 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-08T23:58:17.205 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 118/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-08T23:58:17.206 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 120/121 2026-03-08T23:58:17.383 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-08T23:58:17.383 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:17.383 INFO:teuthology.orchestra.run.vm03.stdout:Upgraded: 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: librados2-2:18.2.1-0.el9.x86_64 librbd1-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout:Installed: 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-08T23:58:17.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply-3.11-14.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: re2-1:20211101-20.el9.x86_64 2026-03-08T23:58:17.385 INFO:teuthology.orchestra.run.vm03.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-08T23:58:17.386 INFO:teuthology.orchestra.run.vm03.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-08T23:58:17.386 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-08T23:58:17.386 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:17.386 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-08T23:58:17.542 DEBUG:teuthology.parallel:result is None 2026-03-08T23:58:17.542 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-08T23:58:17.542 INFO:teuthology.packaging:ref: None 2026-03-08T23:58:17.542 INFO:teuthology.packaging:tag: v18.2.1 2026-03-08T23:58:17.542 INFO:teuthology.packaging:branch: None 2026-03-08T23:58:17.542 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:58:17.542 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-08T23:58:18.143 DEBUG:teuthology.orchestra.run.vm03:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-08T23:58:18.168 INFO:teuthology.orchestra.run.vm03.stdout:18.2.1-0.el9 2026-03-08T23:58:18.168 INFO:teuthology.packaging:The installed version of ceph is 18.2.1-0.el9 2026-03-08T23:58:18.168 INFO:teuthology.task.install:The correct ceph version 18.2.1-0 is installed. 2026-03-08T23:58:18.169 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-08T23:58:18.169 INFO:teuthology.packaging:ref: None 2026-03-08T23:58:18.169 INFO:teuthology.packaging:tag: v18.2.1 2026-03-08T23:58:18.169 INFO:teuthology.packaging:branch: None 2026-03-08T23:58:18.169 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:58:18.169 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-08T23:58:18.779 DEBUG:teuthology.orchestra.run.vm06:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-08T23:58:18.801 INFO:teuthology.orchestra.run.vm06.stdout:18.2.1-0.el9 2026-03-08T23:58:18.801 INFO:teuthology.packaging:The installed version of ceph is 18.2.1-0.el9 2026-03-08T23:58:18.801 INFO:teuthology.task.install:The correct ceph version 18.2.1-0 is installed. 2026-03-08T23:58:18.802 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T23:58:18.802 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:58:18.803 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T23:58:18.840 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:58:18.840 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T23:58:18.871 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T23:58:18.871 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:58:18.871 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T23:58:18.916 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T23:58:18.994 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:58:18.994 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T23:58:19.019 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T23:58:19.083 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T23:58:19.083 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:58:19.083 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T23:58:19.114 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T23:58:19.191 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:58:19.191 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T23:58:19.217 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T23:58:19.285 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T23:58:19.285 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:58:19.285 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T23:58:19.316 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T23:58:19.385 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:58:19.385 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T23:58:19.411 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T23:58:19.478 INFO:teuthology.run_tasks:Running task print... 2026-03-08T23:58:19.480 INFO:teuthology.task.print:**** done install task... 2026-03-08T23:58:19.480 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-08T23:58:19.533 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.1', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-08T23:58:19.533 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.1 2026-03-08T23:58:19.533 INFO:tasks.cephadm:Cluster fsid is ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:19.533 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-08T23:58:19.533 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-08T23:58:19.533 INFO:tasks.cephadm:Monitor IPs: {'mon.vm03': '192.168.123.103', 'mon.vm06': '192.168.123.106'} 2026-03-08T23:58:19.533 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-08T23:58:19.533 DEBUG:teuthology.orchestra.run.vm03:> sudo hostname $(hostname -s) 2026-03-08T23:58:19.565 DEBUG:teuthology.orchestra.run.vm06:> sudo hostname $(hostname -s) 2026-03-08T23:58:19.593 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-08T23:58:19.594 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T23:58:20.211 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-08T23:58:20.966 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-08T23:58:20.967 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-08T23:58:20.967 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-08T23:58:20.967 DEBUG:teuthology.orchestra.run.vm03:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-08T23:58:22.226 INFO:teuthology.orchestra.run.vm03.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 8 23:58 /home/ubuntu/cephtest/cephadm 2026-03-08T23:58:22.227 DEBUG:teuthology.orchestra.run.vm06:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-08T23:58:23.548 INFO:teuthology.orchestra.run.vm06.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 8 23:58 /home/ubuntu/cephtest/cephadm 2026-03-08T23:58:23.549 DEBUG:teuthology.orchestra.run.vm03:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-08T23:58:23.572 DEBUG:teuthology.orchestra.run.vm06:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-08T23:58:23.595 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.1 on all hosts... 2026-03-08T23:58:23.596 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 pull 2026-03-08T23:58:23.616 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 pull 2026-03-08T23:58:23.766 INFO:teuthology.orchestra.run.vm06.stderr:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-08T23:58:23.775 INFO:teuthology.orchestra.run.vm03.stderr:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout: "image_id": "5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf", 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout: "repo_digests": [ 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout: "quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3", 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout: "quay.io/ceph/ceph@sha256:e8e55db8b4fd270dbec25bc764437a2a3abb707971c4dba5f559fb83018049dc" 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout: ] 2026-03-08T23:58:52.826 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout:{ 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout: "image_id": "5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf", 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout: "repo_digests": [ 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout: "quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3", 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout: "quay.io/ceph/ceph@sha256:e8e55db8b4fd270dbec25bc764437a2a3abb707971c4dba5f559fb83018049dc" 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout: ] 2026-03-08T23:58:53.003 INFO:teuthology.orchestra.run.vm06.stdout:} 2026-03-08T23:58:53.017 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /etc/ceph 2026-03-08T23:58:53.046 DEBUG:teuthology.orchestra.run.vm06:> sudo mkdir -p /etc/ceph 2026-03-08T23:58:53.075 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 777 /etc/ceph 2026-03-08T23:58:53.111 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 777 /etc/ceph 2026-03-08T23:58:53.141 INFO:tasks.cephadm:Writing seed config... 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-08T23:58:53.141 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-08T23:58:53.142 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-08T23:58:53.142 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:58:53.142 DEBUG:teuthology.orchestra.run.vm03:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-08T23:58:53.166 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = ae8f0172-1b4a-11f1-916a-712b2ac006b7 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-08T23:58:53.166 DEBUG:teuthology.orchestra.run.vm03:mon.vm03> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service 2026-03-08T23:58:53.208 INFO:tasks.cephadm:Bootstrapping... 2026-03-08T23:58:53.209 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 -v bootstrap --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.103 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-08T23:58:53.329 INFO:teuthology.orchestra.run.vm03.stdout:-------------------------------------------------------------------------------- 2026-03-08T23:58:53.329 INFO:teuthology.orchestra.run.vm03.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.1', '-v', 'bootstrap', '--fsid', 'ae8f0172-1b4a-11f1-916a-712b2ac006b7', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.103', '--skip-admin-label'] 2026-03-08T23:58:53.349 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-08T23:58:53.350 INFO:teuthology.orchestra.run.vm03.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-08T23:58:53.350 INFO:teuthology.orchestra.run.vm03.stdout:Verifying podman|docker is present... 2026-03-08T23:58:53.369 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-08T23:58:53.369 INFO:teuthology.orchestra.run.vm03.stdout:Verifying lvm2 is present... 2026-03-08T23:58:53.369 INFO:teuthology.orchestra.run.vm03.stdout:Verifying time synchronization is in place... 2026-03-08T23:58:53.376 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-08T23:58:53.376 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-08T23:58:53.381 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-08T23:58:53.381 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout inactive 2026-03-08T23:58:53.387 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout enabled 2026-03-08T23:58:53.392 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout active 2026-03-08T23:58:53.392 INFO:teuthology.orchestra.run.vm03.stdout:Unit chronyd.service is enabled and running 2026-03-08T23:58:53.392 INFO:teuthology.orchestra.run.vm03.stdout:Repeating the final host check... 2026-03-08T23:58:53.411 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5.8.0 2026-03-08T23:58:53.411 INFO:teuthology.orchestra.run.vm03.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-08T23:58:53.411 INFO:teuthology.orchestra.run.vm03.stdout:systemctl is present 2026-03-08T23:58:53.411 INFO:teuthology.orchestra.run.vm03.stdout:lvcreate is present 2026-03-08T23:58:53.417 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-08T23:58:53.417 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-08T23:58:53.423 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-08T23:58:53.423 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout inactive 2026-03-08T23:58:53.428 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout enabled 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stdout active 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Unit chronyd.service is enabled and running 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Host looks OK 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Cluster fsid: ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Acquiring lock 140088505328160 on /run/cephadm/ae8f0172-1b4a-11f1-916a-712b2ac006b7.lock 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Lock 140088505328160 acquired on /run/cephadm/ae8f0172-1b4a-11f1-916a-712b2ac006b7.lock 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Verifying IP 192.168.123.103 port 3300 ... 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Verifying IP 192.168.123.103 port 6789 ... 2026-03-08T23:58:53.435 INFO:teuthology.orchestra.run.vm03.stdout:Base mon IP(s) is [192.168.123.103:3300, 192.168.123.103:6789], mon addrv is [v2:192.168.123.103:3300,v1:192.168.123.103:6789] 2026-03-08T23:58:53.439 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.103 metric 100 2026-03-08T23:58:53.439 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.103 metric 100 2026-03-08T23:58:53.441 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-08T23:58:53.442 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:3/64 scope link noprefixroute 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:Mon IP `192.168.123.103` is in CIDR network `192.168.123.0/24` 2026-03-08T23:58:53.444 INFO:teuthology.orchestra.run.vm03.stdout:Mon IP `192.168.123.103` is in CIDR network `192.168.123.0/24` 2026-03-08T23:58:53.445 INFO:teuthology.orchestra.run.vm03.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-08T23:58:53.445 INFO:teuthology.orchestra.run.vm03.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-08T23:58:53.445 INFO:teuthology.orchestra.run.vm03.stdout:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stdout 5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.1... 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Getting image source signatures 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying blob sha256:7feca07754707458c3945cf0062cf4dabc512f6d90fe1a9a1370b362b6011124 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying blob sha256:a733d3c618b71f19c168ebecd1953429dce2c1631835ca182e9551c36dce5989 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Copying config sha256:5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf 2026-03-08T23:58:54.616 INFO:teuthology.orchestra.run.vm03.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-08T23:58:54.764 INFO:teuthology.orchestra.run.vm03.stdout:ceph: stdout ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable) 2026-03-08T23:58:54.764 INFO:teuthology.orchestra.run.vm03.stdout:Ceph version: ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable) 2026-03-08T23:58:54.764 INFO:teuthology.orchestra.run.vm03.stdout:Extracting ceph user uid/gid from container image... 2026-03-08T23:58:54.838 INFO:teuthology.orchestra.run.vm03.stdout:stat: stdout 167 167 2026-03-08T23:58:54.838 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial keys... 2026-03-08T23:58:54.952 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQA+Da5pQ2fVNhAAHsntrob53GhwaIFNBNd/Jw== 2026-03-08T23:58:55.068 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQA/Da5p2YVeARAA6RxQrLba0PukB+LOl0LxQA== 2026-03-08T23:58:55.154 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph-authtool: stdout AQA/Da5pvgAvCBAAFXyX4MLLJiSLcuWwWpK3Vg== 2026-03-08T23:58:55.155 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial monmap... 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:monmaptool for vm03 [v2:192.168.123.103:3300,v1:192.168.123.103:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:setting min_mon_release = pacific 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: set fsid to ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:58:55.250 INFO:teuthology.orchestra.run.vm03.stdout:Creating mon... 2026-03-08T23:58:55.400 INFO:teuthology.orchestra.run.vm03.stdout:create mon.vm03 on 2026-03-08T23:58:55.538 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-08T23:58:55.660 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-08T23:58:55.792 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7.target → /etc/systemd/system/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7.target. 2026-03-08T23:58:55.792 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7.target → /etc/systemd/system/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7.target. 2026-03-08T23:58:55.935 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03 2026-03-08T23:58:55.935 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to reset failed state of unit ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service: Unit ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service not loaded. 2026-03-08T23:58:56.077 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7.target.wants/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service → /etc/systemd/system/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@.service. 2026-03-08T23:58:56.242 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-08T23:58:56.242 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to enable service . firewalld.service is not available 2026-03-08T23:58:56.242 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mon to start... 2026-03-08T23:58:56.242 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mon... 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout cluster: 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout id: ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout services: 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm03 (age 0.160979s) 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout data: 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout pgs: 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.384+0000 7f73db0d3700 1 Processor -- start 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.385+0000 7f73db0d3700 1 -- start start 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.385+0000 7f73db0d3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d4108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.385+0000 7f73db0d3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73d4109370 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.386+0000 7f73d8e6f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d4108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.386+0000 7f73d8e6f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d4108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60142/0 (socket says 192.168.123.103:60142) 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.386+0000 7f73d8e6f700 1 -- 192.168.123.103:0/1172717864 learned_addr learned my addr 192.168.123.103:0/1172717864 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.386+0000 7f73d8e6f700 1 -- 192.168.123.103:0/1172717864 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73d4109b80 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73d8e6f700 1 --2- 192.168.123.103:0/1172717864 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d4108da0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f73c0009cf0 tx=0x7f73c000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b7bc700c09d15120 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73d37fe700 1 -- 192.168.123.103:0/1172717864 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73c0004030 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73d37fe700 1 -- 192.168.123.103:0/1172717864 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f73c0004190 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73d37fe700 1 -- 192.168.123.103:0/1172717864 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73c0004320 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73db0d3700 1 -- 192.168.123.103:0/1172717864 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 msgr2=0x7f73d4108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73db0d3700 1 --2- 192.168.123.103:0/1172717864 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d4108da0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f73c0009cf0 tx=0x7f73c000b0e0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73db0d3700 1 -- 192.168.123.103:0/1172717864 shutdown_connections 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73db0d3700 1 --2- 192.168.123.103:0/1172717864 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d4108da0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.387+0000 7f73db0d3700 1 -- 192.168.123.103:0/1172717864 >> 192.168.123.103:0/1172717864 conn(0x7f73d41044d0 msgr2=0x7f73d41068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.388+0000 7f73db0d3700 1 -- 192.168.123.103:0/1172717864 shutdown_connections 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.388+0000 7f73db0d3700 1 -- 192.168.123.103:0/1172717864 wait complete. 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.388+0000 7f73db0d3700 1 Processor -- start 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.388+0000 7f73db0d3700 1 -- start start 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.389+0000 7f73db0d3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d419bd20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.389+0000 7f73db0d3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73d419c260 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.389+0000 7f73d8e6f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d419bd20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.389+0000 7f73d8e6f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d419bd20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60152/0 (socket says 192.168.123.103:60152) 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.389+0000 7f73d8e6f700 1 -- 192.168.123.103:0/1591727541 learned_addr learned my addr 192.168.123.103:0/1591727541 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.389+0000 7f73d8e6f700 1 -- 192.168.123.103:0/1591727541 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73c0009740 con 0x7f73d4108980 2026-03-08T23:58:56.481 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73d8e6f700 1 --2- 192.168.123.103:0/1591727541 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d419bd20 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f73c000b120 tx=0x7f73c0004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73c00036a0 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f73c0003800 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73c00039b0 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73d419c460 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f73d419c880 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f73c0022020 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.390+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f73c001bc50 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.391+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f73d4195570 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.392+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f73c0044b10 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.430+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7f73d402d0c0 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.431+0000 7f73d1ffb700 1 -- 192.168.123.103:0/1591727541 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7f73c0033070 con 0x7f73d4108980 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.432+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 msgr2=0x7f73d419bd20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.432+0000 7f73db0d3700 1 --2- 192.168.123.103:0/1591727541 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d419bd20 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f73c000b120 tx=0x7f73c0004750 comp rx=0 tx=0).stop 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.432+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 shutdown_connections 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.432+0000 7f73db0d3700 1 --2- 192.168.123.103:0/1591727541 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73d4108980 0x7f73d419bd20 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.432+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 >> 192.168.123.103:0/1591727541 conn(0x7f73d41044d0 msgr2=0x7f73d4190340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.432+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 shutdown_connections 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.433+0000 7f73db0d3700 1 -- 192.168.123.103:0/1591727541 wait complete. 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:mon is available 2026-03-08T23:58:56.482 INFO:teuthology.orchestra.run.vm03.stdout:Assimilating anything we can from ceph.conf... 2026-03-08T23:58:56.721 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:56.721 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [global] 2026-03-08T23:58:56.721 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout fsid = ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.103:3300,v1:192.168.123.103:6789] 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [osd] 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8bd476700 1 Processor -- start 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8bd476700 1 -- start start 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8bd476700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8106bb0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8bd476700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8b8107180 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8b6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8106bb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8b6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8106bb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60166/0 (socket says 192.168.123.103:60166) 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.611+0000 7fd8b6ffd700 1 -- 192.168.123.103:0/1690069830 learned_addr learned my addr 192.168.123.103:0/1690069830 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.613+0000 7fd8b6ffd700 1 -- 192.168.123.103:0/1690069830 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8b8107990 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.614+0000 7fd8b6ffd700 1 --2- 192.168.123.103:0/1690069830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8106bb0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7fd8a001ab30 tx=0x7fd8a001ae40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3f5c9761323b6afa server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.614+0000 7fd8b5ffb700 1 -- 192.168.123.103:0/1690069830 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8a0004030 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.614+0000 7fd8b5ffb700 1 -- 192.168.123.103:0/1690069830 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fd8a0004190 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 -- 192.168.123.103:0/1690069830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 msgr2=0x7fd8b8106bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 --2- 192.168.123.103:0/1690069830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8106bb0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7fd8a001ab30 tx=0x7fd8a001ae40 comp rx=0 tx=0).stop 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 -- 192.168.123.103:0/1690069830 shutdown_connections 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 --2- 192.168.123.103:0/1690069830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8106bb0 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 -- 192.168.123.103:0/1690069830 >> 192.168.123.103:0/1690069830 conn(0x7fd8b8101d30 msgr2=0x7fd8b8104170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 -- 192.168.123.103:0/1690069830 shutdown_connections 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 -- 192.168.123.103:0/1690069830 wait complete. 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 Processor -- start 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.615+0000 7fd8bd476700 1 -- start start 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8bd476700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8193540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8bd476700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8b8107180 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8b6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8193540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8b6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8193540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60172/0 (socket says 192.168.123.103:60172) 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8b6ffd700 1 -- 192.168.123.103:0/4022670272 learned_addr learned my addr 192.168.123.103:0/4022670272 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8b6ffd700 1 -- 192.168.123.103:0/4022670272 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8a001a7e0 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8b6ffd700 1 --2- 192.168.123.103:0/4022670272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8193540 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fd8a0004000 tx=0x7fd8a0004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8a0003850 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.616+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8b8193a80 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.617+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8b8193ea0 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.617+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fd8a0031070 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.617+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd8a00039b0 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.617+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fd8a0033030 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.617+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fd8a002d450 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.618+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd898005320 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.620+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fd8a002c450 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.662+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7fd898005cc0 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.666+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fd8a002c5f0 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.667+0000 7fd8affff700 1 -- 192.168.123.103:0/4022670272 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7fd8a003f600 con 0x7fd8b8106790 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.668+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 msgr2=0x7fd8b8193540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.669+0000 7fd8bd476700 1 --2- 192.168.123.103:0/4022670272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8193540 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7fd8a0004000 tx=0x7fd8a0004750 comp rx=0 tx=0).stop 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.669+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 shutdown_connections 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.669+0000 7fd8bd476700 1 --2- 192.168.123.103:0/4022670272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8b8106790 0x7fd8b8193540 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.669+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 >> 192.168.123.103:0/4022670272 conn(0x7fd8b8101d30 msgr2=0x7fd8b807a0f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.669+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 shutdown_connections 2026-03-08T23:58:56.722 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.669+0000 7fd8bd476700 1 -- 192.168.123.103:0/4022670272 wait complete. 2026-03-08T23:58:56.723 INFO:teuthology.orchestra.run.vm03.stdout:Generating new minimal ceph.conf... 2026-03-08T23:58:56.921 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.847+0000 7efe21792700 1 Processor -- start 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe21792700 1 -- start start 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe21792700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c07bb80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe21792700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe1c07c150 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe1affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c07bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe1affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c07bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60180/0 (socket says 192.168.123.103:60180) 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe1affd700 1 -- 192.168.123.103:0/3452181472 learned_addr learned my addr 192.168.123.103:0/3452181472 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.848+0000 7efe1affd700 1 -- 192.168.123.103:0/3452181472 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe1c07c9b0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.849+0000 7efe1affd700 1 --2- 192.168.123.103:0/3452181472 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c07bb80 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7efe0c01ab30 tx=0x7efe0c01ae40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=cea891c3ff54d320 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.849+0000 7efe1a7fc700 1 -- 192.168.123.103:0/3452181472 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe0c004030 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.849+0000 7efe1a7fc700 1 -- 192.168.123.103:0/3452181472 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7efe0c01c8b0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.849+0000 7efe21792700 1 -- 192.168.123.103:0/3452181472 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 msgr2=0x7efe1c07bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.849+0000 7efe21792700 1 --2- 192.168.123.103:0/3452181472 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c07bb80 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7efe0c01ab30 tx=0x7efe0c01ae40 comp rx=0 tx=0).stop 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 -- 192.168.123.103:0/3452181472 shutdown_connections 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 --2- 192.168.123.103:0/3452181472 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c07bb80 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 -- 192.168.123.103:0/3452181472 >> 192.168.123.103:0/3452181472 conn(0x7efe1c103f50 msgr2=0x7efe1c106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 -- 192.168.123.103:0/3452181472 shutdown_connections 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 -- 192.168.123.103:0/3452181472 wait complete. 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 Processor -- start 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 -- start start 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c1a09a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe21792700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efe1c07c150 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe1affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c1a09a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe1affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c1a09a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60196/0 (socket says 192.168.123.103:60196) 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.850+0000 7efe1affd700 1 -- 192.168.123.103:0/1965105240 learned_addr learned my addr 192.168.123.103:0/1965105240 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.851+0000 7efe1affd700 1 -- 192.168.123.103:0/1965105240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efe0c01a7e0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.851+0000 7efe1affd700 1 --2- 192.168.123.103:0/1965105240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c1a09a0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7efe0c006b20 tx=0x7efe0c0042e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.851+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe0c0036a0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.851+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efe1c1a0ee0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.851+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efe1c1a1300 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.852+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7efe0c02c8b0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.852+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efe0c0227b0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.852+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7efe0c022cd0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.852+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7efe0c02a3d0 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.852+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efe00005320 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.854+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7efe0c053b10 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.890+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7efe00005190 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.890+0000 7efe18ff9700 1 -- 192.168.123.103:0/1965105240 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7efe0c042030 con 0x7efe1c07b760 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.891+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 msgr2=0x7efe1c1a09a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.891+0000 7efe21792700 1 --2- 192.168.123.103:0/1965105240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c1a09a0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7efe0c006b20 tx=0x7efe0c0042e0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.891+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 shutdown_connections 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.891+0000 7efe21792700 1 --2- 192.168.123.103:0/1965105240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efe1c07b760 0x7efe1c1a09a0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.891+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 >> 192.168.123.103:0/1965105240 conn(0x7efe1c103f50 msgr2=0x7efe1c106120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.892+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 shutdown_connections 2026-03-08T23:58:56.922 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:56.892+0000 7efe21792700 1 -- 192.168.123.103:0/1965105240 wait complete. 2026-03-08T23:58:56.923 INFO:teuthology.orchestra.run.vm03.stdout:Restarting the monitor... 2026-03-08T23:58:57.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 podman[52264]: 2026-03-08 23:58:57.06670302 +0000 UTC m=+0.075741122 container died 46526bea036e9ea54a581bf84c79b0f62d47426e7a24a7180f33805893b57a59 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, org.label-schema.license=GPLv2, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20240222, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1) 2026-03-08T23:58:57.755 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 podman[52264]: 2026-03-08 23:58:57.744264177 +0000 UTC m=+0.753302279 container remove 46526bea036e9ea54a581bf84c79b0f62d47426e7a24a7180f33805893b57a59 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.schema-version=1.0, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd) 2026-03-08T23:58:57.756 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 bash[52264]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03 2026-03-08T23:58:58.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service: Deactivated successfully. 2026-03-08T23:58:58.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 systemd[1]: Stopped Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-08T23:58:58.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 systemd[1]: Starting Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-08T23:58:58.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:57 vm03 podman[52332]: 2026-03-08 23:58:57.992245111 +0000 UTC m=+0.090764830 container create f9863944dcfb0d091ecc36cb189641e022fde809f5586637fc314c55837f5195 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, org.label-schema.vendor=CentOS, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, GIT_BRANCH=HEAD, org.label-schema.license=GPLv2, org.label-schema.build-date=20240222, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, GIT_CLEAN=True, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=-18.2.1, maintainer=Guillaume Abrioux ) 2026-03-08T23:58:58.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 podman[52332]: 2026-03-08 23:58:57.911462564 +0000 UTC m=+0.009982283 image pull 5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf quay.io/ceph/ceph:v18.2.1 2026-03-08T23:58:58.093 INFO:teuthology.orchestra.run.vm03.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 podman[52332]: 2026-03-08 23:58:58.082382195 +0000 UTC m=+0.180901914 container init f9863944dcfb0d091ecc36cb189641e022fde809f5586637fc314c55837f5195 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, GIT_CLEAN=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20240222, ceph=True, maintainer=Guillaume Abrioux , io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=-18.2.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, org.label-schema.vendor=CentOS) 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 podman[52332]: 2026-03-08 23:58:58.08515793 +0000 UTC m=+0.183677649 container start f9863944dcfb0d091ecc36cb189641e022fde809f5586637fc314c55837f5195 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, org.label-schema.license=GPLv2, org.label-schema.build-date=20240222, ceph=True, CEPH_POINT_RELEASE=-18.2.1, maintainer=Guillaume Abrioux , org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0) 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 bash[52332]: f9863944dcfb0d091ecc36cb189641e022fde809f5586637fc314c55837f5195 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 systemd[1]: Started Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: set uid:gid to 167:167 (ceph:ceph) 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable), process ceph-mon, pid 2 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: pidfile_write: ignore empty --pid-file 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: load: jerasure load: lrc 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: RocksDB version: 7.9.2 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Git sha 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Compile date 2023-12-11 22:07:34 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: DB SUMMARY 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: DB Session ID: U2IR9JU9WYXSP24YSXJG 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: CURRENT file: CURRENT 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: IDENTITY file: IDENTITY 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm03/store.db dir, Total Num: 1, files: 000008.sst 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm03/store.db: 000009.log size: 89048 ; 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.error_if_exists: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.create_if_missing: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.paranoid_checks: 1 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.env: 0x56387e728720 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.fs: PosixFileSystem 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.info_log: 0x563880f15360 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_file_opening_threads: 16 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.statistics: (nil) 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.use_fsync: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_log_file_size: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.keep_log_file_num: 1000 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.recycle_log_file_num: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.allow_fallocate: 1 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.allow_mmap_reads: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.allow_mmap_writes: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.use_direct_reads: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-08T23:58:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.create_missing_column_families: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.db_log_dir: 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.wal_dir: 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.advise_random_on_open: 1 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.db_write_buffer_size: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.write_buffer_manager: 0x5638801a4320 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.rate_limiter: (nil) 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.wal_recovery_mode: 2 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.enable_thread_tracking: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.enable_pipelined_write: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.unordered_write: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.row_cache: None 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.wal_filter: None 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.allow_ingest_behind: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.two_write_queues: 0 2026-03-08T23:58:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.manual_wal_flush: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.wal_compression: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.atomic_flush: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.log_readahead_size: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.best_efforts_recovery: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.allow_data_in_errors: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.db_host_id: __hostname__ 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_background_jobs: 2 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_background_compactions: -1 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_subcompactions: 1 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_total_wal_size: 0 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_open_files: -1 2026-03-08T23:58:58.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bytes_per_sync: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_readahead_size: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_background_flushes: -1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Compression algorithms supported: 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kZSTD supported: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kXpressCompression supported: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kLZ4HCCompression supported: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kZlibCompression supported: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kSnappyCompression supported: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kLZ4Compression supported: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: kBZip2Compression supported: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000010 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.merge_operator: 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_filter: None 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_filter_factory: None 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.sst_partitioner_factory: None 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x563880f15480) 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_top_level_index_and_filter: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_type: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_index_type: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_shortening: 1 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: checksum: 4 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: no_block_cache: 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache: 0x5638802271f0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_name: BinnedLRUCache 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_options: 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: capacity : 536870912 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_shard_bits : 4 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: strict_capacity_limit : 0 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: high_pri_pool_ratio: 0.000 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_compressed: (nil) 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: persistent_cache: (nil) 2026-03-08T23:58:58.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size: 4096 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size_deviation: 10 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_restart_interval: 16 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_block_restart_interval: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_block_size: 4096 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: partition_filters: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: use_delta_encoding: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: filter_policy: bloomfilter 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: whole_key_filtering: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: verify_compression: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: read_amp_bytes_per_bit: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: format_version: 5 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_index_compression: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_align: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_auto_readahead_size: 262144 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: prepopulate_block_cache: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: initial_auto_readahead_size: 8192 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_file_reads_for_auto_readahead: 2 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.write_buffer_size: 33554432 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_write_buffer_number: 2 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression: NoCompression 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression: Disabled 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.prefix_extractor: nullptr 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.num_levels: 7 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.level: 32767 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.strategy: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.enabled: false 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-08T23:58:58.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.target_file_size_base: 67108864 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.arena_block_size: 1048576 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.disable_auto_compactions: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.table_properties_collectors: 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.inplace_update_support: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.bloom_locality: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.max_successive_merges: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.paranoid_file_checks: 0 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.force_consistency_checks: 1 2026-03-08T23:58:58.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.report_bg_io_stats: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.ttl: 2592000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.enable_blob_files: false 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.min_blob_size: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.blob_file_size: 268435456 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.blob_file_starting_level: 0 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fd8aa2b9-ca97-4f1b-9ec0-1d302426f007 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014338110241, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014338134402, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84711, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 287, "table_properties": {"data_size": 82789, "index_size": 209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13288, "raw_average_key_size": 51, "raw_value_size": 75614, "raw_average_value_size": 293, "num_data_blocks": 9, "num_entries": 258, "num_filter_entries": 258, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773014338, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fd8aa2b9-ca97-4f1b-9ec0-1d302426f007", "db_session_id": "U2IR9JU9WYXSP24YSXJG", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014338134453, "job": 1, "event": "recovery_finished"} 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm03/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5638802c4000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: rocksdb: DB pointer 0x5638802b0000 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: starting mon.vm03 rank 0 at public addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] at bind addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon_data /var/lib/ceph/mon/ceph-vm03 fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???) e1 preinit fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).mds e1 new map 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).mds e1 print_map 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout: e1 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout: legacy client fscid: -1 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout: No filesystems configured 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mon.vm03 is new leader, mons vm03 in quorum (ranks 0) 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: monmap e1: 1 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: fsmap 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: osdmap e1: 0 total, 0 up, 0 in 2026-03-08T23:58:58.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:58 vm03 ceph-mon[52346]: mgrmap e1: no daemons active 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.278+0000 7f290aa12700 1 Processor -- start 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f290aa12700 1 -- start start 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f290aa12700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f290aa12700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2904074720 con 0x7f2904104fb0 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f2903fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f2903fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60198/0 (socket says 192.168.123.103:60198) 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f2903fff700 1 -- 192.168.123.103:0/2931592182 learned_addr learned my addr 192.168.123.103:0/2931592182 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f2903fff700 1 -- 192.168.123.103:0/2931592182 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2904107920 con 0x7f2904104fb0 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.279+0000 7f2903fff700 1 --2- 192.168.123.103:0/2931592182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041073e0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f28f8009a90 tx=0x7f28f8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5b9f2072e99b9ed2 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.291+0000 7f2902ffd700 1 -- 192.168.123.103:0/2931592182 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f28f8004030 con 0x7f2904104fb0 2026-03-08T23:58:58.473 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.291+0000 7f2902ffd700 1 -- 192.168.123.103:0/2931592182 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f28f800b7e0 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.291+0000 7f2902ffd700 1 -- 192.168.123.103:0/2931592182 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f28f8003920 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 -- 192.168.123.103:0/2931592182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 msgr2=0x7f29041073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 --2- 192.168.123.103:0/2931592182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041073e0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f28f8009a90 tx=0x7f28f8009da0 comp rx=0 tx=0).stop 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 -- 192.168.123.103:0/2931592182 shutdown_connections 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 --2- 192.168.123.103:0/2931592182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041073e0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 -- 192.168.123.103:0/2931592182 >> 192.168.123.103:0/2931592182 conn(0x7f2904100bd0 msgr2=0x7f2904103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 -- 192.168.123.103:0/2931592182 shutdown_connections 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.293+0000 7f290aa12700 1 -- 192.168.123.103:0/2931592182 wait complete. 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.294+0000 7f290aa12700 1 Processor -- start 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.294+0000 7f290aa12700 1 -- start start 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.294+0000 7f290aa12700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041a0840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.294+0000 7f290aa12700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29041a0d80 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.295+0000 7f2903fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041a0840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.295+0000 7f2903fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041a0840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60210/0 (socket says 192.168.123.103:60210) 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.295+0000 7f2903fff700 1 -- 192.168.123.103:0/912576378 learned_addr learned my addr 192.168.123.103:0/912576378 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.295+0000 7f2903fff700 1 -- 192.168.123.103:0/912576378 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f28f8009740 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.296+0000 7f2903fff700 1 --2- 192.168.123.103:0/912576378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041a0840 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f28f8003890 tx=0x7f28f8003970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.296+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f28f8003b40 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.296+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f28f801a430 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.296+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f28f8011420 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.296+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f29041a0f80 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.296+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f29041a13a0 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.297+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f290419a290 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.298+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f28f8028020 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.298+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f28f8011900 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.299+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f28f8011c80 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.335+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7f290402cef0 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.344+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f28f8011e20 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.344+0000 7f29017fa700 1 -- 192.168.123.103:0/912576378 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7f28f802bd30 con 0x7f2904104fb0 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.346+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 msgr2=0x7f29041a0840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.346+0000 7f290aa12700 1 --2- 192.168.123.103:0/912576378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041a0840 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f28f8003890 tx=0x7f28f8003970 comp rx=0 tx=0).stop 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.347+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 shutdown_connections 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.347+0000 7f290aa12700 1 --2- 192.168.123.103:0/912576378 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2904104fb0 0x7f29041a0840 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.347+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 >> 192.168.123.103:0/912576378 conn(0x7f2904100bd0 msgr2=0x7f2904103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.347+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 shutdown_connections 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:58.347+0000 7f290aa12700 1 -- 192.168.123.103:0/912576378 wait complete. 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:Creating mgr... 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-08T23:58:58.474 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-08T23:58:58.475 INFO:teuthology.orchestra.run.vm03.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-08T23:58:58.694 INFO:teuthology.orchestra.run.vm03.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mgr.vm03.yvcons 2026-03-08T23:58:58.694 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Failed to reset failed state of unit ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mgr.vm03.yvcons.service: Unit ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mgr.vm03.yvcons.service not loaded. 2026-03-08T23:58:58.823 INFO:teuthology.orchestra.run.vm03.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7.target.wants/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mgr.vm03.yvcons.service → /etc/systemd/system/ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@.service. 2026-03-08T23:58:59.005 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-08T23:58:59.005 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to enable service . firewalld.service is not available 2026-03-08T23:58:59.005 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-08T23:58:59.005 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-08T23:58:59.006 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr to start... 2026-03-08T23:58:59.006 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr... 2026-03-08T23:58:59.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:58:59.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:58:59.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "ae8f0172-1b4a-11f1-916a-712b2ac006b7", 2026-03-08T23:58:59.291 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 1, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-08T23:58:59.292 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-08T23:58:56.272906+0000", 2026-03-08T23:58:59.293 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.153+0000 7f0b7df3f700 1 Processor -- start 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.154+0000 7f0b7df3f700 1 -- start start 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.154+0000 7f0b7df3f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 0x7f0b780725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.154+0000 7f0b7df3f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b78072bc0 con 0x7f0b780721d0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.154+0000 7f0b7cf3d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 0x7f0b780725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.154+0000 7f0b7cf3d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 0x7f0b780725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60228/0 (socket says 192.168.123.103:60228) 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.154+0000 7f0b7cf3d700 1 -- 192.168.123.103:0/101731723 learned_addr learned my addr 192.168.123.103:0/101731723 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.155+0000 7f0b7cf3d700 1 -- 192.168.123.103:0/101731723 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b7810e1c0 con 0x7f0b780721d0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.155+0000 7f0b7cf3d700 1 --2- 192.168.123.103:0/101731723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 0x7f0b780725f0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f0b6800ab30 tx=0x7f0b68010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b08e4056540d9ecc server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.156+0000 7f0b777fe700 1 -- 192.168.123.103:0/101731723 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b68010e00 con 0x7f0b780721d0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.156+0000 7f0b777fe700 1 -- 192.168.123.103:0/101731723 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0b680044d0 con 0x7f0b780721d0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.156+0000 7f0b7df3f700 1 -- 192.168.123.103:0/101731723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 msgr2=0x7f0b780725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.156+0000 7f0b7df3f700 1 --2- 192.168.123.103:0/101731723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 0x7f0b780725f0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7f0b6800ab30 tx=0x7f0b68010730 comp rx=0 tx=0).stop 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 -- 192.168.123.103:0/101731723 shutdown_connections 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 --2- 192.168.123.103:0/101731723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b780721d0 0x7f0b780725f0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 -- 192.168.123.103:0/101731723 >> 192.168.123.103:0/101731723 conn(0x7f0b7806d320 msgr2=0x7f0b7806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 -- 192.168.123.103:0/101731723 shutdown_connections 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 -- 192.168.123.103:0/101731723 wait complete. 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 Processor -- start 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.157+0000 7f0b7df3f700 1 -- start start 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.158+0000 7f0b7df3f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 0x7f0b781b1ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.158+0000 7f0b7df3f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b6801a410 con 0x7f0b781b18c0 2026-03-08T23:58:59.294 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.158+0000 7f0b7cf3d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 0x7f0b781b1ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.158+0000 7f0b7cf3d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 0x7f0b781b1ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60240/0 (socket says 192.168.123.103:60240) 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.158+0000 7f0b7cf3d700 1 -- 192.168.123.103:0/1195813959 learned_addr learned my addr 192.168.123.103:0/1195813959 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.158+0000 7f0b7cf3d700 1 -- 192.168.123.103:0/1195813959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b6800a7e0 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.159+0000 7f0b7cf3d700 1 --2- 192.168.123.103:0/1195813959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 0x7f0b781b1ce0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f0b6800bbd0 tx=0x7f0b68003980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.159+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b68003bd0 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.159+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0b781b2220 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.159+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0b781b4ea0 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.160+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0b6800f070 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.160+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0b680229a0 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.160+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f0b68018070 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.160+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f0b6802c8e0 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.161+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0b7804f000 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.162+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f0b68027070 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.206+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f0b781b50f0 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.209+0000 7f0b75ffb700 1 -- 192.168.123.103:0/1195813959 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f0b68003d30 con 0x7f0b781b18c0 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.213+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 msgr2=0x7f0b781b1ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.213+0000 7f0b7df3f700 1 --2- 192.168.123.103:0/1195813959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 0x7f0b781b1ce0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f0b6800bbd0 tx=0x7f0b68003980 comp rx=0 tx=0).stop 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.213+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 shutdown_connections 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.213+0000 7f0b7df3f700 1 --2- 192.168.123.103:0/1195813959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b781b18c0 0x7f0b781b1ce0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.213+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 >> 192.168.123.103:0/1195813959 conn(0x7f0b7806d320 msgr2=0x7f0b7806dd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.214+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 shutdown_connections 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:58:59.214+0000 7f0b7df3f700 1 -- 192.168.123.103:0/1195813959 wait complete. 2026-03-08T23:58:59.295 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (1/15)... 2026-03-08T23:58:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:59 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/912576378' entity='client.admin' 2026-03-08T23:58:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:58:59 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1195813959' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "ae8f0172-1b4a-11f1-916a-712b2ac006b7", 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-08T23:59:01.537 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-08T23:58:56.272906+0000", 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.449+0000 7f9217fff700 1 Processor -- start 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.450+0000 7f9217fff700 1 -- start start 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.450+0000 7f9217fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 0x7f9218072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.450+0000 7f9217fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92180727f0 con 0x7f9218071e00 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.450+0000 7f9216ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 0x7f9218072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.450+0000 7f9216ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 0x7f9218072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60254/0 (socket says 192.168.123.103:60254) 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.450+0000 7f9216ffd700 1 -- 192.168.123.103:0/473356241 learned_addr learned my addr 192.168.123.103:0/473356241 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.451+0000 7f9216ffd700 1 -- 192.168.123.103:0/473356241 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f921810ddb0 con 0x7f9218071e00 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.451+0000 7f9216ffd700 1 --2- 192.168.123.103:0/473356241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 0x7f9218072220 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f9208009a90 tx=0x7f9208009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2568212a9912ff74 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.451+0000 7f9215ffb700 1 -- 192.168.123.103:0/473356241 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9208004030 con 0x7f9218071e00 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.451+0000 7f9215ffb700 1 -- 192.168.123.103:0/473356241 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f920800b7e0 con 0x7f9218071e00 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 -- 192.168.123.103:0/473356241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 msgr2=0x7f9218072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 --2- 192.168.123.103:0/473356241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 0x7f9218072220 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f9208009a90 tx=0x7f9208009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 -- 192.168.123.103:0/473356241 shutdown_connections 2026-03-08T23:59:01.539 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 --2- 192.168.123.103:0/473356241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9218071e00 0x7f9218072220 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 -- 192.168.123.103:0/473356241 >> 192.168.123.103:0/473356241 conn(0x7f921806d320 msgr2=0x7f921806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 -- 192.168.123.103:0/473356241 shutdown_connections 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.452+0000 7f9217fff700 1 -- 192.168.123.103:0/473356241 wait complete. 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9217fff700 1 Processor -- start 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9217fff700 1 -- start start 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9217fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 0x7f92181a9610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9217fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f92180727f0 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9216ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 0x7f92181a9610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9216ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 0x7f92181a9610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60266/0 (socket says 192.168.123.103:60266) 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.453+0000 7f9216ffd700 1 -- 192.168.123.103:0/3787827258 learned_addr learned my addr 192.168.123.103:0/3787827258 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.454+0000 7f9216ffd700 1 -- 192.168.123.103:0/3787827258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9208009740 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.454+0000 7f9216ffd700 1 --2- 192.168.123.103:0/3787827258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 0x7f92181a9610 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f920800bd00 tx=0x7f920800bde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.459+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9208003ec0 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.459+0000 7f9217fff700 1 -- 192.168.123.103:0/3787827258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f92181a9b50 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.459+0000 7f9217fff700 1 -- 192.168.123.103:0/3787827258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f921807aff0 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.460+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f92080044c0 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.460+0000 7f9217fff700 1 -- 192.168.123.103:0/3787827258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f921804f000 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.460+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f920801ac80 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.460+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f920802c760 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.460+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f9208011990 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.462+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f9208041b10 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.500+0000 7f9217fff700 1 -- 192.168.123.103:0/3787827258 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f92180623c0 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.502+0000 7f91fffff700 1 -- 192.168.123.103:0/3787827258 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f9208030070 con 0x7f92181a91f0 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.503+0000 7f91fdffb700 1 -- 192.168.123.103:0/3787827258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 msgr2=0x7f92181a9610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:01.540 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.503+0000 7f91fdffb700 1 --2- 192.168.123.103:0/3787827258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 0x7f92181a9610 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f920800bd00 tx=0x7f920800bde0 comp rx=0 tx=0).stop 2026-03-08T23:59:01.541 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.505+0000 7f91fdffb700 1 -- 192.168.123.103:0/3787827258 shutdown_connections 2026-03-08T23:59:01.541 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.505+0000 7f91fdffb700 1 --2- 192.168.123.103:0/3787827258 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f92181a91f0 0x7f92181a9610 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:01.541 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.505+0000 7f91fdffb700 1 -- 192.168.123.103:0/3787827258 >> 192.168.123.103:0/3787827258 conn(0x7f921806d320 msgr2=0x7f921806dd00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:01.541 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.505+0000 7f91fdffb700 1 -- 192.168.123.103:0/3787827258 shutdown_connections 2026-03-08T23:59:01.541 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:01.505+0000 7f91fdffb700 1 -- 192.168.123.103:0/3787827258 wait complete. 2026-03-08T23:59:01.541 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (2/15)... 2026-03-08T23:59:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:01 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3787827258' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "ae8f0172-1b4a-11f1-916a-712b2ac006b7", 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-08T23:59:03.795 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-08T23:59:03.796 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-08T23:58:56.272906+0000", 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.677+0000 7f0e63fff700 1 Processor -- start 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.677+0000 7f0e63fff700 1 -- start start 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.677+0000 7f0e63fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e64072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.677+0000 7f0e63fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e640727f0 con 0x7f0e64071e00 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.678+0000 7f0e62ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e64072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.678+0000 7f0e62ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e64072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60282/0 (socket says 192.168.123.103:60282) 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.678+0000 7f0e62ffd700 1 -- 192.168.123.103:0/788364281 learned_addr learned my addr 192.168.123.103:0/788364281 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.678+0000 7f0e62ffd700 1 -- 192.168.123.103:0/788364281 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e6410ddb0 con 0x7f0e64071e00 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.678+0000 7f0e62ffd700 1 --2- 192.168.123.103:0/788364281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e64072220 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f0e54009cf0 tx=0x7f0e5400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=764ba3dce6b93678 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.681+0000 7f0e61ffb700 1 -- 192.168.123.103:0/788364281 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e54004030 con 0x7f0e64071e00 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.681+0000 7f0e61ffb700 1 -- 192.168.123.103:0/788364281 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0e5400b810 con 0x7f0e64071e00 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.681+0000 7f0e61ffb700 1 -- 192.168.123.103:0/788364281 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e54003ae0 con 0x7f0e64071e00 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 -- 192.168.123.103:0/788364281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 msgr2=0x7f0e64072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 --2- 192.168.123.103:0/788364281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e64072220 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f0e54009cf0 tx=0x7f0e5400b0e0 comp rx=0 tx=0).stop 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 -- 192.168.123.103:0/788364281 shutdown_connections 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 --2- 192.168.123.103:0/788364281 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e64072220 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:03.797 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 -- 192.168.123.103:0/788364281 >> 192.168.123.103:0/788364281 conn(0x7f0e6406d320 msgr2=0x7f0e6406f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 -- 192.168.123.103:0/788364281 shutdown_connections 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.683+0000 7f0e63fff700 1 -- 192.168.123.103:0/788364281 wait complete. 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e63fff700 1 Processor -- start 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e63fff700 1 -- start start 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e63fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e6411d360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e63fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e6411d8a0 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e62ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e6411d360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e62ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e6411d360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60294/0 (socket says 192.168.123.103:60294) 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e62ffd700 1 -- 192.168.123.103:0/2760042079 learned_addr learned my addr 192.168.123.103:0/2760042079 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e62ffd700 1 -- 192.168.123.103:0/2760042079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e54009740 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.684+0000 7f0e62ffd700 1 --2- 192.168.123.103:0/2760042079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e6411d360 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f0e54003770 tx=0x7f0e5400bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.685+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e54004060 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.685+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e6411daa0 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.685+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0e5401a460 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.685+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e6411bb80 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.686+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0e540041c0 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.686+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f0e54004320 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.686+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f0e54011df0 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.687+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e6404f070 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.688+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f0e54028050 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.726+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f0e640623c0 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.727+0000 7f0e68817700 1 -- 192.168.123.103:0/2760042079 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f0e5401e030 con 0x7f0e64071e00 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.730+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 msgr2=0x7f0e6411d360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:03.798 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.730+0000 7f0e63fff700 1 --2- 192.168.123.103:0/2760042079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e6411d360 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f0e54003770 tx=0x7f0e5400bfd0 comp rx=0 tx=0).stop 2026-03-08T23:59:03.799 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.731+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 shutdown_connections 2026-03-08T23:59:03.799 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.731+0000 7f0e63fff700 1 --2- 192.168.123.103:0/2760042079 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e64071e00 0x7f0e6411d360 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:03.799 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.731+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 >> 192.168.123.103:0/2760042079 conn(0x7f0e6406d320 msgr2=0x7f0e6406deb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:03.799 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.731+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 shutdown_connections 2026-03-08T23:59:03.799 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:03.731+0000 7f0e63fff700 1 -- 192.168.123.103:0/2760042079 wait complete. 2026-03-08T23:59:03.799 INFO:teuthology.orchestra.run.vm03.stdout:mgr not available, waiting (3/15)... 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2760042079' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: Activating manager daemon vm03.yvcons 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: mgrmap e2: vm03.yvcons(active, starting, since 0.00293721s) 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: Manager daemon vm03.yvcons is now available 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-08T23:59:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:04 vm03 ceph-mon[52346]: from='mgr.14100 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:06.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:05 vm03 ceph-mon[52346]: mgrmap e3: vm03.yvcons(active, since 1.00755s) 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsid": "ae8f0172-1b4a-11f1-916a-712b2ac006b7", 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "health": { 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 0 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-08T23:59:06.127 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "vm03" 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "quorum_age": 7, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-08T23:59:06.128 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "restful" 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ], 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "modified": "2026-03-08T23:58:56.272906+0000", 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout }, 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.928+0000 7f14e58e6700 1 Processor -- start 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.929+0000 7f14e58e6700 1 -- start start 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.930+0000 7f14e58e6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e0108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.930+0000 7f14e58e6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14e0109380 con 0x7f14e0108990 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.930+0000 7f14deffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e0108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.930+0000 7f14deffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e0108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54910/0 (socket says 192.168.123.103:54910) 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.930+0000 7f14deffd700 1 -- 192.168.123.103:0/4001671755 learned_addr learned my addr 192.168.123.103:0/4001671755 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.931+0000 7f14deffd700 1 -- 192.168.123.103:0/4001671755 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f14e0109b90 con 0x7f14e0108990 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.931+0000 7f14deffd700 1 --2- 192.168.123.103:0/4001671755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e0108db0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f14c8009a90 tx=0x7f14c8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3396b9c40aa1b6df server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.931+0000 7f14ddffb700 1 -- 192.168.123.103:0/4001671755 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f14c8004030 con 0x7f14e0108990 2026-03-08T23:59:06.129 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.931+0000 7f14ddffb700 1 -- 192.168.123.103:0/4001671755 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f14c800b7e0 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.931+0000 7f14ddffb700 1 -- 192.168.123.103:0/4001671755 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f14c8003a40 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.932+0000 7f14e58e6700 1 -- 192.168.123.103:0/4001671755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 msgr2=0x7f14e0108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.932+0000 7f14e58e6700 1 --2- 192.168.123.103:0/4001671755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e0108db0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f14c8009a90 tx=0x7f14c8009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.932+0000 7f14e58e6700 1 -- 192.168.123.103:0/4001671755 shutdown_connections 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.932+0000 7f14e58e6700 1 --2- 192.168.123.103:0/4001671755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e0108db0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.932+0000 7f14e58e6700 1 -- 192.168.123.103:0/4001671755 >> 192.168.123.103:0/4001671755 conn(0x7f14e0103f50 msgr2=0x7f14e0106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.933+0000 7f14e58e6700 1 -- 192.168.123.103:0/4001671755 shutdown_connections 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.933+0000 7f14e58e6700 1 -- 192.168.123.103:0/4001671755 wait complete. 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.933+0000 7f14e58e6700 1 Processor -- start 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.933+0000 7f14e58e6700 1 -- start start 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.933+0000 7f14e58e6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e019c650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.933+0000 7f14e58e6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14e019cb90 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14deffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e019c650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14deffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e019c650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54920/0 (socket says 192.168.123.103:54920) 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14deffd700 1 -- 192.168.123.103:0/3052560462 learned_addr learned my addr 192.168.123.103:0/3052560462 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14deffd700 1 -- 192.168.123.103:0/3052560462 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f14c8009740 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14deffd700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e019c650 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f14c8003710 tx=0x7f14c8003b60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f14c8004130 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f14c8004290 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f14c8011420 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.934+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f14e019cd90 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.935+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f14e019d1b0 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.935+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 45267+0+0 (secure 0 0 0) 0x7f14c8011580 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.936+0000 7f14e48e4700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14cc038330 0x7f14cc03a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.936+0000 7f14de7fc700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14cc038330 0x7f14cc03a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.936+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f14c804c9e0 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.936+0000 7f14de7fc700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14cc038330 0x7f14cc03a7f0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f14d0006fd0 tx=0x7f14d0006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.937+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f14e00623c0 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:05.940+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f14c801e070 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.093+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f14e019fff0 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.094+0000 7f14e48e4700 1 -- 192.168.123.103:0/3052560462 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7f14c804b090 con 0x7f14e0108990 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.096+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14cc038330 msgr2=0x7f14cc03a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.096+0000 7f14e58e6700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14cc038330 0x7f14cc03a7f0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f14d0006fd0 tx=0x7f14d0006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.096+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 msgr2=0x7f14e019c650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e019c650 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f14c8003710 tx=0x7f14c8003b60 comp rx=0 tx=0).stop 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 shutdown_connections 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14cc038330 0x7f14cc03a7f0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 --2- 192.168.123.103:0/3052560462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14e0108990 0x7f14e019c650 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 >> 192.168.123.103:0/3052560462 conn(0x7f14e0103f50 msgr2=0x7f14e0104b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 shutdown_connections 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.097+0000 7f14e58e6700 1 -- 192.168.123.103:0/3052560462 wait complete. 2026-03-08T23:59:06.130 INFO:teuthology.orchestra.run.vm03.stdout:mgr is available 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [global] 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout fsid = ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout [osd] 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.260+0000 7fc0e87a9700 1 Processor -- start 2026-03-08T23:59:06.417 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.261+0000 7fc0e87a9700 1 -- start start 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.261+0000 7fc0e87a9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 0x7fc0e0105a70 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.261+0000 7fc0e87a9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0e0106040 con 0x7fc0e0105650 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.262+0000 7fc0e6545700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 0x7fc0e0105a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.262+0000 7fc0e6545700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 0x7fc0e0105a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54932/0 (socket says 192.168.123.103:54932) 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.262+0000 7fc0e6545700 1 -- 192.168.123.103:0/117245995 learned_addr learned my addr 192.168.123.103:0/117245995 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.262+0000 7fc0e6545700 1 -- 192.168.123.103:0/117245995 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc0e0106850 con 0x7fc0e0105650 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.262+0000 7fc0e6545700 1 --2- 192.168.123.103:0/117245995 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 0x7fc0e0105a70 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc0d0009a90 tx=0x7fc0d0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=4cb2ed623b331999 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.263+0000 7fc0e5543700 1 -- 192.168.123.103:0/117245995 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc0d0004030 con 0x7fc0e0105650 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.263+0000 7fc0e5543700 1 -- 192.168.123.103:0/117245995 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc0d000b7e0 con 0x7fc0e0105650 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.263+0000 7fc0e5543700 1 -- 192.168.123.103:0/117245995 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc0d0003a40 con 0x7fc0e0105650 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.263+0000 7fc0e87a9700 1 -- 192.168.123.103:0/117245995 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 msgr2=0x7fc0e0105a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.263+0000 7fc0e87a9700 1 --2- 192.168.123.103:0/117245995 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 0x7fc0e0105a70 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fc0d0009a90 tx=0x7fc0d0009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.264+0000 7fc0e87a9700 1 -- 192.168.123.103:0/117245995 shutdown_connections 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.264+0000 7fc0e87a9700 1 --2- 192.168.123.103:0/117245995 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e0105650 0x7fc0e0105a70 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.264+0000 7fc0e87a9700 1 -- 192.168.123.103:0/117245995 >> 192.168.123.103:0/117245995 conn(0x7fc0e0100bd0 msgr2=0x7fc0e0103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.264+0000 7fc0e87a9700 1 -- 192.168.123.103:0/117245995 shutdown_connections 2026-03-08T23:59:06.418 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.264+0000 7fc0e87a9700 1 -- 192.168.123.103:0/117245995 wait complete. 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.265+0000 7fc0e87a9700 1 Processor -- start 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.265+0000 7fc0e87a9700 1 -- start start 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.265+0000 7fc0e87a9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 0x7fc0e019aa40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.265+0000 7fc0e87a9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc0e019af80 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.266+0000 7fc0e6545700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 0x7fc0e019aa40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.266+0000 7fc0e6545700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 0x7fc0e019aa40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54936/0 (socket says 192.168.123.103:54936) 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.266+0000 7fc0e6545700 1 -- 192.168.123.103:0/2206763554 learned_addr learned my addr 192.168.123.103:0/2206763554 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.266+0000 7fc0e6545700 1 -- 192.168.123.103:0/2206763554 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc0d0009740 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.266+0000 7fc0e6545700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 0x7fc0e019aa40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc0d0000c00 tx=0x7fc0d0003f90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.267+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc0d000be00 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.267+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fc0d001b440 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.267+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0e019b180 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.267+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0e019ddd0 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.268+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc0d000be00 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.268+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc0e00623c0 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.271+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 45267+0+0 (secure 0 0 0) 0x7fc0d001b8b0 con 0x7fc0e019a620 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.271+0000 7fc0d77fe700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0cc038290 0x7fc0cc03a750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.271+0000 7fc0e5d44700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0cc038290 0x7fc0cc03a750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.419 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.272+0000 7fc0e5d44700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0cc038290 0x7fc0cc03a750 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc0dc006fd0 tx=0x7fc0dc006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.272+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc0d0011420 con 0x7fc0e019a620 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.272+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc0d0011780 con 0x7fc0e019a620 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.377+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7fc0e019df60 con 0x7fc0e019a620 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.381+0000 7fc0d77fe700 1 -- 192.168.123.103:0/2206763554 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7fc0d001f020 con 0x7fc0e019a620 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0cc038290 msgr2=0x7fc0cc03a750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0cc038290 0x7fc0cc03a750 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc0dc006fd0 tx=0x7fc0dc006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 msgr2=0x7fc0e019aa40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 0x7fc0e019aa40 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc0d0000c00 tx=0x7fc0d0003f90 comp rx=0 tx=0).stop 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 shutdown_connections 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc0cc038290 0x7fc0cc03a750 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 --2- 192.168.123.103:0/2206763554 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc0e019a620 0x7fc0e019aa40 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.384+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 >> 192.168.123.103:0/2206763554 conn(0x7fc0e0100bd0 msgr2=0x7fc0e0101950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.385+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 shutdown_connections 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.385+0000 7fc0e87a9700 1 -- 192.168.123.103:0/2206763554 wait complete. 2026-03-08T23:59:06.420 INFO:teuthology.orchestra.run.vm03.stdout:Enabling cephadm module... 2026-03-08T23:59:06.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:06 vm03 ceph-mon[52346]: mgrmap e4: vm03.yvcons(active, since 2s) 2026-03-08T23:59:06.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:06 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3052560462' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-08T23:59:06.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:06 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2206763554' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-08T23:59:06.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:06 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3396798043' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17bade700 1 Processor -- start 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17bade700 1 -- start start 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17bade700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa17407bb80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17bade700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa17407c150 con 0x7fa17407b760 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17987a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa17407bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17987a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa17407bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54952/0 (socket says 192.168.123.103:54952) 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17987a700 1 -- 192.168.123.103:0/2735932140 learned_addr learned my addr 192.168.123.103:0/2735932140 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.559+0000 7fa17987a700 1 -- 192.168.123.103:0/2735932140 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa17407c9b0 con 0x7fa17407b760 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa17987a700 1 --2- 192.168.123.103:0/2735932140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa17407bb80 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa164009a90 tx=0x7fa164009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=53670189d1916c0a server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa178878700 1 -- 192.168.123.103:0/2735932140 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa164004030 con 0x7fa17407b760 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa178878700 1 -- 192.168.123.103:0/2735932140 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fa16400b7e0 con 0x7fa17407b760 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa178878700 1 -- 192.168.123.103:0/2735932140 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa1640039f0 con 0x7fa17407b760 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa17bade700 1 -- 192.168.123.103:0/2735932140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 msgr2=0x7fa17407bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa17bade700 1 --2- 192.168.123.103:0/2735932140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa17407bb80 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fa164009a90 tx=0x7fa164009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa17bade700 1 -- 192.168.123.103:0/2735932140 shutdown_connections 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa17bade700 1 --2- 192.168.123.103:0/2735932140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa17407bb80 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.560+0000 7fa17bade700 1 -- 192.168.123.103:0/2735932140 >> 192.168.123.103:0/2735932140 conn(0x7fa174103f50 msgr2=0x7fa174106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.561+0000 7fa17bade700 1 -- 192.168.123.103:0/2735932140 shutdown_connections 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.561+0000 7fa17bade700 1 -- 192.168.123.103:0/2735932140 wait complete. 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.561+0000 7fa17bade700 1 Processor -- start 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.561+0000 7fa17bade700 1 -- start start 2026-03-08T23:59:06.959 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.561+0000 7fa17bade700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa1741a0a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17bade700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1741a0f50 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17987a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa1741a0a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17987a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa1741a0a10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54962/0 (socket says 192.168.123.103:54962) 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17987a700 1 -- 192.168.123.103:0/3396798043 learned_addr learned my addr 192.168.123.103:0/3396798043 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17987a700 1 -- 192.168.123.103:0/3396798043 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa164009740 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17987a700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa1741a0a10 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa16400be80 tx=0x7fa16400bf60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa164004010 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.562+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa1741a1150 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.563+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa1741a1570 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.563+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa17419a2c0 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.563+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fa16401a430 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.564+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa164011420 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.567+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 4) v1 ==== 45267+0+0 (secure 0 0 0) 0x7fa16401a5a0 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.567+0000 7fa16affd700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1600382f0 0x7fa16003a7b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.567+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fa164030080 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.567+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa16404c620 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.567+0000 7fa179079700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1600382f0 0x7fa16003a7b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.568+0000 7fa179079700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1600382f0 0x7fa16003a7b0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fa170006fd0 tx=0x7fa170006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.698+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7fa17402cc70 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.907+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 5) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fa1640116e0 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.907+0000 7fa16affd700 1 -- 192.168.123.103:0/3396798043 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7fa16404e670 con 0x7fa17407b760 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1600382f0 msgr2=0x7fa16003a7b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1600382f0 0x7fa16003a7b0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fa170006fd0 tx=0x7fa170006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 msgr2=0x7fa1741a0a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa1741a0a10 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa16400be80 tx=0x7fa16400bf60 comp rx=0 tx=0).stop 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 shutdown_connections 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa1600382f0 0x7fa16003a7b0 secure :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fa170006fd0 tx=0x7fa170006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 --2- 192.168.123.103:0/3396798043 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa17407b760 0x7fa1741a0a10 secure :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fa16400be80 tx=0x7fa16400bf60 comp rx=0 tx=0).stop 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 >> 192.168.123.103:0/3396798043 conn(0x7fa174103f50 msgr2=0x7fa174106030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 shutdown_connections 2026-03-08T23:59:06.960 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:06.913+0000 7fa17bade700 1 -- 192.168.123.103:0/3396798043 wait complete. 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "active_name": "vm03.yvcons", 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.113+0000 7f9d53a63700 1 Processor -- start 2026-03-08T23:59:07.317 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.113+0000 7f9d53a63700 1 -- start start 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.113+0000 7f9d53a63700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c0721c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.113+0000 7f9d53a63700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d4c072700 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.116+0000 7f9d517ff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c0721c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.116+0000 7f9d517ff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c0721c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54988/0 (socket says 192.168.123.103:54988) 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.116+0000 7f9d517ff700 1 -- 192.168.123.103:0/337865600 learned_addr learned my addr 192.168.123.103:0/337865600 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.116+0000 7f9d517ff700 1 -- 192.168.123.103:0/337865600 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d4c072840 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.116+0000 7f9d517ff700 1 --2- 192.168.123.103:0/337865600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c0721c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f9d48009a90 tx=0x7f9d48009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c67ca81faa1413b1 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.117+0000 7f9d43fff700 1 -- 192.168.123.103:0/337865600 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d48004030 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.117+0000 7f9d43fff700 1 -- 192.168.123.103:0/337865600 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f9d4800b7e0 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.117+0000 7f9d53a63700 1 -- 192.168.123.103:0/337865600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 msgr2=0x7f9d4c0721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.117+0000 7f9d53a63700 1 --2- 192.168.123.103:0/337865600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c0721c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f9d48009a90 tx=0x7f9d48009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.119+0000 7f9d53a63700 1 -- 192.168.123.103:0/337865600 shutdown_connections 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.119+0000 7f9d53a63700 1 --2- 192.168.123.103:0/337865600 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c0721c0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.119+0000 7f9d53a63700 1 -- 192.168.123.103:0/337865600 >> 192.168.123.103:0/337865600 conn(0x7f9d4c06d400 msgr2=0x7f9d4c06f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.122+0000 7f9d53a63700 1 -- 192.168.123.103:0/337865600 shutdown_connections 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.122+0000 7f9d53a63700 1 -- 192.168.123.103:0/337865600 wait complete. 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.122+0000 7f9d53a63700 1 Processor -- start 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.122+0000 7f9d53a63700 1 -- start start 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.122+0000 7f9d53a63700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c1a9100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.122+0000 7f9d53a63700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9d4c072700 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.123+0000 7f9d517ff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c1a9100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.123+0000 7f9d517ff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c1a9100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54990/0 (socket says 192.168.123.103:54990) 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.123+0000 7f9d517ff700 1 -- 192.168.123.103:0/1650537917 learned_addr learned my addr 192.168.123.103:0/1650537917 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.123+0000 7f9d517ff700 1 -- 192.168.123.103:0/1650537917 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9d48009740 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.123+0000 7f9d517ff700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c1a9100 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9d48003ec0 tx=0x7f9d48003fa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.124+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d48004400 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.124+0000 7f9d53a63700 1 -- 192.168.123.103:0/1650537917 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9d4c1a96a0 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.124+0000 7f9d53a63700 1 -- 192.168.123.103:0/1650537917 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9d4c1a9ac0 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.125+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f9d48004560 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.125+0000 7f9d53a63700 1 -- 192.168.123.103:0/1650537917 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9d4c1a9d70 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.125+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9d48011620 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.128+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 5) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f9d48011840 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.128+0000 7f9d427fc700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d38038430 0x7f9d3803a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.128+0000 7f9d50ffe700 1 -- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d38038430 msgr2=0x7f9d3803a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.128+0000 7f9d50ffe700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d38038430 0x7f9d3803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.129+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f9d4804d100 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.129+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9d4801aa60 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.259+0000 7f9d53a63700 1 -- 192.168.123.103:0/1650537917 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f9d4c0623c0 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.259+0000 7f9d427fc700 1 -- 192.168.123.103:0/1650537917 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7f9d48029330 con 0x7f9d4c071da0 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.262+0000 7f9d37fff700 1 -- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d38038430 msgr2=0x7f9d3803a8f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.262+0000 7f9d37fff700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d38038430 0x7f9d3803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.262+0000 7f9d37fff700 1 -- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 msgr2=0x7f9d4c1a9100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.262+0000 7f9d37fff700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c1a9100 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9d48003ec0 tx=0x7f9d48003fa0 comp rx=0 tx=0).stop 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.263+0000 7f9d37fff700 1 -- 192.168.123.103:0/1650537917 shutdown_connections 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.263+0000 7f9d37fff700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9d38038430 0x7f9d3803a8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.263+0000 7f9d37fff700 1 --2- 192.168.123.103:0/1650537917 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9d4c071da0 0x7f9d4c1a9100 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.263+0000 7f9d37fff700 1 -- 192.168.123.103:0/1650537917 >> 192.168.123.103:0/1650537917 conn(0x7f9d4c06d400 msgr2=0x7f9d4c06e080 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.263+0000 7f9d37fff700 1 -- 192.168.123.103:0/1650537917 shutdown_connections 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.263+0000 7f9d37fff700 1 -- 192.168.123.103:0/1650537917 wait complete. 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for the mgr to restart... 2026-03-08T23:59:07.318 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr epoch 5... 2026-03-08T23:59:08.312 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:07 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3396798043' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-08T23:59:08.312 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:07 vm03 ceph-mon[52346]: mgrmap e5: vm03.yvcons(active, since 3s) 2026-03-08T23:59:08.312 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:07 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1650537917' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: Active manager daemon vm03.yvcons restarted 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: Activating manager daemon vm03.yvcons 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: osdmap e2: 0 total, 0 up, 0 in 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: mgrmap e6: vm03.yvcons(active, starting, since 0.00513079s) 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: Manager daemon vm03.yvcons is now available 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:12.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:11 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.453+0000 7f0a4ffff700 1 Processor -- start 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.453+0000 7f0a4ffff700 1 -- start start 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.453+0000 7f0a4ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 0x7f0a50072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.453+0000 7f0a4ffff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0a500727f0 con 0x7f0a50071e00 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.454+0000 7f0a4effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 0x7f0a50072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.454+0000 7f0a4effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 0x7f0a50072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:54996/0 (socket says 192.168.123.103:54996) 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.454+0000 7f0a4effd700 1 -- 192.168.123.103:0/2001263351 learned_addr learned my addr 192.168.123.103:0/2001263351 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.454+0000 7f0a4effd700 1 -- 192.168.123.103:0/2001263351 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a5010ddb0 con 0x7f0a50071e00 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.454+0000 7f0a4effd700 1 --2- 192.168.123.103:0/2001263351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 0x7f0a50072220 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f0a40009a90 tx=0x7f0a40009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=755af105f5a17e5b server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4dffb700 1 -- 192.168.123.103:0/2001263351 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a40004030 con 0x7f0a50071e00 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4dffb700 1 -- 192.168.123.103:0/2001263351 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0a4000b7e0 con 0x7f0a50071e00 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4dffb700 1 -- 192.168.123.103:0/2001263351 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a400039f0 con 0x7f0a50071e00 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4ffff700 1 -- 192.168.123.103:0/2001263351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 msgr2=0x7f0a50072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4ffff700 1 --2- 192.168.123.103:0/2001263351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 0x7f0a50072220 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f0a40009a90 tx=0x7f0a40009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4ffff700 1 -- 192.168.123.103:0/2001263351 shutdown_connections 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4ffff700 1 --2- 192.168.123.103:0/2001263351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a50071e00 0x7f0a50072220 secure :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f0a40009a90 tx=0x7f0a40009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.455+0000 7f0a4ffff700 1 -- 192.168.123.103:0/2001263351 >> 192.168.123.103:0/2001263351 conn(0x7f0a5006d320 msgr2=0x7f0a5006f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.456+0000 7f0a4ffff700 1 -- 192.168.123.103:0/2001263351 shutdown_connections 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.456+0000 7f0a4ffff700 1 -- 192.168.123.103:0/2001263351 wait complete. 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.456+0000 7f0a4ffff700 1 Processor -- start 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.456+0000 7f0a4ffff700 1 -- start start 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.456+0000 7f0a4ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 0x7f0a501a9660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.456+0000 7f0a4ffff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0a501a9ba0 con 0x7f0a501a9240 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.457+0000 7f0a4effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 0x7f0a501a9660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:12.716 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.457+0000 7f0a4effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 0x7f0a501a9660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55004/0 (socket says 192.168.123.103:55004) 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.457+0000 7f0a4effd700 1 -- 192.168.123.103:0/3378880091 learned_addr learned my addr 192.168.123.103:0/3378880091 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.457+0000 7f0a4effd700 1 -- 192.168.123.103:0/3378880091 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a40009740 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.457+0000 7f0a4effd700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 0x7f0a501a9660 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f0a50072aa0 tx=0x7f0a4000bf30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.459+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a40003fa0 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.459+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0a501a9da0 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.459+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0a5007b1c0 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.459+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f0a400045a0 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a4001b440 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 5) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f0a4001b5a0 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a54a04700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f0a3803afc0 con 0x7f0a380383f0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f0a4004c150 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a4e7fc700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.460+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.660+0000 7f0a4e7fc700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:07.660+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:08.060+0000 7f0a4e7fc700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:08.060+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:08.861+0000 7f0a4e7fc700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:08.861+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:10.463+0000 7f0a4e7fc700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:10.463+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:11.667+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mgrmap(e 6) v1 ==== 45045+0+0 (secure 0 0 0) 0x7f0a4001b850 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:11.667+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:11.667+0000 7f0a54a04700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.669+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f0a40016cd0 con 0x7f0a501a9240 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.669+0000 7f0a54a04700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.669+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f0a3803afc0 con 0x7f0a380383f0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.673+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.673+0000 7f0a4e7fc700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f0a44003a10 tx=0x7f0a440092b0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.674+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f0a3803afc0 con 0x7f0a380383f0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.677+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f0a501aa5c0 con 0x7f0a380383f0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.677+0000 7f0a54a04700 1 -- 192.168.123.103:0/3378880091 <== mgr.14120 v2:192.168.123.103:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7f0a501aa5c0 con 0x7f0a380383f0 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 msgr2=0x7f0a3803a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7f0a44003a10 tx=0x7f0a440092b0 comp rx=0 tx=0).stop 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 msgr2=0x7f0a501a9660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 0x7f0a501a9660 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f0a50072aa0 tx=0x7f0a4000bf30 comp rx=0 tx=0).stop 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 shutdown_connections 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0a380383f0 0x7f0a3803a8b0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 --2- 192.168.123.103:0/3378880091 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0a501a9240 0x7f0a501a9660 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 >> 192.168.123.103:0/3378880091 conn(0x7f0a5006d320 msgr2=0x7f0a5006dee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 shutdown_connections 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.678+0000 7f0a4ffff700 1 -- 192.168.123.103:0/3378880091 wait complete. 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:mgr epoch 5 is available 2026-03-08T23:59:12.717 INFO:teuthology.orchestra.run.vm03.stdout:Setting orchestrator backend to cephadm... 2026-03-08T23:59:12.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: Found migration_current of "None". Setting to last migration. 2026-03-08T23:59:12.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-08T23:59:12.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-08T23:59:12.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:12.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:12.963 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:12.964 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:12 vm03 ceph-mon[52346]: mgrmap e7: vm03.yvcons(active, since 1.00785s) 2026-03-08T23:59:13.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.841+0000 7f8d67461700 1 Processor -- start 2026-03-08T23:59:13.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d67461700 1 -- start start 2026-03-08T23:59:13.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d67461700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d60108d90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.032 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d67461700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d60109360 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d651fd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d60108d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d651fd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d60108d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55052/0 (socket says 192.168.123.103:55052) 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d651fd700 1 -- 192.168.123.103:0/4100362947 learned_addr learned my addr 192.168.123.103:0/4100362947 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.842+0000 7f8d651fd700 1 -- 192.168.123.103:0/4100362947 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d60109b70 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.843+0000 7f8d651fd700 1 --2- 192.168.123.103:0/4100362947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d60108d90 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f8d50009a90 tx=0x7f8d50009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=584f3fc7148ca5b4 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.843+0000 7f8d57fff700 1 -- 192.168.123.103:0/4100362947 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d50004030 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.843+0000 7f8d57fff700 1 -- 192.168.123.103:0/4100362947 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f8d5000b7e0 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.843+0000 7f8d67461700 1 -- 192.168.123.103:0/4100362947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 msgr2=0x7f8d60108d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.843+0000 7f8d67461700 1 --2- 192.168.123.103:0/4100362947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d60108d90 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f8d50009a90 tx=0x7f8d50009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 -- 192.168.123.103:0/4100362947 shutdown_connections 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 --2- 192.168.123.103:0/4100362947 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d60108d90 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 -- 192.168.123.103:0/4100362947 >> 192.168.123.103:0/4100362947 conn(0x7f8d6007be30 msgr2=0x7f8d601064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 -- 192.168.123.103:0/4100362947 shutdown_connections 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 -- 192.168.123.103:0/4100362947 wait complete. 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 Processor -- start 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.844+0000 7f8d67461700 1 -- start start 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d67461700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d6019c6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d651fd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d6019c6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d651fd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d6019c6a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55054/0 (socket says 192.168.123.103:55054) 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d651fd700 1 -- 192.168.123.103:0/3521161025 learned_addr learned my addr 192.168.123.103:0/3521161025 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d67461700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d60109360 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d651fd700 1 -- 192.168.123.103:0/3521161025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d50009740 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d651fd700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d6019c6a0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f8d50003ec0 tx=0x7f8d50003fa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d50004400 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f8d50004560 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8d6019cbe0 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.845+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8d50011620 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.846+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8d6019d000 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.846+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f8d50011890 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.846+0000 7f8d567fc700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8d4c038270 0x7f8d4c03a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.846+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f8d50052580 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.847+0000 7f8d649fc700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8d4c038270 0x7f8d4c03a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.847+0000 7f8d649fc700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8d4c038270 0x7f8d4c03a730 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f8d5c006fd0 tx=0x7f8d5c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.847+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8d6004fa20 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.850+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8d50053050 con 0x7f8d60108970 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.962+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7f8d6019fc40 con 0x7f8d4c038270 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.974+0000 7f8d567fc700 1 -- 192.168.123.103:0/3521161025 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f8d6019fc40 con 0x7f8d4c038270 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.977+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8d4c038270 msgr2=0x7f8d4c03a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.977+0000 7f8d67461700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8d4c038270 0x7f8d4c03a730 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f8d5c006fd0 tx=0x7f8d5c006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:13.033 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.977+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 msgr2=0x7f8d6019c6a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.977+0000 7f8d67461700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d6019c6a0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f8d50003ec0 tx=0x7f8d50003fa0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.978+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 shutdown_connections 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.978+0000 7f8d67461700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8d4c038270 0x7f8d4c03a730 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.978+0000 7f8d67461700 1 --2- 192.168.123.103:0/3521161025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d60108970 0x7f8d6019c6a0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.978+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 >> 192.168.123.103:0/3521161025 conn(0x7f8d6007be30 msgr2=0x7f8d60105de0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.978+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 shutdown_connections 2026-03-08T23:59:13.034 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:12.978+0000 7f8d67461700 1 -- 192.168.123.103:0/3521161025 wait complete. 2026-03-08T23:59:13.338 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-08T23:59:13.338 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.162+0000 7fb8190ee700 1 Processor -- start 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.162+0000 7fb8190ee700 1 -- start start 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.162+0000 7fb8190ee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb814108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.162+0000 7fb8190ee700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb814109380 con 0x7fb814108990 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb812d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb814108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb812d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb814108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55062/0 (socket says 192.168.123.103:55062) 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb812d9d700 1 -- 192.168.123.103:0/2208545191 learned_addr learned my addr 192.168.123.103:0/2208545191 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb812d9d700 1 -- 192.168.123.103:0/2208545191 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb814109b90 con 0x7fb814108990 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb812d9d700 1 --2- 192.168.123.103:0/2208545191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb814108db0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc009a90 tx=0x7fb7fc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=736a96bb67f34355 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb811d9b700 1 -- 192.168.123.103:0/2208545191 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7fc004030 con 0x7fb814108990 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb811d9b700 1 -- 192.168.123.103:0/2208545191 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb7fc00b7e0 con 0x7fb814108990 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.163+0000 7fb811d9b700 1 -- 192.168.123.103:0/2208545191 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7fc003a40 con 0x7fb814108990 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 -- 192.168.123.103:0/2208545191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 msgr2=0x7fb814108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 --2- 192.168.123.103:0/2208545191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb814108db0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc009a90 tx=0x7fb7fc009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 -- 192.168.123.103:0/2208545191 shutdown_connections 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 --2- 192.168.123.103:0/2208545191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb814108db0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.339 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 -- 192.168.123.103:0/2208545191 >> 192.168.123.103:0/2208545191 conn(0x7fb814103f50 msgr2=0x7fb814106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 -- 192.168.123.103:0/2208545191 shutdown_connections 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.164+0000 7fb8190ee700 1 -- 192.168.123.103:0/2208545191 wait complete. 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb8190ee700 1 Processor -- start 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb8190ee700 1 -- start start 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb8190ee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb81419c650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb8190ee700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb81419cb90 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb812d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb81419c650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb812d9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb81419c650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55068/0 (socket says 192.168.123.103:55068) 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.165+0000 7fb812d9d700 1 -- 192.168.123.103:0/2962734930 learned_addr learned my addr 192.168.123.103:0/2962734930 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.166+0000 7fb812d9d700 1 -- 192.168.123.103:0/2962734930 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7fc009740 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.166+0000 7fb812d9d700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb81419c650 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc003710 tx=0x7fb7fc003b60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.167+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7fc004130 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.167+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb7fc004290 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.167+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb7fc011420 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.167+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb81419cd90 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.167+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb81419d1b0 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.169+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7fb7fc011580 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.170+0000 7fb80bfff700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb8000382d0 0x7fb80003a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.170+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb8140623c0 con 0x7fb814108990 2026-03-08T23:59:13.340 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.170+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb7fc04c870 con 0x7fb814108990 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.173+0000 7fb81259c700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb8000382d0 0x7fb80003a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.173+0000 7fb81259c700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb8000382d0 0x7fb80003a790 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb804006fd0 tx=0x7fb804006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.173+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb7fc010970 con 0x7fb814108990 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.278+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7fb814105290 con 0x7fb8000382d0 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.279+0000 7fb80bfff700 1 -- 192.168.123.103:0/2962734930 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7fb814105290 con 0x7fb8000382d0 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.282+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb8000382d0 msgr2=0x7fb80003a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.283+0000 7fb8190ee700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb8000382d0 0x7fb80003a790 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb804006fd0 tx=0x7fb804006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.283+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 msgr2=0x7fb81419c650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.283+0000 7fb8190ee700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb81419c650 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb7fc003710 tx=0x7fb7fc003b60 comp rx=0 tx=0).stop 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.283+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 shutdown_connections 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.284+0000 7fb8190ee700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb8000382d0 0x7fb80003a790 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.284+0000 7fb8190ee700 1 --2- 192.168.123.103:0/2962734930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb814108990 0x7fb81419c650 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.284+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 >> 192.168.123.103:0/2962734930 conn(0x7fb814103f50 msgr2=0x7fb814104b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.284+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 shutdown_connections 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.284+0000 7fb8190ee700 1 -- 192.168.123.103:0/2962734930 wait complete. 2026-03-08T23:59:13.341 INFO:teuthology.orchestra.run.vm03.stdout:Generating ssh key... 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.466+0000 7f3c5fe3c700 1 Processor -- start 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.466+0000 7f3c5fe3c700 1 -- start start 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.467+0000 7f3c5fe3c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c58105a70 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.467+0000 7f3c5fe3c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c58106040 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.467+0000 7f3c5dbd8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c58105a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.467+0000 7f3c5dbd8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c58105a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55082/0 (socket says 192.168.123.103:55082) 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.467+0000 7f3c5dbd8700 1 -- 192.168.123.103:0/1580947621 learned_addr learned my addr 192.168.123.103:0/1580947621 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.468+0000 7f3c5dbd8700 1 -- 192.168.123.103:0/1580947621 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c58106850 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.468+0000 7f3c5dbd8700 1 --2- 192.168.123.103:0/1580947621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c58105a70 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f3c4c009a90 tx=0x7f3c4c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=70755e0d110f925b server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.468+0000 7f3c5cbd6700 1 -- 192.168.123.103:0/1580947621 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3c4c004030 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.468+0000 7f3c5cbd6700 1 -- 192.168.123.103:0/1580947621 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3c4c00b7e0 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.469+0000 7f3c5cbd6700 1 -- 192.168.123.103:0/1580947621 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3c4c003a40 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.469+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/1580947621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 msgr2=0x7f3c58105a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.469+0000 7f3c5fe3c700 1 --2- 192.168.123.103:0/1580947621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c58105a70 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f3c4c009a90 tx=0x7f3c4c009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.469+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/1580947621 shutdown_connections 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.469+0000 7f3c5fe3c700 1 --2- 192.168.123.103:0/1580947621 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c58105a70 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.469+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/1580947621 >> 192.168.123.103:0/1580947621 conn(0x7f3c58100bd0 msgr2=0x7f3c58103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.470+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/1580947621 shutdown_connections 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.470+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/1580947621 wait complete. 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.470+0000 7f3c5fe3c700 1 Processor -- start 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.470+0000 7f3c5fe3c700 1 -- start start 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.470+0000 7f3c5fe3c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c5819c790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.470+0000 7f3c5fe3c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c5819ccd0 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.471+0000 7f3c5dbd8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c5819c790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.471+0000 7f3c5dbd8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c5819c790 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55092/0 (socket says 192.168.123.103:55092) 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.471+0000 7f3c5dbd8700 1 -- 192.168.123.103:0/60055755 learned_addr learned my addr 192.168.123.103:0/60055755 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.471+0000 7f3c5dbd8700 1 -- 192.168.123.103:0/60055755 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c4c009740 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.471+0000 7f3c5dbd8700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c5819c790 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f3c4c00bef0 tx=0x7f3c4c003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.471+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3c4c004140 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.472+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3c4c0042a0 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.472+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3c581076d0 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.472+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3c4c004140 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.472+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3c5819d160 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.473+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3c58196550 con 0x7f3c58105650 2026-03-08T23:59:13.679 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.473+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f3c4c029030 con 0x7f3c58105650 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.473+0000 7f3c4affd700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3c44038210 0x7f3c4403a6d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.473+0000 7f3c5d3d7700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3c44038210 0x7f3c4403a6d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.474+0000 7f3c5d3d7700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3c44038210 0x7f3c4403a6d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3c54006fd0 tx=0x7f3c54006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.474+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f3c4c028d60 con 0x7f3c58105650 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.477+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3c4c011850 con 0x7f3c58105650 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.581+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f3c5802cf50 con 0x7f3c44038210 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.648+0000 7f3c4affd700 1 -- 192.168.123.103:0/60055755 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f3c5802cf50 con 0x7f3c44038210 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3c44038210 msgr2=0x7f3c4403a6d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3c44038210 0x7f3c4403a6d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f3c54006fd0 tx=0x7f3c54006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 msgr2=0x7f3c5819c790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c5819c790 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f3c4c00bef0 tx=0x7f3c4c003b40 comp rx=0 tx=0).stop 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 shutdown_connections 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3c44038210 0x7f3c4403a6d0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 --2- 192.168.123.103:0/60055755 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3c58105650 0x7f3c5819c790 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 >> 192.168.123.103:0/60055755 conn(0x7f3c58100bd0 msgr2=0x7f3c581018b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 shutdown_connections 2026-03-08T23:59:13.680 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.650+0000 7f3c5fe3c700 1 -- 192.168.123.103:0/60055755 wait complete. 2026-03-08T23:59:13.976 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCseP8TJOsuxzWCbi/GtSh4sYcQoMttAx1MhEMLSc14Nydc0FYopBRf0tLWGHk7uryiK+KgRGc/F38yTXBVyXnRKXBNzpZLrgv75RC378qs8u2LftRPgtJVPXoEGeVLFj4MuFXcwmyWvFG6Ol81Mnd2qSP+59CQpY8HaESywjZqCmmmlhbcHaaxI5mjNOTnSPApQBqSKhRuq3wNS2WijDY6FtjM0ellzR7fUqIapI9QMKEE9PhlAE2V0Gt/sKpClCRKzfzbnv0vxw6tThHM83gT9BghsKKkPEKXQrtic3Wa6fAjRzaO1oG54kxahVGI0RIXfxYIvOtOigANo2MCwvQYM4bJ/ZT8CTDNS9bFv//wO77HhKVn+0Bq6JjhgTmdEaF2Xmn8Lwaj0LXsaX50ZouZ/mNQNS9qsx1uXC6vqFK71v+8uVIR5sEKTtUA2hGrEwGDPcQylsKHHL8gVEdXFh1FiEJAprE9OOVUlyANfwVwNkzxENVgf8wEaGCyfZYOyb0= ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:59:13.976 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7f7e1700 1 Processor -- start 2026-03-08T23:59:13.976 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7f7e1700 1 -- start start 2026-03-08T23:59:13.976 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7f7e1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 0x7f7f78105a70 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7f7e1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f78106040 con 0x7f7f78105650 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7d57d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 0x7f7f78105a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7d57d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 0x7f7f78105a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55100/0 (socket says 192.168.123.103:55100) 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.805+0000 7f7f7d57d700 1 -- 192.168.123.103:0/3848237861 learned_addr learned my addr 192.168.123.103:0/3848237861 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.806+0000 7f7f7d57d700 1 -- 192.168.123.103:0/3848237861 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f78106850 con 0x7f7f78105650 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.806+0000 7f7f7d57d700 1 --2- 192.168.123.103:0/3848237861 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 0x7f7f78105a70 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f7f68009cf0 tx=0x7f7f6800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e080383581d75862 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.806+0000 7f7f6ffff700 1 -- 192.168.123.103:0/3848237861 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f68004030 con 0x7f7f78105650 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.806+0000 7f7f6ffff700 1 -- 192.168.123.103:0/3848237861 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f7f6800b810 con 0x7f7f78105650 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/3848237861 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 msgr2=0x7f7f78105a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 --2- 192.168.123.103:0/3848237861 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 0x7f7f78105a70 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f7f68009cf0 tx=0x7f7f6800b0e0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/3848237861 shutdown_connections 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 --2- 192.168.123.103:0/3848237861 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78105650 0x7f7f78105a70 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/3848237861 >> 192.168.123.103:0/3848237861 conn(0x7f7f78100bd0 msgr2=0x7f7f78103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/3848237861 shutdown_connections 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/3848237861 wait complete. 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.807+0000 7f7f7f7e1700 1 Processor -- start 2026-03-08T23:59:13.977 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7f7e1700 1 -- start start 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7f7e1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 0x7f7f7818eff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7f7e1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f78106040 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7d57d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 0x7f7f7818eff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7d57d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 0x7f7f7818eff0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55110/0 (socket says 192.168.123.103:55110) 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7d57d700 1 -- 192.168.123.103:0/4249475663 learned_addr learned my addr 192.168.123.103:0/4249475663 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7d57d700 1 -- 192.168.123.103:0/4249475663 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f68009740 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.808+0000 7f7f7d57d700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 0x7f7f7818eff0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f7f68009cc0 tx=0x7f7f68003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.809+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f68003ed0 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.809+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f7f680044d0 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.809+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f6801ac60 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.809+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f78190d80 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.809+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f7818f700 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.810+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f7f68004030 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.810+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f7804fa20 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.810+0000 7f7f6e7fc700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f64038220 0x7f7f6403a6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.810+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f7f6804b470 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.812+0000 7f7f7cd7c700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f64038220 0x7f7f6403a6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.813+0000 7f7f7cd7c700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f64038220 0x7f7f6403a6e0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f7f74006fd0 tx=0x7f7f74006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.813+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7f680042e0 con 0x7f7f78190960 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.921+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f78102b80 con 0x7f7f64038220 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.922+0000 7f7f6e7fc700 1 -- 192.168.123.103:0/4249475663 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7f7f78102b80 con 0x7f7f64038220 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.924+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f64038220 msgr2=0x7f7f6403a6e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.924+0000 7f7f7f7e1700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f64038220 0x7f7f6403a6e0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f7f74006fd0 tx=0x7f7f74006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.924+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 msgr2=0x7f7f7818eff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.924+0000 7f7f7f7e1700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 0x7f7f7818eff0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f7f68009cc0 tx=0x7f7f68003cb0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.925+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 shutdown_connections 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.925+0000 7f7f7f7e1700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f64038220 0x7f7f6403a6e0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.925+0000 7f7f7f7e1700 1 --2- 192.168.123.103:0/4249475663 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f78190960 0x7f7f7818eff0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.925+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 >> 192.168.123.103:0/4249475663 conn(0x7f7f78100bd0 msgr2=0x7f7f78074110 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.925+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 shutdown_connections 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:13.925+0000 7f7f7f7e1700 1 -- 192.168.123.103:0/4249475663 wait complete. 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-08T23:59:13.978 INFO:teuthology.orchestra.run.vm03.stdout:Adding key to root@localhost authorized_keys... 2026-03-08T23:59:13.979 INFO:teuthology.orchestra.run.vm03.stdout:Adding host vm03... 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:12] ENGINE Bus STARTING 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:12] ENGINE Serving on http://192.168.123.103:8765 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:12] ENGINE Serving on https://192.168.123.103:7150 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:12] ENGINE Bus STARTED 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:14.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:13 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:14.995 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:14 vm03 ceph-mon[52346]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:14.995 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:14 vm03 ceph-mon[52346]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:14.995 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:14 vm03 ceph-mon[52346]: Generating ssh key... 2026-03-08T23:59:14.995 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:14 vm03 ceph-mon[52346]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:14.995 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:14 vm03 ceph-mon[52346]: mgrmap e8: vm03.yvcons(active, since 2s) 2026-03-08T23:59:16.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Added host 'vm03' with addr '192.168.123.103' 2026-03-08T23:59:16.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.106+0000 7fb10a1ca700 1 Processor -- start 2026-03-08T23:59:16.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb10a1ca700 1 -- start start 2026-03-08T23:59:16.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb10a1ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104108d90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb10a1ca700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb104109360 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb1037fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104108d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb1037fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104108d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55118/0 (socket says 192.168.123.103:55118) 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb1037fe700 1 -- 192.168.123.103:0/3745316356 learned_addr learned my addr 192.168.123.103:0/3745316356 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb1037fe700 1 -- 192.168.123.103:0/3745316356 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb104109b70 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.107+0000 7fb1037fe700 1 --2- 192.168.123.103:0/3745316356 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104108d90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fb0ec009cf0 tx=0x7fb0ec00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=92a56e490b439e11 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb1027fc700 1 -- 192.168.123.103:0/3745316356 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0ec004030 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb1027fc700 1 -- 192.168.123.103:0/3745316356 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb0ec00b810 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb1027fc700 1 -- 192.168.123.103:0/3745316356 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0ec003a90 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb10a1ca700 1 -- 192.168.123.103:0/3745316356 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 msgr2=0x7fb104108d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb10a1ca700 1 --2- 192.168.123.103:0/3745316356 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104108d90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fb0ec009cf0 tx=0x7fb0ec00b0e0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb10a1ca700 1 -- 192.168.123.103:0/3745316356 shutdown_connections 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb10a1ca700 1 --2- 192.168.123.103:0/3745316356 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104108d90 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.108+0000 7fb10a1ca700 1 -- 192.168.123.103:0/3745316356 >> 192.168.123.103:0/3745316356 conn(0x7fb10407be30 msgr2=0x7fb1041064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb10a1ca700 1 -- 192.168.123.103:0/3745316356 shutdown_connections 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb10a1ca700 1 -- 192.168.123.103:0/3745316356 wait complete. 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb10a1ca700 1 Processor -- start 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb10a1ca700 1 -- start start 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb10a1ca700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104198250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb10a1ca700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb104198790 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb1037fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104198250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb1037fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104198250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55130/0 (socket says 192.168.123.103:55130) 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb1037fe700 1 -- 192.168.123.103:0/1735215354 learned_addr learned my addr 192.168.123.103:0/1735215354 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.109+0000 7fb1037fe700 1 -- 192.168.123.103:0/1735215354 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb0ec009740 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.110+0000 7fb1037fe700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104198250 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb0ec009710 tx=0x7fb0ec003e00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.110+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0ec004120 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.110+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fb0ec004280 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.110+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb0ec011510 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.110+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb104198990 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.110+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb104198db0 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.111+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7fb0ec0043f0 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.111+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb1040623c0 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.112+0000 7fb100ff9700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb0f0037f10 0x7fb0f003a3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.112+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb0ec04b4d0 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.114+0000 7fb102ffd700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb0f0037f10 0x7fb0f003a3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.114+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb0ec018bd0 con 0x7fb104108970 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.115+0000 7fb102ffd700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb0f0037f10 0x7fb0f003a3d0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fb0f4006fd0 tx=0x7fb0f4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.221+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm03", "addr": "192.168.123.103", "target": ["mon-mgr", ""]}) v1 -- 0x7fb104106420 con 0x7fb0f0037f10 2026-03-08T23:59:16.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:14.649+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fb0ec011670 con 0x7fb104108970 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.962+0000 7fb100ff9700 1 -- 192.168.123.103:0/1735215354 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7fb104106420 con 0x7fb0f0037f10 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.964+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb0f0037f10 msgr2=0x7fb0f003a3d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb0f0037f10 0x7fb0f003a3d0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fb0f4006fd0 tx=0x7fb0f4006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 msgr2=0x7fb104198250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104198250 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb0ec009710 tx=0x7fb0ec003e00 comp rx=0 tx=0).stop 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 shutdown_connections 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb0f0037f10 0x7fb0f003a3d0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 --2- 192.168.123.103:0/1735215354 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb104108970 0x7fb104198250 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 >> 192.168.123.103:0/1735215354 conn(0x7fb10407be30 msgr2=0x7fb104105d10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 shutdown_connections 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:15.965+0000 7fb10a1ca700 1 -- 192.168.123.103:0/1735215354 wait complete. 2026-03-08T23:59:16.004 INFO:teuthology.orchestra.run.vm03.stdout:Deploying mon service with default placement... 2026-03-08T23:59:16.266 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:15 vm03 ceph-mon[52346]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm03", "addr": "192.168.123.103", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:16.267 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:15 vm03 ceph-mon[52346]: Deploying cephadm binary to vm03 2026-03-08T23:59:16.267 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:15 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:16.320 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-08T23:59:16.320 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f6021576700 1 Processor -- start 2026-03-08T23:59:16.320 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f6021576700 1 -- start start 2026-03-08T23:59:16.320 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f6021576700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c072190 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f6021576700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f601c072760 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f601bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c072190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f601bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c072190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60046/0 (socket says 192.168.123.103:60046) 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.145+0000 7f601bfff700 1 -- 192.168.123.103:0/3862883857 learned_addr learned my addr 192.168.123.103:0/3862883857 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f601bfff700 1 -- 192.168.123.103:0/3862883857 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f601c0728a0 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f601bfff700 1 --2- 192.168.123.103:0/3862883857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c072190 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f6004009a90 tx=0x7f6004009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e47bde026e6d30ba server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f601affd700 1 -- 192.168.123.103:0/3862883857 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6004004030 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f601affd700 1 -- 192.168.123.103:0/3862883857 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f600400b7e0 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f601affd700 1 -- 192.168.123.103:0/3862883857 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6004003a40 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f6021576700 1 -- 192.168.123.103:0/3862883857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 msgr2=0x7f601c072190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.146+0000 7f6021576700 1 --2- 192.168.123.103:0/3862883857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c072190 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f6004009a90 tx=0x7f6004009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 -- 192.168.123.103:0/3862883857 shutdown_connections 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 --2- 192.168.123.103:0/3862883857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c072190 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 -- 192.168.123.103:0/3862883857 >> 192.168.123.103:0/3862883857 conn(0x7f601c06d320 msgr2=0x7f601c06f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 -- 192.168.123.103:0/3862883857 shutdown_connections 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 -- 192.168.123.103:0/3862883857 wait complete. 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 Processor -- start 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 -- start start 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c1a8ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f6021576700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f601c1a9530 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.147+0000 7f601bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c1a8ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f601bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c1a8ff0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60052/0 (socket says 192.168.123.103:60052) 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f601bfff700 1 -- 192.168.123.103:0/174462452 learned_addr learned my addr 192.168.123.103:0/174462452 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f601bfff700 1 -- 192.168.123.103:0/174462452 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6004009740 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f601bfff700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c1a8ff0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f600400bef0 tx=0x7f6004003b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6004004140 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f60040042a0 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f60040114c0 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f601c1a9730 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.148+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f601c1a9b50 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.149+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f601c04f000 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.152+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f6004029030 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.152+0000 7f60197fa700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6008038330 0x7f600803a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.152+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f600404c360 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.152+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f600404c720 con 0x7f601c071d70 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.152+0000 7f601b7fe700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6008038330 0x7f600803a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.153+0000 7f601b7fe700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6008038330 0x7f600803a7f0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f600c006fd0 tx=0x7f600c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.266+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f601c06e5e0 con 0x7f6008038330 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.272+0000 7f60197fa700 1 -- 192.168.123.103:0/174462452 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f601c06e5e0 con 0x7f6008038330 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6008038330 msgr2=0x7f600803a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6008038330 0x7f600803a7f0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f600c006fd0 tx=0x7f600c006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 msgr2=0x7f601c1a8ff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c1a8ff0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f600400bef0 tx=0x7f6004003b40 comp rx=0 tx=0).stop 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 shutdown_connections 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6008038330 0x7f600803a7f0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 --2- 192.168.123.103:0/174462452 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f601c071d70 0x7f601c1a8ff0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 >> 192.168.123.103:0/174462452 conn(0x7f601c06d320 msgr2=0x7f601c06ded0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 shutdown_connections 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.275+0000 7f6021576700 1 -- 192.168.123.103:0/174462452 wait complete. 2026-03-08T23:59:16.321 INFO:teuthology.orchestra.run.vm03.stdout:Deploying mgr service with default placement... 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867c503700 1 Processor -- start 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867c503700 1 -- start start 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867c503700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 0x7f8674072190 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867c503700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8674072760 con 0x7f8674071d70 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867a29f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 0x7f8674072190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867a29f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 0x7f8674072190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60064/0 (socket says 192.168.123.103:60064) 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867a29f700 1 -- 192.168.123.103:0/1565433899 learned_addr learned my addr 192.168.123.103:0/1565433899 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867a29f700 1 -- 192.168.123.103:0/1565433899 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8674078fe0 con 0x7f8674071d70 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.473+0000 7f867a29f700 1 --2- 192.168.123.103:0/1565433899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 0x7f8674072190 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f86700098d0 tx=0x7f8670009be0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=2b9595cfa4e72c7c server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867929d700 1 -- 192.168.123.103:0/1565433899 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8670004030 con 0x7f8674071d70 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867929d700 1 -- 192.168.123.103:0/1565433899 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f867000c8f0 con 0x7f8674071d70 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 -- 192.168.123.103:0/1565433899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 msgr2=0x7f8674072190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 --2- 192.168.123.103:0/1565433899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 0x7f8674072190 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f86700098d0 tx=0x7f8670009be0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 -- 192.168.123.103:0/1565433899 shutdown_connections 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 --2- 192.168.123.103:0/1565433899 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8674071d70 0x7f8674072190 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 -- 192.168.123.103:0/1565433899 >> 192.168.123.103:0/1565433899 conn(0x7f867406d400 msgr2=0x7f867406f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 -- 192.168.123.103:0/1565433899 shutdown_connections 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 -- 192.168.123.103:0/1565433899 wait complete. 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.474+0000 7f867c503700 1 Processor -- start 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867c503700 1 -- start start 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867c503700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 0x7f867407ea10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867c503700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8670013070 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867a29f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 0x7f867407ea10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867a29f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 0x7f867407ea10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60068/0 (socket says 192.168.123.103:60068) 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867a29f700 1 -- 192.168.123.103:0/3561149206 learned_addr learned my addr 192.168.123.103:0/3561149206 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867a29f700 1 -- 192.168.123.103:0/3561149206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8670009580 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867a29f700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 0x7f867407ea10 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f867000cff0 tx=0x7f8670003e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8670004070 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f867407ef50 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.475+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8674081bd0 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.476+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f867001c070 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.476+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8670010e80 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.476+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f867001e030 con 0x7f867407e5f0 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.476+0000 7f866b7fe700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f86600383c0 0x7f866003a880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:16.671 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.477+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f867004ca00 con 0x7f867407e5f0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.477+0000 7f8679a9e700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f86600383c0 0x7f866003a880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.477+0000 7f8679a9e700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f86600383c0 0x7f866003a880 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f866c00ad30 tx=0x7f866c0093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.477+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8658005320 con 0x7f867407e5f0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.481+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f867002a430 con 0x7f867407e5f0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.506+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mon.0 v2:192.168.123.103:3300/0 7 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f867002a430 con 0x7f867407e5f0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.607+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f8658000bf0 con 0x7f86600383c0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.611+0000 7f866b7fe700 1 -- 192.168.123.103:0/3561149206 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f8658000bf0 con 0x7f86600383c0 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.614+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f86600383c0 msgr2=0x7f866003a880 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.614+0000 7f867c503700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f86600383c0 0x7f866003a880 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f866c00ad30 tx=0x7f866c0093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.614+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 msgr2=0x7f867407ea10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.614+0000 7f867c503700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 0x7f867407ea10 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f867000cff0 tx=0x7f8670003e20 comp rx=0 tx=0).stop 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.614+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 shutdown_connections 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.614+0000 7f867c503700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f86600383c0 0x7f866003a880 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.615+0000 7f867c503700 1 --2- 192.168.123.103:0/3561149206 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f867407e5f0 0x7f867407ea10 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.615+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 >> 192.168.123.103:0/3561149206 conn(0x7f867406d400 msgr2=0x7f867406dc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.615+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 shutdown_connections 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.615+0000 7f867c503700 1 -- 192.168.123.103:0/3561149206 wait complete. 2026-03-08T23:59:16.672 INFO:teuthology.orchestra.run.vm03.stdout:Deploying crash service with default placement... 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.807+0000 7f4530541700 1 Processor -- start 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.808+0000 7f4530541700 1 -- start start 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.808+0000 7f4530541700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 0x7f4528072630 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.808+0000 7f4530541700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4528072c00 con 0x7f4528072210 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.809+0000 7f452e2dd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 0x7f4528072630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.809+0000 7f452e2dd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 0x7f4528072630 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60080/0 (socket says 192.168.123.103:60080) 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.809+0000 7f452e2dd700 1 -- 192.168.123.103:0/3139553632 learned_addr learned my addr 192.168.123.103:0/3139553632 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.811+0000 7f452e2dd700 1 -- 192.168.123.103:0/3139553632 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f452810e1d0 con 0x7f4528072210 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.811+0000 7f452e2dd700 1 --2- 192.168.123.103:0/3139553632 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 0x7f4528072630 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f451800d180 tx=0x7f451800d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3a1c311bb9484500 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.811+0000 7f452d2db700 1 -- 192.168.123.103:0/3139553632 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4518010070 con 0x7f4528072210 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.811+0000 7f452d2db700 1 -- 192.168.123.103:0/3139553632 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4518004510 con 0x7f4528072210 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.812+0000 7f4530541700 1 -- 192.168.123.103:0/3139553632 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 msgr2=0x7f4528072630 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.812+0000 7f4530541700 1 --2- 192.168.123.103:0/3139553632 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 0x7f4528072630 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f451800d180 tx=0x7f451800d490 comp rx=0 tx=0).stop 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.813+0000 7f4530541700 1 -- 192.168.123.103:0/3139553632 shutdown_connections 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.813+0000 7f4530541700 1 --2- 192.168.123.103:0/3139553632 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4528072210 0x7f4528072630 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.813+0000 7f4530541700 1 -- 192.168.123.103:0/3139553632 >> 192.168.123.103:0/3139553632 conn(0x7f452806d400 msgr2=0x7f452806f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.815+0000 7f4530541700 1 -- 192.168.123.103:0/3139553632 shutdown_connections 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.816+0000 7f4530541700 1 -- 192.168.123.103:0/3139553632 wait complete. 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.816+0000 7f4530541700 1 Processor -- start 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.816+0000 7f4530541700 1 -- start start 2026-03-08T23:59:17.015 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.816+0000 7f4530541700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 0x7f45281a0dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.816+0000 7f4530541700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4518003c20 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f452e2dd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 0x7f45281a0dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f452e2dd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 0x7f45281a0dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60082/0 (socket says 192.168.123.103:60082) 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f452e2dd700 1 -- 192.168.123.103:0/166370438 learned_addr learned my addr 192.168.123.103:0/166370438 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f452e2dd700 1 -- 192.168.123.103:0/166370438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45180087c0 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f452e2dd700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 0x7f45281a0dd0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f4518008c10 tx=0x7f4518008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4518010050 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f451800deb0 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4518016440 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f45281a1310 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.817+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f45281a3f90 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.818+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f452804f000 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.821+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f45180041f0 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.821+0000 7f45277fe700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f45100383e0 0x7f451003a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.821+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f451804ba10 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.821+0000 7f452dadc700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f45100383e0 0x7f451003a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.822+0000 7f452dadc700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f45100383e0 0x7f451003a8a0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f451c006fd0 tx=0x7f451c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.822+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f451800b370 con 0x7f45281a09b0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.939+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f452806e5e0 con 0x7f45100383e0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.951+0000 7f45277fe700 1 -- 192.168.123.103:0/166370438 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f452806e5e0 con 0x7f45100383e0 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.954+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f45100383e0 msgr2=0x7f451003a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.954+0000 7f4530541700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f45100383e0 0x7f451003a8a0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f451c006fd0 tx=0x7f451c006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 msgr2=0x7f45281a0dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 0x7f45281a0dd0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f4518008c10 tx=0x7f4518008cf0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 shutdown_connections 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f45100383e0 0x7f451003a8a0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 --2- 192.168.123.103:0/166370438 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f45281a09b0 0x7f45281a0dd0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 >> 192.168.123.103:0/166370438 conn(0x7f452806d400 msgr2=0x7f452806ded0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 shutdown_connections 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:16.955+0000 7f4530541700 1 -- 192.168.123.103:0/166370438 wait complete. 2026-03-08T23:59:17.016 INFO:teuthology.orchestra.run.vm03.stdout:Deploying ceph-exporter service with default placement... 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: Added host vm03 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.189 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:16 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.185+0000 7efce759e700 1 Processor -- start 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.185+0000 7efce759e700 1 -- start start 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.186+0000 7efce759e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 0x7efce00967e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.186+0000 7efce759e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efce0006cf0 con 0x7efce00943b0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.189+0000 7efce659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 0x7efce00967e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.190+0000 7efce659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 0x7efce00967e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60094/0 (socket says 192.168.123.103:60094) 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.190+0000 7efce659c700 1 -- 192.168.123.103:0/1235057133 learned_addr learned my addr 192.168.123.103:0/1235057133 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.190+0000 7efce659c700 1 -- 192.168.123.103:0/1235057133 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efce0096d20 con 0x7efce00943b0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.190+0000 7efce659c700 1 --2- 192.168.123.103:0/1235057133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 0x7efce00967e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7efcd8009cf0 tx=0x7efcd800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b9e5396108a204e9 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce559a700 1 -- 192.168.123.103:0/1235057133 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efcd8004030 con 0x7efce00943b0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce559a700 1 -- 192.168.123.103:0/1235057133 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efcd800b810 con 0x7efce00943b0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce759e700 1 -- 192.168.123.103:0/1235057133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 msgr2=0x7efce00967e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce759e700 1 --2- 192.168.123.103:0/1235057133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 0x7efce00967e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7efcd8009cf0 tx=0x7efcd800b0e0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce759e700 1 -- 192.168.123.103:0/1235057133 shutdown_connections 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce759e700 1 --2- 192.168.123.103:0/1235057133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce00943b0 0x7efce00967e0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.191+0000 7efce759e700 1 -- 192.168.123.103:0/1235057133 >> 192.168.123.103:0/1235057133 conn(0x7efce008ffd0 msgr2=0x7efce0092430 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.192+0000 7efce759e700 1 -- 192.168.123.103:0/1235057133 shutdown_connections 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.192+0000 7efce759e700 1 -- 192.168.123.103:0/1235057133 wait complete. 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.192+0000 7efce759e700 1 Processor -- start 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.192+0000 7efce759e700 1 -- start start 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.192+0000 7efce759e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 0x7efce0130820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.192+0000 7efce759e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efce0006cf0 con 0x7efce0130400 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.193+0000 7efce659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 0x7efce0130820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.193+0000 7efce659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 0x7efce0130820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60106/0 (socket says 192.168.123.103:60106) 2026-03-08T23:59:17.382 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.193+0000 7efce659c700 1 -- 192.168.123.103:0/2382184515 learned_addr learned my addr 192.168.123.103:0/2382184515 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.193+0000 7efce659c700 1 -- 192.168.123.103:0/2382184515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efcd8009740 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.193+0000 7efce659c700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 0x7efce0130820 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7efcd800b560 tx=0x7efcd800be60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.194+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efcd8003f40 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.194+0000 7efce759e700 1 -- 192.168.123.103:0/2382184515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efce0130d60 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.194+0000 7efce759e700 1 -- 192.168.123.103:0/2382184515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efce01339e0 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.194+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efcd8004580 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.194+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efcd8003b10 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.197+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7efcd801ad00 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.197+0000 7efcd77fe700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efcd00382e0 0x7efcd003a7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.197+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7efcd804bb00 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.197+0000 7efce759e700 1 -- 192.168.123.103:0/2382184515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efcc8005320 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.198+0000 7efce5d9b700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efcd00382e0 0x7efcd003a7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.198+0000 7efce5d9b700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efcd00382e0 0x7efcd003a7a0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7efcdc006fd0 tx=0x7efcdc006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.201+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7efcd802d3e0 con 0x7efce0130400 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.317+0000 7efce759e700 1 -- 192.168.123.103:0/2382184515 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7efcc8000bf0 con 0x7efcd00382e0 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.323+0000 7efcd77fe700 1 -- 192.168.123.103:0/2382184515 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7efcc8000bf0 con 0x7efcd00382e0 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 -- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efcd00382e0 msgr2=0x7efcd003a7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efcd00382e0 0x7efcd003a7a0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7efcdc006fd0 tx=0x7efcdc006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 -- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 msgr2=0x7efce0130820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 0x7efce0130820 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7efcd800b560 tx=0x7efcd800be60 comp rx=0 tx=0).stop 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 -- 192.168.123.103:0/2382184515 shutdown_connections 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7efcd00382e0 0x7efcd003a7a0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 --2- 192.168.123.103:0/2382184515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efce0130400 0x7efce0130820 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.326+0000 7efcd57fa700 1 -- 192.168.123.103:0/2382184515 >> 192.168.123.103:0/2382184515 conn(0x7efce008ffd0 msgr2=0x7efce00923f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.327+0000 7efcd57fa700 1 -- 192.168.123.103:0/2382184515 shutdown_connections 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.328+0000 7efcd57fa700 1 -- 192.168.123.103:0/2382184515 wait complete. 2026-03-08T23:59:17.383 INFO:teuthology.orchestra.run.vm03.stdout:Deploying prometheus service with default placement... 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.522+0000 7f009359e700 1 Processor -- start 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.522+0000 7f009359e700 1 -- start start 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.522+0000 7f009359e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 0x7f00940725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.522+0000 7f009359e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0094072bc0 con 0x7f00940721d0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.523+0000 7f009259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 0x7f00940725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.523+0000 7f009259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 0x7f00940725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60110/0 (socket says 192.168.123.103:60110) 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.523+0000 7f009259c700 1 -- 192.168.123.103:0/848845012 learned_addr learned my addr 192.168.123.103:0/848845012 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.523+0000 7f009259c700 1 -- 192.168.123.103:0/848845012 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f009410e1c0 con 0x7f00940721d0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.523+0000 7f009259c700 1 --2- 192.168.123.103:0/848845012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 0x7f00940725f0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f0084009480 tx=0x7f0084009790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a4282fee84279c04 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.524+0000 7f009159a700 1 -- 192.168.123.103:0/848845012 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0084004030 con 0x7f00940721d0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.524+0000 7f009159a700 1 -- 192.168.123.103:0/848845012 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f008400c8f0 con 0x7f00940721d0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.524+0000 7f009159a700 1 -- 192.168.123.103:0/848845012 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0084003bf0 con 0x7f00940721d0 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 -- 192.168.123.103:0/848845012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 msgr2=0x7f00940725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 --2- 192.168.123.103:0/848845012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 0x7f00940725f0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f0084009480 tx=0x7f0084009790 comp rx=0 tx=0).stop 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 -- 192.168.123.103:0/848845012 shutdown_connections 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 --2- 192.168.123.103:0/848845012 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00940721d0 0x7f00940725f0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 -- 192.168.123.103:0/848845012 >> 192.168.123.103:0/848845012 conn(0x7f009406d320 msgr2=0x7f009406f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:17.717 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 -- 192.168.123.103:0/848845012 shutdown_connections 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.525+0000 7f009359e700 1 -- 192.168.123.103:0/848845012 wait complete. 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.526+0000 7f009359e700 1 Processor -- start 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.526+0000 7f009359e700 1 -- start start 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009359e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 0x7f00941a0f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009359e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00941a1490 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 0x7f00941a0f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 0x7f00941a0f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60116/0 (socket says 192.168.123.103:60116) 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009259c700 1 -- 192.168.123.103:0/4215591275 learned_addr learned my addr 192.168.123.103:0/4215591275 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009259c700 1 -- 192.168.123.103:0/4215591275 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0084009160 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009259c700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 0x7f00941a0f50 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f0084000c00 tx=0x7f0084004160 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0084004510 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.527+0000 7f009359e700 1 -- 192.168.123.103:0/4215591275 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00941a1690 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.528+0000 7f009359e700 1 -- 192.168.123.103:0/4215591275 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00941a42e0 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.528+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f00840108d0 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.528+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0084020750 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.528+0000 7f009359e700 1 -- 192.168.123.103:0/4215591275 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f009404f000 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.530+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f0084010a40 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.530+0000 7f00837fe700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f007c0383a0 0x7f007c03a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.530+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f008404c580 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.530+0000 7f0091d9b700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f007c0383a0 0x7f007c03a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.530+0000 7f0091d9b700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f007c0383a0 0x7f007c03a860 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f0088006fd0 tx=0x7f0088006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.532+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f008400f550 con 0x7f00941a0b30 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.640+0000 7f009359e700 1 -- 192.168.123.103:0/4215591275 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7f00941a1890 con 0x7f007c0383a0 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.647+0000 7f00837fe700 1 -- 192.168.123.103:0/4215591275 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7f00941a1890 con 0x7f007c0383a0 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 -- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f007c0383a0 msgr2=0x7f007c03a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f007c0383a0 0x7f007c03a860 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f0088006fd0 tx=0x7f0088006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 -- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 msgr2=0x7f00941a0f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 0x7f00941a0f50 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f0084000c00 tx=0x7f0084004160 comp rx=0 tx=0).stop 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 -- 192.168.123.103:0/4215591275 shutdown_connections 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f007c0383a0 0x7f007c03a860 secure :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f0088006fd0 tx=0x7f0088006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 --2- 192.168.123.103:0/4215591275 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00941a0b30 0x7f00941a0f50 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.650+0000 7f00817fa700 1 -- 192.168.123.103:0/4215591275 >> 192.168.123.103:0/4215591275 conn(0x7f009406d320 msgr2=0x7f009406e090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.651+0000 7f00817fa700 1 -- 192.168.123.103:0/4215591275 shutdown_connections 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.651+0000 7f00817fa700 1 -- 192.168.123.103:0/4215591275 wait complete. 2026-03-08T23:59:17.718 INFO:teuthology.orchestra.run.vm03.stdout:Deploying grafana service with default placement... 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: Saving service mon spec with placement count:5 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: Saving service mgr spec with placement count:2 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: Saving service crash spec with placement * 2026-03-08T23:59:17.975 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.976 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:17.976 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:17 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.857+0000 7f8ee595a700 1 Processor -- start 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.858+0000 7f8ee595a700 1 -- start start 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.858+0000 7f8ee595a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 0x7f8ee010fdc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.858+0000 7f8ee595a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ee0110390 con 0x7f8ee0072b70 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.859+0000 7f8edeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 0x7f8ee010fdc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.859+0000 7f8edeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 0x7f8ee010fdc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60122/0 (socket says 192.168.123.103:60122) 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.859+0000 7f8edeffd700 1 -- 192.168.123.103:0/1209426661 learned_addr learned my addr 192.168.123.103:0/1209426661 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.859+0000 7f8edeffd700 1 -- 192.168.123.103:0/1209426661 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ee01104d0 con 0x7f8ee0072b70 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.859+0000 7f8edeffd700 1 --2- 192.168.123.103:0/1209426661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 0x7f8ee010fdc0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f8ed400ab30 tx=0x7f8ed4010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=ae1d904df5666056 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.861+0000 7f8eddffb700 1 -- 192.168.123.103:0/1209426661 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ed4010e00 con 0x7f8ee0072b70 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.861+0000 7f8eddffb700 1 -- 192.168.123.103:0/1209426661 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ed4004510 con 0x7f8ee0072b70 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.861+0000 7f8eddffb700 1 -- 192.168.123.103:0/1209426661 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ed401a640 con 0x7f8ee0072b70 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.862+0000 7f8ee595a700 1 -- 192.168.123.103:0/1209426661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 msgr2=0x7f8ee010fdc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.862+0000 7f8ee595a700 1 --2- 192.168.123.103:0/1209426661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 0x7f8ee010fdc0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f8ed400ab30 tx=0x7f8ed4010730 comp rx=0 tx=0).stop 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.862+0000 7f8ee595a700 1 -- 192.168.123.103:0/1209426661 shutdown_connections 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.862+0000 7f8ee595a700 1 --2- 192.168.123.103:0/1209426661 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee0072b70 0x7f8ee010fdc0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.862+0000 7f8ee595a700 1 -- 192.168.123.103:0/1209426661 >> 192.168.123.103:0/1209426661 conn(0x7f8ee006d660 msgr2=0x7f8ee006fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.864+0000 7f8ee595a700 1 -- 192.168.123.103:0/1209426661 shutdown_connections 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.864+0000 7f8ee595a700 1 -- 192.168.123.103:0/1209426661 wait complete. 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.864+0000 7f8ee595a700 1 Processor -- start 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.864+0000 7f8ee595a700 1 -- start start 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.864+0000 7f8ee595a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 0x7f8ee01a9950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.864+0000 7f8ee595a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ee01a9e90 con 0x7f8ee01a9530 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.865+0000 7f8edeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 0x7f8ee01a9950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.865+0000 7f8edeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 0x7f8ee01a9950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60132/0 (socket says 192.168.123.103:60132) 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.865+0000 7f8edeffd700 1 -- 192.168.123.103:0/2315291499 learned_addr learned my addr 192.168.123.103:0/2315291499 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.865+0000 7f8edeffd700 1 -- 192.168.123.103:0/2315291499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ed400a7e0 con 0x7f8ee01a9530 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.865+0000 7f8edeffd700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 0x7f8ee01a9950 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f8ed400bbd0 tx=0x7f8ed40042e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.866+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ed4010e00 con 0x7f8ee01a9530 2026-03-08T23:59:18.036 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.866+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ee01aa090 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.866+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ee01acce0 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.866+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ee00623c0 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.869+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ed400f070 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.870+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ed40095d0 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.870+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f8ed4018070 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.870+0000 7f8ee4958700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ec8038440 0x7f8ec803a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.870+0000 7f8ede7fc700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ec8038440 0x7f8ec803a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.871+0000 7f8ede7fc700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ec8038440 0x7f8ec803a900 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8ed800ad80 tx=0x7f8ed80093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.872+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f8ed4031080 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.872+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8ed404c580 con 0x7f8ee01a9530 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.984+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7f8ee006fba0 con 0x7f8ec8038440 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.988+0000 7f8ee4958700 1 -- 192.168.123.103:0/2315291499 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7f8ee006fba0 con 0x7f8ec8038440 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ec8038440 msgr2=0x7f8ec803a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ec8038440 0x7f8ec803a900 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f8ed800ad80 tx=0x7f8ed80093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 msgr2=0x7f8ee01a9950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 0x7f8ee01a9950 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f8ed400bbd0 tx=0x7f8ed40042e0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 shutdown_connections 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ec8038440 0x7f8ec803a900 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 --2- 192.168.123.103:0/2315291499 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8ee01a9530 0x7f8ee01a9950 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 >> 192.168.123.103:0/2315291499 conn(0x7f8ee006d660 msgr2=0x7f8ee006f490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 shutdown_connections 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:17.995+0000 7f8ee595a700 1 -- 192.168.123.103:0/2315291499 wait complete. 2026-03-08T23:59:18.037 INFO:teuthology.orchestra.run.vm03.stdout:Deploying node-exporter service with default placement... 2026-03-08T23:59:18.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-08T23:59:18.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f72949cb700 1 Processor -- start 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f72949cb700 1 -- start start 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f72949cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c0722a0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f72949cb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f728c072870 con 0x7f728c071e80 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f7292767700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c0722a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f7292767700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c0722a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60140/0 (socket says 192.168.123.103:60140) 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.176+0000 7f7292767700 1 -- 192.168.123.103:0/4084800227 learned_addr learned my addr 192.168.123.103:0/4084800227 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.177+0000 7f7292767700 1 -- 192.168.123.103:0/4084800227 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f728c0729b0 con 0x7f728c071e80 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.177+0000 7f7292767700 1 --2- 192.168.123.103:0/4084800227 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c0722a0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f72800098d0 tx=0x7f7280009be0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=b21cffa25334d05d server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.177+0000 7f7291765700 1 -- 192.168.123.103:0/4084800227 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7280004030 con 0x7f728c071e80 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.177+0000 7f7291765700 1 -- 192.168.123.103:0/4084800227 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f728000c8f0 con 0x7f728c071e80 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.177+0000 7f7291765700 1 -- 192.168.123.103:0/4084800227 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7280003bf0 con 0x7f728c071e80 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.178+0000 7f72949cb700 1 -- 192.168.123.103:0/4084800227 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 msgr2=0x7f728c0722a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.178+0000 7f72949cb700 1 --2- 192.168.123.103:0/4084800227 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c0722a0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f72800098d0 tx=0x7f7280009be0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.178+0000 7f72949cb700 1 -- 192.168.123.103:0/4084800227 shutdown_connections 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.178+0000 7f72949cb700 1 --2- 192.168.123.103:0/4084800227 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c0722a0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.178+0000 7f72949cb700 1 -- 192.168.123.103:0/4084800227 >> 192.168.123.103:0/4084800227 conn(0x7f728c06d660 msgr2=0x7f728c06fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.179+0000 7f72949cb700 1 -- 192.168.123.103:0/4084800227 shutdown_connections 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.179+0000 7f72949cb700 1 -- 192.168.123.103:0/4084800227 wait complete. 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.179+0000 7f72949cb700 1 Processor -- start 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.179+0000 7f72949cb700 1 -- start start 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.179+0000 7f72949cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c086f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.179+0000 7f72949cb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f728c087490 con 0x7f728c071e80 2026-03-08T23:59:18.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f7292767700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c086f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f7292767700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c086f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60154/0 (socket says 192.168.123.103:60154) 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f7292767700 1 -- 192.168.123.103:0/3133075807 learned_addr learned my addr 192.168.123.103:0/3133075807 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f7292767700 1 -- 192.168.123.103:0/3133075807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7280009580 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f7292767700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c086f50 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f72800038f0 tx=0x7f7280004140 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f72800044b0 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f728000c020 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f72800185f0 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f72949cb700 1 -- 192.168.123.103:0/3133075807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f728c087690 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.180+0000 7f72949cb700 1 -- 192.168.123.103:0/3133075807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f728c087ab0 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.181+0000 7f72949cb700 1 -- 192.168.123.103:0/3133075807 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f728c04fa20 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.185+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f7280010bf0 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.185+0000 7f727f7fe700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f727803c810 0x7f727803ecd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.185+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f728004c070 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.185+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f728004c4a0 con 0x7f728c071e80 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.185+0000 7f7291f66700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f727803c810 0x7f727803ecd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.187+0000 7f7291f66700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f727803c810 0x7f727803ecd0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f728800ad30 tx=0x7f72880093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.299+0000 7f72949cb700 1 -- 192.168.123.103:0/3133075807 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f728c06fa20 con 0x7f727803c810 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.369+0000 7f727f7fe700 1 -- 192.168.123.103:0/3133075807 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f728c06fa20 con 0x7f727803c810 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 -- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f727803c810 msgr2=0x7f727803ecd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f727803c810 0x7f727803ecd0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f728800ad30 tx=0x7f72880093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 -- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 msgr2=0x7f728c086f50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c086f50 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f72800038f0 tx=0x7f7280004140 comp rx=0 tx=0).stop 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 -- 192.168.123.103:0/3133075807 shutdown_connections 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f727803c810 0x7f727803ecd0 secure :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f728800ad30 tx=0x7f72880093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.371+0000 7f727d7fa700 1 --2- 192.168.123.103:0/3133075807 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f728c071e80 0x7f728c086f50 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.372+0000 7f727d7fa700 1 -- 192.168.123.103:0/3133075807 >> 192.168.123.103:0/3133075807 conn(0x7f728c06d660 msgr2=0x7f728c06f310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.372+0000 7f727d7fa700 1 -- 192.168.123.103:0/3133075807 shutdown_connections 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.372+0000 7f727d7fa700 1 -- 192.168.123.103:0/3133075807 wait complete. 2026-03-08T23:59:18.414 INFO:teuthology.orchestra.run.vm03.stdout:Deploying alertmanager service with default placement... 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.570+0000 7f1ac7350700 1 Processor -- start 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.571+0000 7f1ac7350700 1 -- start start 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.571+0000 7f1ac7350700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab80a5200 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.571+0000 7f1ac7350700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ab80a57d0 con 0x7f1ab80a4de0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.571+0000 7f1ac50ec700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab80a5200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.572+0000 7f1ac50ec700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab80a5200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60168/0 (socket says 192.168.123.103:60168) 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.572+0000 7f1ac50ec700 1 -- 192.168.123.103:0/768476321 learned_addr learned my addr 192.168.123.103:0/768476321 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.572+0000 7f1ac50ec700 1 -- 192.168.123.103:0/768476321 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1ab80a3e30 con 0x7f1ab80a4de0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.572+0000 7f1ac50ec700 1 --2- 192.168.123.103:0/768476321 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab80a5200 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f1abc00bd30 tx=0x7f1abc00d6d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=37457241838b476d server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.572+0000 7f1ab3fff700 1 -- 192.168.123.103:0/768476321 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1abc00be80 con 0x7f1ab80a4de0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.572+0000 7f1ab3fff700 1 -- 192.168.123.103:0/768476321 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1abc004510 con 0x7f1ab80a4de0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.573+0000 7f1ac7350700 1 -- 192.168.123.103:0/768476321 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 msgr2=0x7f1ab80a5200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.573+0000 7f1ac7350700 1 --2- 192.168.123.103:0/768476321 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab80a5200 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f1abc00bd30 tx=0x7f1abc00d6d0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.573+0000 7f1ac7350700 1 -- 192.168.123.103:0/768476321 shutdown_connections 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.573+0000 7f1ac7350700 1 --2- 192.168.123.103:0/768476321 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab80a5200 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.573+0000 7f1ac7350700 1 -- 192.168.123.103:0/768476321 >> 192.168.123.103:0/768476321 conn(0x7f1ab809ff10 msgr2=0x7f1ab80a2370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.574+0000 7f1ac7350700 1 -- 192.168.123.103:0/768476321 shutdown_connections 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.574+0000 7f1ac7350700 1 -- 192.168.123.103:0/768476321 wait complete. 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.574+0000 7f1ac7350700 1 Processor -- start 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.574+0000 7f1ac7350700 1 -- start start 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.574+0000 7f1ac7350700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab8138620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.574+0000 7f1ac7350700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ab8138b60 con 0x7f1ab80a4de0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.575+0000 7f1ac50ec700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab8138620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.575+0000 7f1ac50ec700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab8138620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60174/0 (socket says 192.168.123.103:60174) 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.575+0000 7f1ac50ec700 1 -- 192.168.123.103:0/1754065156 learned_addr learned my addr 192.168.123.103:0/1754065156 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.575+0000 7f1ac50ec700 1 -- 192.168.123.103:0/1754065156 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1abc00b9e0 con 0x7f1ab80a4de0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.575+0000 7f1ac50ec700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab8138620 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f1abc000c00 tx=0x7f1abc017740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.095 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.576+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1abc009040 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.576+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1abc003a00 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.576+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1ab8138d60 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.576+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1ab8139260 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.577+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1abc017d00 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.577+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f1abc015070 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.577+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1ab8004b30 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.577+0000 7f1ab27fc700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1aac03c730 0x7f1aac03ebf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.578+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f1abc04bba0 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.585+0000 7f1ac48eb700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1aac03c730 0x7f1aac03ebf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.586+0000 7f1ac48eb700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1aac03c730 0x7f1aac03ebf0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f1ab4006fd0 tx=0x7f1ab4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.586+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1ab8004b30 con 0x7f1ab80a4de0 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:18.703+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7f1ab8004ac0 con 0x7f1aac03c730 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.033+0000 7f1ab27fc700 1 -- 192.168.123.103:0/1754065156 <== mgr.14120 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7f1ab8004ac0 con 0x7f1aac03c730 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1aac03c730 msgr2=0x7f1aac03ebf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1aac03c730 0x7f1aac03ebf0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f1ab4006fd0 tx=0x7f1ab4006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 msgr2=0x7f1ab8138620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab8138620 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f1abc000c00 tx=0x7f1abc017740 comp rx=0 tx=0).stop 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 shutdown_connections 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1aac03c730 0x7f1aac03ebf0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 --2- 192.168.123.103:0/1754065156 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ab80a4de0 0x7f1ab8138620 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 >> 192.168.123.103:0/1754065156 conn(0x7f1ab809ff10 msgr2=0x7f1ab80a0ba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 shutdown_connections 2026-03-08T23:59:19.096 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.037+0000 7f1ac7350700 1 -- 192.168.123.103:0/1754065156 wait complete. 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: Saving service ceph-exporter spec with placement * 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: Saving service prometheus spec with placement count:1 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: Saving service grafana spec with placement count:1 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: Saving service node-exporter spec with placement * 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:19 vm03 ceph-mon[52346]: Saving service alertmanager spec with placement count:1 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.232+0000 7f7572f7c700 1 Processor -- start 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.233+0000 7f7572f7c700 1 -- start start 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.233+0000 7f7572f7c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c1095b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.233+0000 7f7572f7c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f756c074720 con 0x7f756c1071c0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.233+0000 7f7570d18700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c1095b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.234+0000 7f7570d18700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c1095b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60176/0 (socket says 192.168.123.103:60176) 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.234+0000 7f7570d18700 1 -- 192.168.123.103:0/4080957637 learned_addr learned my addr 192.168.123.103:0/4080957637 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.234+0000 7f7570d18700 1 -- 192.168.123.103:0/4080957637 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f756c109af0 con 0x7f756c1071c0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.234+0000 7f7570d18700 1 --2- 192.168.123.103:0/4080957637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c1095b0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f755c009a90 tx=0x7f755c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=5db2200d3be5d4d server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.234+0000 7f756b7fe700 1 -- 192.168.123.103:0/4080957637 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f755c004030 con 0x7f756c1071c0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.234+0000 7f756b7fe700 1 -- 192.168.123.103:0/4080957637 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f755c00b7e0 con 0x7f756c1071c0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.235+0000 7f756b7fe700 1 -- 192.168.123.103:0/4080957637 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f755c003ae0 con 0x7f756c1071c0 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.235+0000 7f7572f7c700 1 -- 192.168.123.103:0/4080957637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 msgr2=0x7f756c1095b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.411 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.235+0000 7f7572f7c700 1 --2- 192.168.123.103:0/4080957637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c1095b0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f755c009a90 tx=0x7f755c009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.235+0000 7f7572f7c700 1 -- 192.168.123.103:0/4080957637 shutdown_connections 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.235+0000 7f7572f7c700 1 --2- 192.168.123.103:0/4080957637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c1095b0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.235+0000 7f7572f7c700 1 -- 192.168.123.103:0/4080957637 >> 192.168.123.103:0/4080957637 conn(0x7f756c100bd0 msgr2=0x7f756c103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.236+0000 7f7572f7c700 1 -- 192.168.123.103:0/4080957637 shutdown_connections 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.236+0000 7f7572f7c700 1 -- 192.168.123.103:0/4080957637 wait complete. 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.236+0000 7f7572f7c700 1 Processor -- start 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7572f7c700 1 -- start start 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7572f7c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c19c650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7572f7c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f756c19cb90 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7570d18700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c19c650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7570d18700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c19c650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60186/0 (socket says 192.168.123.103:60186) 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7570d18700 1 -- 192.168.123.103:0/2463690230 learned_addr learned my addr 192.168.123.103:0/2463690230 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.237+0000 7f7570d18700 1 -- 192.168.123.103:0/2463690230 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f755c009740 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.238+0000 7f7570d18700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c19c650 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f755c000c00 tx=0x7f755c00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.238+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f755c0041a0 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.238+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f756c19cd90 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.238+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f756c19d1b0 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.238+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f755c004300 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.238+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f755c011550 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.239+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f755c0116b0 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.239+0000 7f7569ffb700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f75540383f0 0x7f755403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.239+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f755c04d150 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.239+0000 7f756bfff700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f75540383f0 0x7f755403a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.240+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7558005320 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.241+0000 7f756bfff700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f75540383f0 0x7f755403a8b0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f7560006fd0 tx=0x7f7560006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.242+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f755c011960 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.348+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f7558005190 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.354+0000 7f7569ffb700 1 -- 192.168.123.103:0/2463690230 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f755c018b40 con 0x7f756c1071c0 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f75540383f0 msgr2=0x7f755403a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f75540383f0 0x7f755403a8b0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f7560006fd0 tx=0x7f7560006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 msgr2=0x7f756c19c650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c19c650 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f755c000c00 tx=0x7f755c00bfa0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 shutdown_connections 2026-03-08T23:59:19.412 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f75540383f0 0x7f755403a8b0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 --2- 192.168.123.103:0/2463690230 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f756c1071c0 0x7f756c19c650 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 >> 192.168.123.103:0/2463690230 conn(0x7f756c100bd0 msgr2=0x7f756c101820 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:19.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 shutdown_connections 2026-03-08T23:59:19.413 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.359+0000 7f7572f7c700 1 -- 192.168.123.103:0/2463690230 wait complete. 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.527+0000 7fbe90e12700 1 Processor -- start 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.528+0000 7fbe90e12700 1 -- start start 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.528+0000 7fbe90e12700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 0x7fbe8c079240 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.528+0000 7fbe90e12700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe8c079810 con 0x7fbe8c07ade0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.528+0000 7fbe8a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 0x7fbe8c079240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.529+0000 7fbe8a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 0x7fbe8c079240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60188/0 (socket says 192.168.123.103:60188) 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.529+0000 7fbe8a59c700 1 -- 192.168.123.103:0/724716943 learned_addr learned my addr 192.168.123.103:0/724716943 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.529+0000 7fbe8a59c700 1 -- 192.168.123.103:0/724716943 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbe8c079950 con 0x7fbe8c07ade0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.529+0000 7fbe8a59c700 1 --2- 192.168.123.103:0/724716943 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 0x7fbe8c079240 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fbe74009cf0 tx=0x7fbe7400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=67f424c97b67719f server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.529+0000 7fbe8959a700 1 -- 192.168.123.103:0/724716943 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbe74004030 con 0x7fbe8c07ade0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.529+0000 7fbe8959a700 1 -- 192.168.123.103:0/724716943 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbe7400b810 con 0x7fbe8c07ade0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 -- 192.168.123.103:0/724716943 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 msgr2=0x7fbe8c079240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 --2- 192.168.123.103:0/724716943 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 0x7fbe8c079240 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fbe74009cf0 tx=0x7fbe7400b0e0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 -- 192.168.123.103:0/724716943 shutdown_connections 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 --2- 192.168.123.103:0/724716943 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c07ade0 0x7fbe8c079240 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 -- 192.168.123.103:0/724716943 >> 192.168.123.103:0/724716943 conn(0x7fbe8c101ce0 msgr2=0x7fbe8c104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 -- 192.168.123.103:0/724716943 shutdown_connections 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.530+0000 7fbe90e12700 1 -- 192.168.123.103:0/724716943 wait complete. 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.531+0000 7fbe90e12700 1 Processor -- start 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.531+0000 7fbe90e12700 1 -- start start 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.531+0000 7fbe90e12700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 0x7fbe8c1a1090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.531+0000 7fbe8a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 0x7fbe8c1a1090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe8a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 0x7fbe8c1a1090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60202/0 (socket says 192.168.123.103:60202) 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe8a59c700 1 -- 192.168.123.103:0/3039261959 learned_addr learned my addr 192.168.123.103:0/3039261959 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe8c079810 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe8a59c700 1 -- 192.168.123.103:0/3039261959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbe74009740 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe8a59c700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 0x7fbe8c1a1090 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fbe74009cc0 tx=0x7fbe7400bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbe74003950 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbe740043c0 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbe7401ac80 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbe8c1a15d0 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.532+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbe8c1a41f0 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.533+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fbe74011420 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.534+0000 7fbe837fe700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbe78038390 0x7fbe7803a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.534+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fbe7404c700 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.534+0000 7fbe89d9b700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbe78038390 0x7fbe7803a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.534+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbe8c0623c0 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.537+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbe7401f020 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.537+0000 7fbe89d9b700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbe78038390 0x7fbe7803a850 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fbe7c006fd0 tx=0x7fbe7c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:19.696 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.639+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7fbe8c1a43e0 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.645+0000 7fbe837fe700 1 -- 192.168.123.103:0/3039261959 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7fbe7401f020 con 0x7fbe8c1a0c70 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.649+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbe78038390 msgr2=0x7fbe7803a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.649+0000 7fbe90e12700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbe78038390 0x7fbe7803a850 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fbe7c006fd0 tx=0x7fbe7c006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 msgr2=0x7fbe8c1a1090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 0x7fbe8c1a1090 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fbe74009cc0 tx=0x7fbe7400bfa0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 shutdown_connections 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbe78038390 0x7fbe7803a850 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 --2- 192.168.123.103:0/3039261959 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe8c1a0c70 0x7fbe8c1a1090 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 >> 192.168.123.103:0/3039261959 conn(0x7fbe8c101ce0 msgr2=0x7fbe8c1078c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 shutdown_connections 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.650+0000 7fbe90e12700 1 -- 192.168.123.103:0/3039261959 wait complete. 2026-03-08T23:59:19.697 INFO:teuthology.orchestra.run.vm03.stdout:Enabling the dashboard module... 2026-03-08T23:59:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:20 vm03 ceph-mon[52346]: from='mgr.14120 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:20 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2463690230' entity='client.admin' 2026-03-08T23:59:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:20 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3039261959' entity='client.admin' 2026-03-08T23:59:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:20 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3632159329' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.817+0000 7f514cd9e700 1 Processor -- start 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.817+0000 7f514cd9e700 1 -- start start 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.817+0000 7f514cd9e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f5148108d90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.817+0000 7f514cd9e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5148109360 con 0x7f5148108970 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.818+0000 7f514659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f5148108d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.818+0000 7f514659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f5148108d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60212/0 (socket says 192.168.123.103:60212) 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.818+0000 7f514659c700 1 -- 192.168.123.103:0/1175531353 learned_addr learned my addr 192.168.123.103:0/1175531353 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.818+0000 7f514659c700 1 -- 192.168.123.103:0/1175531353 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5148109b70 con 0x7f5148108970 2026-03-08T23:59:20.694 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.818+0000 7f514659c700 1 --2- 192.168.123.103:0/1175531353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f5148108d90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f5130009a90 tx=0x7f5130009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7764f074aa3b2389 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.818+0000 7f514559a700 1 -- 192.168.123.103:0/1175531353 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5130004030 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514559a700 1 -- 192.168.123.103:0/1175531353 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f513000b7e0 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 -- 192.168.123.103:0/1175531353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 msgr2=0x7f5148108d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 --2- 192.168.123.103:0/1175531353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f5148108d90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f5130009a90 tx=0x7f5130009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 -- 192.168.123.103:0/1175531353 shutdown_connections 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 --2- 192.168.123.103:0/1175531353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f5148108d90 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 -- 192.168.123.103:0/1175531353 >> 192.168.123.103:0/1175531353 conn(0x7f514807be30 msgr2=0x7f51481064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 -- 192.168.123.103:0/1175531353 shutdown_connections 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.819+0000 7f514cd9e700 1 -- 192.168.123.103:0/1175531353 wait complete. 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514cd9e700 1 Processor -- start 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514cd9e700 1 -- start start 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514cd9e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f514819c700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514cd9e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5148109360 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f514819c700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f514819c700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60218/0 (socket says 192.168.123.103:60218) 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.820+0000 7f514659c700 1 -- 192.168.123.103:0/3632159329 learned_addr learned my addr 192.168.123.103:0/3632159329 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f514659c700 1 -- 192.168.123.103:0/3632159329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5130009740 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f514659c700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f514819c700 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f513000bf90 tx=0x7f5130003e50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5130004270 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f51300043d0 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5130011600 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f514819cc40 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.821+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f514819d060 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.822+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f5130011760 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.822+0000 7f513f7fe700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f51340383e0 0x7f513403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.822+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f513004d0b0 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.822+0000 7f5145d9b700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f51340383e0 0x7f513403a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.823+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f514819d380 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.826+0000 7f5145d9b700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f51340383e0 0x7f513403a8a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f5138006fd0 tx=0x7f5138006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.826+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5130011a10 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:19.961+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7f514804fa20 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.648+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 9) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f513001ab10 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.649+0000 7f513f7fe700 1 -- 192.168.123.103:0/3632159329 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7f513002abc0 con 0x7f5148108970 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f51340383e0 msgr2=0x7f513403a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f51340383e0 0x7f513403a8a0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f5138006fd0 tx=0x7f5138006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 msgr2=0x7f514819c700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f514819c700 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f513000bf90 tx=0x7f5130003e50 comp rx=0 tx=0).stop 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 shutdown_connections 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f51340383e0 0x7f513403a8a0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 --2- 192.168.123.103:0/3632159329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5148108970 0x7f514819c700 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 >> 192.168.123.103:0/3632159329 conn(0x7f514807be30 msgr2=0x7f5148105e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 shutdown_connections 2026-03-08T23:59:20.695 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.652+0000 7f514cd9e700 1 -- 192.168.123.103:0/3632159329 wait complete. 2026-03-08T23:59:21.052 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "active_name": "vm03.yvcons", 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e5b25a700 1 Processor -- start 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e5b25a700 1 -- start start 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e5b25a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541051f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e5b25a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e541057c0 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e58ff6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541051f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e58ff6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541051f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60234/0 (socket says 192.168.123.103:60234) 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e58ff6700 1 -- 192.168.123.103:0/3047260273 learned_addr learned my addr 192.168.123.103:0/3047260273 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.873+0000 7f7e58ff6700 1 -- 192.168.123.103:0/3047260273 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e54105fd0 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.874+0000 7f7e58ff6700 1 --2- 192.168.123.103:0/3047260273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541051f0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7e4800bd30 tx=0x7f7e4800d6d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a0cf4dad90aeeecb server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e537fe700 1 -- 192.168.123.103:0/3047260273 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7e4800be80 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e537fe700 1 -- 192.168.123.103:0/3047260273 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7e48004510 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 -- 192.168.123.103:0/3047260273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 msgr2=0x7f7e541051f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 --2- 192.168.123.103:0/3047260273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541051f0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f7e4800bd30 tx=0x7f7e4800d6d0 comp rx=0 tx=0).stop 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 -- 192.168.123.103:0/3047260273 shutdown_connections 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 --2- 192.168.123.103:0/3047260273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541051f0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 -- 192.168.123.103:0/3047260273 >> 192.168.123.103:0/3047260273 conn(0x7f7e54100350 msgr2=0x7f7e541027b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 -- 192.168.123.103:0/3047260273 shutdown_connections 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 -- 192.168.123.103:0/3047260273 wait complete. 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.875+0000 7f7e5b25a700 1 Processor -- start 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e5b25a700 1 -- start start 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e5b25a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541abb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e5b25a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e48003e00 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e58ff6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541abb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e58ff6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541abb40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60248/0 (socket says 192.168.123.103:60248) 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e58ff6700 1 -- 192.168.123.103:0/2272488853 learned_addr learned my addr 192.168.123.103:0/2272488853 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e58ff6700 1 -- 192.168.123.103:0/2272488853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e4800b9e0 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.876+0000 7f7e58ff6700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541abb40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7e480039a0 tx=0x7f7e480179c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.878+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7e48009040 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.878+0000 7f7e5b25a700 1 -- 192.168.123.103:0/2272488853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7e54198980 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.878+0000 7f7e5b25a700 1 -- 192.168.123.103:0/2272488853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7e541ac310 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.878+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7e4800be80 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.879+0000 7f7e5b25a700 1 -- 192.168.123.103:0/2272488853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7e54191a80 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.879+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7e480163c0 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.882+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 9) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f7e48015070 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.882+0000 7f7e51ffb700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7e44038460 0x7f7e4403a920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.882+0000 7f7e53fff700 1 -- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7e44038460 msgr2=0x7f7e4403a920 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.882+0000 7f7e53fff700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7e44038460 0x7f7e4403a920 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.883+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f7e4804d460 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:20.883+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7e4804f930 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.012+0000 7f7e5b25a700 1 -- 192.168.123.103:0/2272488853 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f7e540623f0 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.013+0000 7f7e51ffb700 1 -- 192.168.123.103:0/2272488853 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7f7e48026900 con 0x7f7e54104dd0 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 -- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7e44038460 msgr2=0x7f7e4403a920 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7e44038460 0x7f7e4403a920 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 -- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 msgr2=0x7f7e541abb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541abb40 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f7e480039a0 tx=0x7f7e480179c0 comp rx=0 tx=0).stop 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 -- 192.168.123.103:0/2272488853 shutdown_connections 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7e44038460 0x7f7e4403a920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 --2- 192.168.123.103:0/2272488853 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e54104dd0 0x7f7e541abb40 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.016+0000 7f7e3f7fe700 1 -- 192.168.123.103:0/2272488853 >> 192.168.123.103:0/2272488853 conn(0x7f7e54100350 msgr2=0x7f7e5418c790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.017+0000 7f7e3f7fe700 1 -- 192.168.123.103:0/2272488853 shutdown_connections 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.017+0000 7f7e3f7fe700 1 -- 192.168.123.103:0/2272488853 wait complete. 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for the mgr to restart... 2026-03-08T23:59:21.053 INFO:teuthology.orchestra.run.vm03.stdout:Waiting for mgr epoch 9... 2026-03-08T23:59:21.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:21 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3632159329' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-08T23:59:21.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:21 vm03 ceph-mon[52346]: mgrmap e9: vm03.yvcons(active, since 8s) 2026-03-08T23:59:21.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:21 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2272488853' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: Active manager daemon vm03.yvcons restarted 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: Activating manager daemon vm03.yvcons 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: osdmap e3: 0 total, 0 up, 0 in 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: mgrmap e10: vm03.yvcons(active, starting, since 0.0483688s) 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: Manager daemon vm03.yvcons is now available 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:25 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout { 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout } 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7ef1067700 1 Processor -- start 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7ef1067700 1 -- start start 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7ef1067700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 0x7f7eec0725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7ef1067700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7eec072bc0 con 0x7f7eec0721d0 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7eebfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 0x7f7eec0725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7eebfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 0x7f7eec0725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60256/0 (socket says 192.168.123.103:60256) 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7eebfff700 1 -- 192.168.123.103:0/1792212402 learned_addr learned my addr 192.168.123.103:0/1792212402 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:26.551 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.195+0000 7f7eebfff700 1 -- 192.168.123.103:0/1792212402 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7eec10e1c0 con 0x7f7eec0721d0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7eebfff700 1 --2- 192.168.123.103:0/1792212402 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 0x7f7eec0725f0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7edc00d180 tx=0x7f7edc00d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6712991c4fbb4810 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7eeaffd700 1 -- 192.168.123.103:0/1792212402 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7edc010070 con 0x7f7eec0721d0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7eeaffd700 1 -- 192.168.123.103:0/1792212402 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7edc004510 con 0x7f7eec0721d0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7ef1067700 1 -- 192.168.123.103:0/1792212402 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 msgr2=0x7f7eec0725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7ef1067700 1 --2- 192.168.123.103:0/1792212402 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 0x7f7eec0725f0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f7edc00d180 tx=0x7f7edc00d490 comp rx=0 tx=0).stop 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7ef1067700 1 -- 192.168.123.103:0/1792212402 shutdown_connections 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7ef1067700 1 --2- 192.168.123.103:0/1792212402 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec0721d0 0x7f7eec0725f0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.196+0000 7f7ef1067700 1 -- 192.168.123.103:0/1792212402 >> 192.168.123.103:0/1792212402 conn(0x7f7eec06d320 msgr2=0x7f7eec06f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7ef1067700 1 -- 192.168.123.103:0/1792212402 shutdown_connections 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7ef1067700 1 -- 192.168.123.103:0/1792212402 wait complete. 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7ef1067700 1 Processor -- start 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7ef1067700 1 -- start start 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7ef1067700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 0x7f7eec1a0e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7ef1067700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7edc003c20 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7eebfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 0x7f7eec1a0e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7eebfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 0x7f7eec1a0e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60258/0 (socket says 192.168.123.103:60258) 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.197+0000 7f7eebfff700 1 -- 192.168.123.103:0/2503766564 learned_addr learned my addr 192.168.123.103:0/2503766564 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.198+0000 7f7eebfff700 1 -- 192.168.123.103:0/2503766564 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7edc0087c0 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.198+0000 7f7eebfff700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 0x7f7eec1a0e00 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f7edc004210 tx=0x7f7edc0042f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.198+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7edc010040 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.198+0000 7f7ef1067700 1 -- 192.168.123.103:0/2503766564 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7eec1a1340 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.198+0000 7f7ef1067700 1 -- 192.168.123.103:0/2503766564 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7eec1a1fb0 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.199+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7edc00eec0 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.199+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7edc00b960 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.199+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 9) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f7edc0164b0 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.200+0000 7f7ee97fa700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.200+0000 7f7eeb7fe700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.200+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.200+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f7edc04c630 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.200+0000 7f7ef1067700 1 -- 192.168.123.103:0/2503766564 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7ed8000d40 con 0x7f7ed4038430 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.400+0000 7f7eeb7fe700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.400+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.800+0000 7f7eeb7fe700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:21.801+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:22.601+0000 7f7eeb7fe700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:22.601+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:24.202+0000 7f7eeb7fe700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:24.202+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:25.484+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mgrmap(e 10) v1 ==== 45058+0+0 (secure 0 0 0) 0x7f7edc029430 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:25.484+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:25.484+0000 7f7ee97fa700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.487+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f7edc04d2b0 con 0x7f7eec1a09e0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.487+0000 7f7ee97fa700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.487+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f7ed8000d40 con 0x7f7ed4038430 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.489+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.489+0000 7f7eeb7fe700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f7ee4003d90 tx=0x7f7ee40073f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.490+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f7ed8000d40 con 0x7f7ed4038430 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.493+0000 7f7ef1067700 1 -- 192.168.123.103:0/2503766564 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f7ed8002800 con 0x7f7ed4038430 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.494+0000 7f7ee97fa700 1 -- 192.168.123.103:0/2503766564 <== mgr.14164 v2:192.168.123.103:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7f7ed8002800 con 0x7f7ed4038430 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.494+0000 7f7ed2ffd700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 msgr2=0x7f7ed403a8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.494+0000 7f7ed2ffd700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f7ee4003d90 tx=0x7f7ee40073f0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.494+0000 7f7ed2ffd700 1 -- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 msgr2=0x7f7eec1a0e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 0x7f7eec1a0e00 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f7edc004210 tx=0x7f7edc0042f0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.552 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 -- 192.168.123.103:0/2503766564 shutdown_connections 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7ed4038430 0x7f7ed403a8f0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 --2- 192.168.123.103:0/2503766564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7eec1a09e0 0x7f7eec1a0e00 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 -- 192.168.123.103:0/2503766564 >> 192.168.123.103:0/2503766564 conn(0x7f7eec06d320 msgr2=0x7f7eec06ddd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 -- 192.168.123.103:0/2503766564 shutdown_connections 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.495+0000 7f7ed2ffd700 1 -- 192.168.123.103:0/2503766564 wait complete. 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:mgr epoch 9 is available 2026-03-08T23:59:26.553 INFO:teuthology.orchestra.run.vm03.stdout:Generating a dashboard self-signed certificate... 2026-03-08T23:59:26.810 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:26 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-08T23:59:26.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:26 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-08T23:59:26.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:26 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:26.811 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:26 vm03 ceph-mon[52346]: mgrmap e11: vm03.yvcons(active, since 1.04976s) 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.683+0000 7ff3a5aa9700 1 Processor -- start 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.683+0000 7ff3a5aa9700 1 -- start start 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.683+0000 7ff3a5aa9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 0x7ff3a00725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.683+0000 7ff3a5aa9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3a0072bc0 con 0x7ff3a00721d0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff3a4aa7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 0x7ff3a00725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff3a4aa7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 0x7ff3a00725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41016/0 (socket says 192.168.123.103:41016) 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff3a4aa7700 1 -- 192.168.123.103:0/2913149480 learned_addr learned my addr 192.168.123.103:0/2913149480 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff3a4aa7700 1 -- 192.168.123.103:0/2913149480 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3a010e1c0 con 0x7ff3a00721d0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff3a4aa7700 1 --2- 192.168.123.103:0/2913149480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 0x7ff3a00725f0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff390009480 tx=0x7ff390009790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=23e4ea3c04091176 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff39f7fe700 1 -- 192.168.123.103:0/2913149480 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff390004030 con 0x7ff3a00721d0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.684+0000 7ff39f7fe700 1 -- 192.168.123.103:0/2913149480 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff39000c8f0 con 0x7ff3a00721d0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/2913149480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 msgr2=0x7ff3a00725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 --2- 192.168.123.103:0/2913149480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 0x7ff3a00725f0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff390009480 tx=0x7ff390009790 comp rx=0 tx=0).stop 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/2913149480 shutdown_connections 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 --2- 192.168.123.103:0/2913149480 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a00721d0 0x7ff3a00725f0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/2913149480 >> 192.168.123.103:0/2913149480 conn(0x7ff3a006d320 msgr2=0x7ff3a006f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/2913149480 shutdown_connections 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/2913149480 wait complete. 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.685+0000 7ff3a5aa9700 1 Processor -- start 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a5aa9700 1 -- start start 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a5aa9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 0x7ff3a01a0dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a5aa9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff390013070 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a4aa7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 0x7ff3a01a0dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a4aa7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 0x7ff3a01a0dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41022/0 (socket says 192.168.123.103:41022) 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a4aa7700 1 -- 192.168.123.103:0/1989606798 learned_addr learned my addr 192.168.123.103:0/1989606798 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a4aa7700 1 -- 192.168.123.103:0/1989606798 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff390009160 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a4aa7700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 0x7ff3a01a0dd0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7ff3900038d0 tx=0x7ff390003e60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff3900040b0 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff39001c070 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff3900183f0 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3a01a1310 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.686+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3a01a3f90 con 0x7ff3a01a09b0 2026-03-08T23:59:26.866 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.687+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3a00623c0 con 0x7ff3a01a09b0 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.690+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7ff39001e030 con 0x7ff3a01a09b0 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.690+0000 7ff39dffb700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff388038310 0x7ff38803a7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.690+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff39002f080 con 0x7ff3a01a09b0 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.692+0000 7ff39ffff700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff388038310 0x7ff38803a7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.692+0000 7ff39ffff700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff388038310 0x7ff38803a7d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff39800ad30 tx=0x7ff3980093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.692+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff390018550 con 0x7ff3a01a09b0 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.807+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7ff3a006e4a0 con 0x7ff388038310 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.833+0000 7ff39dffb700 1 -- 192.168.123.103:0/1989606798 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7ff3a006e4a0 con 0x7ff388038310 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.835+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff388038310 msgr2=0x7ff38803a7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.835+0000 7ff3a5aa9700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff388038310 0x7ff38803a7d0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff39800ad30 tx=0x7ff3980093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 msgr2=0x7ff3a01a0dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 0x7ff3a01a0dd0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7ff3900038d0 tx=0x7ff390003e60 comp rx=0 tx=0).stop 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 shutdown_connections 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff388038310 0x7ff38803a7d0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 --2- 192.168.123.103:0/1989606798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3a01a09b0 0x7ff3a01a0dd0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 >> 192.168.123.103:0/1989606798 conn(0x7ff3a006d320 msgr2=0x7ff3a006dd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 shutdown_connections 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:26.836+0000 7ff3a5aa9700 1 -- 192.168.123.103:0/1989606798 wait complete. 2026-03-08T23:59:26.867 INFO:teuthology.orchestra.run.vm03.stdout:Creating initial admin user... 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$eX7F0ErS49/NYZkVJxrht.9Jfi7jrVs0umO6pexwAeHpNfko4LnSa", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773014367, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.017+0000 7f73c20f7700 1 Processor -- start 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.017+0000 7f73c20f7700 1 -- start start 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.017+0000 7f73c20f7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 0x7f73bc0725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.017+0000 7f73c20f7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73bc072bc0 con 0x7f73bc0721d0 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.017+0000 7f73c10f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 0x7f73bc0725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.018+0000 7f73c10f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 0x7f73bc0725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41026/0 (socket says 192.168.123.103:41026) 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.018+0000 7f73c10f5700 1 -- 192.168.123.103:0/2231769045 learned_addr learned my addr 192.168.123.103:0/2231769045 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:27.404 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.018+0000 7f73c10f5700 1 -- 192.168.123.103:0/2231769045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73bc10e1c0 con 0x7f73bc0721d0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.018+0000 7f73c10f5700 1 --2- 192.168.123.103:0/2231769045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 0x7f73bc0725f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f73b8009a90 tx=0x7f73b8009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=82a532b74cb69f73 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73b3fff700 1 -- 192.168.123.103:0/2231769045 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73b8004030 con 0x7f73bc0721d0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73b3fff700 1 -- 192.168.123.103:0/2231769045 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f73b800b7e0 con 0x7f73bc0721d0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73c20f7700 1 -- 192.168.123.103:0/2231769045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 msgr2=0x7f73bc0725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73c20f7700 1 --2- 192.168.123.103:0/2231769045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 0x7f73bc0725f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f73b8009a90 tx=0x7f73b8009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73c20f7700 1 -- 192.168.123.103:0/2231769045 shutdown_connections 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73c20f7700 1 --2- 192.168.123.103:0/2231769045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc0721d0 0x7f73bc0725f0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.019+0000 7f73c20f7700 1 -- 192.168.123.103:0/2231769045 >> 192.168.123.103:0/2231769045 conn(0x7f73bc06d320 msgr2=0x7f73bc06f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.020+0000 7f73c20f7700 1 -- 192.168.123.103:0/2231769045 shutdown_connections 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.020+0000 7f73c20f7700 1 -- 192.168.123.103:0/2231769045 wait complete. 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.020+0000 7f73c20f7700 1 Processor -- start 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.020+0000 7f73c20f7700 1 -- start start 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.020+0000 7f73c20f7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 0x7f73bc1a0dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.020+0000 7f73c20f7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f73b8003a00 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.021+0000 7f73c10f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 0x7f73bc1a0dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.021+0000 7f73c10f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 0x7f73bc1a0dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41040/0 (socket says 192.168.123.103:41040) 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.021+0000 7f73c10f5700 1 -- 192.168.123.103:0/2148488824 learned_addr learned my addr 192.168.123.103:0/2148488824 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.021+0000 7f73c10f5700 1 -- 192.168.123.103:0/2148488824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f73b8009740 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.021+0000 7f73c10f5700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 0x7f73bc1a0dc0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f73b8006b20 tx=0x7f73b8004060 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.023+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73b8004480 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.023+0000 7f73c20f7700 1 -- 192.168.123.103:0/2148488824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f73bc1a1300 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.023+0000 7f73c20f7700 1 -- 192.168.123.103:0/2148488824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f73bc1a1f70 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.023+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f73b80045e0 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.024+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f73b8017690 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.025+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f73b80177f0 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.025+0000 7f73b27fc700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f73a8038300 0x7f73a803a7c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.025+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f73b8013070 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.025+0000 7f73c08f4700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f73a8038300 0x7f73a803a7c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.025+0000 7f73c08f4700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f73a8038300 0x7f73a803a7c0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f73b400ad30 tx=0x7f73b40093f0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.026+0000 7f73c20f7700 1 -- 192.168.123.103:0/2148488824 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f73a0005320 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.028+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f73b8024070 con 0x7f73bc1a09a0 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.154+0000 7f73c20f7700 1 -- 192.168.123.103:0/2148488824 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7f73a0002430 con 0x7f73a8038300 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.351+0000 7f73b27fc700 1 -- 192.168.123.103:0/2148488824 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7f73a0002430 con 0x7f73a8038300 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 -- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f73a8038300 msgr2=0x7f73a803a7c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f73a8038300 0x7f73a803a7c0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f73b400ad30 tx=0x7f73b40093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 -- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 msgr2=0x7f73bc1a0dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 0x7f73bc1a0dc0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f73b8006b20 tx=0x7f73b8004060 comp rx=0 tx=0).stop 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 -- 192.168.123.103:0/2148488824 shutdown_connections 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f73a8038300 0x7f73a803a7c0 secure :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f73b400ad30 tx=0x7f73b40093f0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 --2- 192.168.123.103:0/2148488824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f73bc1a09a0 0x7f73bc1a0dc0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 -- 192.168.123.103:0/2148488824 >> 192.168.123.103:0/2148488824 conn(0x7f73bc06d320 msgr2=0x7f73bc06dd90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 -- 192.168.123.103:0/2148488824 shutdown_connections 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.354+0000 7f73a7fff700 1 -- 192.168.123.103:0/2148488824 wait complete. 2026-03-08T23:59:27.405 INFO:teuthology.orchestra.run.vm03.stdout:Fetching dashboard port number... 2026-03-08T23:59:27.648 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:25] ENGINE Bus STARTING 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:25] ENGINE Serving on http://192.168.123.103:8765 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:26] ENGINE Serving on https://192.168.123.103:7150 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: [08/Mar/2026:23:59:26] ENGINE Bus STARTED 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:27.649 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:27 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stdout 8443 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.528+0000 7f7f3ab76700 1 Processor -- start 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.529+0000 7f7f3ab76700 1 -- start start 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.529+0000 7f7f3ab76700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f34106b80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.529+0000 7f7f3ab76700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f34107150 con 0x7f7f34106760 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.530+0000 7f7f38912700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f34106b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.530+0000 7f7f38912700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f34106b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41050/0 (socket says 192.168.123.103:41050) 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.530+0000 7f7f38912700 1 -- 192.168.123.103:0/646081908 learned_addr learned my addr 192.168.123.103:0/646081908 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.530+0000 7f7f38912700 1 -- 192.168.123.103:0/646081908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f34107960 con 0x7f7f34106760 2026-03-08T23:59:27.704 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.530+0000 7f7f38912700 1 --2- 192.168.123.103:0/646081908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f34106b80 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7f20009a90 tx=0x7f7f20009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=296c704676cd35ab server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.531+0000 7f7f337fe700 1 -- 192.168.123.103:0/646081908 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f20004030 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.531+0000 7f7f337fe700 1 -- 192.168.123.103:0/646081908 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f2000b7e0 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.531+0000 7f7f337fe700 1 -- 192.168.123.103:0/646081908 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f20003b30 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 -- 192.168.123.103:0/646081908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 msgr2=0x7f7f34106b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 --2- 192.168.123.103:0/646081908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f34106b80 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f7f20009a90 tx=0x7f7f20009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 -- 192.168.123.103:0/646081908 shutdown_connections 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 --2- 192.168.123.103:0/646081908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f34106b80 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 -- 192.168.123.103:0/646081908 >> 192.168.123.103:0/646081908 conn(0x7f7f34101ce0 msgr2=0x7f7f34104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 -- 192.168.123.103:0/646081908 shutdown_connections 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.532+0000 7f7f3ab76700 1 -- 192.168.123.103:0/646081908 wait complete. 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.533+0000 7f7f3ab76700 1 Processor -- start 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.533+0000 7f7f3ab76700 1 -- start start 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.533+0000 7f7f3ab76700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f341982e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.534+0000 7f7f3ab76700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f34198820 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.534+0000 7f7f38912700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f341982e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.534+0000 7f7f38912700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f341982e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41064/0 (socket says 192.168.123.103:41064) 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.534+0000 7f7f38912700 1 -- 192.168.123.103:0/1460819970 learned_addr learned my addr 192.168.123.103:0/1460819970 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.534+0000 7f7f38912700 1 -- 192.168.123.103:0/1460819970 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f20009740 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.537+0000 7f7f38912700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f341982e0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7f20000c00 tx=0x7f7f2000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.538+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f20004140 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.538+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f200042a0 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.538+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f7f20011540 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.538+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f34198a20 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.538+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f34198e40 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.539+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f7f200116a0 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.539+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f34192050 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.539+0000 7f7f31ffb700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f24038310 0x7f7f2403a7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.539+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f7f2004ba20 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.540+0000 7f7f33fff700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f24038310 0x7f7f2403a7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.540+0000 7f7f33fff700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f24038310 0x7f7f2403a7d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f7f28006fd0 tx=0x7f7f28006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.542+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7f20010740 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.648+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7f7f340623c0 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.650+0000 7f7f31ffb700 1 -- 192.168.123.103:0/1460819970 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7f7f2001e070 con 0x7f7f34106760 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f24038310 msgr2=0x7f7f2403a7d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f24038310 0x7f7f2403a7d0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f7f28006fd0 tx=0x7f7f28006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 msgr2=0x7f7f341982e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f341982e0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f7f20000c00 tx=0x7f7f2000bfa0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 shutdown_connections 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f24038310 0x7f7f2403a7d0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 --2- 192.168.123.103:0/1460819970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f34106760 0x7f7f341982e0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.653+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 >> 192.168.123.103:0/1460819970 conn(0x7f7f34101ce0 msgr2=0x7f7f341037a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.654+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 shutdown_connections 2026-03-08T23:59:27.705 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.654+0000 7f7f3ab76700 1 -- 192.168.123.103:0/1460819970 wait complete. 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout:firewalld does not appear to be present 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout:Ceph Dashboard is now available at: 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout: URL: https://vm03.local:8443/ 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout: User: admin 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout: Password: 52hwpde2i2 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout:Saving cluster configuration to /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config directory 2026-03-08T23:59:27.706 INFO:teuthology.orchestra.run.vm03.stdout:Enabling autotune for osd_memory_target 2026-03-08T23:59:28.000 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.827+0000 7f57d3643700 1 Processor -- start 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.828+0000 7f57d3643700 1 -- start start 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.828+0000 7f57d3643700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc108d90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.828+0000 7f57d3643700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57cc109360 con 0x7f57cc108970 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.828+0000 7f57d13df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc108d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.829+0000 7f57d13df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc108d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41072/0 (socket says 192.168.123.103:41072) 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.829+0000 7f57d13df700 1 -- 192.168.123.103:0/1235229047 learned_addr learned my addr 192.168.123.103:0/1235229047 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.829+0000 7f57d13df700 1 -- 192.168.123.103:0/1235229047 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57cc109b70 con 0x7f57cc108970 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.829+0000 7f57d13df700 1 --2- 192.168.123.103:0/1235229047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc108d90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f57bc009a90 tx=0x7f57bc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7077df4e59ed73cd server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57c3fff700 1 -- 192.168.123.103:0/1235229047 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f57bc004030 con 0x7f57cc108970 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57c3fff700 1 -- 192.168.123.103:0/1235229047 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f57bc00b7e0 con 0x7f57cc108970 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57c3fff700 1 -- 192.168.123.103:0/1235229047 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f57bc003b30 con 0x7f57cc108970 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57d3643700 1 -- 192.168.123.103:0/1235229047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 msgr2=0x7f57cc108d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57d3643700 1 --2- 192.168.123.103:0/1235229047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc108d90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f57bc009a90 tx=0x7f57bc009da0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57d3643700 1 -- 192.168.123.103:0/1235229047 shutdown_connections 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57d3643700 1 --2- 192.168.123.103:0/1235229047 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc108d90 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.830+0000 7f57d3643700 1 -- 192.168.123.103:0/1235229047 >> 192.168.123.103:0/1235229047 conn(0x7f57cc07be30 msgr2=0x7f57cc1064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.831+0000 7f57d3643700 1 -- 192.168.123.103:0/1235229047 shutdown_connections 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.831+0000 7f57d3643700 1 -- 192.168.123.103:0/1235229047 wait complete. 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.831+0000 7f57d3643700 1 Processor -- start 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.831+0000 7f57d3643700 1 -- start start 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.832+0000 7f57d3643700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc19c620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.832+0000 7f57d3643700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57cc19cb60 con 0x7f57cc108970 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.832+0000 7f57d13df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc19c620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.832+0000 7f57d13df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc19c620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41076/0 (socket says 192.168.123.103:41076) 2026-03-08T23:59:28.001 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.832+0000 7f57d13df700 1 -- 192.168.123.103:0/871998266 learned_addr learned my addr 192.168.123.103:0/871998266 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.832+0000 7f57d13df700 1 -- 192.168.123.103:0/871998266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f57bc009740 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.833+0000 7f57d13df700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc19c620 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f57bc000c00 tx=0x7f57bc00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.833+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f57bc0040f0 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.833+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f57bc004250 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.833+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57cc19cd60 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.833+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f57bc011560 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.833+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57cc19d180 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.834+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f57bc0116c0 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.834+0000 7f57c27fc700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57b80382d0 0x7f57b803a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.834+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f57bc04ca90 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.834+0000 7f57d0bde700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57b80382d0 0x7f57b803a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.835+0000 7f57d0bde700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57b80382d0 0x7f57b803a790 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f57c8006fd0 tx=0x7f57c8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.835+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f57cc04fa20 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.838+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f57bc01e070 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.940+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f57cc19fbc0 con 0x7f57cc108970 2026-03-08T23:59:28.002 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.941+0000 7f57c27fc700 1 -- 192.168.123.103:0/871998266 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f57bc04b0c0 con 0x7f57cc108970 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.945+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57b80382d0 msgr2=0x7f57b803a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.945+0000 7f57d3643700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57b80382d0 0x7f57b803a790 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f57c8006fd0 tx=0x7f57c8006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.945+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 msgr2=0x7f57cc19c620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.945+0000 7f57d3643700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc19c620 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f57bc000c00 tx=0x7f57bc00bfa0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.946+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 shutdown_connections 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.946+0000 7f57d3643700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f57b80382d0 0x7f57b803a790 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.946+0000 7f57d3643700 1 --2- 192.168.123.103:0/871998266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f57cc108970 0x7f57cc19c620 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.946+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 >> 192.168.123.103:0/871998266 conn(0x7f57cc07be30 msgr2=0x7f57cc105d00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.947+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 shutdown_connections 2026-03-08T23:59:28.003 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:27.947+0000 7f57d3643700 1 -- 192.168.123.103:0/871998266 wait complete. 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.169+0000 7f5bfb202700 1 Processor -- start 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.170+0000 7f5bfb202700 1 -- start start 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.170+0000 7f5bfb202700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf4106b80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.170+0000 7f5bfb202700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5bf4107150 con 0x7f5bf4106760 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.171+0000 7f5bf8f9e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf4106b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.171+0000 7f5bf8f9e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf4106b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41078/0 (socket says 192.168.123.103:41078) 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.171+0000 7f5bf8f9e700 1 -- 192.168.123.103:0/655652415 learned_addr learned my addr 192.168.123.103:0/655652415 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.173+0000 7f5bf8f9e700 1 -- 192.168.123.103:0/655652415 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5bf4107960 con 0x7f5bf4106760 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.173+0000 7f5bf8f9e700 1 --2- 192.168.123.103:0/655652415 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf4106b80 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f5be0009cf0 tx=0x7f5be000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=35d8428c3a91dca5 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.173+0000 7f5bf37fe700 1 -- 192.168.123.103:0/655652415 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5be0004030 con 0x7f5bf4106760 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.173+0000 7f5bf37fe700 1 -- 192.168.123.103:0/655652415 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5be000b810 con 0x7f5bf4106760 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.174+0000 7f5bf37fe700 1 -- 192.168.123.103:0/655652415 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5be0003b10 con 0x7f5bf4106760 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.174+0000 7f5bfb202700 1 -- 192.168.123.103:0/655652415 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 msgr2=0x7f5bf4106b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.174+0000 7f5bfb202700 1 --2- 192.168.123.103:0/655652415 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf4106b80 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f5be0009cf0 tx=0x7f5be000b0e0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.174+0000 7f5bfb202700 1 -- 192.168.123.103:0/655652415 shutdown_connections 2026-03-08T23:59:28.398 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.174+0000 7f5bfb202700 1 --2- 192.168.123.103:0/655652415 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf4106b80 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.174+0000 7f5bfb202700 1 -- 192.168.123.103:0/655652415 >> 192.168.123.103:0/655652415 conn(0x7f5bf4101ce0 msgr2=0x7f5bf4104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.175+0000 7f5bfb202700 1 -- 192.168.123.103:0/655652415 shutdown_connections 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.175+0000 7f5bfb202700 1 -- 192.168.123.103:0/655652415 wait complete. 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.175+0000 7f5bfb202700 1 Processor -- start 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.175+0000 7f5bfb202700 1 -- start start 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.176+0000 7f5bfb202700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf419c710 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.176+0000 7f5bfb202700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5bf419cc50 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.176+0000 7f5bf8f9e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf419c710 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.176+0000 7f5bf8f9e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf419c710 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41084/0 (socket says 192.168.123.103:41084) 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.176+0000 7f5bf8f9e700 1 -- 192.168.123.103:0/3662645314 learned_addr learned my addr 192.168.123.103:0/3662645314 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.176+0000 7f5bf8f9e700 1 -- 192.168.123.103:0/3662645314 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5be0009740 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.177+0000 7f5bf8f9e700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf419c710 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f5be0000c00 tx=0x7f5be0011890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.177+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5be0011bc0 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.177+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5be0011d20 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.177+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5be001a550 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.177+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5bf419ce50 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.177+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5bf419d1b0 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.178+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f5be001b440 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.178+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5bf404fa20 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.178+0000 7f5bf1ffb700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5be40382d0 0x7f5be403a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.178+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f5be004c060 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.178+0000 7f5bf3fff700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5be40382d0 0x7f5be403a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.181+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5be007f0e0 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.182+0000 7f5bf3fff700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5be40382d0 0x7f5be403a790 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f5be8006fd0 tx=0x7f5be8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.354+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f5bf40623c0 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.354+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f5be000fda0 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.358+0000 7f5bf1ffb700 1 -- 192.168.123.103:0/3662645314 <== mon.0 v2:192.168.123.103:3300/0 8 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f5be0053020 con 0x7f5bf4106760 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5be40382d0 msgr2=0x7f5be403a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5be40382d0 0x7f5be403a790 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f5be8006fd0 tx=0x7f5be8006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 msgr2=0x7f5bf419c710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf419c710 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f5be0000c00 tx=0x7f5be0011890 comp rx=0 tx=0).stop 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 shutdown_connections 2026-03-08T23:59:28.399 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5be40382d0 0x7f5be403a790 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 --2- 192.168.123.103:0/3662645314 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5bf4106760 0x7f5bf419c710 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.360+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 >> 192.168.123.103:0/3662645314 conn(0x7f5bf4101ce0 msgr2=0x7f5bf4102a70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.361+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 shutdown_connections 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr 2026-03-08T23:59:28.361+0000 7f5bfb202700 1 -- 192.168.123.103:0/3662645314 wait complete. 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:Or, if you are only running a single cluster on this host: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: ceph telemetry on 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:For more information see: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:28.400 INFO:teuthology.orchestra.run.vm03.stdout:Bootstrap complete. 2026-03-08T23:59:28.428 INFO:tasks.cephadm:Fetching config... 2026-03-08T23:59:28.428 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:59:28.429 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-08T23:59:28.453 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-08T23:59:28.453 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:59:28.453 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-08T23:59:28.524 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-08T23:59:28.524 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:59:28.524 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/keyring of=/dev/stdout 2026-03-08T23:59:28.596 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-08T23:59:28.596 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T23:59:28.596 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-08T23:59:28.653 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-08T23:59:28.654 DEBUG:teuthology.orchestra.run.vm03:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCseP8TJOsuxzWCbi/GtSh4sYcQoMttAx1MhEMLSc14Nydc0FYopBRf0tLWGHk7uryiK+KgRGc/F38yTXBVyXnRKXBNzpZLrgv75RC378qs8u2LftRPgtJVPXoEGeVLFj4MuFXcwmyWvFG6Ol81Mnd2qSP+59CQpY8HaESywjZqCmmmlhbcHaaxI5mjNOTnSPApQBqSKhRuq3wNS2WijDY6FtjM0ellzR7fUqIapI9QMKEE9PhlAE2V0Gt/sKpClCRKzfzbnv0vxw6tThHM83gT9BghsKKkPEKXQrtic3Wa6fAjRzaO1oG54kxahVGI0RIXfxYIvOtOigANo2MCwvQYM4bJ/ZT8CTDNS9bFv//wO77HhKVn+0Bq6JjhgTmdEaF2Xmn8Lwaj0LXsaX50ZouZ/mNQNS9qsx1uXC6vqFK71v+8uVIR5sEKTtUA2hGrEwGDPcQylsKHHL8gVEdXFh1FiEJAprE9OOVUlyANfwVwNkzxENVgf8wEaGCyfZYOyb0= ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-08T23:59:28.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:28 vm03 ceph-mon[52346]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:28.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:28 vm03 ceph-mon[52346]: from='client.14178 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:28.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:28 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1460819970' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-08T23:59:28.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:28 vm03 ceph-mon[52346]: mgrmap e12: vm03.yvcons(active, since 2s) 2026-03-08T23:59:28.723 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:28 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3662645314' entity='client.admin' 2026-03-08T23:59:28.740 INFO:teuthology.orchestra.run.vm03.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCseP8TJOsuxzWCbi/GtSh4sYcQoMttAx1MhEMLSc14Nydc0FYopBRf0tLWGHk7uryiK+KgRGc/F38yTXBVyXnRKXBNzpZLrgv75RC378qs8u2LftRPgtJVPXoEGeVLFj4MuFXcwmyWvFG6Ol81Mnd2qSP+59CQpY8HaESywjZqCmmmlhbcHaaxI5mjNOTnSPApQBqSKhRuq3wNS2WijDY6FtjM0ellzR7fUqIapI9QMKEE9PhlAE2V0Gt/sKpClCRKzfzbnv0vxw6tThHM83gT9BghsKKkPEKXQrtic3Wa6fAjRzaO1oG54kxahVGI0RIXfxYIvOtOigANo2MCwvQYM4bJ/ZT8CTDNS9bFv//wO77HhKVn+0Bq6JjhgTmdEaF2Xmn8Lwaj0LXsaX50ZouZ/mNQNS9qsx1uXC6vqFK71v+8uVIR5sEKTtUA2hGrEwGDPcQylsKHHL8gVEdXFh1FiEJAprE9OOVUlyANfwVwNkzxENVgf8wEaGCyfZYOyb0= ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:59:28.750 DEBUG:teuthology.orchestra.run.vm06:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCseP8TJOsuxzWCbi/GtSh4sYcQoMttAx1MhEMLSc14Nydc0FYopBRf0tLWGHk7uryiK+KgRGc/F38yTXBVyXnRKXBNzpZLrgv75RC378qs8u2LftRPgtJVPXoEGeVLFj4MuFXcwmyWvFG6Ol81Mnd2qSP+59CQpY8HaESywjZqCmmmlhbcHaaxI5mjNOTnSPApQBqSKhRuq3wNS2WijDY6FtjM0ellzR7fUqIapI9QMKEE9PhlAE2V0Gt/sKpClCRKzfzbnv0vxw6tThHM83gT9BghsKKkPEKXQrtic3Wa6fAjRzaO1oG54kxahVGI0RIXfxYIvOtOigANo2MCwvQYM4bJ/ZT8CTDNS9bFv//wO77HhKVn+0Bq6JjhgTmdEaF2Xmn8Lwaj0LXsaX50ZouZ/mNQNS9qsx1uXC6vqFK71v+8uVIR5sEKTtUA2hGrEwGDPcQylsKHHL8gVEdXFh1FiEJAprE9OOVUlyANfwVwNkzxENVgf8wEaGCyfZYOyb0= ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-08T23:59:28.787 INFO:teuthology.orchestra.run.vm06.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCseP8TJOsuxzWCbi/GtSh4sYcQoMttAx1MhEMLSc14Nydc0FYopBRf0tLWGHk7uryiK+KgRGc/F38yTXBVyXnRKXBNzpZLrgv75RC378qs8u2LftRPgtJVPXoEGeVLFj4MuFXcwmyWvFG6Ol81Mnd2qSP+59CQpY8HaESywjZqCmmmlhbcHaaxI5mjNOTnSPApQBqSKhRuq3wNS2WijDY6FtjM0ellzR7fUqIapI9QMKEE9PhlAE2V0Gt/sKpClCRKzfzbnv0vxw6tThHM83gT9BghsKKkPEKXQrtic3Wa6fAjRzaO1oG54kxahVGI0RIXfxYIvOtOigANo2MCwvQYM4bJ/ZT8CTDNS9bFv//wO77HhKVn+0Bq6JjhgTmdEaF2Xmn8Lwaj0LXsaX50ZouZ/mNQNS9qsx1uXC6vqFK71v+8uVIR5sEKTtUA2hGrEwGDPcQylsKHHL8gVEdXFh1FiEJAprE9OOVUlyANfwVwNkzxENVgf8wEaGCyfZYOyb0= ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-08T23:59:28.798 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-08T23:59:28.963 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-08T23:59:29.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.284+0000 7f3dd880a700 1 -- 192.168.123.103:0/2343725919 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 msgr2=0x7f3dd0102f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:29.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.284+0000 7f3dd880a700 1 --2- 192.168.123.103:0/2343725919 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0102f10 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f3dc4009b50 tx=0x7f3dc4009e60 comp rx=0 tx=0).stop 2026-03-08T23:59:29.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.285+0000 7f3dd880a700 1 -- 192.168.123.103:0/2343725919 shutdown_connections 2026-03-08T23:59:29.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.285+0000 7f3dd880a700 1 --2- 192.168.123.103:0/2343725919 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0102f10 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.285+0000 7f3dd880a700 1 -- 192.168.123.103:0/2343725919 >> 192.168.123.103:0/2343725919 conn(0x7f3dd00fe070 msgr2=0x7f3dd01004d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.285+0000 7f3dd880a700 1 -- 192.168.123.103:0/2343725919 shutdown_connections 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.285+0000 7f3dd880a700 1 -- 192.168.123.103:0/2343725919 wait complete. 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd880a700 1 Processor -- start 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd880a700 1 -- start start 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd880a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0197c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd880a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd01981a0 con 0x7f3dd0102af0 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd65a6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0197c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd65a6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0197c60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41106/0 (socket says 192.168.123.103:41106) 2026-03-08T23:59:29.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd65a6700 1 -- 192.168.123.103:0/2698287179 learned_addr learned my addr 192.168.123.103:0/2698287179 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:29.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.286+0000 7f3dd65a6700 1 -- 192.168.123.103:0/2698287179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3dc40097e0 con 0x7f3dd0102af0 2026-03-08T23:59:29.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.287+0000 7f3dd65a6700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0197c60 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f3dc4005950 tx=0x7f3dc40050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:29.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.287+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3dc401c070 con 0x7f3dd0102af0 2026-03-08T23:59:29.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.287+0000 7f3dd880a700 1 -- 192.168.123.103:0/2698287179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dd01983a0 con 0x7f3dd0102af0 2026-03-08T23:59:29.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.287+0000 7f3dd880a700 1 -- 192.168.123.103:0/2698287179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dd0198840 con 0x7f3dd0102af0 2026-03-08T23:59:29.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.287+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3dc40056d0 con 0x7f3dd0102af0 2026-03-08T23:59:29.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.287+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3dc400f460 con 0x7f3dd0102af0 2026-03-08T23:59:29.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.288+0000 7f3dd880a700 1 -- 192.168.123.103:0/2698287179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3db4005320 con 0x7f3dd0102af0 2026-03-08T23:59:29.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.292+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f3dc4021870 con 0x7f3dd0102af0 2026-03-08T23:59:29.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.292+0000 7f3dc37fe700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3dbc040bf0 0x7f3dbc0430b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:29.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.292+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f3dc404cc10 con 0x7f3dd0102af0 2026-03-08T23:59:29.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.292+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3dc4079180 con 0x7f3dd0102af0 2026-03-08T23:59:29.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.292+0000 7f3dd5da5700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3dbc040bf0 0x7f3dbc0430b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:29.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.293+0000 7f3dd5da5700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3dbc040bf0 0x7f3dbc0430b0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f3dc8006fd0 tx=0x7f3dc8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:29.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.445+0000 7f3dd880a700 1 -- 192.168.123.103:0/2698287179 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7f3db4005cc0 con 0x7f3dd0102af0 2026-03-08T23:59:29.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.462+0000 7f3dc37fe700 1 -- 192.168.123.103:0/2698287179 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7f3dc4026020 con 0x7f3dd0102af0 2026-03-08T23:59:29.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.467+0000 7f3dc17fa700 1 -- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3dbc040bf0 msgr2=0x7f3dbc0430b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:29.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.467+0000 7f3dc17fa700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3dbc040bf0 0x7f3dbc0430b0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f3dc8006fd0 tx=0x7f3dc8006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:29.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.467+0000 7f3dc17fa700 1 -- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 msgr2=0x7f3dd0197c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:29.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.467+0000 7f3dc17fa700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0197c60 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f3dc4005950 tx=0x7f3dc40050d0 comp rx=0 tx=0).stop 2026-03-08T23:59:29.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.468+0000 7f3dc17fa700 1 -- 192.168.123.103:0/2698287179 shutdown_connections 2026-03-08T23:59:29.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.468+0000 7f3dc17fa700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f3dbc040bf0 0x7f3dbc0430b0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:29.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.468+0000 7f3dc17fa700 1 --2- 192.168.123.103:0/2698287179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3dd0102af0 0x7f3dd0197c60 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:29.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.468+0000 7f3dc17fa700 1 -- 192.168.123.103:0/2698287179 >> 192.168.123.103:0/2698287179 conn(0x7f3dd00fe070 msgr2=0x7f3dd00fed50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:29.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.468+0000 7f3dc17fa700 1 -- 192.168.123.103:0/2698287179 shutdown_connections 2026-03-08T23:59:29.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:29.468+0000 7f3dc17fa700 1 -- 192.168.123.103:0/2698287179 wait complete. 2026-03-08T23:59:29.603 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-08T23:59:29.603 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-08T23:59:29.799 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-08T23:59:30.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.132+0000 7f264d796700 1 -- 192.168.123.103:0/1619121754 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 msgr2=0x7f26480ffd80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:30.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.132+0000 7f264d796700 1 --2- 192.168.123.103:0/1619121754 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f26480ffd80 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f2630009b00 tx=0x7f2630009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:30.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.133+0000 7f264d796700 1 -- 192.168.123.103:0/1619121754 shutdown_connections 2026-03-08T23:59:30.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.133+0000 7f264d796700 1 --2- 192.168.123.103:0/1619121754 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f26480ffd80 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:30.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.133+0000 7f264d796700 1 -- 192.168.123.103:0/1619121754 >> 192.168.123.103:0/1619121754 conn(0x7f26480faf00 msgr2=0x7f26480fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:30.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.134+0000 7f264d796700 1 -- 192.168.123.103:0/1619121754 shutdown_connections 2026-03-08T23:59:30.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.134+0000 7f264d796700 1 -- 192.168.123.103:0/1619121754 wait complete. 2026-03-08T23:59:30.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f264d796700 1 Processor -- start 2026-03-08T23:59:30.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f264d796700 1 -- start start 2026-03-08T23:59:30.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f264d796700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f2648196ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:30.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f264d796700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2648197430 con 0x7f26480ff960 2026-03-08T23:59:30.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f2646ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f2648196ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:30.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f2646ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f2648196ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41128/0 (socket says 192.168.123.103:41128) 2026-03-08T23:59:30.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.135+0000 7f2646ffd700 1 -- 192.168.123.103:0/151452148 learned_addr learned my addr 192.168.123.103:0/151452148 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:30.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.136+0000 7f2646ffd700 1 -- 192.168.123.103:0/151452148 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26300097e0 con 0x7f26480ff960 2026-03-08T23:59:30.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.136+0000 7f2646ffd700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f2648196ef0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f2630004750 tx=0x7f2630005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:30.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.136+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f263002d070 con 0x7f26480ff960 2026-03-08T23:59:30.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.136+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2648197630 con 0x7f26480ff960 2026-03-08T23:59:30.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.136+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2648193950 con 0x7f26480ff960 2026-03-08T23:59:30.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.136+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2630032470 con 0x7f26480ff960 2026-03-08T23:59:30.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.138+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f263000f460 con 0x7f26480ff960 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.138+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f263000f620 con 0x7f26480ff960 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.138+0000 7f263ffff700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2634038070 0x7f263403a530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.138+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f263005e480 con 0x7f26480ff960 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.138+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2628005320 con 0x7f26480ff960 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.141+0000 7f26467fc700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2634038070 0x7f263403a530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.141+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2630057e60 con 0x7f26480ff960 2026-03-08T23:59:30.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.144+0000 7f26467fc700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2634038070 0x7f263403a530 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f2638006fd0 tx=0x7f2638006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:30.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.286+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7f2628000bf0 con 0x7f2634038070 2026-03-08T23:59:30.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.292+0000 7f263ffff700 1 -- 192.168.123.103:0/151452148 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f2628000bf0 con 0x7f2634038070 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.294+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2634038070 msgr2=0x7f263403a530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.294+0000 7f264d796700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2634038070 0x7f263403a530 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f2638006fd0 tx=0x7f2638006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.294+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 msgr2=0x7f2648196ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.294+0000 7f264d796700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f2648196ef0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f2630004750 tx=0x7f2630005dc0 comp rx=0 tx=0).stop 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.295+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 shutdown_connections 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.295+0000 7f264d796700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2634038070 0x7f263403a530 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.295+0000 7f264d796700 1 --2- 192.168.123.103:0/151452148 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f26480ff960 0x7f2648196ef0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.295+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 >> 192.168.123.103:0/151452148 conn(0x7f26480faf00 msgr2=0x7f26480690c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:30.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.296+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 shutdown_connections 2026-03-08T23:59:30.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:30.296+0000 7f264d796700 1 -- 192.168.123.103:0/151452148 wait complete. 2026-03-08T23:59:30.359 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm06 2026-03-08T23:59:30.359 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:59:30.359 DEBUG:teuthology.orchestra.run.vm06:> dd of=/etc/ceph/ceph.conf 2026-03-08T23:59:30.377 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-08T23:59:30.377 DEBUG:teuthology.orchestra.run.vm06:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2698287179' entity='client.admin' 2026-03-08T23:59:30.402 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:30 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:30.435 INFO:tasks.cephadm:Adding host vm06 to orchestrator... 2026-03-08T23:59:30.436 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch host add vm06 2026-03-08T23:59:30.787 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.095+0000 7f9907fff700 1 -- 192.168.123.103:0/1277076970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99000a4a10 msgr2=0x7f99000a4e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.095+0000 7f9907fff700 1 --2- 192.168.123.103:0/1277076970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99000a4a10 0x7f99000a4e30 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f98f8009b00 tx=0x7f98f8009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.096+0000 7f9907fff700 1 -- 192.168.123.103:0/1277076970 shutdown_connections 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.096+0000 7f9907fff700 1 --2- 192.168.123.103:0/1277076970 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99000a4a10 0x7f99000a4e30 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.096+0000 7f9907fff700 1 -- 192.168.123.103:0/1277076970 >> 192.168.123.103:0/1277076970 conn(0x7f990009fed0 msgr2=0x7f99000a2330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.096+0000 7f9907fff700 1 -- 192.168.123.103:0/1277076970 shutdown_connections 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.096+0000 7f9907fff700 1 -- 192.168.123.103:0/1277076970 wait complete. 2026-03-08T23:59:31.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.097+0000 7f9907fff700 1 Processor -- start 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.097+0000 7f9907fff700 1 -- start start 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.097+0000 7f9907fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 0x7f9900142710 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.097+0000 7f9907fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f98f8012070 con 0x7f99001422f0 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.098+0000 7f9906ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 0x7f9900142710 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.098+0000 7f9906ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 0x7f9900142710 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41154/0 (socket says 192.168.123.103:41154) 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.098+0000 7f9906ffd700 1 -- 192.168.123.103:0/3927003622 learned_addr learned my addr 192.168.123.103:0/3927003622 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:31.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.098+0000 7f9906ffd700 1 -- 192.168.123.103:0/3927003622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98f80097e0 con 0x7f99001422f0 2026-03-08T23:59:31.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.099+0000 7f9906ffd700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 0x7f9900142710 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f98f8006010 tx=0x7f98f800bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.099+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f98f801c070 con 0x7f99001422f0 2026-03-08T23:59:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.099+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f98f8003d70 con 0x7f99001422f0 2026-03-08T23:59:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.099+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9900142c50 con 0x7f99001422f0 2026-03-08T23:59:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.099+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99001458d0 con 0x7f99001422f0 2026-03-08T23:59:31.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.099+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f98f8017440 con 0x7f99001422f0 2026-03-08T23:59:31.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.101+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9900004b50 con 0x7f99001422f0 2026-03-08T23:59:31.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.102+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f98f800f460 con 0x7f99001422f0 2026-03-08T23:59:31.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.102+0000 7f98effff700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f98f00380d0 0x7f98f003a590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:31.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.102+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f98f804bf60 con 0x7f99001422f0 2026-03-08T23:59:31.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.104+0000 7f99067fc700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f98f00380d0 0x7f98f003a590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:31.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.105+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f98f801fb20 con 0x7f99001422f0 2026-03-08T23:59:31.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.113+0000 7f99067fc700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f98f00380d0 0x7f98f003a590 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f990804f8e0 tx=0x7f990804f080 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:31.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:31.225+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm06", "target": ["mon-mgr", ""]}) v1 -- 0x7f990013ab70 con 0x7f98f00380d0 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-08T23:59:31.507 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:31 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-08T23:59:32.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:32.315+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f98f8017ac0 con 0x7f99001422f0 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: Deploying daemon crash.vm03 on vm03 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: from='client.14190 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm06", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: Deploying cephadm binary to vm06 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:32.481 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:32 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:33.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.158+0000 7f98effff700 1 -- 192.168.123.103:0/3927003622 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f990013ab70 con 0x7f98f00380d0 2026-03-08T23:59:33.160 INFO:teuthology.orchestra.run.vm03.stdout:Added host 'vm06' with addr '192.168.123.106' 2026-03-08T23:59:33.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f98f00380d0 msgr2=0x7f98f003a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:33.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f98f00380d0 0x7f98f003a590 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f990804f8e0 tx=0x7f990804f080 comp rx=0 tx=0).stop 2026-03-08T23:59:33.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 msgr2=0x7f9900142710 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:33.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 0x7f9900142710 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f98f8006010 tx=0x7f98f800bba0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 shutdown_connections 2026-03-08T23:59:33.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f98f00380d0 0x7f98f003a590 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 --2- 192.168.123.103:0/3927003622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f99001422f0 0x7f9900142710 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.161+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 >> 192.168.123.103:0/3927003622 conn(0x7f990009fed0 msgr2=0x7f99000a2330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:33.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.162+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 shutdown_connections 2026-03-08T23:59:33.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.162+0000 7f9907fff700 1 -- 192.168.123.103:0/3927003622 wait complete. 2026-03-08T23:59:33.208 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch host ls --format=json 2026-03-08T23:59:33.353 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-08T23:59:33.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:33 vm03 ceph-mon[52346]: Deploying daemon node-exporter.vm03 on vm03 2026-03-08T23:59:33.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:33 vm03 ceph-mon[52346]: mgrmap e13: vm03.yvcons(active, since 6s) 2026-03-08T23:59:33.409 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:33 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:33.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.601+0000 7f8ae4410700 1 -- 192.168.123.103:0/658241739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 msgr2=0x7f8adc1030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:33.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.601+0000 7f8ae4410700 1 --2- 192.168.123.103:0/658241739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc1030d0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f8acc009b50 tx=0x7f8acc009e60 comp rx=0 tx=0).stop 2026-03-08T23:59:33.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.602+0000 7f8ae4410700 1 -- 192.168.123.103:0/658241739 shutdown_connections 2026-03-08T23:59:33.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.602+0000 7f8ae4410700 1 --2- 192.168.123.103:0/658241739 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc1030d0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.602+0000 7f8ae4410700 1 -- 192.168.123.103:0/658241739 >> 192.168.123.103:0/658241739 conn(0x7f8adc0fe250 msgr2=0x7f8adc100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.602+0000 7f8ae4410700 1 -- 192.168.123.103:0/658241739 shutdown_connections 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.602+0000 7f8ae4410700 1 -- 192.168.123.103:0/658241739 wait complete. 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.602+0000 7f8ae4410700 1 Processor -- start 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae4410700 1 -- start start 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae4410700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc197eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae4410700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8adc1983f0 con 0x7f8adc102cb0 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae21ac700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc197eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae21ac700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc197eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41176/0 (socket says 192.168.123.103:41176) 2026-03-08T23:59:33.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae21ac700 1 -- 192.168.123.103:0/4243368681 learned_addr learned my addr 192.168.123.103:0/4243368681 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.603+0000 7f8ae21ac700 1 -- 192.168.123.103:0/4243368681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8acc0097e0 con 0x7f8adc102cb0 2026-03-08T23:59:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.604+0000 7f8ae21ac700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc197eb0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f8acc004f70 tx=0x7f8acc0050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.604+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8acc01c070 con 0x7f8adc102cb0 2026-03-08T23:59:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.604+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8acc021470 con 0x7f8adc102cb0 2026-03-08T23:59:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.604+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8acc00f460 con 0x7f8adc102cb0 2026-03-08T23:59:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.604+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8adc1985f0 con 0x7f8adc102cb0 2026-03-08T23:59:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.604+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8adc198a90 con 0x7f8adc102cb0 2026-03-08T23:59:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.605+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f8acc0215e0 con 0x7f8adc102cb0 2026-03-08T23:59:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.605+0000 7f8ad37fe700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ac80383f0 0x7f8ac803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.605+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8acc04c2f0 con 0x7f8adc102cb0 2026-03-08T23:59:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.605+0000 7f8ae19ab700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ac80383f0 0x7f8ac803a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.605+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8adc191a40 con 0x7f8adc102cb0 2026-03-08T23:59:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.606+0000 7f8ae19ab700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ac80383f0 0x7f8ac803a8b0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f8ad8006fd0 tx=0x7f8ad8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:33.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.608+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8acc026070 con 0x7f8adc102cb0 2026-03-08T23:59:33.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.715+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f8adc0611d0 con 0x7f8ac80383f0 2026-03-08T23:59:33.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.718+0000 7f8ad37fe700 1 -- 192.168.123.103:0/4243368681 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7f8adc0611d0 con 0x7f8ac80383f0 2026-03-08T23:59:33.719 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T23:59:33.719 INFO:teuthology.orchestra.run.vm03.stdout:[{"addr": "192.168.123.103", "hostname": "vm03", "labels": [], "status": ""}, {"addr": "192.168.123.106", "hostname": "vm06", "labels": [], "status": ""}] 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ac80383f0 msgr2=0x7f8ac803a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ac80383f0 0x7f8ac803a8b0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f8ad8006fd0 tx=0x7f8ad8006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 msgr2=0x7f8adc197eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc197eb0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f8acc004f70 tx=0x7f8acc0050d0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 shutdown_connections 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8ac80383f0 0x7f8ac803a8b0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 --2- 192.168.123.103:0/4243368681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8adc102cb0 0x7f8adc197eb0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:33.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 >> 192.168.123.103:0/4243368681 conn(0x7f8adc0fe250 msgr2=0x7f8adc0fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:33.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.721+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 shutdown_connections 2026-03-08T23:59:33.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:33.722+0000 7f8ae4410700 1 -- 192.168.123.103:0/4243368681 wait complete. 2026-03-08T23:59:33.780 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-08T23:59:33.780 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd crush tunables default 2026-03-08T23:59:33.928 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.170+0000 7f9af8d86700 1 -- 192.168.123.103:0/1465235526 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4102ca0 msgr2=0x7f9af41030c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.170+0000 7f9af8d86700 1 --2- 192.168.123.103:0/1465235526 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4102ca0 0x7f9af41030c0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9adc009b50 tx=0x7f9adc009e60 comp rx=0 tx=0).stop 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.170+0000 7f9af8d86700 1 -- 192.168.123.103:0/1465235526 shutdown_connections 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.170+0000 7f9af8d86700 1 --2- 192.168.123.103:0/1465235526 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4102ca0 0x7f9af41030c0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.170+0000 7f9af8d86700 1 -- 192.168.123.103:0/1465235526 >> 192.168.123.103:0/1465235526 conn(0x7f9af40fe220 msgr2=0x7f9af4100680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.171+0000 7f9af8d86700 1 -- 192.168.123.103:0/1465235526 shutdown_connections 2026-03-08T23:59:34.171 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.171+0000 7f9af8d86700 1 -- 192.168.123.103:0/1465235526 wait complete. 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.171+0000 7f9af8d86700 1 Processor -- start 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.172+0000 7f9af8d86700 1 -- start start 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.172+0000 7f9af8d86700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 0x7f9af4198430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.172+0000 7f9af8d86700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9af4198970 con 0x7f9af4198010 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.172+0000 7f9af259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 0x7f9af4198430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.172+0000 7f9af259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 0x7f9af4198430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41192/0 (socket says 192.168.123.103:41192) 2026-03-08T23:59:34.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.172+0000 7f9af259c700 1 -- 192.168.123.103:0/319080045 learned_addr learned my addr 192.168.123.103:0/319080045 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-08T23:59:34.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9af259c700 1 -- 192.168.123.103:0/319080045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9adc0097e0 con 0x7f9af4198010 2026-03-08T23:59:34.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9af259c700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 0x7f9af4198430 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f9adc000c00 tx=0x7f9adc0050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:34.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9adc01c070 con 0x7f9af4198010 2026-03-08T23:59:34.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9adc0056d0 con 0x7f9af4198010 2026-03-08T23:59:34.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9adc021e60 con 0x7f9af4198010 2026-03-08T23:59:34.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9af4198b70 con 0x7f9af4198010 2026-03-08T23:59:34.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.173+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9af4075420 con 0x7f9af4198010 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.175+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f9adc021800 con 0x7f9af4198010 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.175+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9af4191b00 con 0x7f9af4198010 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.175+0000 7f9aeb7fe700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9ae0038440 0x7f9ae003a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.175+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f9adc04c090 con 0x7f9af4198010 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.177+0000 7f9af1d9b700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9ae0038440 0x7f9ae003a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.178+0000 7f9af1d9b700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9ae0038440 0x7f9ae003a900 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f9ae4006fd0 tx=0x7f9ae4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:34.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.178+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9adc029360 con 0x7f9af4198010 2026-03-08T23:59:34.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.295+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7f9af404fa20 con 0x7f9af4198010 2026-03-08T23:59:34.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.318+0000 7f9aeb7fe700 1 -- 192.168.123.103:0/319080045 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7f9adc026070 con 0x7f9af4198010 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9ae0038440 msgr2=0x7f9ae003a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9ae0038440 0x7f9ae003a900 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f9ae4006fd0 tx=0x7f9ae4006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 msgr2=0x7f9af4198430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 0x7f9af4198430 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f9adc000c00 tx=0x7f9adc0050d0 comp rx=0 tx=0).stop 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 shutdown_connections 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9ae0038440 0x7f9ae003a900 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 --2- 192.168.123.103:0/319080045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9af4198010 0x7f9af4198430 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 >> 192.168.123.103:0/319080045 conn(0x7f9af40fe220 msgr2=0x7f9af40fef00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.323+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 shutdown_connections 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-08T23:59:34.324+0000 7f9af8d86700 1 -- 192.168.123.103:0/319080045 wait complete. 2026-03-08T23:59:34.324 INFO:teuthology.orchestra.run.vm03.stderr:adjusted tunables profile to default 2026-03-08T23:59:34.382 INFO:tasks.cephadm:Adding mon.vm03 on vm03 2026-03-08T23:59:34.382 INFO:tasks.cephadm:Adding mon.vm06 on vm06 2026-03-08T23:59:34.382 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch apply mon '2;vm03:192.168.123.103=vm03;vm06:192.168.123.106=vm06' 2026-03-08T23:59:34.525 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:34.561 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:34.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:34 vm03 ceph-mon[52346]: Added host vm06 2026-03-08T23:59:34.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:34 vm03 ceph-mon[52346]: from='client.14193 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-08T23:59:34.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/319080045' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-08T23:59:35.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:35 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/319080045' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-08T23:59:35.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:35 vm03 ceph-mon[52346]: osdmap e4: 0 total, 0 up, 0 in 2026-03-08T23:59:35.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:35 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:35.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:35 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:35.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:35 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:35.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:35 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.665+0000 7f01d750b700 1 -- 192.168.123.106:0/1784454567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 msgr2=0x7f01d0100ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.665+0000 7f01d750b700 1 --2- 192.168.123.106:0/1784454567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0100ea0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f01c4009b00 tx=0x7f01c4009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.666+0000 7f01d750b700 1 -- 192.168.123.106:0/1784454567 shutdown_connections 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.666+0000 7f01d750b700 1 --2- 192.168.123.106:0/1784454567 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0100ea0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.666+0000 7f01d750b700 1 -- 192.168.123.106:0/1784454567 >> 192.168.123.106:0/1784454567 conn(0x7f01d00fc020 msgr2=0x7f01d00fe460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.666+0000 7f01d750b700 1 -- 192.168.123.106:0/1784454567 shutdown_connections 2026-03-08T23:59:35.665 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.666+0000 7f01d750b700 1 -- 192.168.123.106:0/1784454567 wait complete. 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d750b700 1 Processor -- start 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d750b700 1 -- start start 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d750b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0074b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d750b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f01d0075070 con 0x7f01d0100a80 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d52a7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0074b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d52a7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0074b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43800/0 (socket says 192.168.123.106:43800) 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d52a7700 1 -- 192.168.123.106:0/3808814668 learned_addr learned my addr 192.168.123.106:0/3808814668 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:35.666 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.667+0000 7f01d52a7700 1 -- 192.168.123.106:0/3808814668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f01c40097e0 con 0x7f01d0100a80 2026-03-08T23:59:35.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.668+0000 7f01d52a7700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0074b30 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f01c4009fd0 tx=0x7f01c4005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:35.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.668+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f01c401d070 con 0x7f01d0100a80 2026-03-08T23:59:35.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.668+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f01d0073180 con 0x7f01d0100a80 2026-03-08T23:59:35.667 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.668+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f01d0073620 con 0x7f01d0100a80 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.668+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f01c4022470 con 0x7f01d0100a80 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.668+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f01c400f460 con 0x7f01d0100a80 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.669+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f01c400f610 con 0x7f01d0100a80 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.669+0000 7f01c27fc700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f01bc038480 0x7f01bc03a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.669+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f01c404d4e0 con 0x7f01d0100a80 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.669+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f01b4005320 con 0x7f01d0100a80 2026-03-08T23:59:35.668 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.669+0000 7f01d4aa6700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f01bc038480 0x7f01bc03a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:35.669 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.670+0000 7f01d4aa6700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f01bc038480 0x7f01bc03a940 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f01cc006fd0 tx=0x7f01cc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:35.672 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.673+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f01c4027070 con 0x7f01d0100a80 2026-03-08T23:59:35.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.788+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm03:192.168.123.103=vm03;vm06:192.168.123.106=vm06", "target": ["mon-mgr", ""]}) v1 -- 0x7f01b4000c90 con 0x7f01bc038480 2026-03-08T23:59:35.792 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.793+0000 7f01c27fc700 1 -- 192.168.123.106:0/3808814668 <== mgr.14164 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f01b4000c90 con 0x7f01bc038480 2026-03-08T23:59:35.792 INFO:teuthology.orchestra.run.vm06.stdout:Scheduled mon update... 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f01bc038480 msgr2=0x7f01bc03a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f01bc038480 0x7f01bc03a940 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f01cc006fd0 tx=0x7f01cc006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 msgr2=0x7f01d0074b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0074b30 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f01c4009fd0 tx=0x7f01c4005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 shutdown_connections 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f01bc038480 0x7f01bc03a940 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 --2- 192.168.123.106:0/3808814668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f01d0100a80 0x7f01d0074b30 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 >> 192.168.123.106:0/3808814668 conn(0x7f01d00fc020 msgr2=0x7f01d00fcce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:35.794 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.795+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 shutdown_connections 2026-03-08T23:59:35.795 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:35.796+0000 7f01d750b700 1 -- 192.168.123.106:0/3808814668 wait complete. 2026-03-08T23:59:35.854 DEBUG:teuthology.orchestra.run.vm06:mon.vm06> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm06.service 2026-03-08T23:59:35.855 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:35.855 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:36.031 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:36.067 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:36.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.320+0000 7f9379cf8700 1 -- 192.168.123.106:0/410442355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 msgr2=0x7f9374073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:36.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.320+0000 7f9379cf8700 1 --2- 192.168.123.106:0/410442355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f9374073220 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f935c009b00 tx=0x7f935c009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:36.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.321+0000 7f9379cf8700 1 -- 192.168.123.106:0/410442355 shutdown_connections 2026-03-08T23:59:36.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.321+0000 7f9379cf8700 1 --2- 192.168.123.106:0/410442355 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f9374073220 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:36.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.321+0000 7f9379cf8700 1 -- 192.168.123.106:0/410442355 >> 192.168.123.106:0/410442355 conn(0x7f93740fc030 msgr2=0x7f93740fe450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:36.320 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.321+0000 7f9379cf8700 1 -- 192.168.123.106:0/410442355 shutdown_connections 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.321+0000 7f9379cf8700 1 -- 192.168.123.106:0/410442355 wait complete. 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f9379cf8700 1 Processor -- start 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f9379cf8700 1 -- start start 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f9379cf8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f937419c150 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f9379cf8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f935c012070 con 0x7f9374074dc0 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f93737fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f937419c150 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f93737fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f937419c150 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43818/0 (socket says 192.168.123.106:43818) 2026-03-08T23:59:36.321 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.322+0000 7f93737fe700 1 -- 192.168.123.106:0/673856099 learned_addr learned my addr 192.168.123.106:0/673856099 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:36.322 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.323+0000 7f93737fe700 1 -- 192.168.123.106:0/673856099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f935c0097e0 con 0x7f9374074dc0 2026-03-08T23:59:36.322 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.323+0000 7f93737fe700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f937419c150 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f935c000c00 tx=0x7f935c005650 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:36.322 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.323+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f935c01d070 con 0x7f9374074dc0 2026-03-08T23:59:36.322 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.323+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f937419c690 con 0x7f9374074dc0 2026-03-08T23:59:36.323 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.323+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f937419cab0 con 0x7f9374074dc0 2026-03-08T23:59:36.323 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f935c00bcb0 con 0x7f9374074dc0 2026-03-08T23:59:36.323 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f935c00f460 con 0x7f9374074dc0 2026-03-08T23:59:36.324 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f935c00f640 con 0x7f9374074dc0 2026-03-08T23:59:36.324 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9370ff9700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f93600384d0 0x7f936003a990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:36.324 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f935c04d530 con 0x7f9374074dc0 2026-03-08T23:59:36.324 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9372ffd700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f93600384d0 0x7f936003a990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:36.324 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.324+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f937404fa90 con 0x7f9374074dc0 2026-03-08T23:59:36.324 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.325+0000 7f9372ffd700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f93600384d0 0x7f936003a990 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f9364006fd0 tx=0x7f9364006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:36.327 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.328+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f935c026070 con 0x7f9374074dc0 2026-03-08T23:59:36.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.484+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f9374195e10 con 0x7f9374074dc0 2026-03-08T23:59:36.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.485+0000 7f9370ff9700 1 -- 192.168.123.106:0/673856099 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f935c017440 con 0x7f9374074dc0 2026-03-08T23:59:36.484 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:36.484 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f93600384d0 msgr2=0x7f936003a990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f93600384d0 0x7f936003a990 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f9364006fd0 tx=0x7f9364006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 msgr2=0x7f937419c150 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f937419c150 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f935c000c00 tx=0x7f935c005650 comp rx=0 tx=0).stop 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 shutdown_connections 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f93600384d0 0x7f936003a990 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 --2- 192.168.123.106:0/673856099 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9374074dc0 0x7f937419c150 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 >> 192.168.123.106:0/673856099 conn(0x7f93740fc030 msgr2=0x7f93740fcca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 shutdown_connections 2026-03-08T23:59:36.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:36.487+0000 7f9379cf8700 1 -- 192.168.123.106:0/673856099 wait complete. 2026-03-08T23:59:36.487 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:36.519 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:36 vm03 ceph-mon[52346]: Deploying daemon alertmanager.vm03 on vm03 2026-03-08T23:59:36.519 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:36 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:36.519 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:36 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:36.519 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:36 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/673856099' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:37.547 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:37.547 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:37.687 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:37.723 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:37.774 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:37 vm03 ceph-mon[52346]: from='client.14197 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm03:192.168.123.103=vm03;vm06:192.168.123.106=vm06", "target": ["mon-mgr", ""]}]: dispatch 2026-03-08T23:59:37.774 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:37 vm03 ceph-mon[52346]: Saving service mon spec with placement vm03:192.168.123.103=vm03;vm06:192.168.123.106=vm06;count:2 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.971+0000 7ff57ea13700 1 -- 192.168.123.106:0/1825203718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 msgr2=0x7ff578103980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.971+0000 7ff57ea13700 1 --2- 192.168.123.106:0/1825203718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578103980 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7ff560009b00 tx=0x7ff560009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.972+0000 7ff57ea13700 1 -- 192.168.123.106:0/1825203718 shutdown_connections 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.972+0000 7ff57ea13700 1 --2- 192.168.123.106:0/1825203718 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578103980 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.972+0000 7ff57ea13700 1 -- 192.168.123.106:0/1825203718 >> 192.168.123.106:0/1825203718 conn(0x7ff5780faf00 msgr2=0x7ff5780fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.972+0000 7ff57ea13700 1 -- 192.168.123.106:0/1825203718 shutdown_connections 2026-03-08T23:59:37.971 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.972+0000 7ff57ea13700 1 -- 192.168.123.106:0/1825203718 wait complete. 2026-03-08T23:59:37.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff57ea13700 1 Processor -- start 2026-03-08T23:59:37.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff57ea13700 1 -- start start 2026-03-08T23:59:37.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff57ea13700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578197d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:37.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff57ea13700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff578198280 con 0x7ff578101590 2026-03-08T23:59:37.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff577fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578197d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:37.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff577fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578197d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43832/0 (socket says 192.168.123.106:43832) 2026-03-08T23:59:37.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.973+0000 7ff577fff700 1 -- 192.168.123.106:0/1787754490 learned_addr learned my addr 192.168.123.106:0/1787754490 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:37.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff577fff700 1 -- 192.168.123.106:0/1787754490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff5600097e0 con 0x7ff578101590 2026-03-08T23:59:37.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff577fff700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578197d40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7ff560004f40 tx=0x7ff560005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:37.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff56001c070 con 0x7ff578101590 2026-03-08T23:59:37.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff5600053b0 con 0x7ff578101590 2026-03-08T23:59:37.974 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff56000f460 con 0x7ff578101590 2026-03-08T23:59:37.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff578198480 con 0x7ff578101590 2026-03-08T23:59:37.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.974+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff578198920 con 0x7ff578101590 2026-03-08T23:59:37.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.975+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7ff560021470 con 0x7ff578101590 2026-03-08T23:59:37.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.975+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff5780623c0 con 0x7ff578101590 2026-03-08T23:59:37.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.975+0000 7ff5757fa700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff564038440 0x7ff56403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:37.975 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.975+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff56004c360 con 0x7ff578101590 2026-03-08T23:59:37.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.977+0000 7ff5777fe700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff564038440 0x7ff56403a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:37.976 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.977+0000 7ff5777fe700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff564038440 0x7ff56403a900 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7ff568006fd0 tx=0x7ff568006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:37.977 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:37.978+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff56000f5e0 con 0x7ff578101590 2026-03-08T23:59:38.137 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.137+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff57802ced0 con 0x7ff578101590 2026-03-08T23:59:38.138 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.139+0000 7ff5757fa700 1 -- 192.168.123.106:0/1787754490 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff560030300 con 0x7ff578101590 2026-03-08T23:59:38.138 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:38.138 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:38.140 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff564038440 msgr2=0x7ff56403a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff564038440 0x7ff56403a900 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7ff568006fd0 tx=0x7ff568006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 msgr2=0x7ff578197d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578197d40 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7ff560004f40 tx=0x7ff560005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 shutdown_connections 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff564038440 0x7ff56403a900 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 --2- 192.168.123.106:0/1787754490 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff578101590 0x7ff578197d40 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.141+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 >> 192.168.123.106:0/1787754490 conn(0x7ff5780faf00 msgr2=0x7ff5780fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.142+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 shutdown_connections 2026-03-08T23:59:38.141 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:38.142+0000 7ff57ea13700 1 -- 192.168.123.106:0/1787754490 wait complete. 2026-03-08T23:59:38.142 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:39.185 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:39.185 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:39.328 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/1787754490' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:39 vm03 ceph-mon[52346]: Deploying daemon grafana.vm03 on vm03 2026-03-08T23:59:39.363 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.615+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3015655869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 msgr2=0x7f1ff41030b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.615+0000 7f1ffa8e1700 1 --2- 192.168.123.106:0/3015655869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff41030b0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f1fe4009b00 tx=0x7f1fe4009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.615+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3015655869 shutdown_connections 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.615+0000 7f1ffa8e1700 1 --2- 192.168.123.106:0/3015655869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff41030b0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.615+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3015655869 >> 192.168.123.106:0/3015655869 conn(0x7f1ff40fe230 msgr2=0x7f1ff4100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.616+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3015655869 shutdown_connections 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.616+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3015655869 wait complete. 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.616+0000 7f1ffa8e1700 1 Processor -- start 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.616+0000 7f1ffa8e1700 1 -- start start 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.616+0000 7f1ffa8e1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff4197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:39.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.616+0000 7f1ffa8e1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1ff41982e0 con 0x7f1ff4102c90 2026-03-08T23:59:39.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff4197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:39.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff4197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43848/0 (socket says 192.168.123.106:43848) 2026-03-08T23:59:39.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff3fff700 1 -- 192.168.123.106:0/3178802668 learned_addr learned my addr 192.168.123.106:0/3178802668 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:39.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff3fff700 1 -- 192.168.123.106:0/3178802668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1fe40097e0 con 0x7f1ff4102c90 2026-03-08T23:59:39.616 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff3fff700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff4197da0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f1fe4004750 tx=0x7f1fe4005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1fe401c070 con 0x7f1ff4102c90 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1fe4021470 con 0x7f1ff4102c90 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1fe400f460 con 0x7f1ff4102c90 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1ff41984e0 con 0x7f1ff4102c90 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.617+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1ff41988c0 con 0x7f1ff4102c90 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.618+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f1fe4021ac0 con 0x7f1ff4102c90 2026-03-08T23:59:39.617 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.618+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1ff404fa90 con 0x7f1ff4102c90 2026-03-08T23:59:39.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.619+0000 7f1ff1ffb700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1fe00383f0 0x7f1fe003a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:39.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.619+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f1fe404c3b0 con 0x7f1ff4102c90 2026-03-08T23:59:39.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.619+0000 7f1feb5ff700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1fe00383f0 0x7f1fe003a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:39.618 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.619+0000 7f1feb5ff700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1fe00383f0 0x7f1fe003a8b0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f1fdc006fd0 tx=0x7f1fdc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:39.620 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.621+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1fe400f5c0 con 0x7f1ff4102c90 2026-03-08T23:59:39.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.771+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f1ff402d090 con 0x7f1ff4102c90 2026-03-08T23:59:39.772 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.773+0000 7f1ff1ffb700 1 -- 192.168.123.106:0/3178802668 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f1fe4030300 con 0x7f1ff4102c90 2026-03-08T23:59:39.772 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:39.772 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:39.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.775+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1fe00383f0 msgr2=0x7f1fe003a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:39.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.775+0000 7f1ffa8e1700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1fe00383f0 0x7f1fe003a8b0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f1fdc006fd0 tx=0x7f1fdc006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:39.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.775+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 msgr2=0x7f1ff4197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:39.774 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.775+0000 7f1ffa8e1700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff4197da0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f1fe4004750 tx=0x7f1fe4005dc0 comp rx=0 tx=0).stop 2026-03-08T23:59:39.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.776+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 shutdown_connections 2026-03-08T23:59:39.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.776+0000 7f1ffa8e1700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1fe00383f0 0x7f1fe003a8b0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:39.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.776+0000 7f1ffa8e1700 1 --2- 192.168.123.106:0/3178802668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1ff4102c90 0x7f1ff4197da0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:39.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.776+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 >> 192.168.123.106:0/3178802668 conn(0x7f1ff40fe230 msgr2=0x7f1ff40feef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:39.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.776+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 shutdown_connections 2026-03-08T23:59:39.775 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:39.776+0000 7f1ffa8e1700 1 -- 192.168.123.106:0/3178802668 wait complete. 2026-03-08T23:59:39.776 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:40.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:40 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/3178802668' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:40.835 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:40.836 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:40.976 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:41.018 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:41.282 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.283+0000 7f88e4a62700 1 -- 192.168.123.106:0/2303998973 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 msgr2=0x7f88e01030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.283+0000 7f88e4a62700 1 --2- 192.168.123.106:0/2303998973 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e01030d0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f88d0009b00 tx=0x7f88d0009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.283+0000 7f88e4a62700 1 -- 192.168.123.106:0/2303998973 shutdown_connections 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.283+0000 7f88e4a62700 1 --2- 192.168.123.106:0/2303998973 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e01030d0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.283+0000 7f88e4a62700 1 -- 192.168.123.106:0/2303998973 >> 192.168.123.106:0/2303998973 conn(0x7f88e00fe250 msgr2=0x7f88e0100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.284+0000 7f88e4a62700 1 -- 192.168.123.106:0/2303998973 shutdown_connections 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.284+0000 7f88e4a62700 1 -- 192.168.123.106:0/2303998973 wait complete. 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.284+0000 7f88e4a62700 1 Processor -- start 2026-03-08T23:59:41.283 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.284+0000 7f88e4a62700 1 -- start start 2026-03-08T23:59:41.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.285+0000 7f88e4a62700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e0197e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:41.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.285+0000 7f88e4a62700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88e0198360 con 0x7f88e0102cb0 2026-03-08T23:59:41.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.285+0000 7f88de59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e0197e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:41.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.285+0000 7f88de59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e0197e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43876/0 (socket says 192.168.123.106:43876) 2026-03-08T23:59:41.284 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.285+0000 7f88de59c700 1 -- 192.168.123.106:0/2678735361 learned_addr learned my addr 192.168.123.106:0/2678735361 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:41.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88de59c700 1 -- 192.168.123.106:0/2678735361 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f88d00097e0 con 0x7f88e0102cb0 2026-03-08T23:59:41.285 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88de59c700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e0197e20 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f88d0004d40 tx=0x7f88d0004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:41.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f88d001c070 con 0x7f88e0102cb0 2026-03-08T23:59:41.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f88d00056f0 con 0x7f88e0102cb0 2026-03-08T23:59:41.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f88d0017440 con 0x7f88e0102cb0 2026-03-08T23:59:41.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f88e0198560 con 0x7f88e0102cb0 2026-03-08T23:59:41.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.286+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f88e0198a00 con 0x7f88e0102cb0 2026-03-08T23:59:41.286 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.287+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f88d0005210 con 0x7f88e0102cb0 2026-03-08T23:59:41.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.287+0000 7f88d77fe700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f88cc038440 0x7f88cc03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:41.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.288+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f88d004c000 con 0x7f88e0102cb0 2026-03-08T23:59:41.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.288+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f88e0191a40 con 0x7f88e0102cb0 2026-03-08T23:59:41.287 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.288+0000 7f88d7fff700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f88cc038440 0x7f88cc03a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:41.288 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.289+0000 7f88d7fff700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f88cc038440 0x7f88cc03a900 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f88c8006fd0 tx=0x7f88c8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:41.290 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.291+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f88d0025070 con 0x7f88e0102cb0 2026-03-08T23:59:41.445 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.446+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f88e00623c0 con 0x7f88e0102cb0 2026-03-08T23:59:41.446 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.447+0000 7f88d77fe700 1 -- 192.168.123.106:0/2678735361 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f88d000f980 con 0x7f88e0102cb0 2026-03-08T23:59:41.446 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:41.446 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:41.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.449+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f88cc038440 msgr2=0x7f88cc03a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:41.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.449+0000 7f88e4a62700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f88cc038440 0x7f88cc03a900 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f88c8006fd0 tx=0x7f88c8006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:41.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.449+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 msgr2=0x7f88e0197e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:41.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.449+0000 7f88e4a62700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e0197e20 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f88d0004d40 tx=0x7f88d0004e20 comp rx=0 tx=0).stop 2026-03-08T23:59:41.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.450+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 shutdown_connections 2026-03-08T23:59:41.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.450+0000 7f88e4a62700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f88cc038440 0x7f88cc03a900 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:41.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.450+0000 7f88e4a62700 1 --2- 192.168.123.106:0/2678735361 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f88e0102cb0 0x7f88e0197e20 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:41.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.450+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 >> 192.168.123.106:0/2678735361 conn(0x7f88e00fe250 msgr2=0x7f88e00fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:41.449 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.450+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 shutdown_connections 2026-03-08T23:59:41.450 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:41.450+0000 7f88e4a62700 1 -- 192.168.123.106:0/2678735361 wait complete. 2026-03-08T23:59:41.450 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:41.522 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:41 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:41.522 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:41 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/2678735361' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:42.512 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:42.513 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:42.649 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:42.685 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.942+0000 7fb3143c2700 1 -- 192.168.123.106:0/1745611697 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 msgr2=0x7fb30c1030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.942+0000 7fb3143c2700 1 --2- 192.168.123.106:0/1745611697 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c1030d0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fb2fc009b00 tx=0x7fb2fc009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.942+0000 7fb3143c2700 1 -- 192.168.123.106:0/1745611697 shutdown_connections 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.942+0000 7fb3143c2700 1 --2- 192.168.123.106:0/1745611697 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c1030d0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.942+0000 7fb3143c2700 1 -- 192.168.123.106:0/1745611697 >> 192.168.123.106:0/1745611697 conn(0x7fb30c0fe250 msgr2=0x7fb30c100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.943+0000 7fb3143c2700 1 -- 192.168.123.106:0/1745611697 shutdown_connections 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.943+0000 7fb3143c2700 1 -- 192.168.123.106:0/1745611697 wait complete. 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.943+0000 7fb3143c2700 1 Processor -- start 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.943+0000 7fb3143c2700 1 -- start start 2026-03-08T23:59:42.942 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.943+0000 7fb3143c2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c197e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:42.943 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.943+0000 7fb3143c2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb30c198360 con 0x7fb30c102cb0 2026-03-08T23:59:42.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.944+0000 7fb31215e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c197e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:42.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.944+0000 7fb31215e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c197e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43886/0 (socket says 192.168.123.106:43886) 2026-03-08T23:59:42.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.944+0000 7fb31215e700 1 -- 192.168.123.106:0/393097958 learned_addr learned my addr 192.168.123.106:0/393097958 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:42.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb31215e700 1 -- 192.168.123.106:0/393097958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2fc0097e0 con 0x7fb30c102cb0 2026-03-08T23:59:42.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb31215e700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c197e20 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fb2fc004d40 tx=0x7fb2fc004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:42.944 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb2fc01c070 con 0x7fb30c102cb0 2026-03-08T23:59:42.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb30c198560 con 0x7fb30c102cb0 2026-03-08T23:59:42.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb30c198a00 con 0x7fb30c102cb0 2026-03-08T23:59:42.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb2fc0056f0 con 0x7fb30c102cb0 2026-03-08T23:59:42.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.945+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb2fc017440 con 0x7fb30c102cb0 2026-03-08T23:59:42.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.946+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fb2fc0175a0 con 0x7fb30c102cb0 2026-03-08T23:59:42.945 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.946+0000 7fb3037fe700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb2f8038440 0x7fb2f803a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:42.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.946+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb2fc04bf90 con 0x7fb30c102cb0 2026-03-08T23:59:42.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.946+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb2f0005320 con 0x7fb30c102cb0 2026-03-08T23:59:42.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.946+0000 7fb31195d700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb2f8038440 0x7fb2f803a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:42.946 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.947+0000 7fb31195d700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb2f8038440 0x7fb2f803a900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb308006fd0 tx=0x7fb308006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:42.948 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:42.949+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb2fc025070 con 0x7fb30c102cb0 2026-03-08T23:59:43.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.092+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb2f0005190 con 0x7fb30c102cb0 2026-03-08T23:59:43.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.094+0000 7fb3037fe700 1 -- 192.168.123.106:0/393097958 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb2fc028970 con 0x7fb30c102cb0 2026-03-08T23:59:43.094 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:43.094 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.096+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb2f8038440 msgr2=0x7fb2f803a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.096+0000 7fb3143c2700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb2f8038440 0x7fb2f803a900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb308006fd0 tx=0x7fb308006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.096+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 msgr2=0x7fb30c197e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.096+0000 7fb3143c2700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c197e20 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fb2fc004d40 tx=0x7fb2fc004e20 comp rx=0 tx=0).stop 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.097+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 shutdown_connections 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.097+0000 7fb3143c2700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fb2f8038440 0x7fb2f803a900 secure :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fb308006fd0 tx=0x7fb308006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.097+0000 7fb3143c2700 1 --2- 192.168.123.106:0/393097958 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb30c102cb0 0x7fb30c197e20 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.097+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 >> 192.168.123.106:0/393097958 conn(0x7fb30c0fe250 msgr2=0x7fb30c0fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.097+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 shutdown_connections 2026-03-08T23:59:43.096 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:43.097+0000 7fb3143c2700 1 -- 192.168.123.106:0/393097958 wait complete. 2026-03-08T23:59:43.097 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:43.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:43 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/393097958' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:44.145 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:44.145 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:44.289 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:44.324 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:44.560 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.561+0000 7f8af85e7700 1 -- 192.168.123.106:0/1367545723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 msgr2=0x7f8af01030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.561+0000 7f8af85e7700 1 --2- 192.168.123.106:0/1367545723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af01030d0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f8ae0009b00 tx=0x7f8ae0009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.561+0000 7f8af85e7700 1 -- 192.168.123.106:0/1367545723 shutdown_connections 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.561+0000 7f8af85e7700 1 --2- 192.168.123.106:0/1367545723 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af01030d0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.561+0000 7f8af85e7700 1 -- 192.168.123.106:0/1367545723 >> 192.168.123.106:0/1367545723 conn(0x7f8af00fe250 msgr2=0x7f8af0100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.562+0000 7f8af85e7700 1 -- 192.168.123.106:0/1367545723 shutdown_connections 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.562+0000 7f8af85e7700 1 -- 192.168.123.106:0/1367545723 wait complete. 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.562+0000 7f8af85e7700 1 Processor -- start 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.562+0000 7f8af85e7700 1 -- start start 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.562+0000 7f8af85e7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af0197e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:44.561 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.562+0000 7f8af85e7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8af0198360 con 0x7f8af0102cb0 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af6383700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af0197e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af6383700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af0197e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:43902/0 (socket says 192.168.123.106:43902) 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af6383700 1 -- 192.168.123.106:0/2432334507 learned_addr learned my addr 192.168.123.106:0/2432334507 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af6383700 1 -- 192.168.123.106:0/2432334507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8ae00097e0 con 0x7f8af0102cb0 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af6383700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af0197e20 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f8ae0004d40 tx=0x7f8ae0004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ae001c070 con 0x7f8af0102cb0 2026-03-08T23:59:44.562 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8af0198560 con 0x7f8af0102cb0 2026-03-08T23:59:44.563 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.563+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8af0198a00 con 0x7f8af0102cb0 2026-03-08T23:59:44.563 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.564+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8ae00054e0 con 0x7f8af0102cb0 2026-03-08T23:59:44.563 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.564+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8ae0003b70 con 0x7f8af0102cb0 2026-03-08T23:59:44.563 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.564+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8af0191a40 con 0x7f8af0102cb0 2026-03-08T23:59:44.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.564+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f8ae000f460 con 0x7f8af0102cb0 2026-03-08T23:59:44.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.565+0000 7f8ae77fe700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8adc038440 0x7f8adc03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:44.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.565+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8ae004d360 con 0x7f8af0102cb0 2026-03-08T23:59:44.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.565+0000 7f8af5b82700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8adc038440 0x7f8adc03a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:44.564 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.565+0000 7f8af5b82700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8adc038440 0x7f8adc03a900 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f8aec006fd0 tx=0x7f8aec006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:44.566 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.567+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8ae0029980 con 0x7f8af0102cb0 2026-03-08T23:59:44.717 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.717+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8af002cc70 con 0x7f8af0102cb0 2026-03-08T23:59:44.718 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.719+0000 7f8ae77fe700 1 -- 192.168.123.106:0/2432334507 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8ae0029390 con 0x7f8af0102cb0 2026-03-08T23:59:44.719 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:44.719 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8adc038440 msgr2=0x7f8adc03a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8adc038440 0x7f8adc03a900 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f8aec006fd0 tx=0x7f8aec006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 msgr2=0x7f8af0197e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af0197e20 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f8ae0004d40 tx=0x7f8ae0004e20 comp rx=0 tx=0).stop 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 shutdown_connections 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f8adc038440 0x7f8adc03a900 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 --2- 192.168.123.106:0/2432334507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8af0102cb0 0x7f8af0197e20 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 >> 192.168.123.106:0/2432334507 conn(0x7f8af00fe250 msgr2=0x7f8af00fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 shutdown_connections 2026-03-08T23:59:44.721 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:44.722+0000 7f8af85e7700 1 -- 192.168.123.106:0/2432334507 wait complete. 2026-03-08T23:59:44.722 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:44 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/2432334507' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:45.769 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:45.769 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:45.919 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:45.955 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:46.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 -- 192.168.123.106:0/3713139470 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 msgr2=0x7f00cc1038e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:46.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 --2- 192.168.123.106:0/3713139470 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc1038e0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f00c0009b00 tx=0x7f00c0009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 -- 192.168.123.106:0/3713139470 shutdown_connections 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 --2- 192.168.123.106:0/3713139470 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc1038e0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 -- 192.168.123.106:0/3713139470 >> 192.168.123.106:0/3713139470 conn(0x7f00cc0faf00 msgr2=0x7f00cc0fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 -- 192.168.123.106:0/3713139470 shutdown_connections 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.250+0000 7f00d4077700 1 -- 192.168.123.106:0/3713139470 wait complete. 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.251+0000 7f00d4077700 1 Processor -- start 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.251+0000 7f00d4077700 1 -- start start 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.251+0000 7f00d4077700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc197d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:46.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.251+0000 7f00d4077700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00cc198290 con 0x7f00cc1014f0 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d1e13700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc197d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d1e13700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc197d50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:52790/0 (socket says 192.168.123.106:52790) 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d1e13700 1 -- 192.168.123.106:0/2829586620 learned_addr learned my addr 192.168.123.106:0/2829586620 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d1e13700 1 -- 192.168.123.106:0/2829586620 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00c00097e0 con 0x7f00cc1014f0 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d1e13700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc197d50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f00c0004f40 tx=0x7f00c0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00c001c070 con 0x7f00cc1014f0 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00cc198490 con 0x7f00cc1014f0 2026-03-08T23:59:46.251 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.252+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00cc198930 con 0x7f00cc1014f0 2026-03-08T23:59:46.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.253+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f00c00053b0 con 0x7f00cc1014f0 2026-03-08T23:59:46.252 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.253+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f00c000f460 con 0x7f00cc1014f0 2026-03-08T23:59:46.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.253+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f00c000f6d0 con 0x7f00cc1014f0 2026-03-08T23:59:46.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.254+0000 7f00beffd700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f00b8038440 0x7f00b803a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:46.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.254+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f00c004d5f0 con 0x7f00cc1014f0 2026-03-08T23:59:46.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.254+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00cc191bf0 con 0x7f00cc1014f0 2026-03-08T23:59:46.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.254+0000 7f00d1612700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f00b8038440 0x7f00b803a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:46.253 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.254+0000 7f00d1612700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f00b8038440 0x7f00b803a900 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f00c8006fd0 tx=0x7f00c8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:46.256 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.257+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f00c0017440 con 0x7f00cc1014f0 2026-03-08T23:59:46.410 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.410+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f00cc0623c0 con 0x7f00cc1014f0 2026-03-08T23:59:46.410 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.411+0000 7f00beffd700 1 -- 192.168.123.106:0/2829586620 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f00c001fb60 con 0x7f00cc1014f0 2026-03-08T23:59:46.411 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:46.412 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:46.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.415+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f00b8038440 msgr2=0x7f00b803a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:46.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.415+0000 7f00d4077700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f00b8038440 0x7f00b803a900 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f00c8006fd0 tx=0x7f00c8006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:46.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.415+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 msgr2=0x7f00cc197d50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:46.414 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.415+0000 7f00d4077700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc197d50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f00c0004f40 tx=0x7f00c0005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:46.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.416+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 shutdown_connections 2026-03-08T23:59:46.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.416+0000 7f00d4077700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f00b8038440 0x7f00b803a900 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:46.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.416+0000 7f00d4077700 1 --2- 192.168.123.106:0/2829586620 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00cc1014f0 0x7f00cc197d50 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:46.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.416+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 >> 192.168.123.106:0/2829586620 conn(0x7f00cc0faf00 msgr2=0x7f00cc0fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:46.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.416+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 shutdown_connections 2026-03-08T23:59:46.415 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:46.416+0000 7f00d4077700 1 -- 192.168.123.106:0/2829586620 wait complete. 2026-03-08T23:59:46.416 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:46 vm03 ceph-mon[52346]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-08T23:59:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:46 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/2829586620' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:47.495 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:47.495 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:47.644 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:47.685 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:48.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:48 vm03 ceph-mon[52346]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-08T23:59:48.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.348+0000 7fdad7b51700 1 -- 192.168.123.106:0/3165902852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 msgr2=0x7fdad01039a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:48.347 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.348+0000 7fdad7b51700 1 --2- 192.168.123.106:0/3165902852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad01039a0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fdac0009b00 tx=0x7fdac0009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:48.348 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.349+0000 7fdad7b51700 1 -- 192.168.123.106:0/3165902852 shutdown_connections 2026-03-08T23:59:48.348 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.349+0000 7fdad7b51700 1 --2- 192.168.123.106:0/3165902852 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad01039a0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:48.348 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.349+0000 7fdad7b51700 1 -- 192.168.123.106:0/3165902852 >> 192.168.123.106:0/3165902852 conn(0x7fdad00faf00 msgr2=0x7fdad00fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:48.348 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.349+0000 7fdad7b51700 1 -- 192.168.123.106:0/3165902852 shutdown_connections 2026-03-08T23:59:48.348 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.349+0000 7fdad7b51700 1 -- 192.168.123.106:0/3165902852 wait complete. 2026-03-08T23:59:48.349 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.350+0000 7fdad7b51700 1 Processor -- start 2026-03-08T23:59:48.349 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.350+0000 7fdad7b51700 1 -- start start 2026-03-08T23:59:48.349 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.350+0000 7fdad7b51700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad0195b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:48.349 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.350+0000 7fdad7b51700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdad0196070 con 0x7fdad01015b0 2026-03-08T23:59:48.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.351+0000 7fdad58ed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad0195b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:48.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.351+0000 7fdad58ed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad0195b30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:52806/0 (socket says 192.168.123.106:52806) 2026-03-08T23:59:48.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.351+0000 7fdad58ed700 1 -- 192.168.123.106:0/3127532329 learned_addr learned my addr 192.168.123.106:0/3127532329 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:48.350 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.351+0000 7fdad58ed700 1 -- 192.168.123.106:0/3127532329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdac00097e0 con 0x7fdad01015b0 2026-03-08T23:59:48.351 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.352+0000 7fdad58ed700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad0195b30 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fdac0004f40 tx=0x7fdac0005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.352+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdac001c070 con 0x7fdad01015b0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.352+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdac00053b0 con 0x7fdad01015b0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.352+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdac000f590 con 0x7fdad01015b0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.352+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdad0196270 con 0x7fdad01015b0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.352+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdad0196710 con 0x7fdad01015b0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.353+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fdac0021a50 con 0x7fdad01015b0 2026-03-08T23:59:48.352 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.353+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdad018f800 con 0x7fdad01015b0 2026-03-08T23:59:48.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.353+0000 7fdac6ffd700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdabc0383f0 0x7fdabc03a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:48.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.353+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fdac004c360 con 0x7fdad01015b0 2026-03-08T23:59:48.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.354+0000 7fdad50ec700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdabc0383f0 0x7fdabc03a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:48.353 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.354+0000 7fdad50ec700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdabc0383f0 0x7fdabc03a8b0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fdacc006fd0 tx=0x7fdacc006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:48.360 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.361+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fdac000f6f0 con 0x7fdad01015b0 2026-03-08T23:59:48.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.505+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fdad00623c0 con 0x7fdad01015b0 2026-03-08T23:59:48.504 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.505+0000 7fdac6ffd700 1 -- 192.168.123.106:0/3127532329 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fdac0026030 con 0x7fdad01015b0 2026-03-08T23:59:48.505 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:48.505 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdabc0383f0 msgr2=0x7fdabc03a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdabc0383f0 0x7fdabc03a8b0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fdacc006fd0 tx=0x7fdacc006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 msgr2=0x7fdad0195b30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad0195b30 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fdac0004f40 tx=0x7fdac0005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 shutdown_connections 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fdabc0383f0 0x7fdabc03a8b0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 --2- 192.168.123.106:0/3127532329 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdad01015b0 0x7fdad0195b30 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 >> 192.168.123.106:0/3127532329 conn(0x7fdad00faf00 msgr2=0x7fdad00fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 shutdown_connections 2026-03-08T23:59:48.507 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:48.508+0000 7fdad7b51700 1 -- 192.168.123.106:0/3127532329 wait complete. 2026-03-08T23:59:48.508 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:49.371 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/3127532329' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:49.371 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.371 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.371 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.371 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.371 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.372 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.372 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.372 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:49 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:49.569 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:49.569 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:49.721 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:49.766 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.024+0000 7fa59ac2f700 1 -- 192.168.123.106:0/503678986 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 msgr2=0x7fa594100ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.024+0000 7fa59ac2f700 1 --2- 192.168.123.106:0/503678986 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594100ea0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7fa584009b00 tx=0x7fa584009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.025+0000 7fa59ac2f700 1 -- 192.168.123.106:0/503678986 shutdown_connections 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.025+0000 7fa59ac2f700 1 --2- 192.168.123.106:0/503678986 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594100ea0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.025+0000 7fa59ac2f700 1 -- 192.168.123.106:0/503678986 >> 192.168.123.106:0/503678986 conn(0x7fa5940fc000 msgr2=0x7fa5940fe460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.025+0000 7fa59ac2f700 1 -- 192.168.123.106:0/503678986 shutdown_connections 2026-03-08T23:59:50.024 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.025+0000 7fa59ac2f700 1 -- 192.168.123.106:0/503678986 wait complete. 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.025+0000 7fa59ac2f700 1 Processor -- start 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.026+0000 7fa59ac2f700 1 -- start start 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.026+0000 7fa59ac2f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594197dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.026+0000 7fa59ac2f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa594198300 con 0x7fa594100a80 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.026+0000 7fa5989cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594197dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.026+0000 7fa5989cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594197dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:52816/0 (socket says 192.168.123.106:52816) 2026-03-08T23:59:50.025 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.026+0000 7fa5989cb700 1 -- 192.168.123.106:0/3648466779 learned_addr learned my addr 192.168.123.106:0/3648466779 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:50.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa5989cb700 1 -- 192.168.123.106:0/3648466779 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5840097e0 con 0x7fa594100a80 2026-03-08T23:59:50.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa5989cb700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594197dc0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fa584004750 tx=0x7fa584005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:50.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa58401c070 con 0x7fa594100a80 2026-03-08T23:59:50.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa584021470 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa594198500 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa58400f460 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.027+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5941989a0 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.028+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fa584021ac0 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.029+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa594191c10 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.029+0000 7fa591ffb700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa57c038440 0x7fa57c03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.029+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fa58404c3b0 con 0x7fa594100a80 2026-03-08T23:59:50.028 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.029+0000 7fa593fff700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa57c038440 0x7fa57c03a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:50.029 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.030+0000 7fa593fff700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa57c038440 0x7fa57c03a900 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fa588006fd0 tx=0x7fa588006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:50.031 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.032+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa58400f5c0 con 0x7fa594100a80 2026-03-08T23:59:50.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.184+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fa5940623c0 con 0x7fa594100a80 2026-03-08T23:59:50.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.185+0000 7fa591ffb700 1 -- 192.168.123.106:0/3648466779 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fa584030300 con 0x7fa594100a80 2026-03-08T23:59:50.184 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:50.185 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa57c038440 msgr2=0x7fa57c03a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa57c038440 0x7fa57c03a900 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fa588006fd0 tx=0x7fa588006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 msgr2=0x7fa594197dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594197dc0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7fa584004750 tx=0x7fa584005dc0 comp rx=0 tx=0).stop 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 shutdown_connections 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa57c038440 0x7fa57c03a900 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 --2- 192.168.123.106:0/3648466779 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa594100a80 0x7fa594197dc0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 >> 192.168.123.106:0/3648466779 conn(0x7fa5940fc000 msgr2=0x7fa5940fcce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 shutdown_connections 2026-03-08T23:59:50.187 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:50.188+0000 7fa59ac2f700 1 -- 192.168.123.106:0/3648466779 wait complete. 2026-03-08T23:59:50.188 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:50.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:50 vm03 ceph-mon[52346]: Deploying daemon prometheus.vm03 on vm03 2026-03-08T23:59:50.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:50 vm03 ceph-mon[52346]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-08T23:59:50.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:50 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/3648466779' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:51.235 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:51.235 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:51.385 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:51.424 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:51.527 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:51 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.701+0000 7fc5265f5700 1 -- 192.168.123.106:0/1044249769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc520100a60 msgr2=0x7fc520100e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.701+0000 7fc5265f5700 1 --2- 192.168.123.106:0/1044249769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc520100a60 0x7fc520100e80 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7fc514009b00 tx=0x7fc514009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.702+0000 7fc5265f5700 1 -- 192.168.123.106:0/1044249769 shutdown_connections 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.702+0000 7fc5265f5700 1 --2- 192.168.123.106:0/1044249769 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc520100a60 0x7fc520100e80 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.702+0000 7fc5265f5700 1 -- 192.168.123.106:0/1044249769 >> 192.168.123.106:0/1044249769 conn(0x7fc5200fc000 msgr2=0x7fc5200fe440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.702+0000 7fc5265f5700 1 -- 192.168.123.106:0/1044249769 shutdown_connections 2026-03-08T23:59:51.701 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.702+0000 7fc5265f5700 1 -- 192.168.123.106:0/1044249769 wait complete. 2026-03-08T23:59:51.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.702+0000 7fc5265f5700 1 Processor -- start 2026-03-08T23:59:51.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.703+0000 7fc5265f5700 1 -- start start 2026-03-08T23:59:51.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.703+0000 7fc5265f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 0x7fc52019e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:51.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.703+0000 7fc5265f5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc52019eee0 con 0x7fc52019e580 2026-03-08T23:59:51.702 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.703+0000 7fc51ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 0x7fc52019e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:51.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.703+0000 7fc51ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 0x7fc52019e9a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:52840/0 (socket says 192.168.123.106:52840) 2026-03-08T23:59:51.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.703+0000 7fc51ffff700 1 -- 192.168.123.106:0/2913890757 learned_addr learned my addr 192.168.123.106:0/2913890757 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:51.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.704+0000 7fc51ffff700 1 -- 192.168.123.106:0/2913890757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc5140097e0 con 0x7fc52019e580 2026-03-08T23:59:51.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.704+0000 7fc51ffff700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 0x7fc52019e9a0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fc514009fd0 tx=0x7fc514005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:51.703 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.704+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc51401c070 con 0x7fc52019e580 2026-03-08T23:59:51.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.704+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc52019f0e0 con 0x7fc52019e580 2026-03-08T23:59:51.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.704+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc5201a1d30 con 0x7fc52019e580 2026-03-08T23:59:51.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.705+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc51400b810 con 0x7fc52019e580 2026-03-08T23:59:51.704 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.705+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc51400f460 con 0x7fc52019e580 2026-03-08T23:59:51.705 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.706+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fc51400f680 con 0x7fc52019e580 2026-03-08T23:59:51.705 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.706+0000 7fc51d7fa700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc5080383f0 0x7fc50803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:51.705 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.706+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fc51404d2b0 con 0x7fc52019e580 2026-03-08T23:59:51.705 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.706+0000 7fc51f7fe700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc5080383f0 0x7fc50803a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:51.706 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.707+0000 7fc51f7fe700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc5080383f0 0x7fc50803a8b0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc510006fd0 tx=0x7fc510006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:51.706 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.707+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc50c005320 con 0x7fc52019e580 2026-03-08T23:59:51.709 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.710+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc514026070 con 0x7fc52019e580 2026-03-08T23:59:51.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.890+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc50c005190 con 0x7fc52019e580 2026-03-08T23:59:51.890 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.891+0000 7fc51d7fa700 1 -- 192.168.123.106:0/2913890757 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc5140209c0 con 0x7fc52019e580 2026-03-08T23:59:51.891 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:51.891 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:51.893 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.894+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc5080383f0 msgr2=0x7fc50803a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:51.893 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.894+0000 7fc5265f5700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc5080383f0 0x7fc50803a8b0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc510006fd0 tx=0x7fc510006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.894+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 msgr2=0x7fc52019e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.895+0000 7fc5265f5700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 0x7fc52019e9a0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fc514009fd0 tx=0x7fc514005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.895+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 shutdown_connections 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.895+0000 7fc5265f5700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc5080383f0 0x7fc50803a8b0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.895+0000 7fc5265f5700 1 --2- 192.168.123.106:0/2913890757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc52019e580 0x7fc52019e9a0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.895+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 >> 192.168.123.106:0/2913890757 conn(0x7fc5200fc000 msgr2=0x7fc5200fccc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:51.894 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.895+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 shutdown_connections 2026-03-08T23:59:51.895 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:51.896+0000 7fc5265f5700 1 -- 192.168.123.106:0/2913890757 wait complete. 2026-03-08T23:59:51.896 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:52.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:52 vm03 ceph-mon[52346]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-08T23:59:52.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:52 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/2913890757' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:52.966 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:52.967 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:53.109 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:53.146 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.421+0000 7f65cca96700 1 -- 192.168.123.106:0/3446258462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 msgr2=0x7f65c8100ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.421+0000 7f65cca96700 1 --2- 192.168.123.106:0/3446258462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8100ea0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f65b8009b00 tx=0x7f65b8009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.422+0000 7f65cca96700 1 -- 192.168.123.106:0/3446258462 shutdown_connections 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.422+0000 7f65cca96700 1 --2- 192.168.123.106:0/3446258462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8100ea0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.422+0000 7f65cca96700 1 -- 192.168.123.106:0/3446258462 >> 192.168.123.106:0/3446258462 conn(0x7f65c80fc000 msgr2=0x7f65c80fe460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.422+0000 7f65cca96700 1 -- 192.168.123.106:0/3446258462 shutdown_connections 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.422+0000 7f65cca96700 1 -- 192.168.123.106:0/3446258462 wait complete. 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.423+0000 7f65cca96700 1 Processor -- start 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.423+0000 7f65cca96700 1 -- start start 2026-03-08T23:59:53.422 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.423+0000 7f65cca96700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:53.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.423+0000 7f65cca96700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f65c81982e0 con 0x7f65c8100a80 2026-03-08T23:59:53.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.424+0000 7f65c659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:53.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.424+0000 7f65c659c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:52860/0 (socket says 192.168.123.106:52860) 2026-03-08T23:59:53.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.424+0000 7f65c659c700 1 -- 192.168.123.106:0/456962708 learned_addr learned my addr 192.168.123.106:0/456962708 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:53.423 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.424+0000 7f65c659c700 1 -- 192.168.123.106:0/456962708 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f65b80097e0 con 0x7f65c8100a80 2026-03-08T23:59:53.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.425+0000 7f65c659c700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8197da0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f65b8004750 tx=0x7f65b8005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:53.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.425+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f65b801c070 con 0x7f65c8100a80 2026-03-08T23:59:53.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.425+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f65c81984e0 con 0x7f65c8100a80 2026-03-08T23:59:53.424 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.425+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f65c8198980 con 0x7f65c8100a80 2026-03-08T23:59:53.425 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.426+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f65b8021470 con 0x7f65c8100a80 2026-03-08T23:59:53.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.426+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f65b800f460 con 0x7f65c8100a80 2026-03-08T23:59:53.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.426+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f65b800f600 con 0x7f65c8100a80 2026-03-08T23:59:53.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.426+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f65c8191c10 con 0x7f65c8100a80 2026-03-08T23:59:53.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.426+0000 7f65bf7fe700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f65a8038440 0x7f65a803a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:53.426 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.427+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f65b804d450 con 0x7f65c8100a80 2026-03-08T23:59:53.428 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.429+0000 7f65bffff700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f65a8038440 0x7f65a803a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:53.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.430+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f65b8029950 con 0x7f65c8100a80 2026-03-08T23:59:53.429 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.430+0000 7f65bffff700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f65a8038440 0x7f65a803a900 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f65b0006fd0 tx=0x7f65b0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:53.576 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.577+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f65c80623c0 con 0x7f65c8100a80 2026-03-08T23:59:53.577 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.577+0000 7f65bf7fe700 1 -- 192.168.123.106:0/456962708 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f65b8030300 con 0x7f65c8100a80 2026-03-08T23:59:53.578 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:53.578 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:53.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.581+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f65a8038440 msgr2=0x7f65a803a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:53.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.581+0000 7f65cca96700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f65a8038440 0x7f65a803a900 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f65b0006fd0 tx=0x7f65b0006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:53.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.581+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 msgr2=0x7f65c8197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:53.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.581+0000 7f65cca96700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8197da0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f65b8004750 tx=0x7f65b8005dc0 comp rx=0 tx=0).stop 2026-03-08T23:59:53.580 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.581+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 shutdown_connections 2026-03-08T23:59:53.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.582+0000 7f65cca96700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f65a8038440 0x7f65a803a900 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:53.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.582+0000 7f65cca96700 1 --2- 192.168.123.106:0/456962708 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f65c8100a80 0x7f65c8197da0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:53.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.582+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 >> 192.168.123.106:0/456962708 conn(0x7f65c80fc000 msgr2=0x7f65c80fe460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:53.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.582+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 shutdown_connections 2026-03-08T23:59:53.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:53.582+0000 7f65cca96700 1 -- 192.168.123.106:0/456962708 wait complete. 2026-03-08T23:59:53.582 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:54.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:54 vm03 ceph-mon[52346]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-08T23:59:54.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:54 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/456962708' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:54.642 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:54.642 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:54.788 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:54.825 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:55.090 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.090+0000 7fcc7687c700 1 -- 192.168.123.106:0/2424898832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 msgr2=0x7fcc701038e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.090+0000 7fcc7687c700 1 --2- 192.168.123.106:0/2424898832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc701038e0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fcc58009b00 tx=0x7fcc58009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.091+0000 7fcc7687c700 1 -- 192.168.123.106:0/2424898832 shutdown_connections 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.091+0000 7fcc7687c700 1 --2- 192.168.123.106:0/2424898832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc701038e0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.091+0000 7fcc7687c700 1 -- 192.168.123.106:0/2424898832 >> 192.168.123.106:0/2424898832 conn(0x7fcc700faf00 msgr2=0x7fcc700fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.091+0000 7fcc7687c700 1 -- 192.168.123.106:0/2424898832 shutdown_connections 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.092+0000 7fcc7687c700 1 -- 192.168.123.106:0/2424898832 wait complete. 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.092+0000 7fcc7687c700 1 Processor -- start 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.092+0000 7fcc7687c700 1 -- start start 2026-03-08T23:59:55.091 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.092+0000 7fcc7687c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc70195b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:55.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.092+0000 7fcc7687c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc70196090 con 0x7fcc701014f0 2026-03-08T23:59:55.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.093+0000 7fcc6ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc70195b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:55.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.093+0000 7fcc6ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc70195b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:52882/0 (socket says 192.168.123.106:52882) 2026-03-08T23:59:55.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.093+0000 7fcc6ffff700 1 -- 192.168.123.106:0/1142530939 learned_addr learned my addr 192.168.123.106:0/1142530939 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:55.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.093+0000 7fcc6ffff700 1 -- 192.168.123.106:0/1142530939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcc580097e0 con 0x7fcc701014f0 2026-03-08T23:59:55.092 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.093+0000 7fcc6ffff700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc70195b50 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fcc58004f40 tx=0x7fcc58005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:55.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.093+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fcc5801c070 con 0x7fcc701014f0 2026-03-08T23:59:55.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.094+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcc70196290 con 0x7fcc701014f0 2026-03-08T23:59:55.093 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.094+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcc70196730 con 0x7fcc701014f0 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.094+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcc580053b0 con 0x7fcc701014f0 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.094+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fcc5800f590 con 0x7fcc701014f0 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.095+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fcc58021a50 con 0x7fcc701014f0 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.095+0000 7fcc6d7fa700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcc5c038440 0x7fcc5c03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.095+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fcc5804c2c0 con 0x7fcc701014f0 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.095+0000 7fcc6f7fe700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcc5c038440 0x7fcc5c03a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:55.094 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.095+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc50005320 con 0x7fcc701014f0 2026-03-08T23:59:55.095 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.096+0000 7fcc6f7fe700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcc5c038440 0x7fcc5c03a900 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fcc60006fd0 tx=0x7fcc60006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:55.098 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.099+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fcc58026070 con 0x7fcc701014f0 2026-03-08T23:59:55.246 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.246+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fcc50005190 con 0x7fcc701014f0 2026-03-08T23:59:55.246 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.247+0000 7fcc6d7fa700 1 -- 192.168.123.106:0/1142530939 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fcc5800f6f0 con 0x7fcc701014f0 2026-03-08T23:59:55.247 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:55.247 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcc5c038440 msgr2=0x7fcc5c03a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcc5c038440 0x7fcc5c03a900 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fcc60006fd0 tx=0x7fcc60006e40 comp rx=0 tx=0).stop 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 msgr2=0x7fcc70195b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc70195b50 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7fcc58004f40 tx=0x7fcc58005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 shutdown_connections 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcc5c038440 0x7fcc5c03a900 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 --2- 192.168.123.106:0/1142530939 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc701014f0 0x7fcc70195b50 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 >> 192.168.123.106:0/1142530939 conn(0x7fcc700faf00 msgr2=0x7fcc700fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:55.249 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.250+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 shutdown_connections 2026-03-08T23:59:55.250 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:55.251+0000 7fcc7687c700 1 -- 192.168.123.106:0/1142530939 wait complete. 2026-03-08T23:59:55.250 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:55.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:55 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:55.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:55 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:55.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:55 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-08T23:59:55.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:55 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-08T23:59:55.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:55 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/1142530939' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:56.297 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:56.297 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:56.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:56 vm03 ceph-mon[52346]: from='mgr.14164 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-08T23:59:56.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:56 vm03 ceph-mon[52346]: mgrmap e14: vm03.yvcons(active, since 29s) 2026-03-08T23:59:56.450 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:56.500 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.796+0000 7fead4d72700 1 -- 192.168.123.106:0/556756456 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 msgr2=0x7fead01030c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.796+0000 7fead4d72700 1 --2- 192.168.123.106:0/556756456 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead01030c0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7feab8009b00 tx=0x7feab8009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.797+0000 7fead4d72700 1 -- 192.168.123.106:0/556756456 shutdown_connections 2026-03-08T23:59:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.797+0000 7fead4d72700 1 --2- 192.168.123.106:0/556756456 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead01030c0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.797+0000 7fead4d72700 1 -- 192.168.123.106:0/556756456 >> 192.168.123.106:0/556756456 conn(0x7fead00fe220 msgr2=0x7fead0100680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:56.796 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.797+0000 7fead4d72700 1 -- 192.168.123.106:0/556756456 shutdown_connections 2026-03-08T23:59:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.797+0000 7fead4d72700 1 -- 192.168.123.106:0/556756456 wait complete. 2026-03-08T23:59:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7fead4d72700 1 Processor -- start 2026-03-08T23:59:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7fead4d72700 1 -- start start 2026-03-08T23:59:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7fead4d72700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead0197d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:56.797 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7fead4d72700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fead01982a0 con 0x7fead0102ca0 2026-03-08T23:59:56.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7feace59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead0197d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:56.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7feace59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead0197d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:57774/0 (socket says 192.168.123.106:57774) 2026-03-08T23:59:56.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.798+0000 7feace59c700 1 -- 192.168.123.106:0/526432610 learned_addr learned my addr 192.168.123.106:0/526432610 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:56.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.799+0000 7feace59c700 1 -- 192.168.123.106:0/526432610 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feab80097e0 con 0x7fead0102ca0 2026-03-08T23:59:56.798 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.799+0000 7feace59c700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead0197d60 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7feab8004750 tx=0x7feab8005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.799+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feab801c070 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.799+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feab8021470 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.799+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fead01984a0 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.799+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fead0198940 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.800+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7feab800f460 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.800+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 14) v1 ==== 45351+0+0 (secure 0 0 0) 0x7feab8005290 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.800+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feab0005320 con 0x7fead0102ca0 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.800+0000 7feac77fe700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feabc040ca0 0x7feabc043160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:56.800 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.801+0000 7feacdd9b700 1 -- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feabc040ca0 msgr2=0x7feabc043160 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:56.801 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.801+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7feab804c320 con 0x7fead0102ca0 2026-03-08T23:59:56.801 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.801+0000 7feacdd9b700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feabc040ca0 0x7feabc043160 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-08T23:59:56.803 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.804+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7feab8021ac0 con 0x7fead0102ca0 2026-03-08T23:59:56.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.954+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7feab0005190 con 0x7fead0102ca0 2026-03-08T23:59:56.953 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.954+0000 7feac77fe700 1 -- 192.168.123.106:0/526432610 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7feab8026030 con 0x7fead0102ca0 2026-03-08T23:59:56.954 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:56.954 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:56.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feabc040ca0 msgr2=0x7feabc043160 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-08T23:59:56.956 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feabc040ca0 0x7feabc043160 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 msgr2=0x7fead0197d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead0197d60 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7feab8004750 tx=0x7feab8005dc0 comp rx=0 tx=0).stop 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 shutdown_connections 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7feabc040ca0 0x7feabc043160 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 --2- 192.168.123.106:0/526432610 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fead0102ca0 0x7fead0197d60 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 >> 192.168.123.106:0/526432610 conn(0x7fead00fe220 msgr2=0x7fead00fef00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:56.957 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 shutdown_connections 2026-03-08T23:59:56.958 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:56.957+0000 7fead4d72700 1 -- 192.168.123.106:0/526432610 wait complete. 2026-03-08T23:59:56.958 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:57.581 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:57 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/526432610' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:58.023 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:58.024 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:58.164 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:58.201 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.482+0000 7fbfaf555700 1 -- 192.168.123.106:0/2523333085 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 msgr2=0x7fbfa81038e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.482+0000 7fbfaf555700 1 --2- 192.168.123.106:0/2523333085 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa81038e0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fbf9c009b00 tx=0x7fbf9c009e10 comp rx=0 tx=0).stop 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.483+0000 7fbfaf555700 1 -- 192.168.123.106:0/2523333085 shutdown_connections 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.483+0000 7fbfaf555700 1 --2- 192.168.123.106:0/2523333085 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa81038e0 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.483+0000 7fbfaf555700 1 -- 192.168.123.106:0/2523333085 >> 192.168.123.106:0/2523333085 conn(0x7fbfa80faf00 msgr2=0x7fbfa80fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.483+0000 7fbfaf555700 1 -- 192.168.123.106:0/2523333085 shutdown_connections 2026-03-08T23:59:58.482 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.483+0000 7fbfaf555700 1 -- 192.168.123.106:0/2523333085 wait complete. 2026-03-08T23:59:58.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.484+0000 7fbfaf555700 1 Processor -- start 2026-03-08T23:59:58.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.484+0000 7fbfaf555700 1 -- start start 2026-03-08T23:59:58.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.484+0000 7fbfaf555700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa8100e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:58.483 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.484+0000 7fbfaf555700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbfa8101390 con 0x7fbfa81014f0 2026-03-08T23:59:58.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.484+0000 7fbfad2f1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa8100e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-08T23:59:58.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbfad2f1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa8100e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:57790/0 (socket says 192.168.123.106:57790) 2026-03-08T23:59:58.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbfad2f1700 1 -- 192.168.123.106:0/302277534 learned_addr learned my addr 192.168.123.106:0/302277534 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-08T23:59:58.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbfad2f1700 1 -- 192.168.123.106:0/302277534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf9c0097e0 con 0x7fbfa81014f0 2026-03-08T23:59:58.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbfad2f1700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa8100e50 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fbf9c004f40 tx=0x7fbf9c005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-08T23:59:58.484 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbf9c01c070 con 0x7fbfa81014f0 2026-03-08T23:59:58.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf9c0053b0 con 0x7fbfa81014f0 2026-03-08T23:59:58.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbfa80ff4a0 con 0x7fbfa81014f0 2026-03-08T23:59:58.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.485+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbf9c00f460 con 0x7fbfa81014f0 2026-03-08T23:59:58.485 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.486+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbfa80ff940 con 0x7fbfa81014f0 2026-03-08T23:59:58.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.486+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 14) v1 ==== 45351+0+0 (secure 0 0 0) 0x7fbf9c021470 con 0x7fbfa81014f0 2026-03-08T23:59:58.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.487+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbfa804fa90 con 0x7fbfa81014f0 2026-03-08T23:59:58.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.487+0000 7fbf9a7fc700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf94038440 0x7fbf9403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-08T23:59:58.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.487+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fbf9c04c3a0 con 0x7fbfa81014f0 2026-03-08T23:59:58.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.487+0000 7fbfacaf0700 1 -- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf94038440 msgr2=0x7fbf9403a900 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.103:6800/2 2026-03-08T23:59:58.486 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.487+0000 7fbfacaf0700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf94038440 0x7fbf9403a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-08T23:59:58.489 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.489+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbf9c0297d0 con 0x7fbfa81014f0 2026-03-08T23:59:58.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.638+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fbfa80623f0 con 0x7fbfa81014f0 2026-03-08T23:59:58.638 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.639+0000 7fbf9a7fc700 1 -- 192.168.123.106:0/302277534 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fbf9c026030 con 0x7fbfa81014f0 2026-03-08T23:59:58.639 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-08T23:59:58.639 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-08T23:59:58.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.642+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf94038440 msgr2=0x7fbf9403a900 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-08T23:59:58.641 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.642+0000 7fbfaf555700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf94038440 0x7fbf9403a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:58.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.642+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 msgr2=0x7fbfa8100e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-08T23:59:58.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.643+0000 7fbfaf555700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa8100e50 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fbf9c004f40 tx=0x7fbf9c005e70 comp rx=0 tx=0).stop 2026-03-08T23:59:58.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.643+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 shutdown_connections 2026-03-08T23:59:58.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.643+0000 7fbfaf555700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbf94038440 0x7fbf9403a900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:58.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.643+0000 7fbfaf555700 1 --2- 192.168.123.106:0/302277534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbfa81014f0 0x7fbfa8100e50 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-08T23:59:58.642 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.643+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 >> 192.168.123.106:0/302277534 conn(0x7fbfa80faf00 msgr2=0x7fbfa80fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-08T23:59:58.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.643+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 shutdown_connections 2026-03-08T23:59:58.643 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-08T23:59:58.644+0000 7fbfaf555700 1 -- 192.168.123.106:0/302277534 wait complete. 2026-03-08T23:59:58.644 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-08T23:59:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 08 23:59:59 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/302277534' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-08T23:59:59.714 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-08T23:59:59.714 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-08T23:59:59.849 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-08T23:59:59.889 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T00:00:00.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.173+0000 7fc2000fa700 1 -- 192.168.123.106:0/279323140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 msgr2=0x7fc1f8072760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:00.173 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.173+0000 7fc2000fa700 1 --2- 192.168.123.106:0/279323140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f8072760 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7fc1f4009b00 tx=0x7fc1f4009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.174+0000 7fc2000fa700 1 -- 192.168.123.106:0/279323140 shutdown_connections 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.174+0000 7fc2000fa700 1 --2- 192.168.123.106:0/279323140 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f8072760 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.174+0000 7fc2000fa700 1 -- 192.168.123.106:0/279323140 >> 192.168.123.106:0/279323140 conn(0x7fc1f806d800 msgr2=0x7fc1f806fc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.175+0000 7fc2000fa700 1 -- 192.168.123.106:0/279323140 shutdown_connections 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.175+0000 7fc2000fa700 1 -- 192.168.123.106:0/279323140 wait complete. 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.175+0000 7fc2000fa700 1 Processor -- start 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.175+0000 7fc2000fa700 1 -- start start 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.175+0000 7fc2000fa700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f81a4d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:00.175 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.176+0000 7fc2000fa700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1f81a52d0 con 0x7fc1f8072340 2026-03-09T00:00:00.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1fde96700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f81a4d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:00.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1fde96700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f81a4d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:57808/0 (socket says 192.168.123.106:57808) 2026-03-09T00:00:00.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1fde96700 1 -- 192.168.123.106:0/3618548020 learned_addr learned my addr 192.168.123.106:0/3618548020 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:00.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1fde96700 1 -- 192.168.123.106:0/3618548020 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1f40097e0 con 0x7fc1f8072340 2026-03-09T00:00:00.176 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1fde96700 1 --2- 192.168.123.106:0/3618548020 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f81a4d90 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7fc1f400b5c0 tx=0x7fc1f4005470 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:00.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1f401d070 con 0x7fc1f8072340 2026-03-09T00:00:00.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc1f4003b60 con 0x7fc1f8072340 2026-03-09T00:00:00.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc1f400fca0 con 0x7fc1f8072340 2026-03-09T00:00:00.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc2000fa700 1 -- 192.168.123.106:0/3618548020 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc1f81a54d0 con 0x7fc1f8072340 2026-03-09T00:00:00.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.177+0000 7fc2000fa700 1 -- 192.168.123.106:0/3618548020 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc1f81a5970 con 0x7fc1f8072340 2026-03-09T00:00:00.178 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.179+0000 7fc2000fa700 1 -- 192.168.123.106:0/3618548020 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc1f819f190 con 0x7fc1f8072340 2026-03-09T00:00:00.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.182+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 15) v1 ==== 45072+0+0 (secure 0 0 0) 0x7fc1f4003cd0 con 0x7fc1f8072340 2026-03-09T00:00:00.182 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.182+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fc1f404bc30 con 0x7fc1f8072340 2026-03-09T00:00:00.184 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.185+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc1f4015920 con 0x7fc1f8072340 2026-03-09T00:00:00.367 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.367+0000 7fc2000fa700 1 -- 192.168.123.106:0/3618548020 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc1f80623c0 con 0x7fc1f8072340 2026-03-09T00:00:00.368 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.369+0000 7fc1eeffd700 1 -- 192.168.123.106:0/3618548020 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc1f4015350 con 0x7fc1f8072340 2026-03-09T00:00:00.368 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:00.368 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:00.371 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.372+0000 7fc1ecff9700 1 -- 192.168.123.106:0/3618548020 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 msgr2=0x7fc1f81a4d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:00.371 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.372+0000 7fc1ecff9700 1 --2- 192.168.123.106:0/3618548020 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f81a4d90 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7fc1f400b5c0 tx=0x7fc1f4005470 comp rx=0 tx=0).stop 2026-03-09T00:00:00.371 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.372+0000 7fc1ecff9700 1 -- 192.168.123.106:0/3618548020 shutdown_connections 2026-03-09T00:00:00.371 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.372+0000 7fc1ecff9700 1 --2- 192.168.123.106:0/3618548020 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc1f8072340 0x7fc1f81a4d90 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:00.371 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.372+0000 7fc1ecff9700 1 -- 192.168.123.106:0/3618548020 >> 192.168.123.106:0/3618548020 conn(0x7fc1f806d800 msgr2=0x7fc1f806fc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:00.374 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.375+0000 7fc1ecff9700 1 -- 192.168.123.106:0/3618548020 shutdown_connections 2026-03-09T00:00:00.374 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:00.375+0000 7fc1ecff9700 1 -- 192.168.123.106:0/3618548020 wait complete. 2026-03-09T00:00:00.375 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: Active manager daemon vm03.yvcons restarted 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: Activating manager daemon vm03.yvcons 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: mgrmap e15: vm03.yvcons(active, starting, since 0.00450976s) 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: Manager daemon vm03.yvcons is now available 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-09T00:00:01.293 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:01 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/3618548020' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:01.293 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:01 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:01.293 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:01 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:01.293 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:01 vm03 ceph-mon[52346]: mgrmap e16: vm03.yvcons(active, since 1.00852s) 2026-03-09T00:00:01.424 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:01.425 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:01.589 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T00:00:01.625 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-09T00:00:01.985 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.986+0000 7f428d8b9700 1 -- 192.168.123.106:0/3023817673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 msgr2=0x7f4288100eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:01.985 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.986+0000 7f428d8b9700 1 --2- 192.168.123.106:0/3023817673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288100eb0 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f4278009b00 tx=0x7f4278009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:01.987 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.987+0000 7f428d8b9700 1 -- 192.168.123.106:0/3023817673 shutdown_connections 2026-03-09T00:00:01.987 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.987+0000 7f428d8b9700 1 --2- 192.168.123.106:0/3023817673 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288100eb0 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:01.987 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.987+0000 7f428d8b9700 1 -- 192.168.123.106:0/3023817673 >> 192.168.123.106:0/3023817673 conn(0x7f42880fc030 msgr2=0x7f42880fe470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:01.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.990+0000 7f428d8b9700 1 -- 192.168.123.106:0/3023817673 shutdown_connections 2026-03-09T00:00:01.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.990+0000 7f428d8b9700 1 -- 192.168.123.106:0/3023817673 wait complete. 2026-03-09T00:00:01.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.991+0000 7f428d8b9700 1 Processor -- start 2026-03-09T00:00:01.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.991+0000 7f428d8b9700 1 -- start start 2026-03-09T00:00:01.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.991+0000 7f428d8b9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288195b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:01.990 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.991+0000 7f428d8b9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42881960c0 con 0x7f4288100a90 2026-03-09T00:00:01.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f4286ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288195b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:01.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f4286ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288195b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:57838/0 (socket says 192.168.123.106:57838) 2026-03-09T00:00:01.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f4286ffd700 1 -- 192.168.123.106:0/4049511324 learned_addr learned my addr 192.168.123.106:0/4049511324 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:01.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f4286ffd700 1 -- 192.168.123.106:0/4049511324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42780097e0 con 0x7f4288100a90 2026-03-09T00:00:01.991 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f4286ffd700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288195b80 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f4278000c00 tx=0x7f4278004740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:01.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f427801c070 con 0x7f4288100a90 2026-03-09T00:00:01.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42881962c0 con 0x7f4288100a90 2026-03-09T00:00:01.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4288196760 con 0x7f4288100a90 2026-03-09T00:00:01.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f42780053b0 con 0x7f4288100a90 2026-03-09T00:00:01.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.992+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f427800f460 con 0x7f4288100a90 2026-03-09T00:00:01.993 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.993+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f428818f820 con 0x7f4288100a90 2026-03-09T00:00:01.998 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.998+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 16) v1 ==== 45199+0+0 (secure 0 0 0) 0x7f4278005520 con 0x7f4288100a90 2026-03-09T00:00:02.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.999+0000 7f428c8b7700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4270038330 0x7f427003a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:02.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.999+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f427804ce20 con 0x7f4288100a90 2026-03-09T00:00:02.000 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:01.999+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f427804d2a0 con 0x7f4288100a90 2026-03-09T00:00:02.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.001+0000 7f42867fc700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4270038330 0x7f427003a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:02.003 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.003+0000 7f42867fc700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4270038330 0x7f427003a7f0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f427c006fd0 tx=0x7f427c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:02.157 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.158+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f42880623c0 con 0x7f4288100a90 2026-03-09T00:00:02.158 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.159+0000 7f428c8b7700 1 -- 192.168.123.106:0/4049511324 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f4278026020 con 0x7f4288100a90 2026-03-09T00:00:02.158 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:02.158 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:02.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4270038330 msgr2=0x7f427003a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:02.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4270038330 0x7f427003a7f0 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f427c006fd0 tx=0x7f427c006e40 comp rx=0 tx=0).stop 2026-03-09T00:00:02.160 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 msgr2=0x7f4288195b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288195b80 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f4278000c00 tx=0x7f4278004740 comp rx=0 tx=0).stop 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 shutdown_connections 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4270038330 0x7f427003a7f0 secure :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f427c006fd0 tx=0x7f427c006e40 comp rx=0 tx=0).stop 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 --2- 192.168.123.106:0/4049511324 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4288100a90 0x7f4288195b80 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.161+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 >> 192.168.123.106:0/4049511324 conn(0x7f42880fc030 msgr2=0x7f42880fccf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.162+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 shutdown_connections 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:02.162+0000 7f428d8b9700 1 -- 192.168.123.106:0/4049511324 wait complete. 2026-03-09T00:00:02.161 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:02.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: [09/Mar/2026:00:00:00] ENGINE Bus STARTING 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: [09/Mar/2026:00:00:00] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: [09/Mar/2026:00:00:00] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: [09/Mar/2026:00:00:00] ENGINE Bus STARTED 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:02.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:02 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/4049511324' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:03.235 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:03.235 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:03.394 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: mgrmap e17: vm03.yvcons(active, since 2s) 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:03.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.684+0000 7ff72a1c9700 1 -- 192.168.123.106:0/3662074214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 msgr2=0x7ff71c0a6340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:03.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.684+0000 7ff72a1c9700 1 --2- 192.168.123.106:0/3662074214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c0a6340 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7ff720007780 tx=0x7ff72000c050 comp rx=0 tx=0).stop 2026-03-09T00:00:03.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.684+0000 7ff72a1c9700 1 -- 192.168.123.106:0/3662074214 shutdown_connections 2026-03-09T00:00:03.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.684+0000 7ff72a1c9700 1 --2- 192.168.123.106:0/3662074214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c0a6340 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:03.684 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.684+0000 7ff72a1c9700 1 -- 192.168.123.106:0/3662074214 >> 192.168.123.106:0/3662074214 conn(0x7ff71c09fec0 msgr2=0x7ff71c0a2320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:03.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.685+0000 7ff72a1c9700 1 -- 192.168.123.106:0/3662074214 shutdown_connections 2026-03-09T00:00:03.685 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.685+0000 7ff72a1c9700 1 -- 192.168.123.106:0/3662074214 wait complete. 2026-03-09T00:00:03.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.687+0000 7ff72a1c9700 1 Processor -- start 2026-03-09T00:00:03.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.687+0000 7ff72a1c9700 1 -- start start 2026-03-09T00:00:03.686 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.687+0000 7ff72a1c9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c00ae70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:03.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.688+0000 7ff72a1c9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff71c00b3b0 con 0x7ff71c0a5f20 2026-03-09T00:00:03.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.688+0000 7ff7291c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c00ae70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:03.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.689+0000 7ff7291c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c00ae70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:57862/0 (socket says 192.168.123.106:57862) 2026-03-09T00:00:03.688 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.689+0000 7ff7291c7700 1 -- 192.168.123.106:0/2897324322 learned_addr learned my addr 192.168.123.106:0/2897324322 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:03.689 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.690+0000 7ff7291c7700 1 -- 192.168.123.106:0/2897324322 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff720007430 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.691+0000 7ff7291c7700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c00ae70 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7ff720016040 tx=0x7ff720004730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.691+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff72000f040 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.691+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff71c00b5b0 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.691+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff71c00ba50 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.692+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff72000a8d0 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.692+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff720013430 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.692+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7ff720013590 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.692+0000 7ff71a7fc700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff710038580 0x7ff71003aa40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.692+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7ff72004cfb0 con 0x7ff71c0a5f20 2026-03-09T00:00:03.692 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.693+0000 7ff7289c6700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff710038580 0x7ff71003aa40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:03.693 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.694+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff708005320 con 0x7ff71c0a5f20 2026-03-09T00:00:03.696 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.697+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff72007a0e0 con 0x7ff71c0a5f20 2026-03-09T00:00:03.697 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.697+0000 7ff7289c6700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff710038580 0x7ff71003aa40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff72404f8e0 tx=0x7ff72404f080 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:03.866 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.865+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff708005190 con 0x7ff71c0a5f20 2026-03-09T00:00:03.867 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:03.867 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:03.867 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.867+0000 7ff71a7fc700 1 -- 192.168.123.106:0/2897324322 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff720018080 con 0x7ff71c0a5f20 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.871+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff710038580 msgr2=0x7ff71003aa40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.871+0000 7ff72a1c9700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff710038580 0x7ff71003aa40 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7ff72404f8e0 tx=0x7ff72404f080 comp rx=0 tx=0).stop 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.871+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 msgr2=0x7ff71c00ae70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.871+0000 7ff72a1c9700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c00ae70 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7ff720016040 tx=0x7ff720004730 comp rx=0 tx=0).stop 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.871+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 shutdown_connections 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.872+0000 7ff72a1c9700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff710038580 0x7ff71003aa40 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.872+0000 7ff72a1c9700 1 --2- 192.168.123.106:0/2897324322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff71c0a5f20 0x7ff71c00ae70 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:03.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.872+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 >> 192.168.123.106:0/2897324322 conn(0x7ff71c09fec0 msgr2=0x7ff71c0a0ba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:03.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.872+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 shutdown_connections 2026-03-09T00:00:03.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:03.873+0000 7ff72a1c9700 1 -- 192.168.123.106:0/2897324322 wait complete. 2026-03-09T00:00:03.873 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/2897324322' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T00:00:04.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:04.923 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:04.923 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:05.165 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.453+0000 7fae24937700 1 -- 192.168.123.106:0/3969332820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae2010c210 msgr2=0x7fae2010c5f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.453+0000 7fae24937700 1 --2- 192.168.123.106:0/3969332820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae2010c210 0x7fae2010c5f0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fae10007780 tx=0x7fae1000c050 comp rx=0 tx=0).stop 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.454+0000 7fae24937700 1 -- 192.168.123.106:0/3969332820 shutdown_connections 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.454+0000 7fae24937700 1 --2- 192.168.123.106:0/3969332820 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae2010c210 0x7fae2010c5f0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.454+0000 7fae24937700 1 -- 192.168.123.106:0/3969332820 >> 192.168.123.106:0/3969332820 conn(0x7fae2006c970 msgr2=0x7fae2006cd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.454+0000 7fae24937700 1 -- 192.168.123.106:0/3969332820 shutdown_connections 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.454+0000 7fae24937700 1 -- 192.168.123.106:0/3969332820 wait complete. 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.454+0000 7fae24937700 1 Processor -- start 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae24937700 1 -- start start 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae24937700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 0x7fae20134580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae24937700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae10003680 con 0x7fae20137560 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae1ed9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 0x7fae20134580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae1ed9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 0x7fae20134580 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:57894/0 (socket says 192.168.123.106:57894) 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae1ed9d700 1 -- 192.168.123.106:0/93976659 learned_addr learned my addr 192.168.123.106:0/93976659 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:05.454 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae1ed9d700 1 -- 192.168.123.106:0/93976659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae10007430 con 0x7fae20137560 2026-03-09T00:00:05.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.455+0000 7fae1ed9d700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 0x7fae20134580 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fae1000a010 tx=0x7fae1000c7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:05.455 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.456+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fae1000f050 con 0x7fae20137560 2026-03-09T00:00:05.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.456+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fae20134b20 con 0x7fae20137560 2026-03-09T00:00:05.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.456+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fae20134fa0 con 0x7fae20137560 2026-03-09T00:00:05.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.456+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fae1000cb60 con 0x7fae20137560 2026-03-09T00:00:05.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.457+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fae10008370 con 0x7fae20137560 2026-03-09T00:00:05.456 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.457+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7fae1001a040 con 0x7fae20137560 2026-03-09T00:00:05.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.457+0000 7fae07fff700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae080385a0 0x7fae0803aa60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:05.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.458+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fae1004c5a0 con 0x7fae20137560 2026-03-09T00:00:05.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.458+0000 7fae1e59c700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae080385a0 0x7fae0803aa60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:05.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.458+0000 7fae1e59c700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae080385a0 0x7fae0803aa60 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fae1800ad30 tx=0x7fae180093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:05.457 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.458+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fae0c005320 con 0x7fae20137560 2026-03-09T00:00:05.460 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.461+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fae10018020 con 0x7fae20137560 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-09T00:00:05.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:05.628 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.628+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fae0c005190 con 0x7fae20137560 2026-03-09T00:00:05.630 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:05.630 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:05.630 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.630+0000 7fae07fff700 1 -- 192.168.123.106:0/93976659 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fae10020020 con 0x7fae20137560 2026-03-09T00:00:05.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae080385a0 msgr2=0x7fae0803aa60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:05.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae080385a0 0x7fae0803aa60 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fae1800ad30 tx=0x7fae180093f0 comp rx=0 tx=0).stop 2026-03-09T00:00:05.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 msgr2=0x7fae20134580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:05.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 0x7fae20134580 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fae1000a010 tx=0x7fae1000c7b0 comp rx=0 tx=0).stop 2026-03-09T00:00:05.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 shutdown_connections 2026-03-09T00:00:05.631 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fae080385a0 0x7fae0803aa60 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:05.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 --2- 192.168.123.106:0/93976659 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae20137560 0x7fae20134580 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:05.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 >> 192.168.123.106:0/93976659 conn(0x7fae2006c970 msgr2=0x7fae2010b720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:05.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 shutdown_connections 2026-03-09T00:00:05.632 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:05.632+0000 7fae24937700 1 -- 192.168.123.106:0/93976659 wait complete. 2026-03-09T00:00:05.632 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:06.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:06 vm03 ceph-mon[52346]: Deploying daemon crash.vm06 on vm06 2026-03-09T00:00:06.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:06 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/93976659' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:06.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:06 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:06.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:06 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:06.694 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:06.694 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:06.890 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.196+0000 7f73a3789700 1 -- 192.168.123.106:0/4265186266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 msgr2=0x7f739c103280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.196+0000 7f73a3789700 1 --2- 192.168.123.106:0/4265186266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c103280 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f738c009b00 tx=0x7f738c009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.197+0000 7f73a3789700 1 -- 192.168.123.106:0/4265186266 shutdown_connections 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.197+0000 7f73a3789700 1 --2- 192.168.123.106:0/4265186266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c103280 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.197+0000 7f73a3789700 1 -- 192.168.123.106:0/4265186266 >> 192.168.123.106:0/4265186266 conn(0x7f739c0fe750 msgr2=0x7f739c100b70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.198+0000 7f73a3789700 1 -- 192.168.123.106:0/4265186266 shutdown_connections 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.198+0000 7f73a3789700 1 -- 192.168.123.106:0/4265186266 wait complete. 2026-03-09T00:00:07.197 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.198+0000 7f73a3789700 1 Processor -- start 2026-03-09T00:00:07.198 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.199+0000 7f73a3789700 1 -- start start 2026-03-09T00:00:07.198 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.199+0000 7f73a3789700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c19c510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:07.198 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.199+0000 7f73a3789700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f739c19ca50 con 0x7f739c102ea0 2026-03-09T00:00:07.198 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.199+0000 7f73a1525700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c19c510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:07.198 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.199+0000 7f73a1525700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c19c510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:41722/0 (socket says 192.168.123.106:41722) 2026-03-09T00:00:07.198 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.199+0000 7f73a1525700 1 -- 192.168.123.106:0/1498767757 learned_addr learned my addr 192.168.123.106:0/1498767757 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73a1525700 1 -- 192.168.123.106:0/1498767757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f738c0097e0 con 0x7f739c102ea0 2026-03-09T00:00:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73a1525700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c19c510 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f738c006010 tx=0x7f738c004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f738c01c070 con 0x7f739c102ea0 2026-03-09T00:00:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f738c021470 con 0x7f739c102ea0 2026-03-09T00:00:07.199 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f739c19cc50 con 0x7f739c102ea0 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f739c19d070 con 0x7f739c102ea0 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.200+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f738c00f460 con 0x7f739c102ea0 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.201+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f738c00f5c0 con 0x7f739c102ea0 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.201+0000 7f73927fc700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7388038530 0x7f738803a9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.201+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f738c04d580 con 0x7f739c102ea0 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.202+0000 7f73a0d24700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7388038530 0x7f738803a9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:07.201 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.202+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f739c04fa20 con 0x7f739c102ea0 2026-03-09T00:00:07.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.205+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f738c026070 con 0x7f739c102ea0 2026-03-09T00:00:07.204 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.205+0000 7f73a0d24700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7388038530 0x7f738803a9f0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f7398006fd0 tx=0x7f7398006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:07.355 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.356+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f739c19d970 con 0x7f739c102ea0 2026-03-09T00:00:07.356 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.356+0000 7f73927fc700 1 -- 192.168.123.106:0/1498767757 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f738c029720 con 0x7f739c102ea0 2026-03-09T00:00:07.356 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:07.356 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7388038530 msgr2=0x7f738803a9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7388038530 0x7f738803a9f0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f7398006fd0 tx=0x7f7398006e40 comp rx=0 tx=0).stop 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 msgr2=0x7f739c19c510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c19c510 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f738c006010 tx=0x7f738c004dc0 comp rx=0 tx=0).stop 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 shutdown_connections 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7388038530 0x7f738803a9f0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 --2- 192.168.123.106:0/1498767757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f739c102ea0 0x7f739c19c510 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 >> 192.168.123.106:0/1498767757 conn(0x7f739c0fe750 msgr2=0x7f739c107110 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 shutdown_connections 2026-03-09T00:00:07.358 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:07.359+0000 7f73a3789700 1 -- 192.168.123.106:0/1498767757 wait complete. 2026-03-09T00:00:07.359 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:07.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:07 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:07.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:07 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:07.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:07 vm03 ceph-mon[52346]: Deploying daemon node-exporter.vm06 on vm06 2026-03-09T00:00:08.418 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:08.418 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:08.560 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:08.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:08 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/1498767757' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.853+0000 7f97b2c68700 1 -- 192.168.123.106:0/1333983071 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 msgr2=0x7f97ac105df0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.853+0000 7f97b2c68700 1 --2- 192.168.123.106:0/1333983071 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac105df0 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f979c009b00 tx=0x7f979c009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.854+0000 7f97b2c68700 1 -- 192.168.123.106:0/1333983071 shutdown_connections 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.854+0000 7f97b2c68700 1 --2- 192.168.123.106:0/1333983071 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac105df0 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.854+0000 7f97b2c68700 1 -- 192.168.123.106:0/1333983071 >> 192.168.123.106:0/1333983071 conn(0x7f97ac0fb430 msgr2=0x7f97ac0fd850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.854+0000 7f97b2c68700 1 -- 192.168.123.106:0/1333983071 shutdown_connections 2026-03-09T00:00:08.853 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.854+0000 7f97b2c68700 1 -- 192.168.123.106:0/1333983071 wait complete. 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b2c68700 1 Processor -- start 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b2c68700 1 -- start start 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b2c68700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac18f670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b2c68700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f97ac193660 con 0x7f97ac105a10 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b0a04700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac18f670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b0a04700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac18f670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:41736/0 (socket says 192.168.123.106:41736) 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b0a04700 1 -- 192.168.123.106:0/4029619127 learned_addr learned my addr 192.168.123.106:0/4029619127 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:08.854 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.855+0000 7f97b0a04700 1 -- 192.168.123.106:0/4029619127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f979c0097e0 con 0x7f97ac105a10 2026-03-09T00:00:08.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.856+0000 7f97b0a04700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac18f670 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f979c006010 tx=0x7f979c004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:08.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.856+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f979c01c070 con 0x7f97ac105a10 2026-03-09T00:00:08.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.856+0000 7f97b2c68700 1 -- 192.168.123.106:0/4029619127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f97ac18fbb0 con 0x7f97ac105a10 2026-03-09T00:00:08.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.856+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f979c021470 con 0x7f97ac105a10 2026-03-09T00:00:08.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.856+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f979c00f460 con 0x7f97ac105a10 2026-03-09T00:00:08.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.856+0000 7f97b2c68700 1 -- 192.168.123.106:0/4029619127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f97ac190030 con 0x7f97ac105a10 2026-03-09T00:00:08.857 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.857+0000 7f97b2c68700 1 -- 192.168.123.106:0/4029619127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f97ac102cb0 con 0x7f97ac105a10 2026-03-09T00:00:08.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.858+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f979c005320 con 0x7f97ac105a10 2026-03-09T00:00:08.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.858+0000 7f97a9ffb700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97940384e0 0x7f979403a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:08.858 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.858+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f979c04c3b0 con 0x7f97ac105a10 2026-03-09T00:00:08.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.863+0000 7f97abfff700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97940384e0 0x7f979403a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:08.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.863+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f979c0215e0 con 0x7f97ac105a10 2026-03-09T00:00:08.862 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:08.863+0000 7f97abfff700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97940384e0 0x7f979403a9a0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f97a0006fd0 tx=0x7f97a0006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:09.008 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.008+0000 7f97b2c68700 1 -- 192.168.123.106:0/4029619127 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f97ac0623c0 con 0x7f97ac105a10 2026-03-09T00:00:09.011 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.012+0000 7f97a9ffb700 1 -- 192.168.123.106:0/4029619127 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f979c026030 con 0x7f97ac105a10 2026-03-09T00:00:09.011 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:09.011 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 -- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97940384e0 msgr2=0x7f979403a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97940384e0 0x7f979403a9a0 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f97a0006fd0 tx=0x7f97a0006e40 comp rx=0 tx=0).stop 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 -- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 msgr2=0x7f97ac18f670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac18f670 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f979c006010 tx=0x7f979c004dc0 comp rx=0 tx=0).stop 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 -- 192.168.123.106:0/4029619127 shutdown_connections 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f97940384e0 0x7f979403a9a0 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 --2- 192.168.123.106:0/4029619127 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f97ac105a10 0x7f97ac18f670 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:09.014 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.015+0000 7f97937fe700 1 -- 192.168.123.106:0/4029619127 >> 192.168.123.106:0/4029619127 conn(0x7f97ac0fb430 msgr2=0x7f97ac0692d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:09.015 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.016+0000 7f97937fe700 1 -- 192.168.123.106:0/4029619127 shutdown_connections 2026-03-09T00:00:09.016 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:09.017+0000 7f97937fe700 1 -- 192.168.123.106:0/4029619127 wait complete. 2026-03-09T00:00:09.017 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/4029619127' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:00:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:10.083 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:10.083 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:10.289 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: Deploying daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:00:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:10 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:10.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.758+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1016307070 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 msgr2=0x7fc40c0aab50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:10.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.758+0000 7fc41b3cd700 1 --2- 192.168.123.106:0/1016307070 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c0aab50 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7fc410009b00 tx=0x7fc410009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:10.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.758+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1016307070 shutdown_connections 2026-03-09T00:00:10.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.758+0000 7fc41b3cd700 1 --2- 192.168.123.106:0/1016307070 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c0aab50 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:10.763 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.758+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1016307070 >> 192.168.123.106:0/1016307070 conn(0x7fc40c01a430 msgr2=0x7fc40c01a840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1016307070 shutdown_connections 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1016307070 wait complete. 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41b3cd700 1 Processor -- start 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41b3cd700 1 -- start start 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41b3cd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c139eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41b3cd700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc40c13a3f0 con 0x7fc40c0aa770 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.764+0000 7fc41a3cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c139eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc41a3cb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c139eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:41780/0 (socket says 192.168.123.106:41780) 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc41a3cb700 1 -- 192.168.123.106:0/1165671860 learned_addr learned my addr 192.168.123.106:0/1165671860 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:10.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc41a3cb700 1 -- 192.168.123.106:0/1165671860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4100097e0 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc41a3cb700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c139eb0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fc410005950 tx=0x7fc410004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc410005210 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc40c13a5f0 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc40c13aa10 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc410011470 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.765+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc41001f410 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.766+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7fc41001f570 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.766+0000 7fc40b7fe700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc400038560 0x7fc40003aa20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.767+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fc41004dbc0 con 0x7fc40c0aa770 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.767+0000 7fc419bca700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc400038560 0x7fc40003aa20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:10.766 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.767+0000 7fc419bca700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc400038560 0x7fc40003aa20 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fc404006fd0 tx=0x7fc404006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:10.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.768+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc3f8005320 con 0x7fc40c0aa770 2026-03-09T00:00:10.770 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.771+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc410015070 con 0x7fc40c0aa770 2026-03-09T00:00:10.972 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.972+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc3f8005190 con 0x7fc40c0aa770 2026-03-09T00:00:10.973 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.974+0000 7fc40b7fe700 1 -- 192.168.123.106:0/1165671860 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fc41000e720 con 0x7fc40c0aa770 2026-03-09T00:00:10.978 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:10.978 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":1,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-08T23:58:55.232252Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.985+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc400038560 msgr2=0x7fc40003aa20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.985+0000 7fc41b3cd700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc400038560 0x7fc40003aa20 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7fc404006fd0 tx=0x7fc404006e40 comp rx=0 tx=0).stop 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.985+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 msgr2=0x7fc40c139eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.985+0000 7fc41b3cd700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c139eb0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7fc410005950 tx=0x7fc410004dc0 comp rx=0 tx=0).stop 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.986+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 shutdown_connections 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.986+0000 7fc41b3cd700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc400038560 0x7fc40003aa20 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.986+0000 7fc41b3cd700 1 --2- 192.168.123.106:0/1165671860 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc40c0aa770 0x7fc40c139eb0 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.986+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 >> 192.168.123.106:0/1165671860 conn(0x7fc40c01a430 msgr2=0x7fc40c0a35d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.986+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 shutdown_connections 2026-03-09T00:00:10.986 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:10.986+0000 7fc41b3cd700 1 -- 192.168.123.106:0/1165671860 wait complete. 2026-03-09T00:00:10.987 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 1 2026-03-09T00:00:11.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:11 vm03 ceph-mon[52346]: Deploying daemon mon.vm06 on vm06 2026-03-09T00:00:11.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:11 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/1165671860' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:12.186 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-09T00:00:12.186 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mon dump -f json 2026-03-09T00:00:12.350 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.025+0000 7fc9b9438700 1 -- 192.168.123.106:0/3300054159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc99c0035c0 msgr2=0x7fc99c005a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.025+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3300054159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc99c0035c0 0x7fc99c005a50 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7fc9b406cd60 tx=0x7fc9ac009e80 comp rx=0 tx=0).stop 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.026+0000 7fc9b9438700 1 -- 192.168.123.106:0/3300054159 shutdown_connections 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.026+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3300054159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc99c0035c0 0x7fc99c005a50 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.026+0000 7fc9b9438700 1 -- 192.168.123.106:0/3300054159 >> 192.168.123.106:0/3300054159 conn(0x7fc9b406ba60 msgr2=0x7fc9b406be70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.026+0000 7fc9b9438700 1 -- 192.168.123.106:0/3300054159 shutdown_connections 2026-03-09T00:00:17.026 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 -- 192.168.123.106:0/3300054159 wait complete. 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 Processor -- start 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 -- start start 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9b41a53f0 0x7fc9b41a85c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9b41a8b00 con 0x7fc9b41a53f0 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.027+0000 7fc9b9438700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9b41a8c40 con 0x7fc99c0035c0 2026-03-09T00:00:17.027 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.028+0000 7fc9b37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9b41a53f0 0x7fc9b41a85c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:17.030 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.031+0000 7fc9b3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:17.030 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.031+0000 7fc9b3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:60224/0 (socket says 192.168.123.106:60224) 2026-03-09T00:00:17.030 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.031+0000 7fc9b3fff700 1 -- 192.168.123.106:0/3829000185 learned_addr learned my addr 192.168.123.106:0/3829000185 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b3fff700 1 -- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 msgr2=0x7fc9b41a4eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b3fff700 1 -- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 msgr2=0x7fc9b41a4eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b3fff700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b3fff700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b37fe700 1 -- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 msgr2=0x7fc9b41a4eb0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b37fe700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b37fe700 1 -- 192.168.123.106:0/3829000185 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9ac0098d0 con 0x7fc9b41a53f0 2026-03-09T00:00:17.032 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.032+0000 7fc9b37fe700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9b41a53f0 0x7fc9b41a85c0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fc9a800b700 tx=0x7fc9a800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:17.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.033+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc9a8010820 con 0x7fc9b41a53f0 2026-03-09T00:00:17.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.033+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc9a8010e60 con 0x7fc9b41a53f0 2026-03-09T00:00:17.033 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.033+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc9a800d570 con 0x7fc9b41a53f0 2026-03-09T00:00:17.034 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.035+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc9b41a8f20 con 0x7fc9b41a53f0 2026-03-09T00:00:17.034 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.035+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc9b41af940 con 0x7fc9b41a53f0 2026-03-09T00:00:17.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.036+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7fc9a8010980 con 0x7fc9b41a53f0 2026-03-09T00:00:17.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.037+0000 7fc9b17fa700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc9a006c480 0x7fc9a006e940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:17.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.037+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fc9a8089ff0 con 0x7fc9b41a53f0 2026-03-09T00:00:17.036 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.037+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc9b41a4cc0 con 0x7fc9b41a53f0 2026-03-09T00:00:17.037 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.038+0000 7fc9b3fff700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc9a006c480 0x7fc9a006e940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:17.038 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.039+0000 7fc9b3fff700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc9a006c480 0x7fc9a006e940 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc9ac005bd0 tx=0x7fc9ac005ae0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:17.042 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.041+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc9a805f080 con 0x7fc9b41a53f0 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: mon.vm03 calling monitor election 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: mon.vm06 calling monitor election 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.? 192.168.123.106:0/4093995387' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: mon.vm03 is new leader, mons vm03,vm06 in quorum (ranks 0,1) 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: monmap e2: 2 mons at {vm03=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0],vm06=[v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: fsmap 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: osdmap e5: 0 total, 0 up, 0 in 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: mgrmap e17: vm03.yvcons(active, since 16s) 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.? 192.168.123.106:0/4093995387' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.? 192.168.123.106:0/4093995387' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.? 192.168.123.106:0/4093995387' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: overall HEALTH_OK 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:17.207 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.206+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fc9b40623c0 con 0x7fc9b41a53f0 2026-03-09T00:00:17.210 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:17.210 INFO:teuthology.orchestra.run.vm06.stdout:{"epoch":2,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","modified":"2026-03-09T00:00:11.764667Z","created":"2026-03-08T23:58:55.232252Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm03","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm06","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:3300","nonce":0},{"type":"v1","addr":"192.168.123.106:6789","nonce":0}]},"addr":"192.168.123.106:6789/0","public_addr":"192.168.123.106:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-09T00:00:17.210 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.209+0000 7fc9b17fa700 1 -- 192.168.123.106:0/3829000185 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7fc9a8014030 con 0x7fc9b41a53f0 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc9a006c480 msgr2=0x7fc9a006e940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc9a006c480 0x7fc9a006e940 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc9ac005bd0 tx=0x7fc9ac005ae0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9b41a53f0 msgr2=0x7fc9b41a85c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9b41a53f0 0x7fc9b41a85c0 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fc9a800b700 tx=0x7fc9a800bac0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 shutdown_connections 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc9a006c480 0x7fc9a006e940 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc99c0035c0 0x7fc9b41a4eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 --2- 192.168.123.106:0/3829000185 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9b41a53f0 0x7fc9b41a85c0 secure :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7fc9a800b700 tx=0x7fc9a800bac0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.212+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 >> 192.168.123.106:0/3829000185 conn(0x7fc9b406ba60 msgr2=0x7fc9b410a630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.213+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 shutdown_connections 2026-03-09T00:00:17.212 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:17.213+0000 7fc9b9438700 1 -- 192.168.123.106:0/3829000185 wait complete. 2026-03-09T00:00:17.216 INFO:teuthology.orchestra.run.vm06.stderr:dumped monmap epoch 2 2026-03-09T00:00:17.317 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-09T00:00:17.317 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph config generate-minimal-conf 2026-03-09T00:00:17.476 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.732+0000 7f6acd4d0700 1 -- 192.168.123.103:0/449465832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8102e70 msgr2=0x7f6ac8103250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.732+0000 7f6acd4d0700 1 --2- 192.168.123.103:0/449465832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8102e70 0x7f6ac8103250 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f6ab0009b00 tx=0x7f6ab0009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.733+0000 7f6acd4d0700 1 -- 192.168.123.103:0/449465832 shutdown_connections 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.733+0000 7f6acd4d0700 1 --2- 192.168.123.103:0/449465832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8102e70 0x7f6ac8103250 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.733+0000 7f6acd4d0700 1 -- 192.168.123.103:0/449465832 >> 192.168.123.103:0/449465832 conn(0x7f6ac80fe760 msgr2=0x7f6ac8100b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.733+0000 7f6acd4d0700 1 -- 192.168.123.103:0/449465832 shutdown_connections 2026-03-09T00:00:17.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.733+0000 7f6acd4d0700 1 -- 192.168.123.103:0/449465832 wait complete. 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.733+0000 7f6acd4d0700 1 Processor -- start 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6acd4d0700 1 -- start start 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6acd4d0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 0x7f6ac81985b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6acd4d0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 0x7f6ac819cd70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6acd4d0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ac819d2b0 con 0x7f6ac8198af0 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6acd4d0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ac819d420 con 0x7f6ac8102e70 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac67fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 0x7f6ac819cd70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac67fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 0x7f6ac819cd70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47506/0 (socket says 192.168.123.103:47506) 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac67fc700 1 -- 192.168.123.103:0/484352416 learned_addr learned my addr 192.168.123.103:0/484352416 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:00:17.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac6ffd700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 0x7f6ac81985b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac6ffd700 1 -- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 msgr2=0x7f6ac81985b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 13 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac6ffd700 1 -- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 msgr2=0x7f6ac81985b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac6ffd700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 0x7f6ac81985b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.734+0000 7f6ac67fc700 1 -- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 msgr2=0x7f6ac81985b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6ac6ffd700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 0x7f6ac81985b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6ac67fc700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 0x7f6ac81985b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6ac67fc700 1 -- 192.168.123.103:0/484352416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6ab00097e0 con 0x7f6ac8198af0 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6ac67fc700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 0x7f6ac819cd70 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f6ab800ba70 tx=0x7f6ab800be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ab800c7e0 con 0x7f6ac8198af0 2026-03-09T00:00:17.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6acd4d0700 1 -- 192.168.123.103:0/484352416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ac819d700 con 0x7f6ac8198af0 2026-03-09T00:00:17.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6ab800ce20 con 0x7f6ac8198af0 2026-03-09T00:00:17.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.735+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6ab8012550 con 0x7f6ac8198af0 2026-03-09T00:00:17.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.736+0000 7f6acd4d0700 1 -- 192.168.123.103:0/484352416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ac81a3010 con 0x7f6ac8198af0 2026-03-09T00:00:17.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.736+0000 7f6acd4d0700 1 -- 192.168.123.103:0/484352416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ac81983c0 con 0x7f6ac8198af0 2026-03-09T00:00:17.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.737+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f6ab8014440 con 0x7f6ac8198af0 2026-03-09T00:00:17.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.737+0000 7f6abffff700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ab406c570 0x7f6ab406ea30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:17.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.737+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f6ab808d030 con 0x7f6ac8198af0 2026-03-09T00:00:17.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.740+0000 7f6ac6ffd700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ab406c570 0x7f6ab406ea30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:17.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.740+0000 7f6ac6ffd700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ab406c570 0x7f6ab406ea30 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6ab0006010 tx=0x7f6ab000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:17.740 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.741+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6ab8054f80 con 0x7f6ac8198af0 2026-03-09T00:00:17.844 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:17 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: mgrmap e18: vm03.yvcons(active, since 16s), standbys: vm06.rzcvhn 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/3829000185' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:17.881 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:17.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.881+0000 7f6acd4d0700 1 -- 192.168.123.103:0/484352416 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f6ac804fa20 con 0x7f6ac8198af0 2026-03-09T00:00:17.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.884+0000 7f6abffff700 1 -- 192.168.123.103:0/484352416 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f6ab80585a0 con 0x7f6ac8198af0 2026-03-09T00:00:17.884 INFO:teuthology.orchestra.run.vm03.stdout:# minimal ceph.conf for ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:00:17.884 INFO:teuthology.orchestra.run.vm03.stdout:[global] 2026-03-09T00:00:17.884 INFO:teuthology.orchestra.run.vm03.stdout: fsid = ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:00:17.884 INFO:teuthology.orchestra.run.vm03.stdout: mon_host = [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.887+0000 7f6abdffb700 1 -- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ab406c570 msgr2=0x7f6ab406ea30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.887+0000 7f6abdffb700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ab406c570 0x7f6ab406ea30 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6ab0006010 tx=0x7f6ab000b540 comp rx=0 tx=0).stop 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.887+0000 7f6abdffb700 1 -- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 msgr2=0x7f6ac819cd70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 0x7f6ac819cd70 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f6ab800ba70 tx=0x7f6ab800be30 comp rx=0 tx=0).stop 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 -- 192.168.123.103:0/484352416 shutdown_connections 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6ab406c570 0x7f6ab406ea30 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ac8102e70 0x7f6ac81985b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 --2- 192.168.123.103:0/484352416 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ac8198af0 0x7f6ac819cd70 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 -- 192.168.123.103:0/484352416 >> 192.168.123.103:0/484352416 conn(0x7f6ac80fe760 msgr2=0x7f6ac81089d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:17.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.888+0000 7f6abdffb700 1 -- 192.168.123.103:0/484352416 shutdown_connections 2026-03-09T00:00:17.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:17.889+0000 7f6abdffb700 1 -- 192.168.123.103:0/484352416 wait complete. 2026-03-09T00:00:17.944 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-09T00:00:17.944 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:00:17.944 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T00:00:17.978 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:00:17.978 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:00:18.049 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:00:18.049 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.conf 2026-03-09T00:00:18.078 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:00:18.078 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:00:18.148 INFO:tasks.cephadm:Deploying OSDs... 2026-03-09T00:00:18.148 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:00:18.148 DEBUG:teuthology.orchestra.run.vm03:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T00:00:18.182 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T00:00:18.183 DEBUG:teuthology.orchestra.run.vm03:> ls /dev/[sv]d? 2026-03-09T00:00:18.249 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vda 2026-03-09T00:00:18.249 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdb 2026-03-09T00:00:18.249 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdc 2026-03-09T00:00:18.249 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vdd 2026-03-09T00:00:18.249 INFO:teuthology.orchestra.run.vm03.stdout:/dev/vde 2026-03-09T00:00:18.250 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T00:00:18.250 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T00:00:18.250 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdb 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdb 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-08 23:59:29.857962260 +0000 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-08 23:53:55.411000000 +0000 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-08 23:53:55.411000000 +0000 2026-03-09T00:00:18.318 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-08 23:53:53.287000000 +0000 2026-03-09T00:00:18.318 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T00:00:18.393 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T00:00:18.393 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T00:00:18.393 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000159398 s, 3.2 MB/s 2026-03-09T00:00:18.394 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T00:00:18.456 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdc 2026-03-09T00:00:18.522 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdc 2026-03-09T00:00:18.522 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:18.522 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-09T00:00:18.522 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:18.522 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:18.523 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-08 23:59:29.916962308 +0000 2026-03-09T00:00:18.523 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-08 23:53:55.417000000 +0000 2026-03-09T00:00:18.523 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-08 23:53:55.417000000 +0000 2026-03-09T00:00:18.523 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-08 23:53:53.291000000 +0000 2026-03-09T00:00:18.523 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T00:00:18.591 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T00:00:18.591 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T00:00:18.591 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000118151 s, 4.3 MB/s 2026-03-09T00:00:18.592 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T00:00:18.651 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vdd 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vdd 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-08 23:59:29.985962364 +0000 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-08 23:53:55.446000000 +0000 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-08 23:53:55.446000000 +0000 2026-03-09T00:00:18.709 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-08 23:53:53.295000000 +0000 2026-03-09T00:00:18.709 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T00:00:18.806 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T00:00:18.806 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T00:00:18.806 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000228198 s, 2.2 MB/s 2026-03-09T00:00:18.807 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T00:00:18.853 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vde 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vde 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-08 23:59:30.052962419 +0000 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-08 23:53:55.402000000 +0000 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-08 23:53:55.402000000 +0000 2026-03-09T00:00:18.894 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-08 23:53:53.326000000 +0000 2026-03-09T00:00:18.894 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T00:00:18.974 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-09T00:00:18.974 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-09T00:00:18.974 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000270436 s, 1.9 MB/s 2026-03-09T00:00:18.977 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T00:00:19.081 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:00:19.081 DEBUG:teuthology.orchestra.run.vm06:> dd if=/scratch_devs of=/dev/stdout 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/484352416' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:00:19.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:18 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:19.098 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T00:00:19.098 DEBUG:teuthology.orchestra.run.vm06:> ls /dev/[sv]d? 2026-03-09T00:00:19.155 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vda 2026-03-09T00:00:19.156 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdb 2026-03-09T00:00:19.156 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdc 2026-03-09T00:00:19.156 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vdd 2026-03-09T00:00:19.156 INFO:teuthology.orchestra.run.vm06.stdout:/dev/vde 2026-03-09T00:00:19.156 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-09T00:00:19.156 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-09T00:00:19.156 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdb 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/484352416' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:00:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:18 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdb 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 00:00:02.423064059 +0000 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-08 23:54:26.099000000 +0000 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-08 23:54:26.099000000 +0000 2026-03-09T00:00:19.187 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-08 23:54:24.282000000 +0000 2026-03-09T00:00:19.187 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-09T00:00:19.250 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T00:00:19.250 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T00:00:19.250 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000154749 s, 3.3 MB/s 2026-03-09T00:00:19.251 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-09T00:00:19.308 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdc 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdc 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 00:00:02.494064115 +0000 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-08 23:54:26.096000000 +0000 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-08 23:54:26.096000000 +0000 2026-03-09T00:00:19.365 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-08 23:54:24.286000000 +0000 2026-03-09T00:00:19.365 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-09T00:00:19.428 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T00:00:19.428 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T00:00:19.428 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000173524 s, 3.0 MB/s 2026-03-09T00:00:19.429 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-09T00:00:19.486 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vdd 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vdd 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 00:00:02.562064168 +0000 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-08 23:54:26.099000000 +0000 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-08 23:54:26.099000000 +0000 2026-03-09T00:00:19.542 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-08 23:54:24.291000000 +0000 2026-03-09T00:00:19.542 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-09T00:00:19.604 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T00:00:19.604 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T00:00:19.604 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000184285 s, 2.8 MB/s 2026-03-09T00:00:19.605 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-09T00:00:19.661 DEBUG:teuthology.orchestra.run.vm06:> stat /dev/vde 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout: File: /dev/vde 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Access: 2026-03-09 00:00:02.619064213 +0000 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-08 23:54:26.100000000 +0000 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-08 23:54:26.100000000 +0000 2026-03-09T00:00:19.722 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-08 23:54:24.303000000 +0000 2026-03-09T00:00:19.722 DEBUG:teuthology.orchestra.run.vm06:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-09T00:00:19.787 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records in 2026-03-09T00:00:19.787 INFO:teuthology.orchestra.run.vm06.stderr:1+0 records out 2026-03-09T00:00:19.787 INFO:teuthology.orchestra.run.vm06.stderr:512 bytes copied, 0.000238947 s, 2.1 MB/s 2026-03-09T00:00:19.788 DEBUG:teuthology.orchestra.run.vm06:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-09T00:00:19.845 INFO:tasks.cephadm:Deploying osd.0 on vm03 with /dev/vde... 2026-03-09T00:00:19.845 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- lvm zap /dev/vde 2026-03-09T00:00:19.889 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: Reconfiguring mon.vm03 (unknown last config time)... 2026-03-09T00:00:19.889 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: Reconfiguring mon.vm03 (unknown last config time)... 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: Reconfiguring mgr.vm03.yvcons (unknown last config time)... 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.yvcons", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: Reconfiguring daemon mgr.vm03.yvcons on vm03 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:00:19.891 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:20.025 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: Reconfiguring mgr.vm03.yvcons (unknown last config time)... 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.yvcons", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: Reconfiguring daemon mgr.vm03.yvcons on vm03 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:00:20.143 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:20.595 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:00:20.607 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch daemon add osd vm03:/dev/vde 2026-03-09T00:00:20.799 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:21.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.096+0000 7fed7b117700 1 -- 192.168.123.103:0/3375732738 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74075a10 msgr2=0x7fed74077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:21.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.096+0000 7fed7b117700 1 --2- 192.168.123.103:0/3375732738 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74075a10 0x7fed74077ea0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7fed6c00b600 tx=0x7fed6c00b910 comp rx=0 tx=0).stop 2026-03-09T00:00:21.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.096+0000 7fed7b117700 1 -- 192.168.123.103:0/3375732738 shutdown_connections 2026-03-09T00:00:21.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.096+0000 7fed7b117700 1 --2- 192.168.123.103:0/3375732738 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74075a10 0x7fed74077ea0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:21.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.096+0000 7fed7b117700 1 --2- 192.168.123.103:0/3375732738 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed74072b20 0x7fed74072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:21.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.096+0000 7fed7b117700 1 -- 192.168.123.103:0/3375732738 >> 192.168.123.103:0/3375732738 conn(0x7fed7406daa0 msgr2=0x7fed7406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.097+0000 7fed7b117700 1 -- 192.168.123.103:0/3375732738 shutdown_connections 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.097+0000 7fed7b117700 1 -- 192.168.123.103:0/3375732738 wait complete. 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.097+0000 7fed7b117700 1 Processor -- start 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.097+0000 7fed7b117700 1 -- start start 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed7b117700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74072b20 0x7fed740831b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed7b117700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed7b117700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed74083c30 con 0x7fed74072b20 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed7b117700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed74083da0 con 0x7fed740836f0 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed79914700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:21.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed7a115700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74072b20 0x7fed740831b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed79914700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:38372/0 (socket says 192.168.123.103:38372) 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.098+0000 7fed79914700 1 -- 192.168.123.103:0/3976225826 learned_addr learned my addr 192.168.123.103:0/3976225826 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed79914700 1 -- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 msgr2=0x7fed741b3200 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_bulk peer close file descriptor 12 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed79914700 1 -- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 msgr2=0x7fed741b3200 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).read_until read failed 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed79914700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_read_frame_preamble_main read frame preamble failed r=-1 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed79914700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed7a115700 1 -- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 msgr2=0x7fed741b3200 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed7a115700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed7a115700 1 -- 192.168.123.103:0/3976225826 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fed6c00b050 con 0x7fed74072b20 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.099+0000 7fed7a115700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74072b20 0x7fed740831b0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fed7000b810 tx=0x7fed7000bbd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.100+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed7000d690 con 0x7fed74072b20 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.100+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fed741b3800 con 0x7fed74072b20 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.100+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fed741b3d00 con 0x7fed74072b20 2026-03-09T00:00:21.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.101+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fed7000dcd0 con 0x7fed74072b20 2026-03-09T00:00:21.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.101+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed7000c410 con 0x7fed74072b20 2026-03-09T00:00:21.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.102+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7fed7000c570 con 0x7fed74072b20 2026-03-09T00:00:21.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.103+0000 7fed6b7fe700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fed6006e7a0 0x7fed60070c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:21.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.103+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fed70011070 con 0x7fed74072b20 2026-03-09T00:00:21.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.105+0000 7fed79914700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fed6006e7a0 0x7fed60070c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:21.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.105+0000 7fed79914700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fed6006e7a0 0x7fed60070c60 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fed6c007ea0 tx=0x7fed6c00bd10 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:21.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.106+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fed58005320 con 0x7fed74072b20 2026-03-09T00:00:21.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.109+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fed7005a4c0 con 0x7fed74072b20 2026-03-09T00:00:21.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:21.243+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7fed58000bf0 con 0x7fed6006e7a0 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.360 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:21 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:21 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: Reconfiguring alertmanager.vm03 (dependencies changed)... 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: Reconfiguring daemon alertmanager.vm03 on vm03 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:22.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: Reconfiguring alertmanager.vm03 (dependencies changed)... 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: Reconfiguring daemon alertmanager.vm03 on vm03 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: Reconfiguring grafana.vm03 (dependencies changed)... 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: Reconfiguring daemon grafana.vm03 on vm03 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/4100103365' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1eefdd28-e5a7-4e98-a454-60c0bb654070"}]: dispatch 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/4100103365' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "1eefdd28-e5a7-4e98-a454-60c0bb654070"}]': finished 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T00:00:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:23 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:23.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: Reconfiguring grafana.vm03 (dependencies changed)... 2026-03-09T00:00:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: Reconfiguring daemon grafana.vm03 on vm03 2026-03-09T00:00:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/4100103365' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1eefdd28-e5a7-4e98-a454-60c0bb654070"}]: dispatch 2026-03-09T00:00:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/4100103365' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "1eefdd28-e5a7-4e98-a454-60c0bb654070"}]': finished 2026-03-09T00:00:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: osdmap e6: 1 total, 0 up, 1 in 2026-03-09T00:00:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:23 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:24.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:24 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1219770330' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:24.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:24 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1219770330' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:26.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:25 vm03 ceph-mon[52346]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:26.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:25 vm06 ceph-mon[58395]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:26.800 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:26 vm03 ceph-mon[52346]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:26 vm06 ceph-mon[58395]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:28.474 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:28 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:28.474 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:28 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:28.474 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:28 vm03 ceph-mon[52346]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T00:00:28.474 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:28 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T00:00:28.474 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:28 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:28 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:28 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:28 vm06 ceph-mon[58395]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T00:00:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:28 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T00:00:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:28 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:29.393 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: Deploying daemon osd.0 on vm03 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T00:00:29.582 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:00:29.583 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:29.583 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T00:00:29.583 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.583 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:29 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: Deploying daemon osd.0 on vm03 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:29 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: Reconfiguring mgr.vm06.rzcvhn (monmap changed)... 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: Reconfiguring daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:30.398 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:30 vm06 ceph-mon[58395]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: Reconfiguring mgr.vm06.rzcvhn (monmap changed)... 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: Reconfiguring daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:30.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:30 vm03 ceph-mon[52346]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T00:00:31.778 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 0 on host 'vm03' 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.774+0000 7fed6b7fe700 1 -- 192.168.123.103:0/3976225826 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fed58000bf0 con 0x7fed6006e7a0 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.776+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fed6006e7a0 msgr2=0x7fed60070c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.776+0000 7fed7b117700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fed6006e7a0 0x7fed60070c60 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fed6c007ea0 tx=0x7fed6c00bd10 comp rx=0 tx=0).stop 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.776+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74072b20 msgr2=0x7fed740831b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.776+0000 7fed7b117700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74072b20 0x7fed740831b0 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7fed7000b810 tx=0x7fed7000bbd0 comp rx=0 tx=0).stop 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.778+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 shutdown_connections 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.778+0000 7fed7b117700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fed6006e7a0 0x7fed60070c60 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.778+0000 7fed7b117700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fed74072b20 0x7fed740831b0 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.778+0000 7fed7b117700 1 --2- 192.168.123.103:0/3976225826 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fed740836f0 0x7fed741b3200 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.778+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 >> 192.168.123.103:0/3976225826 conn(0x7fed7406daa0 msgr2=0x7fed7406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.778+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 shutdown_connections 2026-03-09T00:00:31.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:31.779+0000 7fed7b117700 1 -- 192.168.123.103:0/3976225826 wait complete. 2026-03-09T00:00:31.779 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.779 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.780 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:31 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:31.836 DEBUG:teuthology.orchestra.run.vm03:osd.0> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0.service 2026-03-09T00:00:31.837 INFO:tasks.cephadm:Deploying osd.1 on vm03 with /dev/vdd... 2026-03-09T00:00:31.838 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- lvm zap /dev/vdd 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm03.local:9093"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm03.local:3000"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm03.local:9095"}]: dispatch 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:31 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:32.107 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:32.585 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:32 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[69755]: 2026-03-09T00:00:32.434+0000 7f256a233640 -1 osd.0 0 log_to_monitors true 2026-03-09T00:00:32.760 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:00:32.772 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch daemon add osd vm03:/dev/vdd 2026-03-09T00:00:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:32 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:32 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:32 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:32 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:32 vm03 ceph-mon[52346]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:32 vm03 ceph-mon[52346]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T00:00:32.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:32 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:32 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:32 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:32 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:32.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:32 vm06 ceph-mon[58395]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:32.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:32 vm06 ceph-mon[58395]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T00:00:32.948 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:33.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.254+0000 7f174a776700 1 -- 192.168.123.103:0/2151496777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f174410a700 msgr2=0x7f174410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:33.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.254+0000 7f174a776700 1 --2- 192.168.123.103:0/2151496777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f174410a700 0x7f174410cb90 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f1734009b00 tx=0x7f1734009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:33.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.254+0000 7f174a776700 1 -- 192.168.123.103:0/2151496777 shutdown_connections 2026-03-09T00:00:33.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.254+0000 7f174a776700 1 --2- 192.168.123.103:0/2151496777 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f174410a700 0x7f174410cb90 secure :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f1734009b00 tx=0x7f1734009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:33.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.254+0000 7f174a776700 1 --2- 192.168.123.103:0/2151496777 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744107d90 0x7f174410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:33.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.254+0000 7f174a776700 1 -- 192.168.123.103:0/2151496777 >> 192.168.123.103:0/2151496777 conn(0x7f174406daa0 msgr2=0x7f174406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:33.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.255+0000 7f174a776700 1 -- 192.168.123.103:0/2151496777 shutdown_connections 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.255+0000 7f174a776700 1 -- 192.168.123.103:0/2151496777 wait complete. 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f174a776700 1 Processor -- start 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f174a776700 1 -- start start 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f174a776700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744107d90 0x7f1744116c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f174a776700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 0x7f1744076fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f174a776700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f17441176a0 con 0x7f1744117190 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f174a776700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1744117810 con 0x7f1744107d90 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f1748f73700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 0x7f1744076fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f1748f73700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 0x7f1744076fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40138/0 (socket says 192.168.123.103:40138) 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.256+0000 7f1748f73700 1 -- 192.168.123.103:0/3979348676 learned_addr learned my addr 192.168.123.103:0/3979348676 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f1748f73700 1 -- 192.168.123.103:0/3979348676 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744107d90 msgr2=0x7f1744116c50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f1748f73700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744107d90 0x7f1744116c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f1748f73700 1 -- 192.168.123.103:0/3979348676 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f17340097e0 con 0x7f1744117190 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f1748f73700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 0x7f1744076fe0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f1734000c00 tx=0x7f1734004a40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:33.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173401d070 con 0x7f1744117190 2026-03-09T00:00:33.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f173400bc50 con 0x7f1744117190 2026-03-09T00:00:33.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.257+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1744077520 con 0x7f1744117190 2026-03-09T00:00:33.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.259+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f173400f7c0 con 0x7f1744117190 2026-03-09T00:00:33.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.259+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f173400f980 con 0x7f1744117190 2026-03-09T00:00:33.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.259+0000 7f173a7fc700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f173006c1f0 0x7f173006e6b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:33.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.259+0000 7f1749774700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f173006c1f0 0x7f173006e6b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:33.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.259+0000 7f1749774700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f173006c1f0 0x7f173006e6b0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f17441af320 tx=0x7f1740006d20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:33.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.259+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1744077a10 con 0x7f1744117190 2026-03-09T00:00:33.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.266+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(7..7 src has 1..7) v4 ==== 1404+0+0 (secure 0 0 0) 0x7f173400bdc0 con 0x7f1744117190 2026-03-09T00:00:33.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.266+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f174404ea90 con 0x7f1744117190 2026-03-09T00:00:33.272 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.269+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1734027050 con 0x7f1744117190 2026-03-09T00:00:33.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:33.385+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f1744108a20 con 0x7f173006c1f0 2026-03-09T00:00:33.942 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:33.943 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:33 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:33.943 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:33 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[69755]: 2026-03-09T00:00:33.743+0000 7f255f096700 -1 osd.0 0 waiting for initial osdmap 2026-03-09T00:00:33.943 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:33 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[69755]: 2026-03-09T00:00:33.759+0000 7f2558e87700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:00:34.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: osdmap e7: 1 total, 0 up, 1 in 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='client.14280 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:33 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: osdmap e8: 1 total, 0 up, 1 in 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: Detected new or changed devices on vm03 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1740465463' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "6a8d6e5f-c441-499a-a4bb-8d9bc046a85f"}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] boot 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1740465463' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "6a8d6e5f-c441-499a-a4bb-8d9bc046a85f"}]': finished 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.741 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:34 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: osdmap e8: 1 total, 0 up, 1 in 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: Detected new or changed devices on vm03 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1740465463' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "6a8d6e5f-c441-499a-a4bb-8d9bc046a85f"}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: osd.0 [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] boot 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1740465463' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "6a8d6e5f-c441-499a-a4bb-8d9bc046a85f"}]': finished 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: osdmap e9: 2 total, 1 up, 2 in 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:00:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:34 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:36.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:35 vm03 ceph-mon[52346]: purged_snaps scrub starts 2026-03-09T00:00:36.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:35 vm03 ceph-mon[52346]: purged_snaps scrub ok 2026-03-09T00:00:36.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:35 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:36.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:35 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:36.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:35 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/437055159' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:36.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:35 vm06 ceph-mon[58395]: purged_snaps scrub starts 2026-03-09T00:00:36.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:35 vm06 ceph-mon[58395]: purged_snaps scrub ok 2026-03-09T00:00:36.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:35 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:36.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:35 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:36.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:35 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/437055159' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:37.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:37 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:37.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: pgmap v15: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: osdmap e10: 2 total, 1 up, 2 in 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:37.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:37 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:39.071 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:39 vm03 ceph-mon[52346]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:39.071 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:39 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T00:00:39.071 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:39 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:39.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:39 vm06 ceph-mon[58395]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:39 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T00:00:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:39 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:40.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:40 vm03 ceph-mon[52346]: Deploying daemon osd.1 on vm03 2026-03-09T00:00:40.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:40 vm06 ceph-mon[58395]: Deploying daemon osd.1 on vm03 2026-03-09T00:00:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:41 vm03 ceph-mon[52346]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:41 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:41 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:41 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:41.257 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:41 vm06 ceph-mon[58395]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:41.257 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:41 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:41.257 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:41 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:41.257 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:41 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:41.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.725+0000 7f173a7fc700 1 -- 192.168.123.103:0/3979348676 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f1744108a20 con 0x7f173006c1f0 2026-03-09T00:00:41.725 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 1 on host 'vm03' 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.727+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f173006c1f0 msgr2=0x7f173006e6b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.727+0000 7f174a776700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f173006c1f0 0x7f173006e6b0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f17441af320 tx=0x7f1740006d20 comp rx=0 tx=0).stop 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.727+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 msgr2=0x7f1744076fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.727+0000 7f174a776700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 0x7f1744076fe0 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f1734000c00 tx=0x7f1734004a40 comp rx=0 tx=0).stop 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 shutdown_connections 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f173006c1f0 0x7f173006e6b0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1744107d90 0x7f1744116c50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 --2- 192.168.123.103:0/3979348676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1744117190 0x7f1744076fe0 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 >> 192.168.123.103:0/3979348676 conn(0x7f174406daa0 msgr2=0x7f174406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 shutdown_connections 2026-03-09T00:00:41.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:41.728+0000 7f174a776700 1 -- 192.168.123.103:0/3979348676 wait complete. 2026-03-09T00:00:41.785 DEBUG:teuthology.orchestra.run.vm03:osd.1> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1.service 2026-03-09T00:00:41.787 INFO:tasks.cephadm:Deploying osd.2 on vm03 with /dev/vdc... 2026-03-09T00:00:41.787 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- lvm zap /dev/vdc 2026-03-09T00:00:42.097 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:42 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:42 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:42 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:42 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:42 vm03 ceph-mon[52346]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:42.712 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:00:42.736 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch daemon add osd vm03:/dev/vdc 2026-03-09T00:00:42.907 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:42 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[76366]: 2026-03-09T00:00:42.737+0000 7fb68200c640 -1 osd.1 0 log_to_monitors true 2026-03-09T00:00:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:42 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:42 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:42 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:42 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:42 vm06 ceph-mon[58395]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:42.982 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:00:43.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.298+0000 7fe71054d700 1 -- 192.168.123.103:0/2569903386 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 msgr2=0x7fe708100280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:43.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.298+0000 7fe71054d700 1 --2- 192.168.123.103:0/2569903386 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 0x7fe708100280 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7fe704009b00 tx=0x7fe704009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.299+0000 7fe71054d700 1 -- 192.168.123.103:0/2569903386 shutdown_connections 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.299+0000 7fe71054d700 1 --2- 192.168.123.103:0/2569903386 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708101440 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.299+0000 7fe71054d700 1 --2- 192.168.123.103:0/2569903386 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 0x7fe708100280 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.299+0000 7fe71054d700 1 -- 192.168.123.103:0/2569903386 >> 192.168.123.103:0/2569903386 conn(0x7fe7080fb3c0 msgr2=0x7fe7080fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.299+0000 7fe71054d700 1 -- 192.168.123.103:0/2569903386 shutdown_connections 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.299+0000 7fe71054d700 1 -- 192.168.123.103:0/2569903386 wait complete. 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe71054d700 1 Processor -- start 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe71054d700 1 -- start start 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe71054d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 0x7fe7080689e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe71054d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708068f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe71054d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe708071900 con 0x7fe7080ffe60 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe71054d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe708071a40 con 0x7fe708100fc0 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe70dae8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708068f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe70dae8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708068f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:51568/0 (socket says 192.168.123.103:51568) 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.300+0000 7fe70dae8700 1 -- 192.168.123.103:0/1453799837 learned_addr learned my addr 192.168.123.103:0/1453799837 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.301+0000 7fe70e2e9700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 0x7fe7080689e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.301+0000 7fe70dae8700 1 -- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 msgr2=0x7fe7080689e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.301+0000 7fe70dae8700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 0x7fe7080689e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.301+0000 7fe70dae8700 1 -- 192.168.123.103:0/1453799837 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7040097e0 con 0x7fe708100fc0 2026-03-09T00:00:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.301+0000 7fe70dae8700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708068f20 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fe6f800d8d0 tx=0x7fe6f800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:43.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.302+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6f8009940 con 0x7fe708100fc0 2026-03-09T00:00:43.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.302+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe6f8010460 con 0x7fe708100fc0 2026-03-09T00:00:43.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.302+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe6f800f5d0 con 0x7fe708100fc0 2026-03-09T00:00:43.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.302+0000 7fe71054d700 1 -- 192.168.123.103:0/1453799837 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe708071ca0 con 0x7fe708100fc0 2026-03-09T00:00:43.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.302+0000 7fe71054d700 1 -- 192.168.123.103:0/1453799837 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe7080721f0 con 0x7fe708100fc0 2026-03-09T00:00:43.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.303+0000 7fe71054d700 1 -- 192.168.123.103:0/1453799837 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe708066e80 con 0x7fe708100fc0 2026-03-09T00:00:43.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.305+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7fe6f80105d0 con 0x7fe708100fc0 2026-03-09T00:00:43.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.306+0000 7fe6ff7fe700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe6f406c580 0x7fe6f406ea40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:43.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.306+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(10..10 src has 1..10) v4 ==== 1915+0+0 (secure 0 0 0) 0x7fe6f808a630 con 0x7fe708100fc0 2026-03-09T00:00:43.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.306+0000 7fe70e2e9700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe6f406c580 0x7fe6f406ea40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:43.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.307+0000 7fe70e2e9700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe6f406c580 0x7fe6f406ea40 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fe704000c00 tx=0x7fe704005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:43.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.308+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe6f8059530 con 0x7fe708100fc0 2026-03-09T00:00:43.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:43.422+0000 7fe71054d700 1 -- 192.168.123.103:0/1453799837 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7fe7081059a0 con 0x7fe6f406c580 2026-03-09T00:00:43.696 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:43.696 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:43.696 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:43 vm03 ceph-mon[52346]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T00:00:43.696 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:43.696 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:43.696 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:43 vm06 ceph-mon[58395]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T00:00:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='client.24121 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: Detected new or changed devices on vm03 2026-03-09T00:00:44.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3668806223' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "57704364-d509-479a-8dff-0b9f590cc6d0"}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3668806223' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "57704364-d509-479a-8dff-0b9f590cc6d0"}]': finished 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: osdmap e12: 3 total, 1 up, 3 in 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:44.839 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:44 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[76366]: 2026-03-09T00:00:44.666+0000 7fb676e6f700 -1 osd.1 0 waiting for initial osdmap 2026-03-09T00:00:44.839 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:44 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[76366]: 2026-03-09T00:00:44.688+0000 7fb673465700 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='client.24121 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm03:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: osdmap e11: 2 total, 1 up, 2 in 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: Detected new or changed devices on vm03 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: pgmap v21: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3668806223' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "57704364-d509-479a-8dff-0b9f590cc6d0"}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3668806223' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "57704364-d509-479a-8dff-0b9f590cc6d0"}]': finished 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: osdmap e12: 3 total, 1 up, 3 in 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:44.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: purged_snaps scrub starts 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: purged_snaps scrub ok 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2286321750' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] boot 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: osdmap e13: 3 total, 2 up, 3 in 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: purged_snaps scrub starts 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: purged_snaps scrub ok 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2286321750' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: osd.1 [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] boot 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: osdmap e13: 3 total, 2 up, 3 in 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:46.242 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:47.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:46 vm06 ceph-mon[58395]: pgmap v24: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:47.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:46 vm06 ceph-mon[58395]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T00:00:47.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:46 vm03 ceph-mon[52346]: pgmap v24: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:46 vm03 ceph-mon[52346]: osdmap e14: 3 total, 2 up, 3 in 2026-03-09T00:00:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:49.309 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:49 vm03 ceph-mon[52346]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:49.309 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T00:00:49.309 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:49.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:49 vm06 ceph-mon[58395]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:49.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T00:00:49.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:50.222 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:50 vm03 ceph-mon[52346]: Deploying daemon osd.2 on vm03 2026-03-09T00:00:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:50 vm06 ceph-mon[58395]: Deploying daemon osd.2 on vm03 2026-03-09T00:00:51.051 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:51 vm03 ceph-mon[52346]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:51.051 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:51 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:51.051 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:51 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:51.051 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:51 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:51 vm06 ceph-mon[58395]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stdout:Created osd(s) 2 on host 'vm03' 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.486+0000 7fe6ff7fe700 1 -- 192.168.123.103:0/1453799837 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fe7081059a0 con 0x7fe6f406c580 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 -- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe6f406c580 msgr2=0x7fe6f406ea40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe6f406c580 0x7fe6f406ea40 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fe704000c00 tx=0x7fe704005c00 comp rx=0 tx=0).stop 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 -- 192.168.123.103:0/1453799837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 msgr2=0x7fe708068f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708068f20 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fe6f800d8d0 tx=0x7fe6f800dc90 comp rx=0 tx=0).stop 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 -- 192.168.123.103:0/1453799837 shutdown_connections 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe6f406c580 0x7fe6f406ea40 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe7080ffe60 0x7fe7080689e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 --2- 192.168.123.103:0/1453799837 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe708100fc0 0x7fe708068f20 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.490+0000 7fe6fd7fa700 1 -- 192.168.123.103:0/1453799837 >> 192.168.123.103:0/1453799837 conn(0x7fe7080fb3c0 msgr2=0x7fe708104280 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.491+0000 7fe6fd7fa700 1 -- 192.168.123.103:0/1453799837 shutdown_connections 2026-03-09T00:00:52.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:00:52.491+0000 7fe6fd7fa700 1 -- 192.168.123.103:0/1453799837 wait complete. 2026-03-09T00:00:52.658 DEBUG:teuthology.orchestra.run.vm03:osd.2> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.2.service 2026-03-09T00:00:52.659 INFO:tasks.cephadm:Deploying osd.3 on vm06 with /dev/vde... 2026-03-09T00:00:52.659 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- lvm zap /dev/vde 2026-03-09T00:00:52.809 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:00:53.258 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:53 vm03 ceph-mon[52346]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:53.258 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:53 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:53.258 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:53 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:53.297 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:53 vm06 ceph-mon[58395]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:53.297 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:53 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:53.297 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:53 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:53.343 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:00:53.355 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch daemon add osd vm06:/dev/vde 2026-03-09T00:00:53.504 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:00:53.537 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:53 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[83418]: 2026-03-09T00:00:53.256+0000 7f5505d83640 -1 osd.2 0 log_to_monitors true 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.781+0000 7f2f6dc18700 1 -- 192.168.123.106:0/2065019489 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68103120 msgr2=0x7f2f68103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.781+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/2065019489 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68103120 0x7f2f68103540 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f2f58009b00 tx=0x7f2f58009e10 comp rx=0 tx=0).stop 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.782+0000 7f2f6dc18700 1 -- 192.168.123.106:0/2065019489 shutdown_connections 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.782+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/2065019489 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68104320 0x7f2f68104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.782+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/2065019489 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68103120 0x7f2f68103540 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.782+0000 7f2f6dc18700 1 -- 192.168.123.106:0/2065019489 >> 192.168.123.106:0/2065019489 conn(0x7f2f680fe6c0 msgr2=0x7f2f68100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.782+0000 7f2f6dc18700 1 -- 192.168.123.106:0/2065019489 shutdown_connections 2026-03-09T00:00:53.782 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.782+0000 7f2f6dc18700 1 -- 192.168.123.106:0/2065019489 wait complete. 2026-03-09T00:00:53.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.783+0000 7f2f6dc18700 1 Processor -- start 2026-03-09T00:00:53.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.783+0000 7f2f6dc18700 1 -- start start 2026-03-09T00:00:53.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.783+0000 7f2f6dc18700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 0x7f2f68071ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:53.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.783+0000 7f2f6dc18700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68104320 0x7f2f68072220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:53.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.783+0000 7f2f6dc18700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f68072760 con 0x7f2f68103120 2026-03-09T00:00:53.783 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.783+0000 7f2f6dc18700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f680728d0 con 0x7f2f68104320 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f677fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 0x7f2f68071ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f677fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 0x7f2f68071ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:51184/0 (socket says 192.168.123.106:51184) 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f677fe700 1 -- 192.168.123.106:0/3750351203 learned_addr learned my addr 192.168.123.106:0/3750351203 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f5ffff700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68104320 0x7f2f68072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f677fe700 1 -- 192.168.123.106:0/3750351203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68104320 msgr2=0x7f2f68072220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f677fe700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68104320 0x7f2f68072220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f677fe700 1 -- 192.168.123.106:0/3750351203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f580097e0 con 0x7f2f68103120 2026-03-09T00:00:53.784 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.784+0000 7f2f5ffff700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68104320 0x7f2f68072220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:00:53.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f677fe700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 0x7f2f68071ce0 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f2f58004990 tx=0x7f2f580049c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:53.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f5801d070 con 0x7f2f68103120 2026-03-09T00:00:53.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2f5800bc50 con 0x7f2f68103120 2026-03-09T00:00:53.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f5800f830 con 0x7f2f68103120 2026-03-09T00:00:53.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2f681a2550 con 0x7f2f68103120 2026-03-09T00:00:53.785 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2f681a2a60 con 0x7f2f68103120 2026-03-09T00:00:53.786 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2f68066e80 con 0x7f2f68103120 2026-03-09T00:00:53.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.786+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f2f58022470 con 0x7f2f68103120 2026-03-09T00:00:53.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.787+0000 7f2f657fa700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f5406c410 0x7f2f5406e8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:00:53.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.787+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(14..14 src has 1..14) v4 ==== 2347+0+0 (secure 0 0 0) 0x7f2f5808c250 con 0x7f2f68103120 2026-03-09T00:00:53.787 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.787+0000 7f2f5ffff700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f5406c410 0x7f2f5406e8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:00:53.788 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.789+0000 7f2f5ffff700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f5406c410 0x7f2f5406e8d0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f2f68075410 tx=0x7f2f50008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:00:53.790 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.790+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2f5805b260 con 0x7f2f68103120 2026-03-09T00:00:53.914 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:00:53.913+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7f2f68108c70 con 0x7f2f5406c410 2026-03-09T00:00:54.592 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:54 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:54.592 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:54 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:54.592 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:54 vm03 ceph-mon[52346]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T00:00:54.592 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:54 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:54.592 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:54 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:54.592 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:54 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:54 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:54 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:54 vm06 ceph-mon[58395]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T00:00:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:54 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:00:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:54 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:00:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:54 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:55.088 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:54 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[83418]: 2026-03-09T00:00:54.871+0000 7f54fabe6700 -1 osd.2 0 waiting for initial osdmap 2026-03-09T00:00:55.088 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:54 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[83418]: 2026-03-09T00:00:54.880+0000 7f54f69db700 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='client.14312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: osdmap e15: 3 total, 2 up, 3 in 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: Detected new or changed devices on vm03 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/308156157' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8b49bccb-fd91-44f4-831e-a401044f0e64"}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8b49bccb-fd91-44f4-831e-a401044f0e64"}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "8b49bccb-fd91-44f4-831e-a401044f0e64"}]': finished 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: osdmap e16: 4 total, 2 up, 4 in 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:55.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:55 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='client.14312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: pgmap v29: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: osdmap e15: 3 total, 2 up, 3 in 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: Detected new or changed devices on vm03 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/308156157' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8b49bccb-fd91-44f4-831e-a401044f0e64"}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8b49bccb-fd91-44f4-831e-a401044f0e64"}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]': finished 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "8b49bccb-fd91-44f4-831e-a401044f0e64"}]': finished 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: osdmap e16: 4 total, 2 up, 4 in 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:55.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:55 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/409733235' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] boot 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: osdmap e17: 4 total, 3 up, 4 in 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:56.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:56 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/409733235' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: osd.2 [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] boot 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: osdmap e17: 4 total, 3 up, 4 in 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:56 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: purged_snaps scrub starts 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: purged_snaps scrub ok 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: pgmap v33: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:57.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:57 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T00:00:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: purged_snaps scrub starts 2026-03-09T00:00:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: purged_snaps scrub ok 2026-03-09T00:00:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: pgmap v33: 0 pgs: ; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:00:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-09T00:00:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: osdmap e18: 4 total, 3 up, 4 in 2026-03-09T00:00:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:57.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:57 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88265]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vde 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88265]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88265]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88265]: pam_unix(sudo:session): session closed for user root 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88268]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdd 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88268]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88268]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-09T00:00:59.066 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:00:58 vm03 sudo[88268]: pam_unix(sudo:session): session closed for user root 2026-03-09T00:00:59.066 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:58 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T00:00:59.066 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:58 vm03 ceph-mon[52346]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T00:00:59.066 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:58 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:59.066 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:58 vm03 ceph-mon[52346]: pgmap v36: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:00:59.099 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:58 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-09T00:00:59.099 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:58 vm06 ceph-mon[58395]: osdmap e19: 4 total, 3 up, 4 in 2026-03-09T00:00:59.099 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:58 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:00:59.099 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:58 vm06 ceph-mon[58395]: pgmap v36: 1 pgs: 1 unknown; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:00:59.338 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88271]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdc 2026-03-09T00:00:59.338 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88271]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T00:00:59.338 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88271]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-09T00:00:59.338 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88271]: pam_unix(sudo:session): session closed for user root 2026-03-09T00:00:59.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88274]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-09T00:00:59.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88274]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T00:00:59.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88274]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-09T00:00:59.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 sudo[88274]: pam_unix(sudo:session): session closed for user root 2026-03-09T00:00:59.722 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 sudo[63841]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-09T00:00:59.722 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 sudo[63841]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-09T00:00:59.722 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 sudo[63841]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-09T00:00:59.722 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 sudo[63841]: pam_unix(sudo:session): session closed for user root 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: Deploying daemon osd.3 on vm06 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T00:01:00.014 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:00:59 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: Deploying daemon osd.3 on vm06 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: osdmap e20: 4 total, 3 up, 4 in 2026-03-09T00:01:00.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:00:59 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:00.908 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:00.906+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2f58022b70 con 0x7f2f68103120 2026-03-09T00:01:01.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:00 vm06 ceph-mon[58395]: pgmap v38: 1 pgs: 1 unknown; 0 B data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:01.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:00 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:01.309 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:00 vm03 ceph-mon[52346]: pgmap v38: 1 pgs: 1 unknown; 0 B data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:01.309 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:00 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:02.050 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 3 on host 'vm06' 2026-03-09T00:01:02.050 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.048+0000 7f2f657fa700 1 -- 192.168.123.106:0/3750351203 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f2f68108c70 con 0x7f2f5406c410 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f5406c410 msgr2=0x7f2f5406e8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f5406c410 0x7f2f5406e8d0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f2f68075410 tx=0x7f2f50008040 comp rx=0 tx=0).stop 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 msgr2=0x7f2f68071ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 0x7f2f68071ce0 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7f2f58004990 tx=0x7f2f580049c0 comp rx=0 tx=0).stop 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 shutdown_connections 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2f5406c410 0x7f2f5406e8d0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2f68103120 0x7f2f68071ce0 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 --2- 192.168.123.106:0/3750351203 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2f68104320 0x7f2f68072220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 >> 192.168.123.106:0/3750351203 conn(0x7f2f680fe6c0 msgr2=0x7f2f68107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 shutdown_connections 2026-03-09T00:01:02.052 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:02.051+0000 7f2f6dc18700 1 -- 192.168.123.106:0/3750351203 wait complete. 2026-03-09T00:01:02.062 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:01 vm06 ceph-mon[58395]: mgrmap e19: vm03.yvcons(active, since 60s), standbys: vm06.rzcvhn 2026-03-09T00:01:02.062 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:01 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:02.062 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:01 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:02.062 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:01 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:02.098 DEBUG:teuthology.orchestra.run.vm06:osd.3> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3.service 2026-03-09T00:01:02.099 INFO:tasks.cephadm:Deploying osd.4 on vm06 with /dev/vdd... 2026-03-09T00:01:02.099 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- lvm zap /dev/vdd 2026-03-09T00:01:02.334 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:01:02.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:01 vm03 ceph-mon[52346]: mgrmap e19: vm03.yvcons(active, since 60s), standbys: vm06.rzcvhn 2026-03-09T00:01:02.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:01 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:02.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:01 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:02.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:01 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:02.890 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:01:02.905 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch daemon add osd vm06:/dev/vdd 2026-03-09T00:01:03.056 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:03 vm06 ceph-mon[58395]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:03.056 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:03 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.056 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:03 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.056 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:03 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.056 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:03 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.099 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:01:03.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:03 vm03 ceph-mon[52346]: pgmap v39: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:03.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:03 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:03.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.428+0000 7f085b59e700 1 -- 192.168.123.106:0/3582148501 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08540a5df0 msgr2=0x7f08540a6270 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:03.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.428+0000 7f085b59e700 1 --2- 192.168.123.106:0/3582148501 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08540a5df0 0x7f08540a6270 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f0850009b00 tx=0x7f0850009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:03.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.429+0000 7f085b59e700 1 -- 192.168.123.106:0/3582148501 shutdown_connections 2026-03-09T00:01:03.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.429+0000 7f085b59e700 1 --2- 192.168.123.106:0/3582148501 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08540a5df0 0x7f08540a6270 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:03.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.429+0000 7f085b59e700 1 --2- 192.168.123.106:0/3582148501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08540a50d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:03.430 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.429+0000 7f085b59e700 1 -- 192.168.123.106:0/3582148501 >> 192.168.123.106:0/3582148501 conn(0x7f08540a0170 msgr2=0x7f08540a25d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:03.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.431+0000 7f085b59e700 1 -- 192.168.123.106:0/3582148501 shutdown_connections 2026-03-09T00:01:03.431 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.431+0000 7f085b59e700 1 -- 192.168.123.106:0/3582148501 wait complete. 2026-03-09T00:01:03.432 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085b59e700 1 Processor -- start 2026-03-09T00:01:03.432 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085b59e700 1 -- start start 2026-03-09T00:01:03.432 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085b59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08541431b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:03.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085b59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08541436f0 0x7f0854148760 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:03.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085b59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0854143c00 con 0x7f08540a4cb0 2026-03-09T00:01:03.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085b59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0854143d70 con 0x7f08541436f0 2026-03-09T00:01:03.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08541431b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:03.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08541431b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.106:36802/0 (socket says 192.168.123.106:36802) 2026-03-09T00:01:03.433 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.432+0000 7f085a59c700 1 -- 192.168.123.106:0/2011981223 learned_addr learned my addr 192.168.123.106:0/2011981223 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:01:03.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.433+0000 7f085a59c700 1 -- 192.168.123.106:0/2011981223 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08541436f0 msgr2=0x7f0854148760 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:03.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.433+0000 7f085a59c700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08541436f0 0x7f0854148760 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:03.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.433+0000 7f085a59c700 1 -- 192.168.123.106:0/2011981223 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f08500097e0 con 0x7f08540a4cb0 2026-03-09T00:01:03.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.433+0000 7f085a59c700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08541431b0 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f084c00c8f0 tx=0x7f084c00ccb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:03.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.434+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f084c007a50 con 0x7f08540a4cb0 2026-03-09T00:01:03.434 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.434+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f084c004d10 con 0x7f08540a4cb0 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.435+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f084c005680 con 0x7f08540a4cb0 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.436+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f08540ac940 con 0x7f08540a4cb0 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.436+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f08540ace10 con 0x7f08540a4cb0 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.441+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f084c007bb0 con 0x7f08540a4cb0 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.442+0000 7f084b7fe700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f084406c370 0x7f084406e830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.442+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(20..20 src has 1..20) v4 ==== 3165+0+0 (secure 0 0 0) 0x7f084c08ad70 con 0x7f08540a4cb0 2026-03-09T00:01:03.448 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.448+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0854004510 con 0x7f08540a4cb0 2026-03-09T00:01:03.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.453+0000 7f0859d9b700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f084406c370 0x7f084406e830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:03.453 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.453+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f084c059940 con 0x7f08540a4cb0 2026-03-09T00:01:03.471 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.471+0000 7f0859d9b700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f084406c370 0x7f084406e830 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f085000b5c0 tx=0x7f0850005c20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:03.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:03.612+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7f085413b740 con 0x7f084406c370 2026-03-09T00:01:04.412 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='client.14332 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:04.412 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:01:04.412 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:01:04.412 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:04.412 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:01:04 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[64254]: 2026-03-09T00:01:04.198+0000 7fc41403c640 -1 osd.3 0 log_to_monitors true 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: Detected new or changed devices on vm06 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='osd.3 [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:01:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:04 vm06 ceph-mon[58395]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='client.14332 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: Detected new or changed devices on vm06 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: pgmap v40: 1 pgs: 1 active+clean; 449 KiB data, 81 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:04.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='osd.3 [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:01:04.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:04 vm03 ceph-mon[52346]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/1216051581' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8c3a4d00-bb0a-4f59-b53b-83364e99627b"}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8c3a4d00-bb0a-4f59-b53b-83364e99627b"}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "8c3a4d00-bb0a-4f59-b53b-83364e99627b"}]': finished 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: osdmap e21: 5 total, 3 up, 5 in 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='osd.3 [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:05 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/1624965010' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/1216051581' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8c3a4d00-bb0a-4f59-b53b-83364e99627b"}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "8c3a4d00-bb0a-4f59-b53b-83364e99627b"}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "8c3a4d00-bb0a-4f59-b53b-83364e99627b"}]': finished 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: osdmap e21: 5 total, 3 up, 5 in 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='osd.3 [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:05 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/1624965010' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:01:05.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:01:05 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[64254]: 2026-03-09T00:01:05.718+0000 7fc408e9f700 -1 osd.3 0 waiting for initial osdmap 2026-03-09T00:01:05.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:01:05 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[64254]: 2026-03-09T00:01:05.724+0000 7fc404c94700 -1 osd.3 22 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:01:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:06 vm06 ceph-mon[58395]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T00:01:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:06 vm06 ceph-mon[58395]: osdmap e22: 5 total, 3 up, 5 in 2026-03-09T00:01:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:06 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:06 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:06 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:06 vm06 ceph-mon[58395]: pgmap v43: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:07.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:06 vm03 ceph-mon[52346]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T00:01:07.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:06 vm03 ceph-mon[52346]: osdmap e22: 5 total, 3 up, 5 in 2026-03-09T00:01:07.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:06 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:07.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:06 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:07.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:06 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:07.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:06 vm03 ceph-mon[52346]: pgmap v43: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: purged_snaps scrub starts 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: purged_snaps scrub ok 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: osd.3 [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] boot 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: osdmap e23: 5 total, 4 up, 5 in 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:08.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:07 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: purged_snaps scrub starts 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: purged_snaps scrub ok 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: osd.3 [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] boot 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: osdmap e23: 5 total, 4 up, 5 in 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:01:08.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:07 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:08.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:08 vm06 ceph-mon[58395]: osdmap e24: 5 total, 4 up, 5 in 2026-03-09T00:01:08.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:08 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:08.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:08 vm06 ceph-mon[58395]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T00:01:08.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:08 vm06 ceph-mon[58395]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T00:01:08.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:08 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:09.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:08 vm03 ceph-mon[52346]: osdmap e24: 5 total, 4 up, 5 in 2026-03-09T00:01:09.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:08 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:09.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:08 vm03 ceph-mon[52346]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T00:01:09.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:08 vm03 ceph-mon[52346]: osdmap e25: 5 total, 4 up, 5 in 2026-03-09T00:01:09.358 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:08 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:10.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:09 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T00:01:10.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:09 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:10.020 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:09 vm06 ceph-mon[58395]: Deploying daemon osd.4 on vm06 2026-03-09T00:01:10.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T00:01:10.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:09 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:10.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:09 vm03 ceph-mon[52346]: Deploying daemon osd.4 on vm06 2026-03-09T00:01:10.902 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:10 vm06 ceph-mon[58395]: pgmap v48: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T00:01:11.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:10 vm03 ceph-mon[52346]: pgmap v48: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-09T00:01:12.344 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 4 on host 'vm06' 2026-03-09T00:01:12.344 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.342+0000 7f084b7fe700 1 -- 192.168.123.106:0/2011981223 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f085413b740 con 0x7f084406c370 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f084406c370 msgr2=0x7f084406e830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f084406c370 0x7f084406e830 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f085000b5c0 tx=0x7f0850005c20 comp rx=0 tx=0).stop 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 msgr2=0x7f08541431b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08541431b0 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7f084c00c8f0 tx=0x7f084c00ccb0 comp rx=0 tx=0).stop 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 shutdown_connections 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f084406c370 0x7f084406e830 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f08540a4cb0 0x7f08541431b0 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 --2- 192.168.123.106:0/2011981223 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f08541436f0 0x7f0854148760 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 >> 192.168.123.106:0/2011981223 conn(0x7f08540a0170 msgr2=0x7f08540a2460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 shutdown_connections 2026-03-09T00:01:12.346 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:12.345+0000 7f085b59e700 1 -- 192.168.123.106:0/2011981223 wait complete. 2026-03-09T00:01:12.399 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:12 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:12.399 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:12 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:12.399 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:12 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:12.414 DEBUG:teuthology.orchestra.run.vm06:osd.4> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.4.service 2026-03-09T00:01:12.416 INFO:tasks.cephadm:Deploying osd.5 on vm06 with /dev/vdc... 2026-03-09T00:01:12.417 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- lvm zap /dev/vdc 2026-03-09T00:01:12.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:12 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:12.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:12 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:12.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:12 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:12.634 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:01:13.281 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:01:13.296 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph orch daemon add osd vm06:/dev/vdc 2026-03-09T00:01:13.470 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:13 vm06 ceph-mon[58395]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 106 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T00:01:13.470 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:13 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.470 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:13 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.470 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:13 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.470 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:13 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.563 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:01:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:13 vm03 ceph-mon[52346]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 106 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T00:01:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:13 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:13 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:13 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:13 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:13.868 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.866+0000 7ff80bfff700 1 -- 192.168.123.106:0/3951282557 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 msgr2=0x7ff80c103290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:13.868 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.866+0000 7ff80bfff700 1 --2- 192.168.123.106:0/3951282557 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c103290 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7ff7fc009b00 tx=0x7ff7fc009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:13.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.869+0000 7ff80bfff700 1 -- 192.168.123.106:0/3951282557 shutdown_connections 2026-03-09T00:01:13.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.869+0000 7ff80bfff700 1 --2- 192.168.123.106:0/3951282557 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff80c104060 0x7ff80c1044e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:13.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.869+0000 7ff80bfff700 1 --2- 192.168.123.106:0/3951282557 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c103290 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:13.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.869+0000 7ff80bfff700 1 -- 192.168.123.106:0/3951282557 >> 192.168.123.106:0/3951282557 conn(0x7ff80c0fe440 msgr2=0x7ff80c1008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:13.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.869+0000 7ff80bfff700 1 -- 192.168.123.106:0/3951282557 shutdown_connections 2026-03-09T00:01:13.870 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.869+0000 7ff80bfff700 1 -- 192.168.123.106:0/3951282557 wait complete. 2026-03-09T00:01:13.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.871+0000 7ff80bfff700 1 Processor -- start 2026-03-09T00:01:13.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.871+0000 7ff80bfff700 1 -- start start 2026-03-09T00:01:13.871 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.871+0000 7ff80bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c10f6c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.871+0000 7ff80bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff80c104060 0x7ff80c10fc00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.871+0000 7ff80bfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff80c110220 con 0x7ff80c104060 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.871+0000 7ff80bfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff80c110360 con 0x7ff80c102e70 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c10f6c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c10f6c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:33294/0 (socket says 192.168.123.106:33294) 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80affd700 1 -- 192.168.123.106:0/497185233 learned_addr learned my addr 192.168.123.106:0/497185233 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:01:13.872 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80a7fc700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff80c104060 0x7ff80c10fc00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:13.873 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80affd700 1 -- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff80c104060 msgr2=0x7ff80c10fc00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:13.873 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80affd700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff80c104060 0x7ff80c10fc00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:13.873 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.872+0000 7ff80affd700 1 -- 192.168.123.106:0/497185233 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7fc0097e0 con 0x7ff80c102e70 2026-03-09T00:01:13.873 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.873+0000 7ff80affd700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c10f6c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7ff7fc000c00 tx=0x7ff7fc004990 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:13.873 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.873+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7fc01d070 con 0x7ff80c102e70 2026-03-09T00:01:13.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.873+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff80c112da0 con 0x7ff80c102e70 2026-03-09T00:01:13.874 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.873+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff80c1aafc0 con 0x7ff80c102e70 2026-03-09T00:01:13.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.874+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff7fc022470 con 0x7ff80c102e70 2026-03-09T00:01:13.875 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.875+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7fc00f670 con 0x7ff80c102e70 2026-03-09T00:01:13.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.876+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff7fc00f7d0 con 0x7ff80c102e70 2026-03-09T00:01:13.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.876+0000 7ff7f3fff700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff7f406c600 0x7ff7f406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:13.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.876+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(25..25 src has 1..25) v4 ==== 3697+0+0 (secure 0 0 0) 0x7ff7fc08d780 con 0x7ff80c102e70 2026-03-09T00:01:13.876 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.876+0000 7ff80a7fc700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff7f406c600 0x7ff7f406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:13.877 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.877+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff7f8005320 con 0x7ff80c102e70 2026-03-09T00:01:13.881 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.880+0000 7ff80a7fc700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff7f406c600 0x7ff7f406eac0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7ff80c102ba0 tx=0x7ff80000b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:13.881 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.881+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff7fc05c1e0 con 0x7ff80c102e70 2026-03-09T00:01:13.997 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:13.996+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7ff7f8000bf0 con 0x7ff7f406c600 2026-03-09T00:01:14.260 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:01:14 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[70662]: 2026-03-09T00:01:14.123+0000 7f78d14fb640 -1 osd.4 0 log_to_monitors true 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 63 KiB/s, 0 objects/s recovering 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='client.24165 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: Detected new or changed devices on vm06 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.547 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:14.548 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:14 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: pgmap v50: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 63 KiB/s, 0 objects/s recovering 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='client.24165 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm06:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: Detected new or changed devices on vm06 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:14.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:14 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:15.269 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:01:14 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[70662]: 2026-03-09T00:01:14.927+0000 7f78c7b61700 -1 osd.4 0 waiting for initial osdmap 2026-03-09T00:01:15.269 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:01:14 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[70662]: 2026-03-09T00:01:14.938+0000 7f78c2153700 -1 osd.4 27 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/816545341' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "15462a2c-77f6-4f87-a9bf-e5fe4de71f8f"}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "15462a2c-77f6-4f87-a9bf-e5fe4de71f8f"}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "15462a2c-77f6-4f87-a9bf-e5fe4de71f8f"}]': finished 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: osdmap e27: 6 total, 4 up, 6 in 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:15 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: osdmap e26: 5 total, 4 up, 5 in 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/816545341' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "15462a2c-77f6-4f87-a9bf-e5fe4de71f8f"}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "15462a2c-77f6-4f87-a9bf-e5fe4de71f8f"}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893]' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "15462a2c-77f6-4f87-a9bf-e5fe4de71f8f"}]': finished 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: osdmap e27: 6 total, 4 up, 6 in 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:15.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:15 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:16 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/2639937298' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:01:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:16 vm06 ceph-mon[58395]: osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] boot 2026-03-09T00:01:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:16 vm06 ceph-mon[58395]: osdmap e28: 6 total, 5 up, 6 in 2026-03-09T00:01:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:16 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:16 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:16 vm06 ceph-mon[58395]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T00:01:16.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:16 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/2639937298' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-09T00:01:16.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:16 vm03 ceph-mon[52346]: osd.4 [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] boot 2026-03-09T00:01:16.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:16 vm03 ceph-mon[52346]: osdmap e28: 6 total, 5 up, 6 in 2026-03-09T00:01:16.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:01:16.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:16 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:16.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:16 vm03 ceph-mon[52346]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-09T00:01:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:17 vm03 ceph-mon[52346]: purged_snaps scrub starts 2026-03-09T00:01:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:17 vm03 ceph-mon[52346]: purged_snaps scrub ok 2026-03-09T00:01:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:17 vm03 ceph-mon[52346]: osdmap e29: 6 total, 5 up, 6 in 2026-03-09T00:01:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:17 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:18.413 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:17 vm06 ceph-mon[58395]: purged_snaps scrub starts 2026-03-09T00:01:18.413 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:17 vm06 ceph-mon[58395]: purged_snaps scrub ok 2026-03-09T00:01:18.413 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:17 vm06 ceph-mon[58395]: osdmap e29: 6 total, 5 up, 6 in 2026-03-09T00:01:18.413 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:17 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:18 vm06 ceph-mon[58395]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:18 vm03 ceph-mon[52346]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:20.253 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T00:01:20.253 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:19 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:20.253 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:19 vm06 ceph-mon[58395]: Deploying daemon osd.5 on vm06 2026-03-09T00:01:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T00:01:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:19 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:19 vm03 ceph-mon[52346]: Deploying daemon osd.5 on vm06 2026-03-09T00:01:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:20 vm03 ceph-mon[52346]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:20 vm06 ceph-mon[58395]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:22.332 INFO:teuthology.orchestra.run.vm06.stdout:Created osd(s) 5 on host 'vm06' 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.328+0000 7ff7f3fff700 1 -- 192.168.123.106:0/497185233 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7ff7f8000bf0 con 0x7ff7f406c600 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff7f406c600 msgr2=0x7ff7f406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff7f406c600 0x7ff7f406eac0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7ff80c102ba0 tx=0x7ff80000b410 comp rx=0 tx=0).stop 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 msgr2=0x7ff80c10f6c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c10f6c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7ff7fc000c00 tx=0x7ff7fc004990 comp rx=0 tx=0).stop 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 shutdown_connections 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff7f406c600 0x7ff7f406eac0 secure :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7ff80c102ba0 tx=0x7ff80000b410 comp rx=0 tx=0).stop 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff80c102e70 0x7ff80c10f6c0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 --2- 192.168.123.106:0/497185233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff80c104060 0x7ff80c10fc00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 >> 192.168.123.106:0/497185233 conn(0x7ff80c0fe440 msgr2=0x7ff80c078df0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 shutdown_connections 2026-03-09T00:01:22.333 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:22.331+0000 7ff80bfff700 1 -- 192.168.123.106:0/497185233 wait complete. 2026-03-09T00:01:22.348 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:22.348 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:22.348 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:22 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:22.404 DEBUG:teuthology.orchestra.run.vm06:osd.5> sudo journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.5.service 2026-03-09T00:01:22.406 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-09T00:01:22.406 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd stat -f json 2026-03-09T00:01:22.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:22.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:22.437 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:22 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:22.597 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:22.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.894+0000 7f557e1e2700 1 -- 192.168.123.103:0/2306828350 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 msgr2=0x7f5578102550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:22.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.894+0000 7f557e1e2700 1 --2- 192.168.123.103:0/2306828350 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578102550 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f5568009b00 tx=0x7f5568009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.895+0000 7f557e1e2700 1 -- 192.168.123.103:0/2306828350 shutdown_connections 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.895+0000 7f557e1e2700 1 --2- 192.168.123.103:0/2306828350 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578102550 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.895+0000 7f557e1e2700 1 --2- 192.168.123.103:0/2306828350 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f5578101330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.895+0000 7f557e1e2700 1 -- 192.168.123.103:0/2306828350 >> 192.168.123.103:0/2306828350 conn(0x7f55780fc4c0 msgr2=0x7f55780fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.895+0000 7f557e1e2700 1 -- 192.168.123.103:0/2306828350 shutdown_connections 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.895+0000 7f557e1e2700 1 -- 192.168.123.103:0/2306828350 wait complete. 2026-03-09T00:01:22.895 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.896+0000 7f557e1e2700 1 Processor -- start 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.896+0000 7f557e1e2700 1 -- start start 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f557e1e2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f55781967d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f557e1e2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578196d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f557e1e2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5578197330 con 0x7f55781020d0 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f557e1e2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5578197470 con 0x7f5578100ef0 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f55777fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f55781967d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f55777fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f55781967d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:51454/0 (socket says 192.168.123.103:51454) 2026-03-09T00:01:22.896 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f55777fe700 1 -- 192.168.123.103:0/4049939894 learned_addr learned my addr 192.168.123.103:0/4049939894 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:22.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f5576ffd700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578196d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:22.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.897+0000 7f55777fe700 1 -- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 msgr2=0x7f5578196d10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:22.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.898+0000 7f55777fe700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578196d10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:22.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.898+0000 7f55777fe700 1 -- 192.168.123.103:0/4049939894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55680097e0 con 0x7f5578100ef0 2026-03-09T00:01:22.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.898+0000 7f5576ffd700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578196d10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:01:22.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.898+0000 7f55777fe700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f55781967d0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f556000d900 tx=0x7f556000dc10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:22.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.899+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55600041d0 con 0x7f5578100ef0 2026-03-09T00:01:22.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.899+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f557819bf20 con 0x7f5578100ef0 2026-03-09T00:01:22.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.899+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f557819c3e0 con 0x7f5578100ef0 2026-03-09T00:01:22.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.899+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5560004330 con 0x7f5578100ef0 2026-03-09T00:01:22.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.900+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5560003da0 con 0x7f5578100ef0 2026-03-09T00:01:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.901+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5560010460 con 0x7f5578100ef0 2026-03-09T00:01:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.901+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5558005320 con 0x7f5578100ef0 2026-03-09T00:01:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.901+0000 7f5574ff9700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f556406c4e0 0x7f556406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.901+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f5560021030 con 0x7f5578100ef0 2026-03-09T00:01:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.902+0000 7f5576ffd700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f556406c4e0 0x7f556406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:22.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.902+0000 7f5576ffd700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f556406c4e0 0x7f556406e9a0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f55680052d0 tx=0x7f556800b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:22.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:22.907+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f55600597c0 con 0x7f5578100ef0 2026-03-09T00:01:23.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.017+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f5558005190 con 0x7f5578100ef0 2026-03-09T00:01:23.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.018+0000 7f5574ff9700 1 -- 192.168.123.103:0/4049939894 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f5560059350 con 0x7f5578100ef0 2026-03-09T00:01:23.019 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:23.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.022+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f556406c4e0 msgr2=0x7f556406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:23.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.022+0000 7f557e1e2700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f556406c4e0 0x7f556406e9a0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f55680052d0 tx=0x7f556800b540 comp rx=0 tx=0).stop 2026-03-09T00:01:23.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.023+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 msgr2=0x7f55781967d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:23.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.023+0000 7f557e1e2700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f55781967d0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f556000d900 tx=0x7f556000dc10 comp rx=0 tx=0).stop 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.023+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 shutdown_connections 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.024+0000 7f557e1e2700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f556406c4e0 0x7f556406e9a0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.024+0000 7f557e1e2700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5578100ef0 0x7f55781967d0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.024+0000 7f557e1e2700 1 --2- 192.168.123.103:0/4049939894 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f55781020d0 0x7f5578196d10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.024+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 >> 192.168.123.103:0/4049939894 conn(0x7f55780fc4c0 msgr2=0x7f5578105390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.024+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 shutdown_connections 2026-03-09T00:01:23.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:23.024+0000 7f557e1e2700 1 -- 192.168.123.103:0/4049939894 wait complete. 2026-03-09T00:01:23.103 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773014475,"num_in_osds":6,"osd_in_since":1773014474,"num_remapped_pgs":0} 2026-03-09T00:01:23.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:23 vm03 ceph-mon[52346]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:23.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:23 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:23 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:23 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:23 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.281 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:23 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/4049939894' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T00:01:23.591 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:23 vm06 ceph-mon[58395]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:23.591 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:23 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.591 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:23 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.591 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:23 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.591 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:23 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:23.591 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:23 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/4049939894' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T00:01:23.591 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:01:23 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[76918]: 2026-03-09T00:01:23.299+0000 7f8fb6814640 -1 osd.5 0 log_to_monitors true 2026-03-09T00:01:24.104 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd stat -f json 2026-03-09T00:01:24.265 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='osd.5 [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:24.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:24 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.551+0000 7fec1675d700 1 -- 192.168.123.103:0/818358179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 msgr2=0x7fec101024e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.551+0000 7fec1675d700 1 --2- 192.168.123.103:0/818358179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec101024e0 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7fec00009b00 tx=0x7fec00009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.552+0000 7fec1675d700 1 -- 192.168.123.103:0/818358179 shutdown_connections 2026-03-09T00:01:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.552+0000 7fec1675d700 1 --2- 192.168.123.103:0/818358179 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec101024e0 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.552+0000 7fec1675d700 1 --2- 192.168.123.103:0/818358179 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 0x7fec10101320 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:24.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.552+0000 7fec1675d700 1 -- 192.168.123.103:0/818358179 >> 192.168.123.103:0/818358179 conn(0x7fec100fc460 msgr2=0x7fec100fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:24.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.552+0000 7fec1675d700 1 -- 192.168.123.103:0/818358179 shutdown_connections 2026-03-09T00:01:24.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.552+0000 7fec1675d700 1 -- 192.168.123.103:0/818358179 wait complete. 2026-03-09T00:01:24.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.553+0000 7fec1675d700 1 Processor -- start 2026-03-09T00:01:24.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.553+0000 7fec1675d700 1 -- start start 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.553+0000 7fec1675d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 0x7fec10074b30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec1675d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec10073180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec1675d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec100736c0 con 0x7fec10102060 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec1675d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec10073800 con 0x7fec10100f00 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec10073180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec10073180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47280/0 (socket says 192.168.123.103:47280) 2026-03-09T00:01:24.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 -- 192.168.123.103:0/1484698233 learned_addr learned my addr 192.168.123.103:0/1484698233 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0ffff700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 0x7fec10074b30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 -- 192.168.123.103:0/1484698233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 msgr2=0x7fec10074b30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 0x7fec10074b30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 -- 192.168.123.103:0/1484698233 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fec000097e0 con 0x7fec10102060 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0ffff700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 0x7fec10074b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0f7fe700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec10073180 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7fec00009ad0 tx=0x7fec00005070 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec0001d070 con 0x7fec10102060 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fec0000bc50 con 0x7fec10102060 2026-03-09T00:01:24.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.554+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec0000f7f0 con 0x7fec10102060 2026-03-09T00:01:24.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.555+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fec10073a80 con 0x7fec10102060 2026-03-09T00:01:24.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.555+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fec10073f70 con 0x7fec10102060 2026-03-09T00:01:24.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.556+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fec10066e80 con 0x7fec10102060 2026-03-09T00:01:24.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.559+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fec0000f950 con 0x7fec10102060 2026-03-09T00:01:24.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.559+0000 7fec0d7fa700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7febfc06c530 0x7febfc06e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:24.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.560+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(30..30 src has 1..30) v4 ==== 4150+0+0 (secure 0 0 0) 0x7fec0008d6a0 con 0x7fec10102060 2026-03-09T00:01:24.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.560+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fec0008db70 con 0x7fec10102060 2026-03-09T00:01:24.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.560+0000 7fec0ffff700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7febfc06c530 0x7febfc06e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:24.560 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.560+0000 7fec0ffff700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7febfc06c530 0x7febfc06e9f0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7febf80079c0 tx=0x7febf8008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:24.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.665+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fec1019e920 con 0x7fec10102060 2026-03-09T00:01:24.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.665+0000 7fec0d7fa700 1 -- 192.168.123.103:0/1484698233 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v30) v1 ==== 74+0+130 (secure 0 0 0) 0x7fec00027090 con 0x7fec10102060 2026-03-09T00:01:24.665 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:24.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.668+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7febfc06c530 msgr2=0x7febfc06e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:24.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.668+0000 7fec1675d700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7febfc06c530 0x7febfc06e9f0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7febf80079c0 tx=0x7febf8008040 comp rx=0 tx=0).stop 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.668+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 msgr2=0x7fec10073180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.668+0000 7fec1675d700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec10073180 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7fec00009ad0 tx=0x7fec00005070 comp rx=0 tx=0).stop 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 shutdown_connections 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7febfc06c530 0x7febfc06e9f0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec10100f00 0x7fec10074b30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 --2- 192.168.123.103:0/1484698233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec10102060 0x7fec10073180 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 >> 192.168.123.103:0/1484698233 conn(0x7fec100fc460 msgr2=0x7fec10105320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 shutdown_connections 2026-03-09T00:01:24.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:24.669+0000 7fec1675d700 1 -- 192.168.123.103:0/1484698233 wait complete. 2026-03-09T00:01:24.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='osd.5 [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:24 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:24.739 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":30,"num_osds":6,"num_up_osds":5,"osd_up_since":1773014475,"num_in_osds":6,"osd_in_since":1773014474,"num_remapped_pgs":0} 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: Detected new or changed devices on vm06 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: from='osd.5 [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:25 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1484698233' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: Detected new or changed devices on vm06 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: osdmap e30: 6 total, 5 up, 6 in 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: from='osd.5 [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:01:25.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:25 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1484698233' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T00:01:25.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:01:25 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[76918]: 2026-03-09T00:01:25.298+0000 7f8face7a700 -1 osd.5 0 waiting for initial osdmap 2026-03-09T00:01:25.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:01:25 vm06 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[76918]: 2026-03-09T00:01:25.305+0000 7f8fa5468700 -1 osd.5 31 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:01:25.740 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd stat -f json 2026-03-09T00:01:25.912 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.194+0000 7fa75a871700 1 -- 192.168.123.103:0/2772260727 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 msgr2=0x7fa754100640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.194+0000 7fa75a871700 1 --2- 192.168.123.103:0/2772260727 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754100640 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fa744009a60 tx=0x7fa744009d70 comp rx=0 tx=0).stop 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.195+0000 7fa75a871700 1 -- 192.168.123.103:0/2772260727 shutdown_connections 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.195+0000 7fa75a871700 1 --2- 192.168.123.103:0/2772260727 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754100640 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.195+0000 7fa75a871700 1 --2- 192.168.123.103:0/2772260727 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 0x7fa7540ffc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.196+0000 7fa75a871700 1 -- 192.168.123.103:0/2772260727 >> 192.168.123.103:0/2772260727 conn(0x7fa7540fb3c0 msgr2=0x7fa7540fd840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.196+0000 7fa75a871700 1 -- 192.168.123.103:0/2772260727 shutdown_connections 2026-03-09T00:01:26.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.196+0000 7fa75a871700 1 -- 192.168.123.103:0/2772260727 wait complete. 2026-03-09T00:01:26.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa75a871700 1 Processor -- start 2026-03-09T00:01:26.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa75a871700 1 -- start start 2026-03-09T00:01:26.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa75a871700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 0x7fa7541989e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:26.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa75a871700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754198f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:26.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa75a871700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa754199540 con 0x7fa7540ff860 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa75a871700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa754199680 con 0x7fa7541001c0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa7537fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754198f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa7537fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754198f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58594/0 (socket says 192.168.123.103:58594) 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.197+0000 7fa7537fe700 1 -- 192.168.123.103:0/1672831550 learned_addr learned my addr 192.168.123.103:0/1672831550 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7537fe700 1 -- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 msgr2=0x7fa7541989e0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa753fff700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 0x7fa7541989e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7537fe700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 0x7fa7541989e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7537fe700 1 -- 192.168.123.103:0/1672831550 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa73c0097e0 con 0x7fa7541001c0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa753fff700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 0x7fa7541989e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7537fe700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754198f20 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fa7440096a0 tx=0x7fa74400f880 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa74401d070 con 0x7fa7541001c0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa744009710 con 0x7fa7541001c0 2026-03-09T00:01:26.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa75419e430 con 0x7fa7541001c0 2026-03-09T00:01:26.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa74400fe60 con 0x7fa7541001c0 2026-03-09T00:01:26.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.198+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7440177d0 con 0x7fa7541001c0 2026-03-09T00:01:26.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.199+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa754066e80 con 0x7fa7541001c0 2026-03-09T00:01:26.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.200+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fa744021410 con 0x7fa7541001c0 2026-03-09T00:01:26.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.200+0000 7fa7517fa700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa74006c4e0 0x7fa74006e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:26.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.200+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(31..31 src has 1..31) v4 ==== 4166+0+0 (secure 0 0 0) 0x7fa74408c5a0 con 0x7fa7541001c0 2026-03-09T00:01:26.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.200+0000 7fa753fff700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa74006c4e0 0x7fa74006e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:26.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.201+0000 7fa753fff700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa74006c4e0 0x7fa74006e9a0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fa73c005fd0 tx=0x7fa73c009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:26.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.203+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa74405adb0 con 0x7fa7541001c0 2026-03-09T00:01:26.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.317+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fa75419e700 con 0x7fa7541001c0 2026-03-09T00:01:26.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.317+0000 7fa7517fa700 1 -- 192.168.123.103:0/1672831550 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v32) v1 ==== 74+0+130 (secure 0 0 0) 0x7fa7440260a0 con 0x7fa7541001c0 2026-03-09T00:01:26.317 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa74006c4e0 msgr2=0x7fa74006e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa74006c4e0 0x7fa74006e9a0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fa73c005fd0 tx=0x7fa73c009500 comp rx=0 tx=0).stop 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 msgr2=0x7fa754198f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754198f20 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fa7440096a0 tx=0x7fa74400f880 comp rx=0 tx=0).stop 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 shutdown_connections 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa74006c4e0 0x7fa74006e9a0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7540ff860 0x7fa7541989e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 --2- 192.168.123.103:0/1672831550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7541001c0 0x7fa754198f20 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 >> 192.168.123.103:0/1672831550 conn(0x7fa7540fb3c0 msgr2=0x7fa754107e90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:26.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 shutdown_connections 2026-03-09T00:01:26.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.320+0000 7fa75a871700 1 -- 192.168.123.103:0/1672831550 wait complete. 2026-03-09T00:01:26.330 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:26 vm03 ceph-mon[52346]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T00:01:26.330 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:26 vm03 ceph-mon[52346]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T00:01:26.330 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:26 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:26.330 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:26 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:26.330 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:26 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:26.379 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":32,"num_osds":6,"num_up_osds":6,"osd_up_since":1773014486,"num_in_osds":6,"osd_in_since":1773014474,"num_remapped_pgs":0} 2026-03-09T00:01:26.379 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd dump --format=json 2026-03-09T00:01:26.533 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:26 vm06 ceph-mon[58395]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]': finished 2026-03-09T00:01:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:26 vm06 ceph-mon[58395]: osdmap e31: 6 total, 5 up, 6 in 2026-03-09T00:01:26.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:26 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:26.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:26 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:26.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:26 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.795+0000 7f687e904700 1 -- 192.168.123.103:0/3157625069 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103cf0 msgr2=0x7f6878107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.795+0000 7f687e904700 1 --2- 192.168.123.103:0/3157625069 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103cf0 0x7f6878107d40 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f6868009b00 tx=0x7f6868009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.796+0000 7f687e904700 1 -- 192.168.123.103:0/3157625069 shutdown_connections 2026-03-09T00:01:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.796+0000 7f687e904700 1 --2- 192.168.123.103:0/3157625069 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103cf0 0x7f6878107d40 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.796+0000 7f687e904700 1 --2- 192.168.123.103:0/3157625069 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103340 0x7f6878103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.796+0000 7f687e904700 1 -- 192.168.123.103:0/3157625069 >> 192.168.123.103:0/3157625069 conn(0x7f68780feb90 msgr2=0x7f6878100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:26.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.796+0000 7f687e904700 1 -- 192.168.123.103:0/3157625069 shutdown_connections 2026-03-09T00:01:26.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.796+0000 7f687e904700 1 -- 192.168.123.103:0/3157625069 wait complete. 2026-03-09T00:01:26.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.797+0000 7f687e904700 1 Processor -- start 2026-03-09T00:01:26.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.797+0000 7f687e904700 1 -- start start 2026-03-09T00:01:26.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.797+0000 7f687e904700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103340 0x7f6878198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:26.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f687e904700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 0x7f6878199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:26.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f687e904700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68781999e0 con 0x7f6878103cf0 2026-03-09T00:01:26.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f687e904700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f687819d770 con 0x7f6878103340 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 0x7f6878199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 0x7f6878199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40918/0 (socket says 192.168.123.103:40918) 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 -- 192.168.123.103:0/2360572208 learned_addr learned my addr 192.168.123.103:0/2360572208 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f6877fff700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103340 0x7f6878198dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 -- 192.168.123.103:0/2360572208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103340 msgr2=0x7f6878198dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103340 0x7f6878198dc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 -- 192.168.123.103:0/2360572208 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6860009710 con 0x7f6878103cf0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f6877fff700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103340 0x7f6878198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68777fe700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 0x7f6878199300 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f686800ba30 tx=0x7f686800ba60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f686801d070 con 0x7f6878103cf0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f686800f460 con 0x7f6878103cf0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.798+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6868005170 con 0x7f6878103cf0 2026-03-09T00:01:26.798 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.799+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f68680097e0 con 0x7f6878103cf0 2026-03-09T00:01:26.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.799+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f687819dd20 con 0x7f6878103cf0 2026-03-09T00:01:26.799 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.800+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f687804ea90 con 0x7f6878103cf0 2026-03-09T00:01:26.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.800+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f686800fab0 con 0x7f6878103cf0 2026-03-09T00:01:26.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.801+0000 7f68757fa700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f686406c4e0 0x7f686406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:26.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.801+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f686805ab70 con 0x7f6878103cf0 2026-03-09T00:01:26.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.804+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6868091050 con 0x7f6878103cf0 2026-03-09T00:01:26.803 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.804+0000 7f6877fff700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f686406c4e0 0x7f686406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:26.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.804+0000 7f6877fff700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f686406c4e0 0x7f686406e9a0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f6860009fd0 tx=0x7f6860009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:26.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.910+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f6878066e80 con 0x7f6878103cf0 2026-03-09T00:01:26.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.911+0000 7f68757fa700 1 -- 192.168.123.103:0/2360572208 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v32) v1 ==== 74+0+11249 (secure 0 0 0) 0x7f6868026090 con 0x7f6878103cf0 2026-03-09T00:01:26.911 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:26.911 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":32,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","created":"2026-03-08T23:58:56.272672+0000","modified":"2026-03-09T00:01:26.296864+0000","last_up_change":"2026-03-09T00:01:26.296864+0000","last_in_change":"2026-03-09T00:01:14.917922+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T00:00:56.171195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"1eefdd28-e5a7-4e98-a454-60c0bb654070","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6803","nonce":3077634834}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6805","nonce":3077634834}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6809","nonce":3077634834}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6807","nonce":3077634834}]},"public_addr":"192.168.123.103:6803/3077634834","cluster_addr":"192.168.123.103:6805/3077634834","heartbeat_back_addr":"192.168.123.103:6809/3077634834","heartbeat_front_addr":"192.168.123.103:6807/3077634834","state":["exists","up"]},{"osd":1,"uuid":"6a8d6e5f-c441-499a-a4bb-8d9bc046a85f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6811","nonce":3932541768}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6813","nonce":3932541768}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6817","nonce":3932541768}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6815","nonce":3932541768}]},"public_addr":"192.168.123.103:6811/3932541768","cluster_addr":"192.168.123.103:6813/3932541768","heartbeat_back_addr":"192.168.123.103:6817/3932541768","heartbeat_front_addr":"192.168.123.103:6815/3932541768","state":["exists","up"]},{"osd":2,"uuid":"57704364-d509-479a-8dff-0b9f590cc6d0","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6819","nonce":901207923}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6821","nonce":901207923}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6825","nonce":901207923}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6823","nonce":901207923}]},"public_addr":"192.168.123.103:6819/901207923","cluster_addr":"192.168.123.103:6821/901207923","heartbeat_back_addr":"192.168.123.103:6825/901207923","heartbeat_front_addr":"192.168.123.103:6823/901207923","state":["exists","up"]},{"osd":3,"uuid":"8b49bccb-fd91-44f4-831e-a401044f0e64","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6801","nonce":1431049389}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6803","nonce":1431049389}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6807","nonce":1431049389}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6805","nonce":1431049389}]},"public_addr":"192.168.123.106:6801/1431049389","cluster_addr":"192.168.123.106:6803/1431049389","heartbeat_back_addr":"192.168.123.106:6807/1431049389","heartbeat_front_addr":"192.168.123.106:6805/1431049389","state":["exists","up"]},{"osd":4,"uuid":"8c3a4d00-bb0a-4f59-b53b-83364e99627b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6809","nonce":1004293893}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6811","nonce":1004293893}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6815","nonce":1004293893}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6813","nonce":1004293893}]},"public_addr":"192.168.123.106:6809/1004293893","cluster_addr":"192.168.123.106:6811/1004293893","heartbeat_back_addr":"192.168.123.106:6815/1004293893","heartbeat_front_addr":"192.168.123.106:6813/1004293893","state":["exists","up"]},{"osd":5,"uuid":"15462a2c-77f6-4f87-a9bf-e5fe4de71f8f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6817","nonce":1139626945}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6819","nonce":1139626945}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6823","nonce":1139626945}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6821","nonce":1139626945}]},"public_addr":"192.168.123.106:6817/1139626945","cluster_addr":"192.168.123.106:6819/1139626945","heartbeat_back_addr":"192.168.123.106:6823/1139626945","heartbeat_front_addr":"192.168.123.106:6821/1139626945","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:33.479111+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:43.710028+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:54.267401+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:05.164542+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:15.083285+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:0/2723392878":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/78662117":"2026-03-09T23:59:25.439954+0000","192.168.123.103:0/2768716536":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/2535486397":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/2762219228":"2026-03-09T23:59:25.439954+0000","192.168.123.103:6801/2":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1691249097":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1899662013":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1443304653":"2026-03-09T23:59:11.665534+0000","192.168.123.103:6800/2":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/2143338907":"2026-03-09T23:59:25.439954+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T00:01:26.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.915+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f686406c4e0 msgr2=0x7f686406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.915+0000 7f687e904700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f686406c4e0 0x7f686406e9a0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f6860009fd0 tx=0x7f6860009450 comp rx=0 tx=0).stop 2026-03-09T00:01:26.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.915+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 msgr2=0x7f6878199300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:26.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.915+0000 7f687e904700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 0x7f6878199300 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f686800ba30 tx=0x7f686800ba60 comp rx=0 tx=0).stop 2026-03-09T00:01:26.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.916+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 shutdown_connections 2026-03-09T00:01:26.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.916+0000 7f687e904700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f686406c4e0 0x7f686406e9a0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.916+0000 7f687e904700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6878103340 0x7f6878198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.916+0000 7f687e904700 1 --2- 192.168.123.103:0/2360572208 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6878103cf0 0x7f6878199300 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:26.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.916+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 >> 192.168.123.103:0/2360572208 conn(0x7f68780feb90 msgr2=0x7f68781075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:26.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.916+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 shutdown_connections 2026-03-09T00:01:26.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:26.917+0000 7f687e904700 1 -- 192.168.123.103:0/2360572208 wait complete. 2026-03-09T00:01:26.984 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-09T00:00:56.171195+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '20', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-09T00:01:26.985 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd pool get .mgr pg_num 2026-03-09T00:01:27.134 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.394+0000 7ff97c10c700 1 -- 192.168.123.103:0/3877124635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9741010d0 msgr2=0x7ff9741014b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.394+0000 7ff97c10c700 1 --2- 192.168.123.103:0/3877124635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9741010d0 0x7ff9741014b0 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7ff964009b00 tx=0x7ff964009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.395+0000 7ff97c10c700 1 -- 192.168.123.103:0/3877124635 shutdown_connections 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.395+0000 7ff97c10c700 1 --2- 192.168.123.103:0/3877124635 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff974101a80 0x7ff974105ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.395+0000 7ff97c10c700 1 --2- 192.168.123.103:0/3877124635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff9741010d0 0x7ff9741014b0 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.395+0000 7ff97c10c700 1 -- 192.168.123.103:0/3877124635 >> 192.168.123.103:0/3877124635 conn(0x7ff9740fc920 msgr2=0x7ff9740fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.396+0000 7ff97c10c700 1 -- 192.168.123.103:0/3877124635 shutdown_connections 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.396+0000 7ff97c10c700 1 -- 192.168.123.103:0/3877124635 wait complete. 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.396+0000 7ff97c10c700 1 Processor -- start 2026-03-09T00:01:27.395 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.396+0000 7ff97c10c700 1 -- start start 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff97c10c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff9741010d0 0x7ff97419c9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff97c10c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 0x7ff97419cef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff97c10c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff97419d580 con 0x7ff974101a80 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff97c10c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff974196a30 con 0x7ff9741010d0 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 0x7ff97419cef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 0x7ff97419cef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40936/0 (socket says 192.168.123.103:40936) 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 -- 192.168.123.103:0/1568174909 learned_addr learned my addr 192.168.123.103:0/1568174909 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff979ea8700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff9741010d0 0x7ff97419c9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 -- 192.168.123.103:0/1568174909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff9741010d0 msgr2=0x7ff97419c9b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:27.396 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff9741010d0 0x7ff97419c9b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 -- 192.168.123.103:0/1568174909 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9640097e0 con 0x7ff974101a80 2026-03-09T00:01:27.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.397+0000 7ff9796a7700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 0x7ff97419cef0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7ff97000cc60 tx=0x7ff9700074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:27.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.398+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff970007af0 con 0x7ff974101a80 2026-03-09T00:01:27.397 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.398+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff970007c50 con 0x7ff974101a80 2026-03-09T00:01:27.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.398+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff9700186c0 con 0x7ff974101a80 2026-03-09T00:01:27.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.398+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff974196cb0 con 0x7ff974101a80 2026-03-09T00:01:27.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.398+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff974197200 con 0x7ff974101a80 2026-03-09T00:01:27.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.398+0000 7ff979ea8700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff9741010d0 0x7ff97419c9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:01:27.399 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.399+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff974109470 con 0x7ff974101a80 2026-03-09T00:01:27.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.401+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff970018820 con 0x7ff974101a80 2026-03-09T00:01:27.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.401+0000 7ff96affd700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff96006c5b0 0x7ff96006ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:27.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.401+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff97008b510 con 0x7ff974101a80 2026-03-09T00:01:27.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.402+0000 7ff979ea8700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff96006c5b0 0x7ff96006ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:27.401 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.402+0000 7ff979ea8700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff96006c5b0 0x7ff96006ea70 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff964006010 tx=0x7ff96400b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:27.402 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.402+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff970059be0 con 0x7ff974101a80 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: purged_snaps scrub starts 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: purged_snaps scrub ok 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: osd.5 [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] boot 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: osdmap e32: 6 total, 6 up, 6 in 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1672831550' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T00:01:27.482 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:27 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2360572208' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T00:01:27.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.508+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7ff974197f40 con 0x7ff974101a80 2026-03-09T00:01:27.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.509+0000 7ff96affd700 1 -- 192.168.123.103:0/1568174909 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v33) v1 ==== 93+0+10 (secure 0 0 0) 0x7ff970059770 con 0x7ff974101a80 2026-03-09T00:01:27.508 INFO:teuthology.orchestra.run.vm03.stdout:pg_num: 1 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.511+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff96006c5b0 msgr2=0x7ff96006ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.511+0000 7ff97c10c700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff96006c5b0 0x7ff96006ea70 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7ff964006010 tx=0x7ff96400b540 comp rx=0 tx=0).stop 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 msgr2=0x7ff97419cef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 0x7ff97419cef0 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7ff97000cc60 tx=0x7ff9700074a0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 shutdown_connections 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff96006c5b0 0x7ff96006ea70 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff9741010d0 0x7ff97419c9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 --2- 192.168.123.103:0/1568174909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff974101a80 0x7ff97419cef0 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 >> 192.168.123.103:0/1568174909 conn(0x7ff9740fc920 msgr2=0x7ff9740fe040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:27.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 shutdown_connections 2026-03-09T00:01:27.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.512+0000 7ff97c10c700 1 -- 192.168.123.103:0/1568174909 wait complete. 2026-03-09T00:01:27.557 INFO:tasks.cephadm:Setting up client nodes... 2026-03-09T00:01:27.557 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T00:01:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: purged_snaps scrub starts 2026-03-09T00:01:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: purged_snaps scrub ok 2026-03-09T00:01:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-09T00:01:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: osd.5 [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] boot 2026-03-09T00:01:27.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: osdmap e32: 6 total, 6 up, 6 in 2026-03-09T00:01:27.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:01:27.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1672831550' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-09T00:01:27.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:27 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2360572208' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T00:01:27.717 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.955+0000 7fde538ee700 1 -- 192.168.123.103:0/4036003869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 msgr2=0x7fde4c1081b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.955+0000 7fde538ee700 1 --2- 192.168.123.103:0/4036003869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c1081b0 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7fde48009b00 tx=0x7fde48009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 -- 192.168.123.103:0/4036003869 shutdown_connections 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 --2- 192.168.123.103:0/4036003869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 0x7fde4c10f0d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 --2- 192.168.123.103:0/4036003869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c1081b0 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 -- 192.168.123.103:0/4036003869 >> 192.168.123.103:0/4036003869 conn(0x7fde4c06dda0 msgr2=0x7fde4c070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 -- 192.168.123.103:0/4036003869 shutdown_connections 2026-03-09T00:01:27.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 -- 192.168.123.103:0/4036003869 wait complete. 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 Processor -- start 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.956+0000 7fde538ee700 1 -- start start 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde538ee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c10aa10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde538ee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 0x7fde4c109060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde538ee700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde4c1095a0 con 0x7fde4c107d90 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde538ee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde4c1096e0 con 0x7fde4c1086f0 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c10aa10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c10aa10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40954/0 (socket says 192.168.123.103:40954) 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 -- 192.168.123.103:0/3230861527 learned_addr learned my addr 192.168.123.103:0/3230861527 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 -- 192.168.123.103:0/3230861527 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 msgr2=0x7fde4c109060 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:27.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde50e89700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 0x7fde4c109060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 0x7fde4c109060 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 -- 192.168.123.103:0/3230861527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fde480097e0 con 0x7fde4c107d90 2026-03-09T00:01:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde50e89700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 0x7fde4c109060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.957+0000 7fde5168a700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c10aa10 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fde4800bb70 tx=0x7fde48004690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.958+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fde4801d070 con 0x7fde4c107d90 2026-03-09T00:01:27.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.958+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fde4c109960 con 0x7fde4c107d90 2026-03-09T00:01:27.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.958+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fde4c109e50 con 0x7fde4c107d90 2026-03-09T00:01:27.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.958+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fde48022470 con 0x7fde4c107d90 2026-03-09T00:01:27.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.958+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fde4800f740 con 0x7fde4c107d90 2026-03-09T00:01:27.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.959+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fde48004ca0 con 0x7fde4c107d90 2026-03-09T00:01:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.960+0000 7fde427fc700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fde3806c5b0 0x7fde3806ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:27.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.960+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fde30005320 con 0x7fde4c107d90 2026-03-09T00:01:27.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.962+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fde4808c910 con 0x7fde4c107d90 2026-03-09T00:01:27.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.962+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fde4805b090 con 0x7fde4c107d90 2026-03-09T00:01:27.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.963+0000 7fde50e89700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fde3806c5b0 0x7fde3806ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:27.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:27.963+0000 7fde50e89700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fde3806c5b0 0x7fde3806ea70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fde3c009e20 tx=0x7fde3c009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:28.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.114+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7fde30005190 con 0x7fde4c107d90 2026-03-09T00:01:28.119 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.119+0000 7fde427fc700 1 -- 192.168.123.103:0/3230861527 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7fde4805ac20 con 0x7fde4c107d90 2026-03-09T00:01:28.119 INFO:teuthology.orchestra.run.vm03.stdout:[client.0] 2026-03-09T00:01:28.119 INFO:teuthology.orchestra.run.vm03.stdout: key = AQDYDa5pT0LlBhAAFYAw5BctokqSqOCfi13lfw== 2026-03-09T00:01:28.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fde3806c5b0 msgr2=0x7fde3806ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:28.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fde3806c5b0 0x7fde3806ea70 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7fde3c009e20 tx=0x7fde3c009450 comp rx=0 tx=0).stop 2026-03-09T00:01:28.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 msgr2=0x7fde4c10aa10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:28.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c10aa10 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fde4800bb70 tx=0x7fde48004690 comp rx=0 tx=0).stop 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 shutdown_connections 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fde3806c5b0 0x7fde3806ea70 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde4c107d90 0x7fde4c10aa10 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 --2- 192.168.123.103:0/3230861527 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde4c1086f0 0x7fde4c109060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.122+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 >> 192.168.123.103:0/3230861527 conn(0x7fde4c06dda0 msgr2=0x7fde4c10d950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.123+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 shutdown_connections 2026-03-09T00:01:28.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:28.123+0000 7fde538ee700 1 -- 192.168.123.103:0/3230861527 wait complete. 2026-03-09T00:01:28.188 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:01:28.188 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-09T00:01:28.188 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-09T00:01:28.222 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-09T00:01:28.370 INFO:teuthology.orchestra.run.vm06.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm06/config 2026-03-09T00:01:28.500 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:28 vm06 ceph-mon[58395]: osdmap e33: 6 total, 6 up, 6 in 2026-03-09T00:01:28.500 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:28 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1568174909' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T00:01:28.500 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:28 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3230861527' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T00:01:28.500 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:28 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3230861527' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T00:01:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:28 vm03 ceph-mon[52346]: osdmap e33: 6 total, 6 up, 6 in 2026-03-09T00:01:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:28 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1568174909' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-09T00:01:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:28 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3230861527' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T00:01:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:28 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3230861527' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.607+0000 7f6b55cb9700 1 -- 192.168.123.106:0/6274981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50104340 msgr2=0x7f6b501047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.607+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/6274981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50104340 0x7f6b501047a0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f6b40009b50 tx=0x7f6b40009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.608+0000 7f6b55cb9700 1 -- 192.168.123.106:0/6274981 shutdown_connections 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.608+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/6274981 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50104340 0x7f6b501047a0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.608+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/6274981 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b50103140 0x7f6b50103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.608+0000 7f6b55cb9700 1 -- 192.168.123.106:0/6274981 >> 192.168.123.106:0/6274981 conn(0x7f6b500fe6c0 msgr2=0x7f6b50100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.608+0000 7f6b55cb9700 1 -- 192.168.123.106:0/6274981 shutdown_connections 2026-03-09T00:01:28.608 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.608+0000 7f6b55cb9700 1 -- 192.168.123.106:0/6274981 wait complete. 2026-03-09T00:01:28.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b55cb9700 1 Processor -- start 2026-03-09T00:01:28.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b55cb9700 1 -- start start 2026-03-09T00:01:28.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b55cb9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 0x7f6b50198a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:28.609 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b4f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 0x7f6b50198a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b4f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 0x7f6b50198a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.106:48776/0 (socket says 192.168.123.106:48776) 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b55cb9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b50104340 0x7f6b50198f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b55cb9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b501995a0 con 0x7f6b50104340 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b55cb9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b501996e0 con 0x7f6b50103140 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b4f7fe700 1 -- 192.168.123.106:0/3187337094 learned_addr learned my addr 192.168.123.106:0/3187337094 (peer_addr_for_me v2:192.168.123.106:0/0) 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b4f7fe700 1 -- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b50104340 msgr2=0x7f6b50198f80 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b4f7fe700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b50104340 0x7f6b50198f80 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.609+0000 7f6b4f7fe700 1 -- 192.168.123.106:0/3187337094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b400097e0 con 0x7f6b50103140 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.610+0000 7f6b4f7fe700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 0x7f6b50198a40 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f6b38009fd0 tx=0x7f6b3800eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.610+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b38009980 con 0x7f6b50103140 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.610+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b5019e190 con 0x7f6b50103140 2026-03-09T00:01:28.610 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.610+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b5019e6e0 con 0x7f6b50103140 2026-03-09T00:01:28.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.610+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6b38004500 con 0x7f6b50103140 2026-03-09T00:01:28.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.610+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b38010450 con 0x7f6b50103140 2026-03-09T00:01:28.611 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.611+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f6b3800cca0 con 0x7f6b50103140 2026-03-09T00:01:28.612 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.612+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b30005320 con 0x7f6b50103140 2026-03-09T00:01:28.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.614+0000 7f6b4cff9700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6b3c06c5b0 0x7f6b3c06ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:28.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.614+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f6b38014070 con 0x7f6b50103140 2026-03-09T00:01:28.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.614+0000 7f6b4effd700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6b3c06c5b0 0x7f6b3c06ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:28.614 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.614+0000 7f6b4effd700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6b3c06c5b0 0x7f6b3c06ea70 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f6b4000b5c0 tx=0x7f6b40005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:28.615 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.614+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6b38059f60 con 0x7f6b50103140 2026-03-09T00:01:28.759 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.759+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f6b300059f0 con 0x7f6b50103140 2026-03-09T00:01:28.764 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.764+0000 7f6b4cff9700 1 -- 192.168.123.106:0/3187337094 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7f6b38059af0 con 0x7f6b50103140 2026-03-09T00:01:28.764 INFO:teuthology.orchestra.run.vm06.stdout:[client.1] 2026-03-09T00:01:28.764 INFO:teuthology.orchestra.run.vm06.stdout: key = AQDYDa5pRaVsLRAAUZQQumhgMG7QFv85/GbT7w== 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6b3c06c5b0 msgr2=0x7f6b3c06ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6b3c06c5b0 0x7f6b3c06ea70 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f6b4000b5c0 tx=0x7f6b40005fb0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 msgr2=0x7f6b50198a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 0x7f6b50198a40 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f6b38009fd0 tx=0x7f6b3800eea0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 shutdown_connections 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f6b3c06c5b0 0x7f6b3c06ea70 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6b50103140 0x7f6b50198a40 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 --2- 192.168.123.106:0/3187337094 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6b50104340 0x7f6b50198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 >> 192.168.123.106:0/3187337094 conn(0x7f6b500fe6c0 msgr2=0x7f6b50107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 shutdown_connections 2026-03-09T00:01:28.767 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:28.767+0000 7f6b55cb9700 1 -- 192.168.123.106:0/3187337094 wait complete. 2026-03-09T00:01:28.809 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:01:28.809 DEBUG:teuthology.orchestra.run.vm06:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-09T00:01:28.809 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-09T00:01:28.844 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-09T00:01:28.844 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-09T00:01:28.844 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mgr dump --format=json 2026-03-09T00:01:28.984 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:29.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 -- 192.168.123.103:0/1712907652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c074dd0 msgr2=0x7ff83c072fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 --2- 192.168.123.103:0/1712907652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c074dd0 0x7ff83c072fc0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7ff824009a90 tx=0x7ff824009da0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 -- 192.168.123.103:0/1712907652 shutdown_connections 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 --2- 192.168.123.103:0/1712907652 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c073500 0x7ff83c073960 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 --2- 192.168.123.103:0/1712907652 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c074dd0 0x7ff83c072fc0 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 -- 192.168.123.103:0/1712907652 >> 192.168.123.103:0/1712907652 conn(0x7ff83c078ed0 msgr2=0x7ff83c0792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.233+0000 7ff8424e1700 1 -- 192.168.123.103:0/1712907652 shutdown_connections 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 -- 192.168.123.103:0/1712907652 wait complete. 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 Processor -- start 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 -- start start 2026-03-09T00:01:29.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 0x7ff83c19d230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff83bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 0x7ff83c19d230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff83bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 0x7ff83c19d230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40976/0 (socket says 192.168.123.103:40976) 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c074dd0 0x7ff83c19d770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff83c19de50 con 0x7ff83c073500 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff8424e1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff83c1a1be0 con 0x7ff83c074dd0 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.234+0000 7ff83bfff700 1 -- 192.168.123.103:0/3701713411 learned_addr learned my addr 192.168.123.103:0/3701713411 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff83b7fe700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c074dd0 0x7ff83c19d770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff83bfff700 1 -- 192.168.123.103:0/3701713411 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c074dd0 msgr2=0x7ff83c19d770 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff83bfff700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c074dd0 0x7ff83c19d770 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff83bfff700 1 -- 192.168.123.103:0/3701713411 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff824009710 con 0x7ff83c073500 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff83b7fe700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c074dd0 0x7ff83c19d770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:29.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff83bfff700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 0x7ff83c19d230 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7ff824000c00 tx=0x7ff82400f740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:29.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff82401d070 con 0x7ff83c073500 2026-03-09T00:01:29.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff82400fd20 con 0x7ff83c073500 2026-03-09T00:01:29.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff824017900 con 0x7ff83c073500 2026-03-09T00:01:29.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff83c1a1e60 con 0x7ff83c073500 2026-03-09T00:01:29.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.235+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff83c1a2350 con 0x7ff83c073500 2026-03-09T00:01:29.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.237+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff83c04ea90 con 0x7ff83c073500 2026-03-09T00:01:29.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.237+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff824017a60 con 0x7ff83c073500 2026-03-09T00:01:29.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.237+0000 7ff8397fa700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff82806c4e0 0x7ff82806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:29.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.237+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff82408c8f0 con 0x7ff83c073500 2026-03-09T00:01:29.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.240+0000 7ff83b7fe700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff82806c4e0 0x7ff82806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:29.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.240+0000 7ff83b7fe700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff82806c4e0 0x7ff82806e9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7ff83c19e850 tx=0x7ff82c009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:29.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.240+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff8240575e0 con 0x7ff83c073500 2026-03-09T00:01:29.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.375+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7ff83c19e630 con 0x7ff83c073500 2026-03-09T00:01:29.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.377+0000 7ff8397fa700 1 -- 192.168.123.103:0/3701713411 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+173029 (secure 0 0 0) 0x7ff82405ac00 con 0x7ff83c073500 2026-03-09T00:01:29.377 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:29.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:29 vm03 ceph-mon[52346]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:29.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:29 vm03 ceph-mon[52346]: from='client.? 192.168.123.106:0/3187337094' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T00:01:29.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:29 vm03 ceph-mon[52346]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T00:01:29.382 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:29 vm03 ceph-mon[52346]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T00:01:29.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.382+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff82806c4e0 msgr2=0x7ff82806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.382+0000 7ff8424e1700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff82806c4e0 0x7ff82806e9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7ff83c19e850 tx=0x7ff82c009500 comp rx=0 tx=0).stop 2026-03-09T00:01:29.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.382+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 msgr2=0x7ff83c19d230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.382+0000 7ff8424e1700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 0x7ff83c19d230 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7ff824000c00 tx=0x7ff82400f740 comp rx=0 tx=0).stop 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 shutdown_connections 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff82806c4e0 0x7ff82806e9a0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c073500 0x7ff83c19d230 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 --2- 192.168.123.103:0/3701713411 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c074dd0 0x7ff83c19d770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 >> 192.168.123.103:0/3701713411 conn(0x7ff83c078ed0 msgr2=0x7ff83c10f960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 shutdown_connections 2026-03-09T00:01:29.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.385+0000 7ff8424e1700 1 -- 192.168.123.103:0/3701713411 wait complete. 2026-03-09T00:01:29.429 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":19,"active_gid":14223,"active_name":"vm03.yvcons","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":2},{"type":"v1","addr":"192.168.123.103:6801","nonce":2}]},"active_addr":"192.168.123.103:6801/2","active_change":"2026-03-08T23:59:59.972217+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14248,"name":"vm06.rzcvhn","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:0.0.2","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:0.0.2","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.103:8443/","prometheus":"http://192.168.123.103:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":2345556975}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":2931592692}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":3723784945}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":2800828829}]}]} 2026-03-09T00:01:29.430 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-09T00:01:29.431 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-09T00:01:29.431 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd dump --format=json 2026-03-09T00:01:29.574 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:29.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:29 vm06 ceph-mon[58395]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:29.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:29 vm06 ceph-mon[58395]: from='client.? 192.168.123.106:0/3187337094' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T00:01:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:29 vm06 ceph-mon[58395]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-09T00:01:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:29 vm06 ceph-mon[58395]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-09T00:01:29.792 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.793+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3742004288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ff4c0 msgr2=0x7f1c5c0ff8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.793+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3742004288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c0ff8a0 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f1c4c009b00 tx=0x7f1c4c009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.793+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3742004288 shutdown_connections 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.793+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3742004288 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ffde0 0x7f1c5c10ae70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.793+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3742004288 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c0ff8a0 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.793+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3742004288 >> 192.168.123.103:0/3742004288 conn(0x7f1c5c074bd0 msgr2=0x7f1c5c074fe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3742004288 shutdown_connections 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3742004288 wait complete. 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 Processor -- start 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 -- start start 2026-03-09T00:01:29.793 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c198d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ffde0 0x7f1c5c199280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c5c199960 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c62e0c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c5c19d6f0 con 0x7f1c5c0ff4c0 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c5bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ffde0 0x7f1c5c199280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.794+0000 7f1c60ba8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c198d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c60ba8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c198d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58706/0 (socket says 192.168.123.103:58706) 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c60ba8700 1 -- 192.168.123.103:0/3579886649 learned_addr learned my addr 192.168.123.103:0/3579886649 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c5bfff700 1 -- 192.168.123.103:0/3579886649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ff4c0 msgr2=0x7f1c5c198d40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c5bfff700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c198d40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c5bfff700 1 -- 192.168.123.103:0/3579886649 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1c4c0097e0 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c5bfff700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ffde0 0x7f1c5c199280 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f1c5c0678b0 tx=0x7f1c5000d9a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c500041d0 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1c5c19d9d0 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.795+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1c5c19df20 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.796+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1c50004330 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.796+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c50003d40 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.797+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1c5000f690 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.797+0000 7f1c59ffb700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c4406c5b0 0x7f1c4406ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:29.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.797+0000 7f1c60ba8700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c4406c5b0 0x7f1c4406ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:29.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.798+0000 7f1c60ba8700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c4406c5b0 0x7f1c4406ea70 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f1c4c0094d0 tx=0x7f1c4c00b560 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:29.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.798+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1c5008ab50 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.798+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1c48005320 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.800 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.800+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1c50055790 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.903+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f1c48005190 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.905+0000 7f1c59ffb700 1 -- 192.168.123.103:0/3579886649 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11272 (secure 0 0 0) 0x7f1c5001f020 con 0x7f1c5c0ffde0 2026-03-09T00:01:29.905 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:29.905 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","created":"2026-03-08T23:58:56.272672+0000","modified":"2026-03-09T00:01:27.307218+0000","last_up_change":"2026-03-09T00:01:26.296864+0000","last_in_change":"2026-03-09T00:01:14.917922+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T00:00:56.171195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"1eefdd28-e5a7-4e98-a454-60c0bb654070","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6803","nonce":3077634834}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6805","nonce":3077634834}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6809","nonce":3077634834}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6807","nonce":3077634834}]},"public_addr":"192.168.123.103:6803/3077634834","cluster_addr":"192.168.123.103:6805/3077634834","heartbeat_back_addr":"192.168.123.103:6809/3077634834","heartbeat_front_addr":"192.168.123.103:6807/3077634834","state":["exists","up"]},{"osd":1,"uuid":"6a8d6e5f-c441-499a-a4bb-8d9bc046a85f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6811","nonce":3932541768}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6813","nonce":3932541768}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6817","nonce":3932541768}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6815","nonce":3932541768}]},"public_addr":"192.168.123.103:6811/3932541768","cluster_addr":"192.168.123.103:6813/3932541768","heartbeat_back_addr":"192.168.123.103:6817/3932541768","heartbeat_front_addr":"192.168.123.103:6815/3932541768","state":["exists","up"]},{"osd":2,"uuid":"57704364-d509-479a-8dff-0b9f590cc6d0","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6819","nonce":901207923}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6821","nonce":901207923}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6825","nonce":901207923}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6823","nonce":901207923}]},"public_addr":"192.168.123.103:6819/901207923","cluster_addr":"192.168.123.103:6821/901207923","heartbeat_back_addr":"192.168.123.103:6825/901207923","heartbeat_front_addr":"192.168.123.103:6823/901207923","state":["exists","up"]},{"osd":3,"uuid":"8b49bccb-fd91-44f4-831e-a401044f0e64","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6801","nonce":1431049389}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6803","nonce":1431049389}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6807","nonce":1431049389}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6805","nonce":1431049389}]},"public_addr":"192.168.123.106:6801/1431049389","cluster_addr":"192.168.123.106:6803/1431049389","heartbeat_back_addr":"192.168.123.106:6807/1431049389","heartbeat_front_addr":"192.168.123.106:6805/1431049389","state":["exists","up"]},{"osd":4,"uuid":"8c3a4d00-bb0a-4f59-b53b-83364e99627b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6809","nonce":1004293893}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6811","nonce":1004293893}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6815","nonce":1004293893}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6813","nonce":1004293893}]},"public_addr":"192.168.123.106:6809/1004293893","cluster_addr":"192.168.123.106:6811/1004293893","heartbeat_back_addr":"192.168.123.106:6815/1004293893","heartbeat_front_addr":"192.168.123.106:6813/1004293893","state":["exists","up"]},{"osd":5,"uuid":"15462a2c-77f6-4f87-a9bf-e5fe4de71f8f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6817","nonce":1139626945}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6819","nonce":1139626945}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6823","nonce":1139626945}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6821","nonce":1139626945}]},"public_addr":"192.168.123.106:6817/1139626945","cluster_addr":"192.168.123.106:6819/1139626945","heartbeat_back_addr":"192.168.123.106:6823/1139626945","heartbeat_front_addr":"192.168.123.106:6821/1139626945","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:33.479111+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:43.710028+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:54.267401+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:05.164542+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:15.083285+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:24.275952+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:0/2723392878":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/78662117":"2026-03-09T23:59:25.439954+0000","192.168.123.103:0/2768716536":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/2535486397":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/2762219228":"2026-03-09T23:59:25.439954+0000","192.168.123.103:6801/2":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1691249097":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1899662013":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1443304653":"2026-03-09T23:59:11.665534+0000","192.168.123.103:6800/2":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/2143338907":"2026-03-09T23:59:25.439954+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T00:01:29.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c4406c5b0 msgr2=0x7f1c4406ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c4406c5b0 0x7f1c4406ea70 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f1c4c0094d0 tx=0x7f1c4c00b560 comp rx=0 tx=0).stop 2026-03-09T00:01:29.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ffde0 msgr2=0x7f1c5c199280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:29.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ffde0 0x7f1c5c199280 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f1c5c0678b0 tx=0x7f1c5000d9a0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 shutdown_connections 2026-03-09T00:01:29.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c4406c5b0 0x7f1c4406ea70 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c5c0ff4c0 0x7f1c5c198d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 --2- 192.168.123.103:0/3579886649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c5c0ffde0 0x7f1c5c199280 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:29.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 >> 192.168.123.103:0/3579886649 conn(0x7f1c5c074bd0 msgr2=0x7f1c5c107ab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:29.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 shutdown_connections 2026-03-09T00:01:29.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:29.908+0000 7f1c62e0c700 1 -- 192.168.123.103:0/3579886649 wait complete. 2026-03-09T00:01:29.950 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-09T00:01:29.950 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd dump --format=json 2026-03-09T00:01:30.100 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.351+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/975755366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc081032e0 msgr2=0x7fcc081036c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.351+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/975755366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc081032e0 0x7fcc081036c0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fcbf8009b00 tx=0x7fcbf8009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/975755366 shutdown_connections 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/975755366 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc08103c90 0x7fcc08107ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/975755366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc081032e0 0x7fcc081036c0 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/975755366 >> 192.168.123.103:0/975755366 conn(0x7fcc080feb50 msgr2=0x7fcc08100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/975755366 shutdown_connections 2026-03-09T00:01:30.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/975755366 wait complete. 2026-03-09T00:01:30.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.353+0000 7fcc0f1d3700 1 Processor -- start 2026-03-09T00:01:30.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0f1d3700 1 -- start start 2026-03-09T00:01:30.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0f1d3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 0x7fcc08198e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:30.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0f1d3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc08103c90 0x7fcc08199380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:30.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0f1d3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc08199a60 con 0x7fcc08103c90 2026-03-09T00:01:30.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0f1d3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcc0819d7f0 con 0x7fcc081032e0 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0cf6f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 0x7fcc08198e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0cf6f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 0x7fcc08198e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58724/0 (socket says 192.168.123.103:58724) 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0cf6f700 1 -- 192.168.123.103:0/1153878726 learned_addr learned my addr 192.168.123.103:0/1153878726 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc0cf6f700 1 -- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc08103c90 msgr2=0x7fcc08199380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.354+0000 7fcc07fff700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc08103c90 0x7fcc08199380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc0cf6f700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc08103c90 0x7fcc08199380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc0cf6f700 1 -- 192.168.123.103:0/1153878726 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcbf80097e0 con 0x7fcc081032e0 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc07fff700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc08103c90 0x7fcc08199380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:30.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc0cf6f700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 0x7fcc08198e40 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fcbf800b5c0 tx=0x7fcbf80049b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:30.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcbf801d070 con 0x7fcc081032e0 2026-03-09T00:01:30.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcc0819da70 con 0x7fcc081032e0 2026-03-09T00:01:30.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.355+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcc0819df60 con 0x7fcc081032e0 2026-03-09T00:01:30.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.356+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcbf800bc50 con 0x7fcc081032e0 2026-03-09T00:01:30.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.356+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcbf8017720 con 0x7fcc081032e0 2026-03-09T00:01:30.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.357+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fcbf8017940 con 0x7fcc081032e0 2026-03-09T00:01:30.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.357+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcc0810b630 con 0x7fcc081032e0 2026-03-09T00:01:30.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.357+0000 7fcc05ffb700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcbf006c4e0 0x7fcbf006e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:30.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.357+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fcbf808ce30 con 0x7fcc081032e0 2026-03-09T00:01:30.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.358+0000 7fcc07fff700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcbf006c4e0 0x7fcbf006e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:30.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.358+0000 7fcc07fff700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcbf006c4e0 0x7fcbf006e9a0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fcc0819a460 tx=0x7fcbfc008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:30.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.361+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fcbf80589a0 con 0x7fcc081032e0 2026-03-09T00:01:30.463 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:30 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3701713411' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T00:01:30.463 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:30 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3579886649' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T00:01:30.463 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:30 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:30.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.463+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7fcc0804ea90 con 0x7fcc081032e0 2026-03-09T00:01:30.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.466+0000 7fcc05ffb700 1 -- 192.168.123.103:0/1153878726 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11272 (secure 0 0 0) 0x7fcbf8027090 con 0x7fcc081032e0 2026-03-09T00:01:30.465 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:30.465 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"fsid":"ae8f0172-1b4a-11f1-916a-712b2ac006b7","created":"2026-03-08T23:58:56.272672+0000","modified":"2026-03-09T00:01:27.307218+0000","last_up_change":"2026-03-09T00:01:26.296864+0000","last_in_change":"2026-03-09T00:01:14.917922+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-09T00:00:56.171195+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"1eefdd28-e5a7-4e98-a454-60c0bb654070","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6803","nonce":3077634834}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6805","nonce":3077634834}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6809","nonce":3077634834}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":3077634834},{"type":"v1","addr":"192.168.123.103:6807","nonce":3077634834}]},"public_addr":"192.168.123.103:6803/3077634834","cluster_addr":"192.168.123.103:6805/3077634834","heartbeat_back_addr":"192.168.123.103:6809/3077634834","heartbeat_front_addr":"192.168.123.103:6807/3077634834","state":["exists","up"]},{"osd":1,"uuid":"6a8d6e5f-c441-499a-a4bb-8d9bc046a85f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6811","nonce":3932541768}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6813","nonce":3932541768}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6816","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6817","nonce":3932541768}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6814","nonce":3932541768},{"type":"v1","addr":"192.168.123.103:6815","nonce":3932541768}]},"public_addr":"192.168.123.103:6811/3932541768","cluster_addr":"192.168.123.103:6813/3932541768","heartbeat_back_addr":"192.168.123.103:6817/3932541768","heartbeat_front_addr":"192.168.123.103:6815/3932541768","state":["exists","up"]},{"osd":2,"uuid":"57704364-d509-479a-8dff-0b9f590cc6d0","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6818","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6819","nonce":901207923}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6820","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6821","nonce":901207923}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6824","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6825","nonce":901207923}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6822","nonce":901207923},{"type":"v1","addr":"192.168.123.103:6823","nonce":901207923}]},"public_addr":"192.168.123.103:6819/901207923","cluster_addr":"192.168.123.103:6821/901207923","heartbeat_back_addr":"192.168.123.103:6825/901207923","heartbeat_front_addr":"192.168.123.103:6823/901207923","state":["exists","up"]},{"osd":3,"uuid":"8b49bccb-fd91-44f4-831e-a401044f0e64","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6800","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6801","nonce":1431049389}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6802","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6803","nonce":1431049389}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6806","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6807","nonce":1431049389}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6804","nonce":1431049389},{"type":"v1","addr":"192.168.123.106:6805","nonce":1431049389}]},"public_addr":"192.168.123.106:6801/1431049389","cluster_addr":"192.168.123.106:6803/1431049389","heartbeat_back_addr":"192.168.123.106:6807/1431049389","heartbeat_front_addr":"192.168.123.106:6805/1431049389","state":["exists","up"]},{"osd":4,"uuid":"8c3a4d00-bb0a-4f59-b53b-83364e99627b","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6808","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6809","nonce":1004293893}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6810","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6811","nonce":1004293893}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6814","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6815","nonce":1004293893}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6812","nonce":1004293893},{"type":"v1","addr":"192.168.123.106:6813","nonce":1004293893}]},"public_addr":"192.168.123.106:6809/1004293893","cluster_addr":"192.168.123.106:6811/1004293893","heartbeat_back_addr":"192.168.123.106:6815/1004293893","heartbeat_front_addr":"192.168.123.106:6813/1004293893","state":["exists","up"]},{"osd":5,"uuid":"15462a2c-77f6-4f87-a9bf-e5fe4de71f8f","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6816","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6817","nonce":1139626945}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6818","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6819","nonce":1139626945}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6822","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6823","nonce":1139626945}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6820","nonce":1139626945},{"type":"v1","addr":"192.168.123.106:6821","nonce":1139626945}]},"public_addr":"192.168.123.106:6817/1139626945","cluster_addr":"192.168.123.106:6819/1139626945","heartbeat_back_addr":"192.168.123.106:6823/1139626945","heartbeat_front_addr":"192.168.123.106:6821/1139626945","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:33.479111+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:43.710028+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:00:54.267401+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:05.164542+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:15.083285+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-09T00:01:24.275952+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.103:0/2723392878":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/78662117":"2026-03-09T23:59:25.439954+0000","192.168.123.103:0/2768716536":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/2535486397":"2026-03-09T23:59:59.972086+0000","192.168.123.103:0/2762219228":"2026-03-09T23:59:25.439954+0000","192.168.123.103:6801/2":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1691249097":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1899662013":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/1443304653":"2026-03-09T23:59:11.665534+0000","192.168.123.103:6800/2":"2026-03-09T23:59:11.665534+0000","192.168.123.103:0/2143338907":"2026-03-09T23:59:25.439954+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-09T00:01:30.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.468+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcbf006c4e0 msgr2=0x7fcbf006e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.468+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcbf006c4e0 0x7fcbf006e9a0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fcc0819a460 tx=0x7fcbfc008040 comp rx=0 tx=0).stop 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.468+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 msgr2=0x7fcc08198e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.468+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 0x7fcc08198e40 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fcbf800b5c0 tx=0x7fcbf80049b0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 shutdown_connections 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fcbf006c4e0 0x7fcbf006e9a0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcc081032e0 0x7fcc08198e40 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 --2- 192.168.123.103:0/1153878726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcc08103c90 0x7fcc08199380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 >> 192.168.123.103:0/1153878726 conn(0x7fcc080feb50 msgr2=0x7fcc08100200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 shutdown_connections 2026-03-09T00:01:30.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:30.469+0000 7fcc0f1d3700 1 -- 192.168.123.103:0/1153878726 wait complete. 2026-03-09T00:01:30.514 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph tell osd.0 flush_pg_stats 2026-03-09T00:01:30.514 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph tell osd.1 flush_pg_stats 2026-03-09T00:01:30.514 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph tell osd.2 flush_pg_stats 2026-03-09T00:01:30.514 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph tell osd.3 flush_pg_stats 2026-03-09T00:01:30.514 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph tell osd.4 flush_pg_stats 2026-03-09T00:01:30.514 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph tell osd.5 flush_pg_stats 2026-03-09T00:01:30.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:30 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3701713411' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-09T00:01:30.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:30 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3579886649' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T00:01:30.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:30 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:30.939 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:30.941 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:31.001 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:31.026 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:31.031 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:31.140 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:31.510 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:31 vm03 ceph-mon[52346]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:31.510 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:31 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1153878726' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T00:01:31.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:31 vm06 ceph-mon[58395]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:31.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:31 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1153878726' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-09T00:01:31.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.693+0000 7f58e2fcb700 1 -- 192.168.123.103:0/2258772678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 msgr2=0x7f58d4095240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.693+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/2258772678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4095240 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f58cc009b00 tx=0x7f58cc009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:31.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.709+0000 7f58e2fcb700 1 -- 192.168.123.103:0/2258772678 shutdown_connections 2026-03-09T00:01:31.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.709+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/2258772678 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4095240 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.709+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/2258772678 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58d409ad30 0x7f58d409b110 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.709+0000 7f58e2fcb700 1 -- 192.168.123.103:0/2258772678 >> 192.168.123.103:0/2258772678 conn(0x7f58d400a830 msgr2=0x7f58d400ac40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.709+0000 7f58e2fcb700 1 -- 192.168.123.103:0/2258772678 shutdown_connections 2026-03-09T00:01:31.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.710+0000 7f58e2fcb700 1 -- 192.168.123.103:0/2258772678 wait complete. 2026-03-09T00:01:31.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.714+0000 7f58e2fcb700 1 Processor -- start 2026-03-09T00:01:31.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e2fcb700 1 -- start start 2026-03-09T00:01:31.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e2fcb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4144ab0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e2fcb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58d409ad30 0x7f58d4144ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e1fc9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4144ab0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e1fc9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4144ab0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41052/0 (socket says 192.168.123.103:41052) 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e2fcb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58d413eaa0 con 0x7f58d4094de0 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e2fcb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58d413ec10 con 0x7f58d409ad30 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.717+0000 7f58e1fc9700 1 -- 192.168.123.103:0/3370421512 learned_addr learned my addr 192.168.123.103:0/3370421512 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58e1fc9700 1 -- 192.168.123.103:0/3370421512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58d409ad30 msgr2=0x7f58d4144ff0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58e1fc9700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58d409ad30 0x7f58d4144ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58e1fc9700 1 -- 192.168.123.103:0/3370421512 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58cc0097e0 con 0x7f58d4094de0 2026-03-09T00:01:31.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58e1fc9700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4144ab0 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f58d800d8d0 tx=0x7f58d800dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f58d8009940 con 0x7f58d4094de0 2026-03-09T00:01:31.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f58d413eef0 con 0x7f58d4094de0 2026-03-09T00:01:31.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58d413f440 con 0x7f58d4094de0 2026-03-09T00:01:31.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f58d8010460 con 0x7f58d4094de0 2026-03-09T00:01:31.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.718+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f58d800f5d0 con 0x7f58d4094de0 2026-03-09T00:01:31.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.724+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f58d8010a90 con 0x7f58d4094de0 2026-03-09T00:01:31.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.724+0000 7f58d2ffd700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f58c8074ef0 0x7f58c80773b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.726+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f58d808b060 con 0x7f58d4094de0 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.726+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] conn(0x7f58c0001610 0x7f58c0003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.727+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 --> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f58c0006c00 con 0x7f58c0001610 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.729+0000 7f58e27ca700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] conn(0x7f58c0001610 0x7f58c0003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.729+0000 7f58e17c8700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f58c8074ef0 0x7f58c80773b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.729+0000 7f58e27ca700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] conn(0x7f58c0001610 0x7f58c0003ad0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.729+0000 7f58e17c8700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f58c8074ef0 0x7f58c80773b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f58cc00b5c0 tx=0x7f58cc009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.730+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== osd.0 v2:192.168.123.103:6802/3077634834 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f58c0006c00 con 0x7f58c0001610 2026-03-09T00:01:31.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.771+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 --> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f58c0005c80 con 0x7f58c0001610 2026-03-09T00:01:31.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.772+0000 7f58d2ffd700 1 -- 192.168.123.103:0/3370421512 <== osd.0 v2:192.168.123.103:6802/3077634834 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f58c0005c80 con 0x7f58c0001610 2026-03-09T00:01:31.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.772+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] conn(0x7f58c0001610 msgr2=0x7f58c0003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.772+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] conn(0x7f58c0001610 0x7f58c0003ad0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f58c8074ef0 msgr2=0x7f58c80773b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f58c8074ef0 0x7f58c80773b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f58cc00b5c0 tx=0x7f58cc009f90 comp rx=0 tx=0).stop 2026-03-09T00:01:31.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 msgr2=0x7f58d4144ab0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4144ab0 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7f58d800d8d0 tx=0x7f58d800dc90 comp rx=0 tx=0).stop 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 shutdown_connections 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f58c8074ef0 0x7f58c80773b0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:6802/3077634834,v1:192.168.123.103:6803/3077634834] conn(0x7f58c0001610 0x7f58c0003ad0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f58d4094de0 0x7f58d4144ab0 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 --2- 192.168.123.103:0/3370421512 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f58d409ad30 0x7f58d4144ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 >> 192.168.123.103:0/3370421512 conn(0x7f58d400a830 msgr2=0x7f58d4004d40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 shutdown_connections 2026-03-09T00:01:31.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.773+0000 7f58e2fcb700 1 -- 192.168.123.103:0/3370421512 wait complete. 2026-03-09T00:01:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.817+0000 7f52d2db4700 1 -- 192.168.123.103:0/1114319709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 msgr2=0x7f52cc10d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.817+0000 7f52d2db4700 1 --2- 192.168.123.103:0/1114319709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc10d570 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f52bc009b00 tx=0x7f52bc009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:31.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.819+0000 7f52d2db4700 1 -- 192.168.123.103:0/1114319709 shutdown_connections 2026-03-09T00:01:31.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.819+0000 7f52d2db4700 1 --2- 192.168.123.103:0/1114319709 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc10d570 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.819+0000 7f52d2db4700 1 --2- 192.168.123.103:0/1114319709 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f52cc10f340 0x7f52cc10f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.819+0000 7f52d2db4700 1 -- 192.168.123.103:0/1114319709 >> 192.168.123.103:0/1114319709 conn(0x7f52cc06ce20 msgr2=0x7f52cc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.820+0000 7f52d2db4700 1 -- 192.168.123.103:0/1114319709 shutdown_connections 2026-03-09T00:01:31.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 -- 192.168.123.103:0/1114319709 wait complete. 2026-03-09T00:01:31.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 Processor -- start 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 -- start start 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc1180b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f52cc10f340 0x7f52cc1185f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52cc11c240 con 0x7f52cc10d0f0 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.823+0000 7f52d2db4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52cc11c3b0 con 0x7f52cc10f340 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.824+0000 7f52d1db2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc1180b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.824+0000 7f52d1db2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc1180b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41080/0 (socket says 192.168.123.103:41080) 2026-03-09T00:01:31.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.824+0000 7f52d1db2700 1 -- 192.168.123.103:0/2081328396 learned_addr learned my addr 192.168.123.103:0/2081328396 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:31.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.824+0000 7f52d15b1700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f52cc10f340 0x7f52cc1185f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/2882629686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 msgr2=0x7fe9f0107d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/2882629686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0107d90 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0009b00 tx=0x7fe9e0009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:31.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/2882629686 shutdown_connections 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/2882629686 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9f01082d0 0x7fe9f0108750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/2882629686 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0107d90 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/2882629686 >> 192.168.123.103:0/2882629686 conn(0x7fe9f006d0f0 msgr2=0x7fe9f006d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/2882629686 shutdown_connections 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52d1db2700 1 -- 192.168.123.103:0/2081328396 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f52cc10f340 msgr2=0x7f52cc1185f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52d1db2700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f52cc10f340 0x7f52cc1185f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52d1db2700 1 -- 192.168.123.103:0/2081328396 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52bc0097e0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52d1db2700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc1180b0 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f52c800eb40 tx=0x7f52c800ef00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52c800ccf0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52d2db4700 1 -- 192.168.123.103:0/2081328396 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52cc118bf0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.826+0000 7f52d2db4700 1 -- 192.168.123.103:0/2081328396 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52cc1b84b0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.827+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f52c800ce50 con 0x7f52cc10d0f0 2026-03-09T00:01:31.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.828+0000 7f52d2db4700 1 -- 192.168.123.103:0/2081328396 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f52b0000ff0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.828+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52c80188b0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.825+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/2882629686 wait complete. 2026-03-09T00:01:31.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.829+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f52c8018b50 con 0x7f52cc10d0f0 2026-03-09T00:01:31.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.830+0000 7f52c2ffd700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52b806c330 0x7f52b806e7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.830+0000 7fe9f4e8c700 1 Processor -- start 2026-03-09T00:01:31.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.833+0000 7fe9f4e8c700 1 -- start start 2026-03-09T00:01:31.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.833+0000 7fe9f4e8c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9f01082d0 0x7fe9f0119f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.833+0000 7fe9f4e8c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0115f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.833+0000 7fe9f4e8c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9f0116550 con 0x7fe9f010f660 2026-03-09T00:01:31.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.833+0000 7fe9f4e8c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9f01166c0 con 0x7fe9f01082d0 2026-03-09T00:01:31.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.833+0000 7fe9edd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0115f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.836+0000 7fe9edd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0115f80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41088/0 (socket says 192.168.123.103:41088) 2026-03-09T00:01:31.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.836+0000 7fe9edd9b700 1 -- 192.168.123.103:0/221071086 learned_addr learned my addr 192.168.123.103:0/221071086 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:31.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.837+0000 7fe9edd9b700 1 -- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9f01082d0 msgr2=0x7fe9f0119f20 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:31.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.837+0000 7fe9ee59c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9f01082d0 0x7fe9f0119f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.837+0000 7fe9edd9b700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9f01082d0 0x7fe9f0119f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7fe9edd9b700 1 -- 192.168.123.103:0/221071086 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9e00097e0 con 0x7fe9f010f660 2026-03-09T00:01:31.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7f9e1358a700 1 -- 192.168.123.103:0/126033401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c107d90 msgr2=0x7f9e0c1081f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7f9e1358a700 1 --2- 192.168.123.103:0/126033401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c107d90 0x7f9e0c1081f0 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f9e0400b780 tx=0x7f9e0400ba90 comp rx=0 tx=0).stop 2026-03-09T00:01:31.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7f9e1358a700 1 -- 192.168.123.103:0/126033401 shutdown_connections 2026-03-09T00:01:31.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7f9e1358a700 1 --2- 192.168.123.103:0/126033401 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c107d90 0x7f9e0c1081f0 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7f9e1358a700 1 --2- 192.168.123.103:0/126033401 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c10d310 0x7f9e0c10d6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7f9e1358a700 1 -- 192.168.123.103:0/126033401 >> 192.168.123.103:0/126033401 conn(0x7f9e0c06ce20 msgr2=0x7f9e0c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.838+0000 7fe9edd9b700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0115f80 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7fe9e400d8d0 tx=0x7fe9e400dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 -- 192.168.123.103:0/126033401 shutdown_connections 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 -- 192.168.123.103:0/126033401 wait complete. 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 Processor -- start 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 -- start start 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c10d310 0x7f9e0c07ce00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 0x7f9e0c07d7c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e0c083e70 con 0x7f9e0c10d310 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e1358a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9e0c081980 con 0x7f9e0c07d340 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e12588700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c10d310 0x7f9e0c07ce00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e11d87700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 0x7f9e0c07d7c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e11d87700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 0x7f9e0c07d7c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58794/0 (socket says 192.168.123.103:58794) 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.839+0000 7f9e11d87700 1 -- 192.168.123.103:0/1604324132 learned_addr learned my addr 192.168.123.103:0/1604324132 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e11d87700 1 -- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c10d310 msgr2=0x7f9e0c07ce00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e11d87700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c10d310 0x7f9e0c07ce00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e11d87700 1 -- 192.168.123.103:0/1604324132 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9e0400b050 con 0x7f9e0c07d340 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e11d87700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 0x7f9e0c07d7c0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9e0400b600 tx=0x7f9e04009de0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9e04020a40 con 0x7f9e0c07d340 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9e4009940 con 0x7fe9f010f660 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe9e4010460 con 0x7fe9f010f660 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9e400f5d0 con 0x7fe9f010f660 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9f01169a0 con 0x7fe9f010f660 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e1358a700 1 -- 192.168.123.103:0/1604324132 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9e0c081c00 con 0x7f9e0c07d340 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.840+0000 7f9e1358a700 1 -- 192.168.123.103:0/1604324132 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9e0c0820f0 con 0x7f9e0c07d340 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.841+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9e04020ba0 con 0x7f9e0c07d340 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.841+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9e0401a630 con 0x7f9e0c07d340 2026-03-09T00:01:31.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.841+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9f01b8620 con 0x7fe9f010f660 2026-03-09T00:01:31.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.842+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7fe9f01106d0 con 0x7fe9f010f660 2026-03-09T00:01:31.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.842+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f9e0401a850 con 0x7f9e0c07d340 2026-03-09T00:01:31.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.843+0000 7f9e037fe700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9df806c600 0x7f9df806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.843+0000 7f9e12588700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9df806c600 0x7f9df806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.843+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f9e04091cc0 con 0x7f9e0c07d340 2026-03-09T00:01:31.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.844+0000 7f9e12588700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9df806c600 0x7f9df806eac0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9e08009c80 tx=0x7f9e08009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.844+0000 7f9e1358a700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] conn(0x7f9df0001610 0x7f9df0003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.844+0000 7f9e1358a700 1 -- 192.168.123.103:0/1604324132 --> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f9df0006c00 con 0x7f9df0001610 2026-03-09T00:01:31.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.844+0000 7f9e12d89700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] conn(0x7f9df0001610 0x7f9df0003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.844+0000 7f52d15b1700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52b806c330 0x7f52b806e7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.844+0000 7f9e12d89700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] conn(0x7f9df0001610 0x7f9df0003ad0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.845+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== osd.5 v2:192.168.123.106:6816/1139626945 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f9df0006c00 con 0x7f9df0001610 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.845+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe9e400f730 con 0x7fe9f010f660 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.846+0000 7fe9df7fe700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe9d806c600 0x7fe9d806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.846+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe9e408c810 con 0x7fe9f010f660 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.846+0000 7fe9df7fe700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] conn(0x7fe9d80721a0 0x7fe9d80745c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.846+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 --> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7fe9d8074c70 con 0x7fe9d80721a0 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.846+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7fe9e408cbc0 con 0x7fe9f010f660 2026-03-09T00:01:31.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.846+0000 7fe9ee59c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe9d806c600 0x7fe9d806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.851+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f52c8014070 con 0x7f52cc10d0f0 2026-03-09T00:01:31.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.851+0000 7f52c2ffd700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] conn(0x7f52b8071d60 0x7f52b8074180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.851+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 --> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f52b8074830 con 0x7f52b8071d60 2026-03-09T00:01:31.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.851+0000 7f52d15b1700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52b806c330 0x7f52b806e7f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f52bc00b5c0 tx=0x7f52bc005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.852+0000 7f52d25b3700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] conn(0x7f52b8071d60 0x7f52b8074180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.852+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f52c80569f0 con 0x7f52cc10d0f0 2026-03-09T00:01:31.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.852+0000 7fe9eed9d700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] conn(0x7fe9d80721a0 0x7fe9d80745c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.854+0000 7fe9eed9d700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] conn(0x7fe9d80721a0 0x7fe9d80745c0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.854+0000 7fe9ee59c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe9d806c600 0x7fe9d806eac0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0000c00 tx=0x7fe9e0005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.855+0000 7f52d25b3700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] conn(0x7f52b8071d60 0x7f52b8074180 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.855+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== osd.2 v2:192.168.123.103:6818/901207923 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f52b8074830 con 0x7f52b8071d60 2026-03-09T00:01:31.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.857+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== osd.4 v2:192.168.123.106:6808/1004293893 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7fe9d8074c70 con 0x7fe9d80721a0 2026-03-09T00:01:31.862 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.863+0000 7f9e1358a700 1 -- 192.168.123.103:0/1604324132 --> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f9df0005ce0 con 0x7f9df0001610 2026-03-09T00:01:31.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.867+0000 7f9e037fe700 1 -- 192.168.123.103:0/1604324132 <== osd.5 v2:192.168.123.106:6816/1139626945 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f9df0005ce0 con 0x7f9df0001610 2026-03-09T00:01:31.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] conn(0x7f9df0001610 msgr2=0x7f9df0003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] conn(0x7f9df0001610 0x7f9df0003ad0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9df806c600 msgr2=0x7f9df806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9df806c600 0x7f9df806eac0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f9e08009c80 tx=0x7f9e08009400 comp rx=0 tx=0).stop 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 msgr2=0x7f9e0c07d7c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 0x7f9e0c07d7c0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9e0400b600 tx=0x7f9e04009de0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.868+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 shutdown_connections 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f9df806c600 0x7f9df806eac0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9e0c10d310 0x7f9e0c07ce00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:6816/1139626945,v1:192.168.123.106:6817/1139626945] conn(0x7f9df0001610 0x7f9df0003ad0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 --2- 192.168.123.103:0/1604324132 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9e0c07d340 0x7f9e0c07d7c0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 >> 192.168.123.103:0/1604324132 conn(0x7f9e0c06ce20 msgr2=0x7f9e0c071050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 shutdown_connections 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f9e017fa700 1 -- 192.168.123.103:0/1604324132 wait complete. 2026-03-09T00:01:31.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.869+0000 7f52d2db4700 1 -- 192.168.123.103:0/2081328396 --> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f52b0002ce0 con 0x7f52b8071d60 2026-03-09T00:01:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.873+0000 7f52c2ffd700 1 -- 192.168.123.103:0/2081328396 <== osd.2 v2:192.168.123.103:6818/901207923 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f52b0002ce0 con 0x7f52b8071d60 2026-03-09T00:01:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.873+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] conn(0x7f52b8071d60 msgr2=0x7f52b8074180 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.873+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] conn(0x7f52b8071d60 0x7f52b8074180 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.873+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52b806c330 msgr2=0x7f52b806e7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.873+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52b806c330 0x7f52b806e7f0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f52bc00b5c0 tx=0x7f52bc005fb0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.876+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 msgr2=0x7f52cc1180b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.876+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc1180b0 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f52c800eb40 tx=0x7f52c800ef00 comp rx=0 tx=0).stop 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.877+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 shutdown_connections 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.877+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f52b806c330 0x7f52b806e7f0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.877+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:6818/901207923,v1:192.168.123.103:6819/901207923] conn(0x7f52b8071d60 0x7f52b8074180 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.877+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f52cc10d0f0 0x7f52cc1180b0 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.877+0000 7f52c0ff9700 1 --2- 192.168.123.103:0/2081328396 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f52cc10f340 0x7f52cc1185f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.877+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 >> 192.168.123.103:0/2081328396 conn(0x7f52cc06ce20 msgr2=0x7f52cc070450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.879+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 shutdown_connections 2026-03-09T00:01:31.878 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.879+0000 7f52c0ff9700 1 -- 192.168.123.103:0/2081328396 wait complete. 2026-03-09T00:01:31.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.879+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/1265686770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc8107d90 msgr2=0x7f2fc8108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.879+0000 7f2fcf2e2700 1 --2- 192.168.123.103:0/1265686770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc8107d90 0x7f2fc8108210 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f2fb8009b00 tx=0x7f2fb8009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:31.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.883+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/1265686770 shutdown_connections 2026-03-09T00:01:31.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.883+0000 7f2fcf2e2700 1 --2- 192.168.123.103:0/1265686770 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc8107d90 0x7f2fc8108210 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.883+0000 7f2fcf2e2700 1 --2- 192.168.123.103:0/1265686770 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc810f2f0 0x7f2fc810f6d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.883+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/1265686770 >> 192.168.123.103:0/1265686770 conn(0x7f2fc806ce10 msgr2=0x7f2fc806d220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.889+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/1265686770 shutdown_connections 2026-03-09T00:01:31.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.890+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/1265686770 wait complete. 2026-03-09T00:01:31.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.892+0000 7f2fcf2e2700 1 Processor -- start 2026-03-09T00:01:31.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.892+0000 7f2fcf2e2700 1 -- start start 2026-03-09T00:01:31.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.892+0000 7f2fcf2e2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc810f2f0 0x7f2fc8118e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.892+0000 7f2fcf2e2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 0x7f2fc8114280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.892+0000 7f2fcf2e2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2fc81147c0 con 0x7f2fc810f2f0 2026-03-09T00:01:31.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.892+0000 7f2fcf2e2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2fc8114930 con 0x7f2fc8113e00 2026-03-09T00:01:31.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.895+0000 7f2fcc87d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 0x7f2fc8114280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.895+0000 7f2fcc87d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 0x7f2fc8114280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58818/0 (socket says 192.168.123.103:58818) 2026-03-09T00:01:31.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.895+0000 7f2fcc87d700 1 -- 192.168.123.103:0/761505563 learned_addr learned my addr 192.168.123.103:0/761505563 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:31.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.897+0000 7f2fcd07e700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc810f2f0 0x7f2fc8118e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.898+0000 7f2fcc87d700 1 -- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc810f2f0 msgr2=0x7f2fc8118e00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.898+0000 7f2fcc87d700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc810f2f0 0x7f2fc8118e00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.898+0000 7f2fcc87d700 1 -- 192.168.123.103:0/761505563 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2fb80097e0 con 0x7f2fc8113e00 2026-03-09T00:01:31.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.900+0000 7f2fcc87d700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 0x7f2fc8114280 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2fb80056a0 tx=0x7f2fb800f6d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.904+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2fb801c070 con 0x7f2fc8113e00 2026-03-09T00:01:31.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.905+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/761505563 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2fc8114bb0 con 0x7f2fc8113e00 2026-03-09T00:01:31.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.905+0000 7f2fcf2e2700 1 -- 192.168.123.103:0/761505563 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2fc81a6640 con 0x7f2fc8113e00 2026-03-09T00:01:31.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.906+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f2fac000ff0 con 0x7f2fc8113e00 2026-03-09T00:01:31.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.908+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2fb800fca0 con 0x7f2fc8113e00 2026-03-09T00:01:31.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.908+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2fb8017850 con 0x7f2fc8113e00 2026-03-09T00:01:31.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.909+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2fb80179f0 con 0x7f2fc8113e00 2026-03-09T00:01:31.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.917+0000 7f2fbe7fc700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2fb406c600 0x7f2fb406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.941 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.941+0000 7f2fcd07e700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2fb406c600 0x7f2fb406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.924+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 --> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7fe9f004ea90 con 0x7fe9d80721a0 2026-03-09T00:01:31.942 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.943+0000 7fe9df7fe700 1 -- 192.168.123.103:0/221071086 <== osd.4 v2:192.168.123.106:6808/1004293893 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7fe9f004ea90 con 0x7fe9d80721a0 2026-03-09T00:01:31.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.947+0000 7f2fcd07e700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2fb406c600 0x7f2fb406eac0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f2fc4009910 tx=0x7f2fc4008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.947 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.948+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f2fb808df70 con 0x7f2fc8113e00 2026-03-09T00:01:31.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.948+0000 7f2fbe7fc700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] conn(0x7f2fb40721a0 0x7f2fb40745c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:31.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.948+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] conn(0x7fe9d80721a0 msgr2=0x7fe9d80745c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.948 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.948+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] conn(0x7fe9d80721a0 0x7fe9d80745c0 crc :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe9d806c600 msgr2=0x7fe9d806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe9d806c600 0x7fe9d806eac0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0000c00 tx=0x7fe9e0005c00 comp rx=0 tx=0).stop 2026-03-09T00:01:31.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 msgr2=0x7fe9f0115f80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:31.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0115f80 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7fe9e400d8d0 tx=0x7fe9e400dc90 comp rx=0 tx=0).stop 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 shutdown_connections 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:6808/1004293893,v1:192.168.123.106:6809/1004293893] conn(0x7fe9d80721a0 0x7fe9d80745c0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe9d806c600 0x7fe9d806eac0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9f01082d0 0x7fe9f0119f20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 --2- 192.168.123.103:0/221071086 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9f010f660 0x7fe9f0115f80 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.952+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 >> 192.168.123.103:0/221071086 conn(0x7fe9f006d0f0 msgr2=0x7fe9f006f960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.953+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 shutdown_connections 2026-03-09T00:01:31.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.953+0000 7fe9f4e8c700 1 -- 192.168.123.103:0/221071086 wait complete. 2026-03-09T00:01:31.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.953+0000 7f2fcd87f700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] conn(0x7f2fb40721a0 0x7f2fb40745c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:31.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.956+0000 7f2fcd87f700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] conn(0x7f2fb40721a0 0x7f2fb40745c0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:31.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.956+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 --> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f2fb4074c70 con 0x7f2fb40721a0 2026-03-09T00:01:31.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.957+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f2fb808e320 con 0x7f2fc8113e00 2026-03-09T00:01:31.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:31.957+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== osd.1 v2:192.168.123.103:6810/3932541768 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f2fb4074c70 con 0x7f2fb40721a0 2026-03-09T00:01:31.981 INFO:teuthology.orchestra.run.vm03.stdout:137438953474 2026-03-09T00:01:31.981 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.5 2026-03-09T00:01:32.001 INFO:teuthology.orchestra.run.vm03.stdout:73014444041 2026-03-09T00:01:32.002 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.2 2026-03-09T00:01:32.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.011+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 --> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f2fac002d70 con 0x7f2fb40721a0 2026-03-09T00:01:32.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.012+0000 7f2fbe7fc700 1 -- 192.168.123.103:0/761505563 <== osd.1 v2:192.168.123.103:6810/3932541768 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f2fac002d70 con 0x7f2fb40721a0 2026-03-09T00:01:32.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.012+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] conn(0x7f2fb40721a0 msgr2=0x7f2fb40745c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.017+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] conn(0x7f2fb40721a0 0x7f2fb40745c0 crc :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2fb406c600 msgr2=0x7f2fb406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2fb406c600 0x7f2fb406eac0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f2fc4009910 tx=0x7f2fc4008040 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 msgr2=0x7f2fc8114280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 0x7f2fc8114280 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2fb80056a0 tx=0x7f2fb800f6d0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 shutdown_connections 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6810/3932541768,v1:192.168.123.103:6811/3932541768] conn(0x7f2fb40721a0 0x7f2fb40745c0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f2fb406c600 0x7f2fb406eac0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2fc810f2f0 0x7f2fc8118e00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 --2- 192.168.123.103:0/761505563 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2fc8113e00 0x7f2fc8114280 secure :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f2fb80056a0 tx=0x7f2fb800f6d0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.019+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 >> 192.168.123.103:0/761505563 conn(0x7f2fc806ce10 msgr2=0x7f2fc8071560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.020+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 shutdown_connections 2026-03-09T00:01:32.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.020+0000 7f2fb3fff700 1 -- 192.168.123.103:0/761505563 wait complete. 2026-03-09T00:01:32.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.023+0000 7f4c60581700 1 -- 192.168.123.103:0/3677295672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 msgr2=0x7f4c58108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.023+0000 7f4c60581700 1 --2- 192.168.123.103:0/3677295672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 0x7f4c58108210 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4c48009b00 tx=0x7f4c48009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:32.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.027+0000 7f4c60581700 1 -- 192.168.123.103:0/3677295672 shutdown_connections 2026-03-09T00:01:32.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.027+0000 7f4c60581700 1 --2- 192.168.123.103:0/3677295672 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 0x7f4c58108210 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.027+0000 7f4c60581700 1 --2- 192.168.123.103:0/3677295672 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c5810f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.027+0000 7f4c60581700 1 -- 192.168.123.103:0/3677295672 >> 192.168.123.103:0/3677295672 conn(0x7f4c5806ce20 msgr2=0x7f4c5806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.032+0000 7f4c60581700 1 -- 192.168.123.103:0/3677295672 shutdown_connections 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.036+0000 7f4c60581700 1 -- 192.168.123.103:0/3677295672 wait complete. 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.036+0000 7f4c60581700 1 Processor -- start 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.036+0000 7f4c60581700 1 -- start start 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c60581700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 0x7f4c58117e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c60581700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c58112e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c60581700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c581133c0 con 0x7f4c5810f420 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c60581700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c58113530 con 0x7f4c58107d90 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c5e31d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 0x7f4c58117e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c5db1c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c58112e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c5db1c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c58112e80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41134/0 (socket says 192.168.123.103:41134) 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.037+0000 7f4c5db1c700 1 -- 192.168.123.103:0/1677496681 learned_addr learned my addr 192.168.123.103:0/1677496681 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c5db1c700 1 -- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 msgr2=0x7f4c58117e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c5db1c700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 0x7f4c58117e80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c5db1c700 1 -- 192.168.123.103:0/1677496681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c480097e0 con 0x7f4c5810f420 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c5db1c700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c58112e80 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f4c480056a0 tx=0x7f4c4800bb70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c4801d070 con 0x7f4c5810f420 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c60581700 1 -- 192.168.123.103:0/1677496681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c581137b0 con 0x7f4c5810f420 2026-03-09T00:01:32.040 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.038+0000 7f4c60581700 1 -- 192.168.123.103:0/1677496681 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c58113ca0 con 0x7f4c5810f420 2026-03-09T00:01:32.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.039+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4c4800fb30 con 0x7f4c5810f420 2026-03-09T00:01:32.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.039+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c48022d40 con 0x7f4c5810f420 2026-03-09T00:01:32.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.042+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f4c48017400 con 0x7f4c5810f420 2026-03-09T00:01:32.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.042+0000 7f4c4f7fe700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c4406c530 0x7f4c4406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.042+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4c4808dcc0 con 0x7f4c5810f420 2026-03-09T00:01:32.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.043+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] conn(0x7f4c3c001610 0x7f4c3c003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.043+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 --> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f4c3c006c00 con 0x7f4c3c001610 2026-03-09T00:01:32.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.043+0000 7f4c5eb1e700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] conn(0x7f4c3c001610 0x7f4c3c003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:32.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.044+0000 7f4c5e31d700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c4406c530 0x7f4c4406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:32.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.045+0000 7f4c5eb1e700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] conn(0x7f4c3c001610 0x7f4c3c003ad0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:32.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.046+0000 7f4c5e31d700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c4406c530 0x7f4c4406e9f0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f4c5400a850 tx=0x7f4c54008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:32.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.046+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== osd.3 v2:192.168.123.106:6800/1431049389 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f4c3c006c00 con 0x7f4c3c001610 2026-03-09T00:01:32.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.070+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 --> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f4c3c005ce0 con 0x7f4c3c001610 2026-03-09T00:01:32.077 INFO:teuthology.orchestra.run.vm03.stdout:38654705677 2026-03-09T00:01:32.077 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.0 2026-03-09T00:01:32.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.079+0000 7f4c4f7fe700 1 -- 192.168.123.103:0/1677496681 <== osd.3 v2:192.168.123.106:6800/1431049389 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f4c3c005ce0 con 0x7f4c3c001610 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] conn(0x7f4c3c001610 msgr2=0x7f4c3c003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] conn(0x7f4c3c001610 0x7f4c3c003ad0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c4406c530 msgr2=0x7f4c4406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c4406c530 0x7f4c4406e9f0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f4c5400a850 tx=0x7f4c54008040 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 msgr2=0x7f4c58112e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c58112e80 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f4c480056a0 tx=0x7f4c4800bb70 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 shutdown_connections 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:6800/1431049389,v1:192.168.123.106:6801/1431049389] conn(0x7f4c3c001610 0x7f4c3c003ad0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4c4406c530 0x7f4c4406e9f0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c58107d90 0x7f4c58117e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 --2- 192.168.123.103:0/1677496681 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c5810f420 0x7f4c58112e80 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 >> 192.168.123.103:0/1677496681 conn(0x7f4c5806ce20 msgr2=0x7f4c5810d190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 shutdown_connections 2026-03-09T00:01:32.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.080+0000 7f4c4d7fa700 1 -- 192.168.123.103:0/1677496681 wait complete. 2026-03-09T00:01:32.106 INFO:teuthology.orchestra.run.vm03.stdout:120259084293 2026-03-09T00:01:32.106 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.4 2026-03-09T00:01:32.147 INFO:teuthology.orchestra.run.vm03.stdout:55834574859 2026-03-09T00:01:32.148 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.1 2026-03-09T00:01:32.167 INFO:teuthology.orchestra.run.vm03.stdout:98784247815 2026-03-09T00:01:32.167 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.3 2026-03-09T00:01:32.402 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:32.484 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:32.758 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:32.763 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:32.884 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:32.885 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2935388173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a410d4e0 msgr2=0x7f84a410d8c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2935388173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a410d4e0 0x7f84a410d8c0 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f84a0007780 tx=0x7f84a0007a90 comp rx=0 tx=0).stop 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2935388173 shutdown_connections 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2935388173 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a4107d90 0x7f84a4108210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2935388173 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a410d4e0 0x7f84a410d8c0 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2935388173 >> 192.168.123.103:0/2935388173 conn(0x7f84a406d0f0 msgr2=0x7f84a406d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2935388173 shutdown_connections 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2935388173 wait complete. 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 Processor -- start 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- start start 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a4107d90 0x7f84a407cd60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 0x7f84a407d720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84a4083d80 con 0x7f84a4107d90 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.950+0000 7f84ab4ff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84a4081890 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.951+0000 7f84a8a9a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 0x7f84a407d720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.951+0000 7f84a8a9a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 0x7f84a407d720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58852/0 (socket says 192.168.123.103:58852) 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.951+0000 7f84a8a9a700 1 -- 192.168.123.103:0/2490847111 learned_addr learned my addr 192.168.123.103:0/2490847111 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.951+0000 7f84a8a9a700 1 -- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a4107d90 msgr2=0x7f84a407cd60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.951+0000 7f84a8a9a700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a4107d90 0x7f84a407cd60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.951+0000 7f84a8a9a700 1 -- 192.168.123.103:0/2490847111 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84a0007430 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.952+0000 7f84a8a9a700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 0x7f84a407d720 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f849c00c390 tx=0x7f849c00c750 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.952+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f849c00e030 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.952+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84a4081b70 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.952+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84a40820c0 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.953+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f849c00f040 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.953+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f849c014700 con 0x7f84a407d2a0 2026-03-09T00:01:32.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.953+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8488005320 con 0x7f84a407d2a0 2026-03-09T00:01:32.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.954+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f849c009280 con 0x7f84a407d2a0 2026-03-09T00:01:32.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.954+0000 7f849a7fc700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f849006c530 0x7f849006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:32.954 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.954+0000 7f84a929b700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f849006c530 0x7f849006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:32.954 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.954+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f849c08b340 con 0x7f84a407d2a0 2026-03-09T00:01:32.954 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.955+0000 7f84a929b700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f849006c530 0x7f849006e9f0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f84a0007e60 tx=0x7f84a00058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:32.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:32.958+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f849c055f00 con 0x7f84a407d2a0 2026-03-09T00:01:33.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.293+0000 7fabeb5b0700 1 -- 192.168.123.103:0/337620511 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe4107ff0 msgr2=0x7fabe41083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.293+0000 7fabeb5b0700 1 --2- 192.168.123.103:0/337620511 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe4107ff0 0x7fabe41083d0 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7fabe0007780 tx=0x7fabe000c050 comp rx=0 tx=0).stop 2026-03-09T00:01:33.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.300+0000 7fabeb5b0700 1 -- 192.168.123.103:0/337620511 shutdown_connections 2026-03-09T00:01:33.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.300+0000 7fabeb5b0700 1 --2- 192.168.123.103:0/337620511 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fabe41089a0 0x7fabe410be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.300+0000 7fabeb5b0700 1 --2- 192.168.123.103:0/337620511 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe4107ff0 0x7fabe41083d0 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 -- 192.168.123.103:0/337620511 >> 192.168.123.103:0/337620511 conn(0x7fabe406ce20 msgr2=0x7fabe406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 -- 192.168.123.103:0/337620511 shutdown_connections 2026-03-09T00:01:33.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 -- 192.168.123.103:0/337620511 wait complete. 2026-03-09T00:01:33.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 Processor -- start 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 -- start start 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fabe41089a0 0x7fabe41381a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 0x7fabe4133620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabe4133b60 con 0x7fabe41331a0 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.301+0000 7fabeb5b0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabe4133cd0 con 0x7fabe41089a0 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 0x7fabe4133620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 0x7fabe4133620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41156/0 (socket says 192.168.123.103:41156) 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 -- 192.168.123.103:0/1876085556 learned_addr learned my addr 192.168.123.103:0/1876085556 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 -- 192.168.123.103:0/1876085556 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fabe41089a0 msgr2=0x7fabe41381a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fabe41089a0 0x7fabe41381a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 -- 192.168.123.103:0/1876085556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fabe0007430 con 0x7fabe41331a0 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.302+0000 7fabe8b4b700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 0x7fabe4133620 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7fabd400d900 tx=0x7fabd400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.303+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fabd40098e0 con 0x7fabe41331a0 2026-03-09T00:01:33.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.303+0000 7fabeb5b0700 1 -- 192.168.123.103:0/1876085556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fabe4133fb0 con 0x7fabe41331a0 2026-03-09T00:01:33.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.303+0000 7fabeb5b0700 1 -- 192.168.123.103:0/1876085556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fabe407f0a0 con 0x7fabe41331a0 2026-03-09T00:01:33.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.310+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fabd4010460 con 0x7fabe41331a0 2026-03-09T00:01:33.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.310+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fabd400f5d0 con 0x7fabe41331a0 2026-03-09T00:01:33.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.310+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fabd400f770 con 0x7fabe41331a0 2026-03-09T00:01:33.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.311+0000 7fabda7fc700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fabd006c6d0 0x7fabd006eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.312+0000 7fabe934c700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fabd006c6d0 0x7fabd006eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.316+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f8488005190 con 0x7f84a407d2a0 2026-03-09T00:01:33.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.316+0000 7fabe934c700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fabd006c6d0 0x7fabd006eb90 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fabe0005b40 tx=0x7fabe0005ab0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.317+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fabd408bfa0 con 0x7fabe41331a0 2026-03-09T00:01:33.319 INFO:teuthology.orchestra.run.vm03.stdout:137438953474 2026-03-09T00:01:33.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.317+0000 7f849a7fc700 1 -- 192.168.123.103:0/2490847111 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f849c059520 con 0x7f84a407d2a0 2026-03-09T00:01:33.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.319+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fabc8005320 con 0x7fabe41331a0 2026-03-09T00:01:33.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.322+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fabd405a670 con 0x7fabe41331a0 2026-03-09T00:01:33.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.324+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f849006c530 msgr2=0x7f849006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.324+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f849006c530 0x7f849006e9f0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f84a0007e60 tx=0x7f84a00058e0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.324+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 msgr2=0x7f84a407d720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.324+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 0x7f84a407d720 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f849c00c390 tx=0x7f849c00c750 comp rx=0 tx=0).stop 2026-03-09T00:01:33.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.326+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 shutdown_connections 2026-03-09T00:01:33.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.326+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f849006c530 0x7f849006e9f0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.326+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84a4107d90 0x7f84a407cd60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.326+0000 7f84ab4ff700 1 --2- 192.168.123.103:0/2490847111 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84a407d2a0 0x7f84a407d720 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.326+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 >> 192.168.123.103:0/2490847111 conn(0x7f84a406d0f0 msgr2=0x7f84a4072100 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.329+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 shutdown_connections 2026-03-09T00:01:33.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.331+0000 7f84ab4ff700 1 -- 192.168.123.103:0/2490847111 wait complete. 2026-03-09T00:01:33.467 INFO:tasks.cephadm.ceph_manager.ceph:need seq 137438953474 got 137438953474 for osd.5 2026-03-09T00:01:33.467 DEBUG:teuthology.parallel:result is None 2026-03-09T00:01:33.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.493+0000 7f383f59e700 1 -- 192.168.123.103:0/2501269184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010d0f0 msgr2=0x7f384010d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.493+0000 7f383f59e700 1 --2- 192.168.123.103:0/2501269184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010d0f0 0x7f384010d570 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f3834009b00 tx=0x7f3834009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:33.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.494+0000 7f383f59e700 1 -- 192.168.123.103:0/2501269184 shutdown_connections 2026-03-09T00:01:33.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.494+0000 7f383f59e700 1 --2- 192.168.123.103:0/2501269184 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010d0f0 0x7f384010d570 unknown :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.494+0000 7f383f59e700 1 --2- 192.168.123.103:0/2501269184 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f384010f340 0x7f384010f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.494+0000 7f383f59e700 1 -- 192.168.123.103:0/2501269184 >> 192.168.123.103:0/2501269184 conn(0x7f384006ce20 msgr2=0x7f384006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 -- 192.168.123.103:0/2501269184 shutdown_connections 2026-03-09T00:01:33.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 -- 192.168.123.103:0/2501269184 wait complete. 2026-03-09T00:01:33.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 Processor -- start 2026-03-09T00:01:33.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 -- start start 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f384010d0f0 0x7f38401ab600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 0x7f38401abb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38401ac1d0 con 0x7f384010f340 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383f59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38401a5680 con 0x7f384010d0f0 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 0x7f38401abb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383dd9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 0x7f38401abb40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41166/0 (socket says 192.168.123.103:41166) 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383dd9b700 1 -- 192.168.123.103:0/2971238189 learned_addr learned my addr 192.168.123.103:0/2971238189 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.495+0000 7f383e59c700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f384010d0f0 0x7f38401ab600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.496+0000 7f383dd9b700 1 -- 192.168.123.103:0/2971238189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f384010d0f0 msgr2=0x7f38401ab600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.496+0000 7f383dd9b700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f384010d0f0 0x7f38401ab600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.496+0000 7f383dd9b700 1 -- 192.168.123.103:0/2971238189 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38340097e0 con 0x7f384010f340 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.496+0000 7f383dd9b700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 0x7f38401abb40 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f3834009ad0 tx=0x7f383400bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.496+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f383401d070 con 0x7f384010f340 2026-03-09T00:01:33.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.496+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f383400f460 con 0x7f384010f340 2026-03-09T00:01:33.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.497+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3834021620 con 0x7f384010f340 2026-03-09T00:01:33.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.497+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38401a5900 con 0x7f384010f340 2026-03-09T00:01:33.496 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.497+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38401a5d70 con 0x7f384010f340 2026-03-09T00:01:33.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.498+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3840119d00 con 0x7f384010f340 2026-03-09T00:01:33.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.501+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3834003a40 con 0x7f384010f340 2026-03-09T00:01:33.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.501+0000 7f382f7fe700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f382806c380 0x7f382806e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.501+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f383408c760 con 0x7f384010f340 2026-03-09T00:01:33.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.501+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f383408cbe0 con 0x7f384010f340 2026-03-09T00:01:33.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.503+0000 7f383e59c700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f382806c380 0x7f382806e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.506+0000 7f383e59c700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f382806c380 0x7f382806e840 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f3830009de0 tx=0x7f3830009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.561+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7fabc8005190 con 0x7fabe41331a0 2026-03-09T00:01:33.561 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.562+0000 7fabda7fc700 1 -- 192.168.123.103:0/1876085556 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fabd4020070 con 0x7fabe41331a0 2026-03-09T00:01:33.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.568+0000 7f87cb0d8700 1 -- 192.168.123.103:0/3576953270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073130 msgr2=0x7f87c4073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.568+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/3576953270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073130 0x7f87c4073510 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f87b8009b00 tx=0x7f87b8009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:33.569 INFO:teuthology.orchestra.run.vm03.stdout:73014444041 2026-03-09T00:01:33.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.572+0000 7f87cb0d8700 1 -- 192.168.123.103:0/3576953270 shutdown_connections 2026-03-09T00:01:33.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.572+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/3576953270 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073a50 0x7f87c4111940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.572+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/3576953270 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073130 0x7f87c4073510 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.572+0000 7f87cb0d8700 1 -- 192.168.123.103:0/3576953270 >> 192.168.123.103:0/3576953270 conn(0x7f87c40fc920 msgr2=0x7f87c40fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.572+0000 7f87cb0d8700 1 -- 192.168.123.103:0/3576953270 shutdown_connections 2026-03-09T00:01:33.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.573+0000 7f87cb0d8700 1 -- 192.168.123.103:0/3576953270 wait complete. 2026-03-09T00:01:33.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.573+0000 7f87cb0d8700 1 Processor -- start 2026-03-09T00:01:33.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7f87cb0d8700 1 -- start start 2026-03-09T00:01:33.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fabd006c6d0 msgr2=0x7fabd006eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fabd006c6d0 0x7fabd006eb90 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fabe0005b40 tx=0x7fabe0005ab0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 msgr2=0x7fabe4133620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 0x7fabe4133620 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7fabd400d900 tx=0x7fabd400dcc0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87cb0d8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073130 0x7f87c419d190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87cb0d8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073a50 0x7f87c419d6d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87cb0d8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87c419ddb0 con 0x7f87c4073a50 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87cb0d8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f87c41a1b40 con 0x7f87c4073130 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 shutdown_connections 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fabd006c6d0 0x7fabd006eb90 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fabe41089a0 0x7fabe41381a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 --2- 192.168.123.103:0/1876085556 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fabe41331a0 0x7fabe4133620 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 >> 192.168.123.103:0/1876085556 conn(0x7fabe406ce20 msgr2=0x7fabe40705d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 shutdown_connections 2026-03-09T00:01:33.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.575+0000 7fabcffff700 1 -- 192.168.123.103:0/1876085556 wait complete. 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87c8e74700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073130 0x7f87c419d190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87c3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073a50 0x7f87c419d6d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87c8e74700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073130 0x7f87c419d190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58886/0 (socket says 192.168.123.103:58886) 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.576+0000 7f87c8e74700 1 -- 192.168.123.103:0/2072842547 learned_addr learned my addr 192.168.123.103:0/2072842547 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.577+0000 7f87c3fff700 1 -- 192.168.123.103:0/2072842547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073130 msgr2=0x7f87c419d190 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.577+0000 7f87c3fff700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073130 0x7f87c419d190 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.577+0000 7f87c3fff700 1 -- 192.168.123.103:0/2072842547 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f87b80097e0 con 0x7f87c4073a50 2026-03-09T00:01:33.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.577+0000 7f87c3fff700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073a50 0x7f87c419d6d0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f87b000b6d0 tx=0x7f87b000ba90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.577+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87b0011840 con 0x7f87c4073a50 2026-03-09T00:01:33.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.577+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f87b0011e80 con 0x7f87c4073a50 2026-03-09T00:01:33.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.578+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f87b000f550 con 0x7f87c4073a50 2026-03-09T00:01:33.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.578+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f87c41a1e20 con 0x7f87c4073a50 2026-03-09T00:01:33.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.578+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f87c41a22f0 con 0x7f87c4073a50 2026-03-09T00:01:33.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.580+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f87b00119a0 con 0x7f87c4073a50 2026-03-09T00:01:33.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.581+0000 7f87c1ffb700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87b406c380 0x7f87b406e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.581+0000 7f87c8e74700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87b406c380 0x7f87b406e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.581+0000 7f87c8e74700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87b406c380 0x7f87b406e840 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f87b800b5c0 tx=0x7f87b80058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.581+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f87b008abd0 con 0x7f87c4073a50 2026-03-09T00:01:33.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.582+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f87c410f0c0 con 0x7f87c4073a50 2026-03-09T00:01:33.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.584+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f87b0055810 con 0x7f87c4073a50 2026-03-09T00:01:33.604 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:33 vm03 ceph-mon[52346]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:33.604 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:33 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2490847111' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T00:01:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.604+0000 7f4a837e0700 1 -- 192.168.123.103:0/3564707549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c107ff0 msgr2=0x7f4a7c1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.604+0000 7f4a837e0700 1 --2- 192.168.123.103:0/3564707549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c107ff0 0x7f4a7c1083d0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f4a78007780 tx=0x7f4a7800c050 comp rx=0 tx=0).stop 2026-03-09T00:01:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- 192.168.123.103:0/3564707549 shutdown_connections 2026-03-09T00:01:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 --2- 192.168.123.103:0/3564707549 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a7c1089a0 0x7f4a7c10be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 --2- 192.168.123.103:0/3564707549 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c107ff0 0x7f4a7c1083d0 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- 192.168.123.103:0/3564707549 >> 192.168.123.103:0/3564707549 conn(0x7f4a7c06ce20 msgr2=0x7f4a7c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- 192.168.123.103:0/3564707549 shutdown_connections 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- 192.168.123.103:0/3564707549 wait complete. 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 Processor -- start 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- start start 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a7c1089a0 0x7f4a7c1331a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 0x7f4a7c133b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a7c07ef30 con 0x7f4a7c1089a0 2026-03-09T00:01:33.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.605+0000 7f4a837e0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a7c07f0a0 con 0x7f4a7c1336e0 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 0x7f4a7c133b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 0x7f4a7c133b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:58914/0 (socket says 192.168.123.103:58914) 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 -- 192.168.123.103:0/103872965 learned_addr learned my addr 192.168.123.103:0/103872965 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 -- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a7c1089a0 msgr2=0x7f4a7c1331a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a7c1089a0 0x7f4a7c1331a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 -- 192.168.123.103:0/103872965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a78007430 con 0x7f4a7c1336e0 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.606+0000 7f4a80d7b700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 0x7f4a7c133b60 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f4a7400bf40 tx=0x7f4a7400bf70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.607+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a7400cb40 con 0x7f4a7c1336e0 2026-03-09T00:01:33.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.607+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a7c07f330 con 0x7f4a7c1336e0 2026-03-09T00:01:33.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.607+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a7c07f830 con 0x7f4a7c1336e0 2026-03-09T00:01:33.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.608+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4a7400cca0 con 0x7f4a7c1336e0 2026-03-09T00:01:33.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.608+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a74012720 con 0x7f4a7c1336e0 2026-03-09T00:01:33.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.608+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f4a740129c0 con 0x7f4a7c1336e0 2026-03-09T00:01:33.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.609+0000 7f4a727fc700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a6806c600 0x7f4a6806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.609+0000 7f4a8157c700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a6806c600 0x7f4a6806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.610+0000 7f4a8157c700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a6806c600 0x7f4a6806eac0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f4a7800c4d0 tx=0x7f4a7800af60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.610+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f4a7408bd30 con 0x7f4a7c1336e0 2026-03-09T00:01:33.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.610+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a60005320 con 0x7f4a7c1336e0 2026-03-09T00:01:33.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.613+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4a74056a20 con 0x7f4a7c1336e0 2026-03-09T00:01:33.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:33 vm06 ceph-mon[58395]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:33.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:33 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2490847111' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-09T00:01:33.691 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444041 got 73014444041 for osd.2 2026-03-09T00:01:33.691 DEBUG:teuthology.parallel:result is None 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 -- 192.168.123.103:0/427243960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10d0f0 msgr2=0x7fd50c10d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 --2- 192.168.123.103:0/427243960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10d0f0 0x7fd50c10d570 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7fd4fc009b00 tx=0x7fd4fc009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 -- 192.168.123.103:0/427243960 shutdown_connections 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 --2- 192.168.123.103:0/427243960 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10d0f0 0x7fd50c10d570 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 --2- 192.168.123.103:0/427243960 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd50c10f340 0x7fd50c10f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 -- 192.168.123.103:0/427243960 >> 192.168.123.103:0/427243960 conn(0x7fd50c06ce20 msgr2=0x7fd50c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 -- 192.168.123.103:0/427243960 shutdown_connections 2026-03-09T00:01:33.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.700+0000 7fd512c4b700 1 -- 192.168.123.103:0/427243960 wait complete. 2026-03-09T00:01:33.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd512c4b700 1 Processor -- start 2026-03-09T00:01:33.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd512c4b700 1 -- start start 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd512c4b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd50c10d0f0 0x7fd50c1ab580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd512c4b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 0x7fd50c1abac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd512c4b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd50c1ac150 con 0x7fd50c10f340 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd512c4b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd50c1a5600 con 0x7fd50c10d0f0 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd511448700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 0x7fd50c1abac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd511448700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 0x7fd50c1abac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41222/0 (socket says 192.168.123.103:41222) 2026-03-09T00:01:33.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.701+0000 7fd511448700 1 -- 192.168.123.103:0/2687004532 learned_addr learned my addr 192.168.123.103:0/2687004532 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:33.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.702+0000 7fd511448700 1 -- 192.168.123.103:0/2687004532 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd50c10d0f0 msgr2=0x7fd50c1ab580 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:01:33.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.702+0000 7fd511448700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd50c10d0f0 0x7fd50c1ab580 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.703+0000 7fd511448700 1 -- 192.168.123.103:0/2687004532 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd4fc0097e0 con 0x7fd50c10f340 2026-03-09T00:01:33.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.703+0000 7fd511448700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 0x7fd50c1abac0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fd4fc004930 tx=0x7fd4fc004a10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.703+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd4fc01d070 con 0x7fd50c10f340 2026-03-09T00:01:33.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.703+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd4fc00bc50 con 0x7fd50c10f340 2026-03-09T00:01:33.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.704+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd4fc00f7e0 con 0x7fd50c10f340 2026-03-09T00:01:33.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.704+0000 7fd512c4b700 1 -- 192.168.123.103:0/2687004532 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd50c1a5880 con 0x7fd50c10f340 2026-03-09T00:01:33.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.704+0000 7fd512c4b700 1 -- 192.168.123.103:0/2687004532 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd50c1a5cf0 con 0x7fd50c10f340 2026-03-09T00:01:33.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.705+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd4fc00f940 con 0x7fd50c10f340 2026-03-09T00:01:33.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.705+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd50c04f2e0 con 0x7fd50c10f340 2026-03-09T00:01:33.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.706+0000 7fd502ffd700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd4f806c2e0 0x7fd4f806e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:33.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.706+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd4fc08ccf0 con 0x7fd50c10f340 2026-03-09T00:01:33.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.708+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd4fc05b460 con 0x7fd50c10f340 2026-03-09T00:01:33.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.711+0000 7fd511c49700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd4f806c2e0 0x7fd4f806e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:33.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.717+0000 7fd511c49700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd4f806c2e0 0x7fd4f806e7a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fd50800a910 tx=0x7fd508005c10 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:33.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.782+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f4a600059f0 con 0x7f4a7c1336e0 2026-03-09T00:01:33.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.784+0000 7f4a727fc700 1 -- 192.168.123.103:0/103872965 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f4a74019030 con 0x7f4a7c1336e0 2026-03-09T00:01:33.783 INFO:teuthology.orchestra.run.vm03.stdout:55834574858 2026-03-09T00:01:33.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a6806c600 msgr2=0x7f4a6806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a6806c600 0x7f4a6806eac0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f4a7800c4d0 tx=0x7f4a7800af60 comp rx=0 tx=0).stop 2026-03-09T00:01:33.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 msgr2=0x7f4a7c133b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 0x7f4a7c133b60 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f4a7400bf40 tx=0x7f4a7400bf70 comp rx=0 tx=0).stop 2026-03-09T00:01:33.784 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 shutdown_connections 2026-03-09T00:01:33.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4a6806c600 0x7f4a6806eac0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a7c1089a0 0x7f4a7c1331a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 --2- 192.168.123.103:0/103872965 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a7c1336e0 0x7f4a7c133b60 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.785+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 >> 192.168.123.103:0/103872965 conn(0x7f4a7c06ce20 msgr2=0x7f4a7c070550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.786+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 shutdown_connections 2026-03-09T00:01:33.785 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.786+0000 7f4a837e0700 1 -- 192.168.123.103:0/103872965 wait complete. 2026-03-09T00:01:33.838 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574859 got 55834574858 for osd.1 2026-03-09T00:01:33.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.849+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f384004f2e0 con 0x7f384010f340 2026-03-09T00:01:33.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.854+0000 7f382f7fe700 1 -- 192.168.123.103:0/2971238189 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f383405ae30 con 0x7f384010f340 2026-03-09T00:01:33.856 INFO:teuthology.orchestra.run.vm03.stdout:120259084293 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f382806c380 msgr2=0x7f382806e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f382806c380 0x7f382806e840 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f3830009de0 tx=0x7f3830009450 comp rx=0 tx=0).stop 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 msgr2=0x7f38401abb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 0x7f38401abb40 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f3834009ad0 tx=0x7f383400bab0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 shutdown_connections 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f382806c380 0x7f382806e840 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f384010d0f0 0x7f38401ab600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 --2- 192.168.123.103:0/2971238189 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f384010f340 0x7f38401abb40 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.860+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 >> 192.168.123.103:0/2971238189 conn(0x7f384006ce20 msgr2=0x7f3840070510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.861+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 shutdown_connections 2026-03-09T00:01:33.860 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.861+0000 7f383f59e700 1 -- 192.168.123.103:0/2971238189 wait complete. 2026-03-09T00:01:33.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.880+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f87c4066e80 con 0x7f87c4073a50 2026-03-09T00:01:33.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.883+0000 7f87c1ffb700 1 -- 192.168.123.103:0/2072842547 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f87b0015020 con 0x7f87c4073a50 2026-03-09T00:01:33.883 INFO:teuthology.orchestra.run.vm03.stdout:38654705677 2026-03-09T00:01:33.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87b406c380 msgr2=0x7f87b406e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87b406c380 0x7f87b406e840 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f87b800b5c0 tx=0x7f87b80058e0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073a50 msgr2=0x7f87c419d6d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073a50 0x7f87c419d6d0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f87b000b6d0 tx=0x7f87b000ba90 comp rx=0 tx=0).stop 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 shutdown_connections 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f87b406c380 0x7f87b406e840 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f87c4073130 0x7f87c419d190 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 --2- 192.168.123.103:0/2072842547 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f87c4073a50 0x7f87c419d6d0 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.887+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 >> 192.168.123.103:0/2072842547 conn(0x7f87c40fc920 msgr2=0x7f87c4103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.906+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7fd50c04ea90 con 0x7fd50c10f340 2026-03-09T00:01:33.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.907+0000 7fd502ffd700 1 -- 192.168.123.103:0/2687004532 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fd4fc027090 con 0x7fd50c10f340 2026-03-09T00:01:33.908 INFO:teuthology.orchestra.run.vm03.stdout:98784247814 2026-03-09T00:01:33.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.892+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 shutdown_connections 2026-03-09T00:01:33.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.911+0000 7f87cb0d8700 1 -- 192.168.123.103:0/2072842547 wait complete. 2026-03-09T00:01:33.918 INFO:tasks.cephadm.ceph_manager.ceph:need seq 120259084293 got 120259084293 for osd.4 2026-03-09T00:01:33.918 DEBUG:teuthology.parallel:result is None 2026-03-09T00:01:33.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.920+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd4f806c2e0 msgr2=0x7fd4f806e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.920+0000 7fd500ff9700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd4f806c2e0 0x7fd4f806e7a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fd50800a910 tx=0x7fd508005c10 comp rx=0 tx=0).stop 2026-03-09T00:01:33.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.920+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 msgr2=0x7fd50c1abac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:33.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.920+0000 7fd500ff9700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 0x7fd50c1abac0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fd4fc004930 tx=0x7fd4fc004a10 comp rx=0 tx=0).stop 2026-03-09T00:01:33.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.921+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 shutdown_connections 2026-03-09T00:01:33.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.921+0000 7fd500ff9700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd4f806c2e0 0x7fd4f806e7a0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.921+0000 7fd500ff9700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd50c10d0f0 0x7fd50c1ab580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.921+0000 7fd500ff9700 1 --2- 192.168.123.103:0/2687004532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd50c10f340 0x7fd50c1abac0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:33.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.921+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 >> 192.168.123.103:0/2687004532 conn(0x7fd50c06ce20 msgr2=0x7fd50c070450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:33.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.921+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 shutdown_connections 2026-03-09T00:01:33.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:33.922+0000 7fd500ff9700 1 -- 192.168.123.103:0/2687004532 wait complete. 2026-03-09T00:01:33.959 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705677 got 38654705677 for osd.0 2026-03-09T00:01:33.959 DEBUG:teuthology.parallel:result is None 2026-03-09T00:01:33.981 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247814 for osd.3 2026-03-09T00:01:34.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1876085556' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T00:01:34.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/103872965' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T00:01:34.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2971238189' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T00:01:34.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2072842547' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T00:01:34.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:34 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2687004532' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T00:01:34.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:34 vm06 ceph-mon[58395]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1876085556' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-09T00:01:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/103872965' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T00:01:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2971238189' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-09T00:01:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2072842547' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-09T00:01:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:34 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2687004532' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T00:01:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:34 vm03 ceph-mon[52346]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:34.839 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.1 2026-03-09T00:01:34.974 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:34.982 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd last-stat-seq osd.3 2026-03-09T00:01:35.197 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 -- 192.168.123.103:0/1593478248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410dad0 msgr2=0x7f7ad410bdd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 --2- 192.168.123.103:0/1593478248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410dad0 0x7f7ad410bdd0 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7f7acc009fb0 tx=0x7f7acc00b4c0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 -- 192.168.123.103:0/1593478248 shutdown_connections 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 --2- 192.168.123.103:0/1593478248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410dad0 0x7f7ad410bdd0 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 --2- 192.168.123.103:0/1593478248 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ad410d120 0x7f7ad410d500 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 -- 192.168.123.103:0/1593478248 >> 192.168.123.103:0/1593478248 conn(0x7f7ad406ce10 msgr2=0x7f7ad406d220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 -- 192.168.123.103:0/1593478248 shutdown_connections 2026-03-09T00:01:35.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.254+0000 7f7ad359e700 1 -- 192.168.123.103:0/1593478248 wait complete. 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.255+0000 7f7ad359e700 1 Processor -- start 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.255+0000 7f7ad359e700 1 -- start start 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad359e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 0x7f7ad4078e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad359e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ad4079370 0x7f7ad40797f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad359e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ad407d9b0 con 0x7f7ad410d120 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad359e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ad407daf0 con 0x7f7ad4079370 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 0x7f7ad4078e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 0x7f7ad4078e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36248/0 (socket says 192.168.123.103:36248) 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 -- 192.168.123.103:0/3032982539 learned_addr learned my addr 192.168.123.103:0/3032982539 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 -- 192.168.123.103:0/3032982539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ad4079370 msgr2=0x7f7ad40797f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ad4079370 0x7f7ad40797f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 -- 192.168.123.103:0/3032982539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7acc00b050 con 0x7f7ad410d120 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad259c700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 0x7f7ad4078e30 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f7ac400ba70 tx=0x7f7ac400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ac400c760 con 0x7f7ad410d120 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad359e700 1 -- 192.168.123.103:0/3032982539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ad407ddd0 con 0x7f7ad410d120 2026-03-09T00:01:35.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.256+0000 7f7ad359e700 1 -- 192.168.123.103:0/3032982539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ad407e320 con 0x7f7ad410d120 2026-03-09T00:01:35.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.258+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7ac400cda0 con 0x7f7ad410d120 2026-03-09T00:01:35.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.258+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ac4012550 con 0x7f7ad410d120 2026-03-09T00:01:35.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.259+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7ac400c8c0 con 0x7f7ad410d120 2026-03-09T00:01:35.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.259+0000 7f7ac37fe700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7abc06c610 0x7f7abc06ead0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:35.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.259+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7ac408a6c0 con 0x7f7ad410d120 2026-03-09T00:01:35.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.259+0000 7f7ad1d9b700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7abc06c610 0x7f7abc06ead0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:35.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.260+0000 7f7ad1d9b700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7abc06c610 0x7f7abc06ead0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7acc00bcd0 tx=0x7f7acc009410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:35.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.260+0000 7f7ad359e700 1 -- 192.168.123.103:0/3032982539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7ab4005320 con 0x7f7ad410d120 2026-03-09T00:01:35.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.263+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7ac4055280 con 0x7f7ad410d120 2026-03-09T00:01:35.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.403+0000 7f7ad359e700 1 -- 192.168.123.103:0/3032982539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7f7ab40059f0 con 0x7f7ad410d120 2026-03-09T00:01:35.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.404+0000 7f7ac37fe700 1 -- 192.168.123.103:0/3032982539 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f7ac4019030 con 0x7f7ad410d120 2026-03-09T00:01:35.403 INFO:teuthology.orchestra.run.vm03.stdout:55834574859 2026-03-09T00:01:35.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.407+0000 7f7ac17fa700 1 -- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7abc06c610 msgr2=0x7f7abc06ead0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.407+0000 7f7ac17fa700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7abc06c610 0x7f7abc06ead0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f7acc00bcd0 tx=0x7f7acc009410 comp rx=0 tx=0).stop 2026-03-09T00:01:35.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.407+0000 7f7ac17fa700 1 -- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 msgr2=0x7f7ad4078e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.407+0000 7f7ac17fa700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 0x7f7ad4078e30 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7f7ac400ba70 tx=0x7f7ac400be30 comp rx=0 tx=0).stop 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 -- 192.168.123.103:0/3032982539 shutdown_connections 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7abc06c610 0x7f7abc06ead0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7ad410d120 0x7f7ad4078e30 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 --2- 192.168.123.103:0/3032982539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7ad4079370 0x7f7ad40797f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 -- 192.168.123.103:0/3032982539 >> 192.168.123.103:0/3032982539 conn(0x7f7ad406ce10 msgr2=0x7f7ad4071360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 -- 192.168.123.103:0/3032982539 shutdown_connections 2026-03-09T00:01:35.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.408+0000 7f7ac17fa700 1 -- 192.168.123.103:0/3032982539 wait complete. 2026-03-09T00:01:35.475 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574859 got 55834574859 for osd.1 2026-03-09T00:01:35.475 DEBUG:teuthology.parallel:result is None 2026-03-09T00:01:35.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.475+0000 7f84dbfff700 1 -- 192.168.123.103:0/2033277124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 msgr2=0x7f84dc10f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.475 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.475+0000 7f84dbfff700 1 --2- 192.168.123.103:0/2033277124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc10f720 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f84cc009b50 tx=0x7f84cc009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:35.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.476+0000 7f84dbfff700 1 -- 192.168.123.103:0/2033277124 shutdown_connections 2026-03-09T00:01:35.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.476+0000 7f84dbfff700 1 --2- 192.168.123.103:0/2033277124 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84dc10d0f0 0x7f84dc10d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.476+0000 7f84dbfff700 1 --2- 192.168.123.103:0/2033277124 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc10f720 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.476+0000 7f84dbfff700 1 -- 192.168.123.103:0/2033277124 >> 192.168.123.103:0/2033277124 conn(0x7f84dc06ce20 msgr2=0x7f84dc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:35.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.477+0000 7f84dbfff700 1 -- 192.168.123.103:0/2033277124 shutdown_connections 2026-03-09T00:01:35.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.477+0000 7f84dbfff700 1 -- 192.168.123.103:0/2033277124 wait complete. 2026-03-09T00:01:35.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.477+0000 7f84dbfff700 1 Processor -- start 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.477+0000 7f84dbfff700 1 -- start start 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84dbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84dc10d0f0 0x7f84dc19cf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84dbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc19d470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84dbfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84dc19db50 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84dbfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84dc1a1890 con 0x7f84dc10d0f0 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc19d470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc19d470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36264/0 (socket says 192.168.123.103:36264) 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 -- 192.168.123.103:0/1746625801 learned_addr learned my addr 192.168.123.103:0/1746625801 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84daffd700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84dc10d0f0 0x7f84dc19cf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 -- 192.168.123.103:0/1746625801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84dc10d0f0 msgr2=0x7f84dc19cf30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84dc10d0f0 0x7f84dc19cf30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 -- 192.168.123.103:0/1746625801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84cc0097e0 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84da7fc700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc19d470 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f84d40060b0 tx=0x7f84d400d6a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f84d4015400 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f84d400f040 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.478+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f84d40149b0 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.479+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84dc1a1b70 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.480+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f84d4014b10 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.480+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84dc1a20c0 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.480+0000 7f84c3fff700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f84c406c4f0 0x7f84c406e9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.480+0000 7f84daffd700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f84c406c4f0 0x7f84c406e9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.480+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f84dc197340 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.484+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f84d4059e50 con 0x7f84dc10f340 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.484+0000 7f84daffd700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f84c406c4f0 0x7f84c406e9b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f84cc01c5d0 tx=0x7f84cc0058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:35.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.484+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f84d401a020 con 0x7f84dc10f340 2026-03-09T00:01:35.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.586+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f84dc04f2e0 con 0x7f84dc10f340 2026-03-09T00:01:35.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.587+0000 7f84c3fff700 1 -- 192.168.123.103:0/1746625801 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f84d4071e60 con 0x7f84dc10f340 2026-03-09T00:01:35.586 INFO:teuthology.orchestra.run.vm03.stdout:98784247816 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.588+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f84c406c4f0 msgr2=0x7f84c406e9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.588+0000 7f84dbfff700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f84c406c4f0 0x7f84c406e9b0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f84cc01c5d0 tx=0x7f84cc0058e0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 msgr2=0x7f84dc19d470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc19d470 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f84d40060b0 tx=0x7f84d400d6a0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 shutdown_connections 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f84c406c4f0 0x7f84c406e9b0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84dc10d0f0 0x7f84dc19cf30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 --2- 192.168.123.103:0/1746625801 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84dc10f340 0x7f84dc19d470 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:35.588 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 >> 192.168.123.103:0/1746625801 conn(0x7f84dc06ce20 msgr2=0x7f84dc109c90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:35.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 shutdown_connections 2026-03-09T00:01:35.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:35.589+0000 7f84dbfff700 1 -- 192.168.123.103:0/1746625801 wait complete. 2026-03-09T00:01:35.630 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:35 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3032982539' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T00:01:35.648 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247816 for osd.3 2026-03-09T00:01:35.648 DEBUG:teuthology.parallel:result is None 2026-03-09T00:01:35.648 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-09T00:01:35.648 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph pg dump --format=json 2026-03-09T00:01:35.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:35 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3032982539' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-09T00:01:35.825 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 -- 192.168.123.103:0/2660875964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac0684d0 msgr2=0x7ff6ac0688b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/2660875964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac0684d0 0x7ff6ac0688b0 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7ff694009b30 tx=0x7ff694009e40 comp rx=0 tx=0).stop 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 -- 192.168.123.103:0/2660875964 shutdown_connections 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/2660875964 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff6ac068df0 0x7ff6ac10d5b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/2660875964 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac0684d0 0x7ff6ac0688b0 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 -- 192.168.123.103:0/2660875964 >> 192.168.123.103:0/2660875964 conn(0x7ff6ac075960 msgr2=0x7ff6ac075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 -- 192.168.123.103:0/2660875964 shutdown_connections 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 -- 192.168.123.103:0/2660875964 wait complete. 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 Processor -- start 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.047+0000 7ff6b1d29700 1 -- start start 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6b1d29700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff6ac0684d0 0x7ff6ac1099d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6b1d29700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 0x7ff6ac109f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6b1d29700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6ac10a5f0 con 0x7ff6ac068df0 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6b1d29700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6ac105c90 con 0x7ff6ac0684d0 2026-03-09T00:01:36.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 0x7ff6ac109f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 0x7ff6ac109f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36284/0 (socket says 192.168.123.103:36284) 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 -- 192.168.123.103:0/1787616217 learned_addr learned my addr 192.168.123.103:0/1787616217 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 -- 192.168.123.103:0/1787616217 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff6ac0684d0 msgr2=0x7ff6ac1099d0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff6ac0684d0 0x7ff6ac1099d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 -- 192.168.123.103:0/1787616217 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6940097e0 con 0x7ff6ac068df0 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.048+0000 7ff6aaffd700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 0x7ff6ac109f10 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7ff69c00d8d0 tx=0x7ff69c00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:36.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.049+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff69c009940 con 0x7ff6ac068df0 2026-03-09T00:01:36.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.049+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6ac105f70 con 0x7ff6ac068df0 2026-03-09T00:01:36.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.049+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6ac1064c0 con 0x7ff6ac068df0 2026-03-09T00:01:36.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.049+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff69c010460 con 0x7ff6ac068df0 2026-03-09T00:01:36.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.049+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff69c00f5d0 con 0x7ff6ac068df0 2026-03-09T00:01:36.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.050+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff69c0105d0 con 0x7ff6ac068df0 2026-03-09T00:01:36.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.050+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff68c005320 con 0x7ff6ac068df0 2026-03-09T00:01:36.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.054+0000 7ff6a8ff9700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff69806c4e0 0x7ff69806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:36.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.055+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff69c08b270 con 0x7ff6ac068df0 2026-03-09T00:01:36.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.055+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff69c055780 con 0x7ff6ac068df0 2026-03-09T00:01:36.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.055+0000 7ff6ab7fe700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff69806c4e0 0x7ff69806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:36.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.055+0000 7ff6ab7fe700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff69806c4e0 0x7ff69806e9a0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff694006010 tx=0x7ff694005e20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:36.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.155+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7ff68c000bf0 con 0x7ff69806c4e0 2026-03-09T00:01:36.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.157+0000 7ff6a8ff9700 1 -- 192.168.123.103:0/1787616217 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19079 (secure 0 0 0) 0x7ff68c000bf0 con 0x7ff69806c4e0 2026-03-09T00:01:36.157 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff69806c4e0 msgr2=0x7ff69806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff69806c4e0 0x7ff69806e9a0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ff694006010 tx=0x7ff694005e20 comp rx=0 tx=0).stop 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 msgr2=0x7ff6ac109f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 0x7ff6ac109f10 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7ff69c00d8d0 tx=0x7ff69c00dc90 comp rx=0 tx=0).stop 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 shutdown_connections 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7ff69806c4e0 0x7ff69806e9a0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff6ac0684d0 0x7ff6ac1099d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 --2- 192.168.123.103:0/1787616217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff6ac068df0 0x7ff6ac109f10 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 >> 192.168.123.103:0/1787616217 conn(0x7ff6ac075960 msgr2=0x7ff6ac0fe9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 shutdown_connections 2026-03-09T00:01:36.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.160+0000 7ff6b1d29700 1 -- 192.168.123.103:0/1787616217 wait complete. 2026-03-09T00:01:36.160 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-09T00:01:36.202 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":69,"stamp":"2026-03-09T00:01:35.993940+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163640,"kb_used_data":3080,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640904,"statfs":{"total":128823853056,"available":128656285696,"internally_reserved":0,"allocated":3153920,"data_stored":2042049,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.693716"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":137,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-09T00:01:26.302492+0000","last_change":"2026-03-09T00:01:15.929289+0000","last_active":"2026-03-09T00:01:26.302492+0000","last_peered":"2026-03-09T00:01:26.302492+0000","last_clean":"2026-03-09T00:01:26.302492+0000","last_became_active":"2026-03-09T00:01:15.929171+0000","last_became_peered":"2026-03-09T00:01:15.929171+0000","last_unstale":"2026-03-09T00:01:26.302492+0000","last_undegraded":"2026-03-09T00:01:26.302492+0000","last_fullsized":"2026-03-09T00:01:26.302492+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T00:00:56.871199+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T00:00:56.871199+0000","last_clean_scrub_stamp":"2026-03-09T00:00:56.871199+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T05:15:51.501050+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.496}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63200000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.623}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56100000000000005}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61199999999999999}]}]},{"osd":4,"up_from":28,"seq":120259084294,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.622}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67700000000000005}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45600000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57999999999999996}]}]},{"osd":3,"up_from":23,"seq":98784247816,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40200000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47899999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56499999999999995}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.438}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38600000000000001}]}]},{"osd":2,"up_from":17,"seq":73014444042,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27044,"kb_used_data":284,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940380,"statfs":{"total":21470642176,"available":21442949120,"internally_reserved":0,"allocated":290816,"data_stored":110144,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.30499999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35099999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39900000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52000000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59299999999999997}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61099999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.62}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65800000000000003}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64400000000000002}]}]},{"osd":1,"up_from":13,"seq":55834574860,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.318}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.29699999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48799999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.496}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42999999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T00:01:36.202 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph pg dump --format=json 2026-03-09T00:01:36.335 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:36.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.568+0000 7fc9a8b46700 1 -- 192.168.123.103:0/1311817285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a4074dd0 msgr2=0x7fc9a4072fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:36.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.568+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/1311817285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a4074dd0 0x7fc9a4072fc0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7fc98c009b30 tx=0x7fc98c009e40 comp rx=0 tx=0).stop 2026-03-09T00:01:36.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.568+0000 7fc9a8b46700 1 -- 192.168.123.103:0/1311817285 shutdown_connections 2026-03-09T00:01:36.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.568+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/1311817285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4073500 0x7fc9a4073960 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.568+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/1311817285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a4074dd0 0x7fc9a4072fc0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.568+0000 7fc9a8b46700 1 -- 192.168.123.103:0/1311817285 >> 192.168.123.103:0/1311817285 conn(0x7fc9a4078ed0 msgr2=0x7fc9a40792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:36.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.569+0000 7fc9a8b46700 1 -- 192.168.123.103:0/1311817285 shutdown_connections 2026-03-09T00:01:36.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.569+0000 7fc9a8b46700 1 -- 192.168.123.103:0/1311817285 wait complete. 2026-03-09T00:01:36.568 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.569+0000 7fc9a8b46700 1 Processor -- start 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.569+0000 7fc9a8b46700 1 -- start start 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a8b46700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4073500 0x7fc9a419d510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a8b46700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 0x7fc9a41a1ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a8b46700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9a419e070 con 0x7fc9a419da50 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a8b46700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9a419e1e0 con 0x7fc9a4073500 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 0x7fc9a41a1ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 0x7fc9a41a1ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36308/0 (socket says 192.168.123.103:36308) 2026-03-09T00:01:36.569 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 -- 192.168.123.103:0/427055887 learned_addr learned my addr 192.168.123.103:0/427055887 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 -- 192.168.123.103:0/427055887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4073500 msgr2=0x7fc9a419d510 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4073500 0x7fc9a419d510 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 -- 192.168.123.103:0/427055887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc98c0097e0 con 0x7fc9a419da50 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc9a1d9b700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 0x7fc9a41a1ec0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fc99400b820 tx=0x7fc99400bbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.570+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc99400d6c0 con 0x7fc9a419da50 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.571+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc99400dd00 con 0x7fc9a419da50 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.571+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc994015420 con 0x7fc9a419da50 2026-03-09T00:01:36.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.571+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc9a41a24c0 con 0x7fc9a419da50 2026-03-09T00:01:36.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.571+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc9a41a2a10 con 0x7fc9a419da50 2026-03-09T00:01:36.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.572+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc9a41027c0 con 0x7fc9a419da50 2026-03-09T00:01:36.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.572+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc99400c440 con 0x7fc9a419da50 2026-03-09T00:01:36.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.572+0000 7fc99b7fe700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc99006c490 0x7fc99006e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:36.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.575+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc994011070 con 0x7fc9a419da50 2026-03-09T00:01:36.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.575+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc994090050 con 0x7fc9a419da50 2026-03-09T00:01:36.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.575+0000 7fc9a259c700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc99006c490 0x7fc99006e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:36.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.576+0000 7fc9a259c700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc99006c490 0x7fc99006e950 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fc98c00b580 tx=0x7fc98c000bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:36.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:36 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1746625801' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T00:01:36.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:36 vm06 ceph-mon[58395]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:36.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:36 vm06 ceph-mon[58395]: from='client.14436 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T00:01:36.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:36 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1746625801' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-09T00:01:36.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:36 vm03 ceph-mon[52346]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:36.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:36 vm03 ceph-mon[52346]: from='client.14436 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T00:01:36.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.676+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7fc9a419e870 con 0x7fc99006c490 2026-03-09T00:01:36.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.677+0000 7fc99b7fe700 1 -- 192.168.123.103:0/427055887 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19079 (secure 0 0 0) 0x7fc9a419e870 con 0x7fc99006c490 2026-03-09T00:01:36.677 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:36.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.679+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc99006c490 msgr2=0x7fc99006e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:36.678 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.679+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc99006c490 0x7fc99006e950 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fc98c00b580 tx=0x7fc98c000bc0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.679+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 msgr2=0x7fc9a41a1ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.679+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 0x7fc9a41a1ec0 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7fc99400b820 tx=0x7fc99400bbe0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.680+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 shutdown_connections 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.680+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc99006c490 0x7fc99006e950 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.680+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4073500 0x7fc9a419d510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.680+0000 7fc9a8b46700 1 --2- 192.168.123.103:0/427055887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a419da50 0x7fc9a41a1ec0 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:36.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.680+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 >> 192.168.123.103:0/427055887 conn(0x7fc9a4078ed0 msgr2=0x7fc9a410fb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:36.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.680+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 shutdown_connections 2026-03-09T00:01:36.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:36.681+0000 7fc9a8b46700 1 -- 192.168.123.103:0/427055887 wait complete. 2026-03-09T00:01:36.681 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-09T00:01:36.721 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":69,"stamp":"2026-03-09T00:01:35.993940+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163640,"kb_used_data":3080,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640904,"statfs":{"total":128823853056,"available":128656285696,"internally_reserved":0,"allocated":3153920,"data_stored":2042049,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.693716"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":137,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-09T00:01:26.302492+0000","last_change":"2026-03-09T00:01:15.929289+0000","last_active":"2026-03-09T00:01:26.302492+0000","last_peered":"2026-03-09T00:01:26.302492+0000","last_clean":"2026-03-09T00:01:26.302492+0000","last_became_active":"2026-03-09T00:01:15.929171+0000","last_became_peered":"2026-03-09T00:01:15.929171+0000","last_unstale":"2026-03-09T00:01:26.302492+0000","last_undegraded":"2026-03-09T00:01:26.302492+0000","last_fullsized":"2026-03-09T00:01:26.302492+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-09T00:00:56.871199+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-09T00:00:56.871199+0000","last_clean_scrub_stamp":"2026-03-09T00:00:56.871199+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-10T05:15:51.501050+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.496}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63200000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.623}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56100000000000005}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61199999999999999}]}]},{"osd":4,"up_from":28,"seq":120259084294,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110813,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.622}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.67700000000000005}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.45600000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57999999999999996}]}]},{"osd":3,"up_from":23,"seq":98784247816,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40200000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47899999999999998}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56499999999999995}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.438}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.38600000000000001}]}]},{"osd":2,"up_from":17,"seq":73014444042,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27044,"kb_used_data":284,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940380,"statfs":{"total":21470642176,"available":21442949120,"internally_reserved":0,"allocated":290816,"data_stored":110144,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.30499999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35099999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39900000000000002}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52000000000000002}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59299999999999997}]}]},{"osd":0,"up_from":9,"seq":38654705678,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.629}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.61099999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.62}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65800000000000003}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.64400000000000002}]}]},{"osd":1,"up_from":13,"seq":55834574860,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570093,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.318}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.29699999999999999}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.48799999999999999}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.496}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42999999999999999}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-09T00:01:36.721 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-09T00:01:36.721 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-09T00:01:36.722 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-09T00:01:36.722 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph health --format=json 2026-03-09T00:01:36.854 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.080+0000 7f82dfaed700 1 -- 192.168.123.103:0/1131414978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 msgr2=0x7f82d8073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.080+0000 7f82dfaed700 1 --2- 192.168.123.103:0/1131414978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d8073510 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7f82c8009ab0 tx=0x7f82c8009dc0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.081+0000 7f82dfaed700 1 -- 192.168.123.103:0/1131414978 shutdown_connections 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.081+0000 7f82dfaed700 1 --2- 192.168.123.103:0/1131414978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 0x7f82d8111970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.081+0000 7f82dfaed700 1 --2- 192.168.123.103:0/1131414978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d8073510 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.081+0000 7f82dfaed700 1 -- 192.168.123.103:0/1131414978 >> 192.168.123.103:0/1131414978 conn(0x7f82d80fc970 msgr2=0x7f82d80fed90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.081+0000 7f82dfaed700 1 -- 192.168.123.103:0/1131414978 shutdown_connections 2026-03-09T00:01:37.080 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.081+0000 7f82dfaed700 1 -- 192.168.123.103:0/1131414978 wait complete. 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dfaed700 1 Processor -- start 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dfaed700 1 -- start start 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dfaed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d819d190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dfaed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 0x7f82d819d6d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dfaed700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82d819ddb0 con 0x7f82d8073130 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dfaed700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82d81a1b40 con 0x7f82d8073a50 2026-03-09T00:01:37.081 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dd889700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d819d190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:37.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dd889700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d819d190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36336/0 (socket says 192.168.123.103:36336) 2026-03-09T00:01:37.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dd889700 1 -- 192.168.123.103:0/2502048756 learned_addr learned my addr 192.168.123.103:0/2502048756 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:37.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dd889700 1 -- 192.168.123.103:0/2502048756 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 msgr2=0x7f82d819d6d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.082 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dd088700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 0x7f82d819d6d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:37.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.082+0000 7f82dd889700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 0x7f82d819d6d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.083+0000 7f82dd889700 1 -- 192.168.123.103:0/2502048756 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f82c8009710 con 0x7f82d8073130 2026-03-09T00:01:37.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.083+0000 7f82dd088700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 0x7f82d819d6d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:37.084 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.083+0000 7f82dd889700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d819d190 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f82c8000c00 tx=0x7f82c800f800 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:37.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.083+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82c801d070 con 0x7f82d8073130 2026-03-09T00:01:37.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.083+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82d81a1dc0 con 0x7f82d8073130 2026-03-09T00:01:37.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.083+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82d81a2310 con 0x7f82d8073130 2026-03-09T00:01:37.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.084+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f82c800fb80 con 0x7f82d8073130 2026-03-09T00:01:37.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.084+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82c8017760 con 0x7f82d8073130 2026-03-09T00:01:37.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.084+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f82c800fcf0 con 0x7f82d8073130 2026-03-09T00:01:37.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.084+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82bc005320 con 0x7f82d8073130 2026-03-09T00:01:37.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.088+0000 7f82ceffd700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f82c406c4e0 0x7f82c406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:37.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.088+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f82c808c570 con 0x7f82d8073130 2026-03-09T00:01:37.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.088+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f82c808ea40 con 0x7f82d8073130 2026-03-09T00:01:37.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.088+0000 7f82dd088700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f82c406c4e0 0x7f82c406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:37.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.089+0000 7f82dd088700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f82c406c4e0 0x7f82c406e9a0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f82d819e7b0 tx=0x7f82d4009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:37.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.209+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7f82bc005f70 con 0x7f82d8073130 2026-03-09T00:01:37.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.211+0000 7f82ceffd700 1 -- 192.168.123.103:0/2502048756 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7f82c8057200 con 0x7f82d8073130 2026-03-09T00:01:37.210 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:37.210 INFO:teuthology.orchestra.run.vm03.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-09T00:01:37.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.213+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f82c406c4e0 msgr2=0x7f82c406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.213+0000 7f82dfaed700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f82c406c4e0 0x7f82c406e9a0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f82d819e7b0 tx=0x7f82d4009500 comp rx=0 tx=0).stop 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.213+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 msgr2=0x7f82d819d190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.213+0000 7f82dfaed700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d819d190 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7f82c8000c00 tx=0x7f82c800f800 comp rx=0 tx=0).stop 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 shutdown_connections 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f82c406c4e0 0x7f82c406e9a0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82d8073130 0x7f82d819d190 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 --2- 192.168.123.103:0/2502048756 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82d8073a50 0x7f82d819d6d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 >> 192.168.123.103:0/2502048756 conn(0x7f82d80fc970 msgr2=0x7f82d8103480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:37.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 shutdown_connections 2026-03-09T00:01:37.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.214+0000 7f82dfaed700 1 -- 192.168.123.103:0/2502048756 wait complete. 2026-03-09T00:01:37.254 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-09T00:01:37.255 INFO:tasks.cephadm:Setup complete, yielding 2026-03-09T00:01:37.255 INFO:teuthology.run_tasks:Running task print... 2026-03-09T00:01:37.256 INFO:teuthology.task.print:**** done end installing v18.2.1 cephadm ... 2026-03-09T00:01:37.256 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:37.258 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:37.258 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-09T00:01:37.392 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:37.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:37 vm03 ceph-mon[52346]: from='client.14440 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T00:01:37.467 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:37 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2502048756' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.639+0000 7fd64c614700 1 -- 192.168.123.103:0/1905578892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 msgr2=0x7fd6440ff5e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.639+0000 7fd64c614700 1 --2- 192.168.123.103:0/1905578892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd6440ff5e0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7fd640009b00 tx=0x7fd640009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 -- 192.168.123.103:0/1905578892 shutdown_connections 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 --2- 192.168.123.103:0/1905578892 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6440fff30 0x7fd6441003b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 --2- 192.168.123.103:0/1905578892 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd6440ff5e0 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 -- 192.168.123.103:0/1905578892 >> 192.168.123.103:0/1905578892 conn(0x7fd6440fab00 msgr2=0x7fd6440fcf60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:37.640 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 -- 192.168.123.103:0/1905578892 shutdown_connections 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 -- 192.168.123.103:0/1905578892 wait complete. 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.641+0000 7fd64c614700 1 Processor -- start 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64c614700 1 -- start start 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64c614700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd644071d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64c614700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6440fff30 0x7fd644072280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64c614700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6440727c0 con 0x7fd6440ff1c0 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64c614700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd644072900 con 0x7fd6440fff30 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64a3b0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd644071d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64a3b0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd644071d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36346/0 (socket says 192.168.123.103:36346) 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64a3b0700 1 -- 192.168.123.103:0/839448048 learned_addr learned my addr 192.168.123.103:0/839448048 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:37.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64a3b0700 1 -- 192.168.123.103:0/839448048 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6440fff30 msgr2=0x7fd644072280 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:37.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64a3b0700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6440fff30 0x7fd644072280 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.642+0000 7fd64a3b0700 1 -- 192.168.123.103:0/839448048 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6400097e0 con 0x7fd6440ff1c0 2026-03-09T00:01:37.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.643+0000 7fd64a3b0700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd644071d40 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7fd640005230 tx=0x7fd6400056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:37.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.643+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd64001d070 con 0x7fd6440ff1c0 2026-03-09T00:01:37.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.643+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6441a2540 con 0x7fd6440ff1c0 2026-03-09T00:01:37.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.643+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6441a2a30 con 0x7fd6440ff1c0 2026-03-09T00:01:37.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.643+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd64000bc50 con 0x7fd6440ff1c0 2026-03-09T00:01:37.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.643+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd64000f800 con 0x7fd6440ff1c0 2026-03-09T00:01:37.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.644+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd64000f960 con 0x7fd6440ff1c0 2026-03-09T00:01:37.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.644+0000 7fd63b7fe700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd63006c530 0x7fd63006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:37.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.644+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fd64008dd00 con 0x7fd6440ff1c0 2026-03-09T00:01:37.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.645+0000 7fd649baf700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd63006c530 0x7fd63006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:37.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.645+0000 7fd649baf700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd63006c530 0x7fd63006e9f0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fd644072e30 tx=0x7fd634006cb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:37.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.646+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd628005320 con 0x7fd6440ff1c0 2026-03-09T00:01:37.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.648+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd640058970 con 0x7fd6440ff1c0 2026-03-09T00:01:37.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.758+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7fd628005190 con 0x7fd6440ff1c0 2026-03-09T00:01:37.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.765+0000 7fd63b7fe700 1 -- 192.168.123.103:0/839448048 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7fd640027020 con 0x7fd6440ff1c0 2026-03-09T00:01:37.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.772+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd63006c530 msgr2=0x7fd63006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.772+0000 7fd64c614700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd63006c530 0x7fd63006e9f0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fd644072e30 tx=0x7fd634006cb0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.772+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 msgr2=0x7fd644071d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:37.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.772+0000 7fd64c614700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd644071d40 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7fd640005230 tx=0x7fd6400056c0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.773+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 shutdown_connections 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.774+0000 7fd64c614700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fd63006c530 0x7fd63006e9f0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.774+0000 7fd64c614700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6440ff1c0 0x7fd644071d40 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.774+0000 7fd64c614700 1 --2- 192.168.123.103:0/839448048 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6440fff30 0x7fd644072280 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.774+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 >> 192.168.123.103:0/839448048 conn(0x7fd6440fab00 msgr2=0x7fd6441085c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.774+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 shutdown_connections 2026-03-09T00:01:37.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:37.774+0000 7fd64c614700 1 -- 192.168.123.103:0/839448048 wait complete. 2026-03-09T00:01:37.822 INFO:teuthology.run_tasks:Running task print... 2026-03-09T00:01:37.824 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-09T00:01:37.824 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:37.826 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:37.826 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph orch status' 2026-03-09T00:01:37.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:37 vm06 ceph-mon[58395]: from='client.14440 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-09T00:01:37.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:37 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2502048756' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-09T00:01:37.981 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.229+0000 7f343ceff700 1 -- 192.168.123.103:0/2427602103 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438103120 msgr2=0x7f3438103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.229+0000 7f343ceff700 1 --2- 192.168.123.103:0/2427602103 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438103120 0x7f3438103540 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f3428009b00 tx=0x7f3428009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.230+0000 7f343ceff700 1 -- 192.168.123.103:0/2427602103 shutdown_connections 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.230+0000 7f343ceff700 1 --2- 192.168.123.103:0/2427602103 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438104320 0x7f3438104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.230+0000 7f343ceff700 1 --2- 192.168.123.103:0/2427602103 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438103120 0x7f3438103540 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.230+0000 7f343ceff700 1 -- 192.168.123.103:0/2427602103 >> 192.168.123.103:0/2427602103 conn(0x7f34380fe6c0 msgr2=0x7f3438100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.230+0000 7f343ceff700 1 -- 192.168.123.103:0/2427602103 shutdown_connections 2026-03-09T00:01:38.229 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.230+0000 7f343ceff700 1 -- 192.168.123.103:0/2427602103 wait complete. 2026-03-09T00:01:38.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343ceff700 1 Processor -- start 2026-03-09T00:01:38.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343ceff700 1 -- start start 2026-03-09T00:01:38.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343ceff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 0x7f34381989a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343ceff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438104320 0x7f3438198ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343ceff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3438199470 con 0x7f3438104320 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343ceff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34381995b0 con 0x7f3438103120 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 0x7f34381989a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 0x7f34381989a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:38926/0 (socket says 192.168.123.103:38926) 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f343659c700 1 -- 192.168.123.103:0/378459535 learned_addr learned my addr 192.168.123.103:0/378459535 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.231+0000 7f3435d9b700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438104320 0x7f3438198ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f343659c700 1 -- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438104320 msgr2=0x7f3438198ee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f343659c700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438104320 0x7f3438198ee0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f343659c700 1 -- 192.168.123.103:0/378459535 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34280097e0 con 0x7f3438103120 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f3435d9b700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438104320 0x7f3438198ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:01:38.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f343659c700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 0x7f34381989a0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f3428004990 tx=0x7f34280049c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:38.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f342801d070 con 0x7f3438103120 2026-03-09T00:01:38.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.232+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f342800bc50 con 0x7f3438103120 2026-03-09T00:01:38.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.233+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f343819e010 con 0x7f3438103120 2026-03-09T00:01:38.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.233+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f342800f7c0 con 0x7f3438103120 2026-03-09T00:01:38.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.233+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f343819e4d0 con 0x7f3438103120 2026-03-09T00:01:38.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.234+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3438066e80 con 0x7f3438103120 2026-03-09T00:01:38.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.235+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3428022ae0 con 0x7f3438103120 2026-03-09T00:01:38.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.235+0000 7f342f7fe700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f342406c4e0 0x7f342406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:38.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.235+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f342808d730 con 0x7f3438103120 2026-03-09T00:01:38.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.235+0000 7f3435d9b700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f342406c4e0 0x7f342406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:38.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.236+0000 7f3435d9b700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f342406c4e0 0x7f342406e9a0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f342000afd0 tx=0x7f342000a380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:38.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.237+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f342805bf20 con 0x7f3438103120 2026-03-09T00:01:38.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.351+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3438108c70 con 0x7f342406c4e0 2026-03-09T00:01:38.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.352+0000 7f342f7fe700 1 -- 192.168.123.103:0/378459535 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7f3438108c70 con 0x7f342406c4e0 2026-03-09T00:01:38.351 INFO:teuthology.orchestra.run.vm03.stdout:Backend: cephadm 2026-03-09T00:01:38.351 INFO:teuthology.orchestra.run.vm03.stdout:Available: Yes 2026-03-09T00:01:38.351 INFO:teuthology.orchestra.run.vm03.stdout:Paused: No 2026-03-09T00:01:38.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f342406c4e0 msgr2=0x7f342406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f342406c4e0 0x7f342406e9a0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f342000afd0 tx=0x7f342000a380 comp rx=0 tx=0).stop 2026-03-09T00:01:38.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 msgr2=0x7f34381989a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.353 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 0x7f34381989a0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f3428004990 tx=0x7f34280049c0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 shutdown_connections 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f342406c4e0 0x7f342406e9a0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3438103120 0x7f34381989a0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 --2- 192.168.123.103:0/378459535 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3438104320 0x7f3438198ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.354+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 >> 192.168.123.103:0/378459535 conn(0x7f34380fe6c0 msgr2=0x7f3438107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.355+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 shutdown_connections 2026-03-09T00:01:38.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.355+0000 7f343ceff700 1 -- 192.168.123.103:0/378459535 wait complete. 2026-03-09T00:01:38.415 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph orch ps' 2026-03-09T00:01:38.547 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.773+0000 7fc7f49b2700 1 -- 192.168.123.103:0/342953827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 msgr2=0x7fc7ec1047c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.773+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/342953827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec1047c0 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7fc7e0009b50 tx=0x7fc7e0009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 -- 192.168.123.103:0/342953827 shutdown_connections 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/342953827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec1047c0 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/342953827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 0x7fc7ec103580 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 -- 192.168.123.103:0/342953827 >> 192.168.123.103:0/342953827 conn(0x7fc7ec0fe700 msgr2=0x7fc7ec100b40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 -- 192.168.123.103:0/342953827 shutdown_connections 2026-03-09T00:01:38.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 -- 192.168.123.103:0/342953827 wait complete. 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 Processor -- start 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.774+0000 7fc7f49b2700 1 -- start start 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f49b2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 0x7fc7ec198a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f49b2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec198fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f49b2700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7ec1995c0 con 0x7fc7ec104360 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f49b2700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc7ec199700 con 0x7fc7ec103160 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f1f4d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec198fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f1f4d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec198fa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36388/0 (socket says 192.168.123.103:36388) 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f1f4d700 1 -- 192.168.123.103:0/3038093991 learned_addr learned my addr 192.168.123.103:0/3038093991 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:38.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f1f4d700 1 -- 192.168.123.103:0/3038093991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 msgr2=0x7fc7ec198a60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f274e700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 0x7fc7ec198a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:38.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f1f4d700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 0x7fc7ec198a60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.775+0000 7fc7f1f4d700 1 -- 192.168.123.103:0/3038093991 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc7e00097e0 con 0x7fc7ec104360 2026-03-09T00:01:38.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7f1f4d700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec198fa0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7fc7e0004ce0 tx=0x7fc7e0005f00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:38.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7e001d070 con 0x7fc7ec104360 2026-03-09T00:01:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc7e000bcc0 con 0x7fc7ec104360 2026-03-09T00:01:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc7e000f810 con 0x7fc7ec104360 2026-03-09T00:01:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc7ec19e150 con 0x7fc7ec104360 2026-03-09T00:01:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc7ec19e640 con 0x7fc7ec104360 2026-03-09T00:01:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.776+0000 7fc7f274e700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 0x7fc7ec198a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:38.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.777+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc7ec066e80 con 0x7fc7ec104360 2026-03-09T00:01:38.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.779+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc7e0022b50 con 0x7fc7ec104360 2026-03-09T00:01:38.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.779+0000 7fc7e77fe700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc7dc06c4e0 0x7fc7dc06e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:38.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.779+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fc7e008d420 con 0x7fc7ec104360 2026-03-09T00:01:38.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.781+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc7e005baf0 con 0x7fc7ec104360 2026-03-09T00:01:38.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.781+0000 7fc7f274e700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc7dc06c4e0 0x7fc7dc06e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:38.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.782+0000 7fc7f274e700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc7dc06c4e0 0x7fc7dc06e9a0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fc7d8005fd0 tx=0x7fc7d8005f00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:38.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.887+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc7ec108cb0 con 0x7fc7dc06c4e0 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/839448048' entity='client.admin' 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:38.887 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:38 vm03 ceph-mon[52346]: from='client.24249 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.892+0000 7fc7e77fe700 1 -- 192.168.123.103:0/3038093991 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2640 (secure 0 0 0) 0x7fc7ec108cb0 con 0x7fc7dc06c4e0 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (76s) 43s ago 2m 22.5M - 0.25.0 c8568f914cd2 9b05d2f3502a 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (2m) 43s ago 2m 7801k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (93s) 16s ago 93s 8078k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 43s ago 2m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (92s) 16s ago 92s 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (71s) 43s ago 110s 77.3M - 9.4.7 954c08fa6188 9db2e5805e97 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:9283,8765,8443 running (2m) 43s ago 2m 491M - 18.2.1 5be31c24972a e48c90025d56 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (88s) 16s ago 88s 449M - 18.2.1 5be31c24972a 4c6a564e9efa 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 43s ago 2m 45.1M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (87s) 16s ago 87s 44.8M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 43s ago 2m 13.8M - 1.5.0 0da6a335fe13 750af7597536 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (89s) 16s ago 89s 14.0M - 1.5.0 0da6a335fe13 a82b7dc84593 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (68s) 43s ago 68s 41.3M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (58s) 43s ago 58s 40.7M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (48s) 43s ago 48s 36.7M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (37s) 16s ago 37s 43.4M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (27s) 16s ago 27s 39.1M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (17s) 16s ago 17s 15.0M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:01:38.892 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (70s) 43s ago 104s 30.3M - 2.43.0 a07b618ecd1d a4a1b4f06180 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.894+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc7dc06c4e0 msgr2=0x7fc7dc06e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.894+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc7dc06c4e0 0x7fc7dc06e9a0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fc7d8005fd0 tx=0x7fc7d8005f00 comp rx=0 tx=0).stop 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.894+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 msgr2=0x7fc7ec198fa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.894+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec198fa0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7fc7e0004ce0 tx=0x7fc7e0005f00 comp rx=0 tx=0).stop 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 shutdown_connections 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc7dc06c4e0 0x7fc7dc06e9a0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc7ec103160 0x7fc7ec198a60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 --2- 192.168.123.103:0/3038093991 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc7ec104360 0x7fc7ec198fa0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 >> 192.168.123.103:0/3038093991 conn(0x7fc7ec0fe700 msgr2=0x7fc7ec107590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 shutdown_connections 2026-03-09T00:01:38.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:38.895+0000 7fc7f49b2700 1 -- 192.168.123.103:0/3038093991 wait complete. 2026-03-09T00:01:38.932 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph orch ls' 2026-03-09T00:01:39.069 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/839448048' entity='client.admin' 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:38 vm06 ceph-mon[58395]: from='client.24249 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.296+0000 7f79b211f700 1 -- 192.168.123.103:0/2253191268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 msgr2=0x7f79ac073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.296+0000 7f79b211f700 1 --2- 192.168.123.103:0/2253191268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac073c70 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f799c009b50 tx=0x7f799c009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.297+0000 7f79b211f700 1 -- 192.168.123.103:0/2253191268 shutdown_connections 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.297+0000 7f79b211f700 1 --2- 192.168.123.103:0/2253191268 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac073c70 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.297+0000 7f79b211f700 1 --2- 192.168.123.103:0/2253191268 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac074dc0 0x7f79ac073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.297+0000 7f79b211f700 1 -- 192.168.123.103:0/2253191268 >> 192.168.123.103:0/2253191268 conn(0x7f79ac0fc4d0 msgr2=0x7f79ac0fe930 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 -- 192.168.123.103:0/2253191268 shutdown_connections 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 -- 192.168.123.103:0/2253191268 wait complete. 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 Processor -- start 2026-03-09T00:01:39.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 -- start start 2026-03-09T00:01:39.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac198aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:39.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac074dc0 0x7f79ac198fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:39.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f79ac199600 con 0x7f79ac0737f0 2026-03-09T00:01:39.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.298+0000 7f79b211f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f79ac199740 con 0x7f79ac074dc0 2026-03-09T00:01:39.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac198aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:39.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac198aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36402/0 (socket says 192.168.123.103:36402) 2026-03-09T00:01:39.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 -- 192.168.123.103:0/1005284854 learned_addr learned my addr 192.168.123.103:0/1005284854 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:39.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 -- 192.168.123.103:0/1005284854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac074dc0 msgr2=0x7f79ac198fe0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:39.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79aaffd700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac074dc0 0x7f79ac198fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac074dc0 0x7f79ac198fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 -- 192.168.123.103:0/1005284854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f799c0097e0 con 0x7f79ac0737f0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.299+0000 7f79ab7fe700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac198aa0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f799400d900 tx=0x7f799400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.300+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f79940098e0 con 0x7f79ac0737f0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.300+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7994010460 con 0x7f79ac0737f0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.300+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f799400f5d0 con 0x7f79ac0737f0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.300+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f79ac19e1f0 con 0x7f79ac0737f0 2026-03-09T00:01:39.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.300+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f79ac100c70 con 0x7f79ac0737f0 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.301+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f799400f730 con 0x7f79ac0737f0 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.301+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f79ac10be70 con 0x7f79ac0737f0 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.301+0000 7f79a8ff9700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f799806c4e0 0x7f799806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.301+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f799408ab20 con 0x7f79ac0737f0 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.303+0000 7f79aaffd700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f799806c4e0 0x7f799806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.304+0000 7f79aaffd700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f799806c4e0 0x7f799806e9a0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f799c006010 tx=0x7f799c0058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:39.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.304+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f799405a2f0 con 0x7f79ac0737f0 2026-03-09T00:01:39.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.407+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f79ac108450 con 0x7f799806c4e0 2026-03-09T00:01:39.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.410+0000 7f79a8ff9700 1 -- 192.168.123.103:0/1005284854 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7f79ac108450 con 0x7f799806c4e0 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager ?:9093,9094 1/1 43s ago 2m count:1 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter 2/2 43s ago 2m * 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:crash 2/2 43s ago 2m * 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:grafana ?:3000 1/1 43s ago 2m count:1 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:mgr 2/2 43s ago 2m count:2 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:mon 2/2 43s ago 2m vm03:192.168.123.103=vm03;vm06:192.168.123.106=vm06;count:2 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter ?:9100 2/2 43s ago 2m * 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:osd 6 43s ago - 2026-03-09T00:01:39.410 INFO:teuthology.orchestra.run.vm03.stdout:prometheus ?:9095 1/1 43s ago 2m count:1 2026-03-09T00:01:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f799806c4e0 msgr2=0x7f799806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f799806c4e0 0x7f799806e9a0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f799c006010 tx=0x7f799c0058e0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 msgr2=0x7f79ac198aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac198aa0 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f799400d900 tx=0x7f799400dcc0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 shutdown_connections 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f799806c4e0 0x7f799806e9a0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac0737f0 0x7f79ac198aa0 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 --2- 192.168.123.103:0/1005284854 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac074dc0 0x7f79ac198fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 >> 192.168.123.103:0/1005284854 conn(0x7f79ac0fc4d0 msgr2=0x7f79ac106d30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 shutdown_connections 2026-03-09T00:01:39.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.412+0000 7f79b211f700 1 -- 192.168.123.103:0/1005284854 wait complete. 2026-03-09T00:01:39.469 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph orch host ls' 2026-03-09T00:01:39.600 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:39.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.827+0000 7fe572331700 1 -- 192.168.123.103:0/614867885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 msgr2=0x7fe56c103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.827 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.827+0000 7fe572331700 1 --2- 192.168.123.103:0/614867885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c103560 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7fe554009b50 tx=0x7fe554009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.828+0000 7fe572331700 1 -- 192.168.123.103:0/614867885 shutdown_connections 2026-03-09T00:01:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.828+0000 7fe572331700 1 --2- 192.168.123.103:0/614867885 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 0x7fe56c1047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.828+0000 7fe572331700 1 --2- 192.168.123.103:0/614867885 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c103560 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.828+0000 7fe572331700 1 -- 192.168.123.103:0/614867885 >> 192.168.123.103:0/614867885 conn(0x7fe56c0fe6c0 msgr2=0x7fe56c100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.829+0000 7fe572331700 1 -- 192.168.123.103:0/614867885 shutdown_connections 2026-03-09T00:01:39.828 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.829+0000 7fe572331700 1 -- 192.168.123.103:0/614867885 wait complete. 2026-03-09T00:01:39.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.829+0000 7fe572331700 1 Processor -- start 2026-03-09T00:01:39.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe572331700 1 -- start start 2026-03-09T00:01:39.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe572331700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c19ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:39.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe572331700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 0x7fe56c19d3c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:39.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe56bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c19ce80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe56bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c19ce80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36420/0 (socket says 192.168.123.103:36420) 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe572331700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe56c19d9e0 con 0x7fe56c103140 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe572331700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe56c19db20 con 0x7fe56c104340 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe56bfff700 1 -- 192.168.123.103:0/3954790846 learned_addr learned my addr 192.168.123.103:0/3954790846 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.830+0000 7fe56b7fe700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 0x7fe56c19d3c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe56bfff700 1 -- 192.168.123.103:0/3954790846 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 msgr2=0x7fe56c19d3c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe56bfff700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 0x7fe56c19d3c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe56bfff700 1 -- 192.168.123.103:0/3954790846 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe5540097e0 con 0x7fe56c103140 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe56b7fe700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 0x7fe56c19d3c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:01:39.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe56bfff700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c19ce80 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7fe554009b20 tx=0x7fe5540052a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:39.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe55401d070 con 0x7fe56c103140 2026-03-09T00:01:39.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe554022470 con 0x7fe56c103140 2026-03-09T00:01:39.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe55400f670 con 0x7fe56c103140 2026-03-09T00:01:39.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe56c1a2570 con 0x7fe56c103140 2026-03-09T00:01:39.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.831+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe56c1a2a60 con 0x7fe56c103140 2026-03-09T00:01:39.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.833+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe55400f7d0 con 0x7fe56c103140 2026-03-09T00:01:39.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.833+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe56c066e80 con 0x7fe56c103140 2026-03-09T00:01:39.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.836+0000 7fe5697fa700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe55806c4e0 0x7fe55806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:39.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.836+0000 7fe56b7fe700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe55806c4e0 0x7fe55806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:39.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.837+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fe55408c840 con 0x7fe56c103140 2026-03-09T00:01:39.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.837+0000 7fe56b7fe700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe55806c4e0 0x7fe55806e9a0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fe55c009fd0 tx=0x7fe55c009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:39.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.837+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe55405bb80 con 0x7fe56c103140 2026-03-09T00:01:39.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.948+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7fe56c108c90 con 0x7fe55806c4e0 2026-03-09T00:01:39.949 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.949+0000 7fe5697fa700 1 -- 192.168.123.103:0/3954790846 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7fe56c108c90 con 0x7fe55806c4e0 2026-03-09T00:01:39.949 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:39 vm03 ceph-mon[52346]: from='client.14456 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:39.949 INFO:teuthology.orchestra.run.vm03.stdout:HOST ADDR LABELS STATUS 2026-03-09T00:01:39.949 INFO:teuthology.orchestra.run.vm03.stdout:vm03 192.168.123.103 2026-03-09T00:01:39.949 INFO:teuthology.orchestra.run.vm03.stdout:vm06 192.168.123.106 2026-03-09T00:01:39.949 INFO:teuthology.orchestra.run.vm03.stdout:2 hosts in cluster 2026-03-09T00:01:39.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.952+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe55806c4e0 msgr2=0x7fe55806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.952+0000 7fe572331700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe55806c4e0 0x7fe55806e9a0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7fe55c009fd0 tx=0x7fe55c009450 comp rx=0 tx=0).stop 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.952+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 msgr2=0x7fe56c19ce80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.952+0000 7fe572331700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c19ce80 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7fe554009b20 tx=0x7fe5540052a0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 shutdown_connections 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fe55806c4e0 0x7fe55806e9a0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe56c103140 0x7fe56c19ce80 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 --2- 192.168.123.103:0/3954790846 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe56c104340 0x7fe56c19d3c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 >> 192.168.123.103:0/3954790846 conn(0x7fe56c0fe6c0 msgr2=0x7fe56c107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:39.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 shutdown_connections 2026-03-09T00:01:39.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:39.953+0000 7fe572331700 1 -- 192.168.123.103:0/3954790846 wait complete. 2026-03-09T00:01:40.016 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph orch device ls' 2026-03-09T00:01:40.160 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:40.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:39 vm06 ceph-mon[58395]: from='client.14456 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:40.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.594+0000 7f14cdd7d700 1 -- 192.168.123.103:0/1774217058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 msgr2=0x7f14c8073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:40.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14c67fc700 1 -- 192.168.123.103:0/1774217058 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f14b000ba40 con 0x7f14c8074dc0 2026-03-09T00:01:40.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/1774217058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c8073220 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f14b0009b00 tx=0x7f14b0009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:40.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 -- 192.168.123.103:0/1774217058 shutdown_connections 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/1774217058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 0x7f14c8073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/1774217058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c8073220 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 -- 192.168.123.103:0/1774217058 >> 192.168.123.103:0/1774217058 conn(0x7f14c80fc460 msgr2=0x7f14c80fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 -- 192.168.123.103:0/1774217058 shutdown_connections 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.595+0000 7f14cdd7d700 1 -- 192.168.123.103:0/1774217058 wait complete. 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14cdd7d700 1 Processor -- start 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14cdd7d700 1 -- start start 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14cdd7d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 0x7f14c819ce40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14cdd7d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c819d380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14cdd7d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14c819d9a0 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14cdd7d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14c819dae0 con 0x7f14c80737f0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 0x7f14c819ce40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c819d380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 0x7f14c819ce40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:39004/0 (socket says 192.168.123.103:39004) 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c77fe700 1 -- 192.168.123.103:0/4291549869 learned_addr learned my addr 192.168.123.103:0/4291549869 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c6ffd700 1 -- 192.168.123.103:0/4291549869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 msgr2=0x7f14c819ce40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c6ffd700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 0x7f14c819ce40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c6ffd700 1 -- 192.168.123.103:0/4291549869 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f14b00097e0 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.596+0000 7f14c6ffd700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c819d380 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f14b800b810 tx=0x7f14b800bb20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.597+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f14b800d610 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.597+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f14b800dc50 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.597+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f14b8017400 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.597+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f14c81a2590 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.597+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f14c81a2b60 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.598+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f14c8066e80 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.601+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f14b801e030 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.602+0000 7f14c4ff9700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14b406c600 0x7f14b406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.602+0000 7f14c77fe700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14b406c600 0x7f14b406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.602+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f14b808af00 con 0x7f14c8074dc0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.602+0000 7f14c77fe700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14b406c600 0x7f14b406eac0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f14b0009fd0 tx=0x7f14b000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:40.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.602+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f14b808b380 con 0x7f14c8074dc0 2026-03-09T00:01:40.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.708+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f14c81a2e40 con 0x7f14b406c600 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.710+0000 7f14c4ff9700 1 -- 192.168.123.103:0/4291549869 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1278 (secure 0 0 0) 0x7f14c81a2e40 con 0x7f14b406c600 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdb hdd DWNBRSTVMM03001 20.0G Yes 46s ago 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdc hdd DWNBRSTVMM03002 20.0G No 46s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vdd hdd DWNBRSTVMM03003 20.0G No 46s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:vm03 /dev/vde hdd DWNBRSTVMM03004 20.0G No 46s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:vm06 /dev/vdb hdd DWNBRSTVMM06001 20.0G Yes 17s ago 2026-03-09T00:01:40.709 INFO:teuthology.orchestra.run.vm03.stdout:vm06 /dev/vdc hdd DWNBRSTVMM06002 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T00:01:40.710 INFO:teuthology.orchestra.run.vm03.stdout:vm06 /dev/vdd hdd DWNBRSTVMM06003 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T00:01:40.710 INFO:teuthology.orchestra.run.vm03.stdout:vm06 /dev/vde hdd DWNBRSTVMM06004 20.0G No 17s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-09T00:01:40.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14b406c600 msgr2=0x7f14b406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:40.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14b406c600 0x7f14b406eac0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f14b0009fd0 tx=0x7f14b000b540 comp rx=0 tx=0).stop 2026-03-09T00:01:40.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 msgr2=0x7f14c819d380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:40.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c819d380 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f14b800b810 tx=0x7f14b800bb20 comp rx=0 tx=0).stop 2026-03-09T00:01:40.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 shutdown_connections 2026-03-09T00:01:40.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f14b406c600 0x7f14b406eac0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:40.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14c80737f0 0x7f14c819ce40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:40.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 --2- 192.168.123.103:0/4291549869 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14c8074dc0 0x7f14c819d380 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:40.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 >> 192.168.123.103:0/4291549869 conn(0x7f14c80fc460 msgr2=0x7f14c8102890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:40.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 shutdown_connections 2026-03-09T00:01:40.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:40.712+0000 7f14cdd7d700 1 -- 192.168.123.103:0/4291549869 wait complete. 2026-03-09T00:01:40.945 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:40.948 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:40.948 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-09T00:01:41.089 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:41.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:40 vm03 ceph-mon[52346]: from='client.14460 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:41.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:40 vm03 ceph-mon[52346]: from='client.14464 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:41.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:40 vm03 ceph-mon[52346]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:41.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.318+0000 7f0cbb422700 1 -- 192.168.123.103:0/329144922 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4104340 msgr2=0x7f0cb41047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:41.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.318+0000 7f0cbb422700 1 --2- 192.168.123.103:0/329144922 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4104340 0x7f0cb41047a0 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7f0cb0009b00 tx=0x7f0cb0009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.318+0000 7f0cbb422700 1 -- 192.168.123.103:0/329144922 shutdown_connections 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.318+0000 7f0cbb422700 1 --2- 192.168.123.103:0/329144922 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4104340 0x7f0cb41047a0 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.318+0000 7f0cbb422700 1 --2- 192.168.123.103:0/329144922 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4103140 0x7f0cb4103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.318+0000 7f0cbb422700 1 -- 192.168.123.103:0/329144922 >> 192.168.123.103:0/329144922 conn(0x7f0cb40fe6c0 msgr2=0x7f0cb4100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 -- 192.168.123.103:0/329144922 shutdown_connections 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 -- 192.168.123.103:0/329144922 wait complete. 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 Processor -- start 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 -- start start 2026-03-09T00:01:41.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4103140 0x7f0cb41989c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 0x7f0cb4198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0cb4199490 con 0x7f0cb4103140 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cbb422700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0cb41995d0 con 0x7f0cb4104340 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cb89bd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 0x7f0cb4198f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cb89bd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 0x7f0cb4198f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:39014/0 (socket says 192.168.123.103:39014) 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.319+0000 7f0cb89bd700 1 -- 192.168.123.103:0/174083108 learned_addr learned my addr 192.168.123.103:0/174083108 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.320+0000 7f0cb91be700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4103140 0x7f0cb41989c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.320+0000 7f0cb91be700 1 -- 192.168.123.103:0/174083108 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 msgr2=0x7f0cb4198f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.320+0000 7f0cb91be700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 0x7f0cb4198f00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.320+0000 7f0cb91be700 1 -- 192.168.123.103:0/174083108 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0cb00097e0 con 0x7f0cb4103140 2026-03-09T00:01:41.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.320+0000 7f0cb89bd700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 0x7f0cb4198f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:01:41.320 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.320+0000 7f0cb91be700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4103140 0x7f0cb41989c0 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f0ca400cc30 tx=0x7f0ca400cf40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:41.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.321+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ca40041d0 con 0x7f0cb4103140 2026-03-09T00:01:41.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.321+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0ca4007c90 con 0x7f0cb4103140 2026-03-09T00:01:41.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.321+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ca4003de0 con 0x7f0cb4103140 2026-03-09T00:01:41.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.321+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0cb419e090 con 0x7f0cb4103140 2026-03-09T00:01:41.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.321+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0cb419e5b0 con 0x7f0cb4103140 2026-03-09T00:01:41.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.322+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0ca40077b0 con 0x7f0cb4103140 2026-03-09T00:01:41.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.322+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0cb4066e80 con 0x7f0cb4103140 2026-03-09T00:01:41.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.323+0000 7f0caa7fc700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0ca006c4e0 0x7f0ca006e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:41.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.323+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f0ca4013070 con 0x7f0cb4103140 2026-03-09T00:01:41.324 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.323+0000 7f0cb89bd700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0ca006c4e0 0x7f0ca006e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:41.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.324+0000 7f0cb89bd700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0ca006c4e0 0x7f0ca006e9a0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f0cb0005850 tx=0x7f0cb000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:41.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.325+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0ca4059ad0 con 0x7f0cb4103140 2026-03-09T00:01:41.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:40 vm06 ceph-mon[58395]: from='client.14460 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:41.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:40 vm06 ceph-mon[58395]: from='client.14464 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:41.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:40 vm06 ceph-mon[58395]: pgmap v71: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:41.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:41.438+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f0cb419e890 con 0x7f0ca006c4e0 2026-03-09T00:01:42.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:42 vm06 ceph-mon[58395]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:42 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T00:01:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:42 vm03 ceph-mon[52346]: from='client.14468 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:42 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-09T00:01:42.969 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.968+0000 7f0caa7fc700 1 -- 192.168.123.103:0/174083108 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f0cb419e890 con 0x7f0ca006c4e0 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0ca006c4e0 msgr2=0x7f0ca006e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0ca006c4e0 0x7f0ca006e9a0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f0cb0005850 tx=0x7f0cb000b540 comp rx=0 tx=0).stop 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4103140 msgr2=0x7f0cb41989c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4103140 0x7f0cb41989c0 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7f0ca400cc30 tx=0x7f0ca400cf40 comp rx=0 tx=0).stop 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 shutdown_connections 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0ca006c4e0 0x7f0ca006e9a0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0cb4103140 0x7f0cb41989c0 unknown :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 --2- 192.168.123.103:0/174083108 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0cb4104340 0x7f0cb4198f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 >> 192.168.123.103:0/174083108 conn(0x7f0cb40fe6c0 msgr2=0x7f0cb4107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 shutdown_connections 2026-03-09T00:01:42.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:42.971+0000 7f0cbb422700 1 -- 192.168.123.103:0/174083108 wait complete. 2026-03-09T00:01:43.020 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs dump' 2026-03-09T00:01:43.212 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:42 vm03 ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[52342]: 2026-03-09T00:01:42.954+0000 7f1c2fe9f700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='client.14472 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: pgmap v73: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED) 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: fsmap cephfs:0 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:43.243 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:43 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='client.14472 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: osdmap e34: 6 total, 6 up, 6 in 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: pgmap v73: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED) 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: osdmap e35: 6 total, 6 up, 6 in 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: osdmap e36: 6 total, 6 up, 6 in 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: fsmap cephfs:0 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:43 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:43.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.521+0000 7f85be6fc700 1 -- 192.168.123.103:0/4028364868 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 msgr2=0x7f85b8103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:43.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.521+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4028364868 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b8103540 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f85a0009ab0 tx=0x7f85a0009dc0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.531+0000 7f85be6fc700 1 -- 192.168.123.103:0/4028364868 shutdown_connections 2026-03-09T00:01:43.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.531+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4028364868 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85b8104320 0x7f85b8104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.532+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4028364868 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b8103540 unknown :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.532+0000 7f85be6fc700 1 -- 192.168.123.103:0/4028364868 >> 192.168.123.103:0/4028364868 conn(0x7f85b80fe6c0 msgr2=0x7f85b8100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.534+0000 7f85be6fc700 1 -- 192.168.123.103:0/4028364868 shutdown_connections 2026-03-09T00:01:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.535+0000 7f85be6fc700 1 -- 192.168.123.103:0/4028364868 wait complete. 2026-03-09T00:01:43.534 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.535+0000 7f85be6fc700 1 Processor -- start 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.535+0000 7f85be6fc700 1 -- start start 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.535+0000 7f85be6fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b81989d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85be6fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85b8104320 0x7f85b8198f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85be6fc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85b8199530 con 0x7f85b8103120 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85be6fc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85b8199670 con 0x7f85b8104320 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b81989d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b81989d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36470/0 (socket says 192.168.123.103:36470) 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 -- 192.168.123.103:0/4256036017 learned_addr learned my addr 192.168.123.103:0/4256036017 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b77fe700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85b8104320 0x7f85b8198f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 -- 192.168.123.103:0/4256036017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85b8104320 msgr2=0x7f85b8198f10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85b8104320 0x7f85b8198f10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 -- 192.168.123.103:0/4256036017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85a0009710 con 0x7f85b8103120 2026-03-09T00:01:43.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b7fff700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b81989d0 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f85a000bc80 tx=0x7f85a000bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85a001d070 con 0x7f85b8103120 2026-03-09T00:01:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85b819e0c0 con 0x7f85b8103120 2026-03-09T00:01:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.536+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85b819e5b0 con 0x7f85b8103120 2026-03-09T00:01:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.537+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f85a000fcf0 con 0x7f85b8103120 2026-03-09T00:01:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.537+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85a0017860 con 0x7f85b8103120 2026-03-09T00:01:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.537+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8598005320 con 0x7f85b8103120 2026-03-09T00:01:43.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.538+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f85a00179c0 con 0x7f85b8103120 2026-03-09T00:01:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.539+0000 7f85b57fa700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85a406c3a0 0x7f85a406e860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.539+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(36..36 src has 1..36) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f85a008c710 con 0x7f85b8103120 2026-03-09T00:01:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.540+0000 7f85b77fe700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85a406c3a0 0x7f85a406e860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.540+0000 7f85b77fe700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85a406c3a0 0x7f85a406e860 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f85a8005fd0 tx=0x7f85a8009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:43.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.542+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f85a005ac70 con 0x7f85b8103120 2026-03-09T00:01:43.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.700+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8598006200 con 0x7f85b8103120 2026-03-09T00:01:43.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.701+0000 7f85b57fa700 1 -- 192.168.123.103:0/4256036017 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7f85a008ae60 con 0x7f85b8103120 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:e2 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:epoch 2 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:42.953016+0000 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:01:43.701 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:in 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:up {} 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:inline_data disabled 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 0 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:43.702 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:43.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85a406c3a0 msgr2=0x7f85a406e860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:43.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85a406c3a0 0x7f85a406e860 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f85a8005fd0 tx=0x7f85a8009500 comp rx=0 tx=0).stop 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 msgr2=0x7f85b81989d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b81989d0 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f85a000bc80 tx=0x7f85a000bfd0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 shutdown_connections 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f85a406c3a0 0x7f85a406e860 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85b8103120 0x7f85b81989d0 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 --2- 192.168.123.103:0/4256036017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85b8104320 0x7f85b8198f10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.704+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 >> 192.168.123.103:0/4256036017 conn(0x7f85b80fe6c0 msgr2=0x7f85b8107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.705+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 shutdown_connections 2026-03-09T00:01:43.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:43.705+0000 7f85be6fc700 1 -- 192.168.123.103:0/4256036017 wait complete. 2026-03-09T00:01:43.705 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 2 2026-03-09T00:01:43.872 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:43.875 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:43.875 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs set cephfs max_mds 1' 2026-03-09T00:01:44.086 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:44.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:44 vm03 ceph-mon[52346]: Saving service mds.cephfs spec with placement count:4 2026-03-09T00:01:44.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:44 vm03 ceph-mon[52346]: Deploying daemon mds.cephfs.vm03.sejksk on vm03 2026-03-09T00:01:44.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:44 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/4256036017' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:01:44.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:44 vm03 ceph-mon[52346]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T00:01:44.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:44.099 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:44 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.340+0000 7f0bb1957700 1 -- 192.168.123.103:0/1257263606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 msgr2=0x7f0bac103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.340+0000 7f0bb1957700 1 --2- 192.168.123.103:0/1257263606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac103560 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7f0b94009b50 tx=0x7f0b94009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.341+0000 7f0bb1957700 1 -- 192.168.123.103:0/1257263606 shutdown_connections 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.341+0000 7f0bb1957700 1 --2- 192.168.123.103:0/1257263606 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bac104340 0x7f0bac1047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.341+0000 7f0bb1957700 1 --2- 192.168.123.103:0/1257263606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac103560 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.341+0000 7f0bb1957700 1 -- 192.168.123.103:0/1257263606 >> 192.168.123.103:0/1257263606 conn(0x7f0bac0fe6c0 msgr2=0x7f0bac100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.341+0000 7f0bb1957700 1 -- 192.168.123.103:0/1257263606 shutdown_connections 2026-03-09T00:01:44.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.341+0000 7f0bb1957700 1 -- 192.168.123.103:0/1257263606 wait complete. 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0bb1957700 1 Processor -- start 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0bb1957700 1 -- start start 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0bb1957700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0bb1957700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bac104340 0x7f0bac079080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0bb1957700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bac075630 con 0x7f0bac103140 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0bb1957700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0bac0757a0 con 0x7f0bac104340 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac078b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baa7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bac104340 0x7f0bac079080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baaffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac078b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36504/0 (socket says 192.168.123.103:36504) 2026-03-09T00:01:44.341 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baaffd700 1 -- 192.168.123.103:0/2829723539 learned_addr learned my addr 192.168.123.103:0/2829723539 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:44.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baaffd700 1 -- 192.168.123.103:0/2829723539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bac104340 msgr2=0x7f0bac079080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:44.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baaffd700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bac104340 0x7f0bac079080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:44.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.342+0000 7f0baaffd700 1 -- 192.168.123.103:0/2829723539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b940097e0 con 0x7f0bac103140 2026-03-09T00:01:44.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.343+0000 7f0baaffd700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac078b40 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f0b94004ce0 tx=0x7f0b94005790 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:44.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.343+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b9401d070 con 0x7f0bac103140 2026-03-09T00:01:44.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.343+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0b9400bb00 con 0x7f0bac103140 2026-03-09T00:01:44.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.343+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b9400f670 con 0x7f0bac103140 2026-03-09T00:01:44.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.343+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0bac075a20 con 0x7f0bac103140 2026-03-09T00:01:44.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.343+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0bac075f10 con 0x7f0bac103140 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.344+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0b9400bc70 con 0x7f0bac103140 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.344+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0bac066e80 con 0x7f0bac103140 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.344+0000 7f0bb0955700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9806c490 0x7f0b9806e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.344+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f0b9408cca0 con 0x7f0bac103140 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.347+0000 7f0baa7fc700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9806c490 0x7f0b9806e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.347+0000 7f0baa7fc700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9806c490 0x7f0b9806e950 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f0b9c007c60 tx=0x7f0b9c0073d0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:44.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.347+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0b9405b060 con 0x7f0bac103140 2026-03-09T00:01:44.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:44 vm06 ceph-mon[58395]: Saving service mds.cephfs spec with placement count:4 2026-03-09T00:01:44.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:44 vm06 ceph-mon[58395]: Deploying daemon mds.cephfs.vm03.sejksk on vm03 2026-03-09T00:01:44.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:44 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/4256036017' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:01:44.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:44 vm06 ceph-mon[58395]: osdmap e37: 6 total, 6 up, 6 in 2026-03-09T00:01:44.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:44.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:44 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:44.472 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:44.472+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"} v 0) v1 -- 0x7f0bac1a2eb0 con 0x7f0bac103140 2026-03-09T00:01:45.083 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.083+0000 7f0bb0955700 1 -- 192.168.123.103:0/2829723539 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7f0b940270b0 con 0x7f0bac103140 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.089+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9806c490 msgr2=0x7f0b9806e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.089+0000 7f0bb1957700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9806c490 0x7f0b9806e950 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f0b9c007c60 tx=0x7f0b9c0073d0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.089+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 msgr2=0x7f0bac078b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.089+0000 7f0bb1957700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac078b40 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f0b94004ce0 tx=0x7f0b94005790 comp rx=0 tx=0).stop 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 shutdown_connections 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f0b9806c490 0x7f0b9806e950 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0bac103140 0x7f0bac078b40 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 --2- 192.168.123.103:0/2829723539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0bac104340 0x7f0bac079080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 >> 192.168.123.103:0/2829723539 conn(0x7f0bac0fe6c0 msgr2=0x7f0bac107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 shutdown_connections 2026-03-09T00:01:45.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.090+0000 7f0bb1957700 1 -- 192.168.123.103:0/2829723539 wait complete. 2026-03-09T00:01:45.162 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:45.165 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:45.165 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs set cephfs allow_standby_replay false' 2026-03-09T00:01:45.328 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: pgmap v77: 65 pgs: 14 creating+peering, 50 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: Deploying daemon mds.cephfs.vm06.vlrwtl on vm06 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2829723539' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled) 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:45.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:boot 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/2829723539' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.106:6824/3799306593,v1:192.168.123.106:6825/3799306593] up:boot 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: daemon mds.cephfs.vm06.vlrwtl assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: Cluster is now healthy 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: fsmap cephfs:0 2 up:standby 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:creating} 1 up:standby 2026-03-09T00:01:45.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:45 vm03 ceph-mon[52346]: daemon mds.cephfs.vm06.vlrwtl is now active in filesystem cephfs as rank 0 2026-03-09T00:01:45.405 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: pgmap v77: 65 pgs: 14 creating+peering, 50 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:01:45.405 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.405 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:45.405 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:45.405 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:45.405 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: Deploying daemon mds.cephfs.vm06.vlrwtl on vm06 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2829723539' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled) 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: osdmap e38: 6 total, 6 up, 6 in 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:boot 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/2829723539' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.106:6824/3799306593,v1:192.168.123.106:6825/3799306593] up:boot 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: daemon mds.cephfs.vm06.vlrwtl assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: Cluster is now healthy 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: fsmap cephfs:0 2 up:standby 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:creating} 1 up:standby 2026-03-09T00:01:45.406 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:45 vm06 ceph-mon[58395]: daemon mds.cephfs.vm06.vlrwtl is now active in filesystem cephfs as rank 0 2026-03-09T00:01:45.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.744+0000 7fbbc2be4700 1 -- 192.168.123.103:0/1578021795 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc104340 msgr2=0x7fbbbc1047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:45.744 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.744+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/1578021795 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc104340 0x7fbbbc1047a0 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7fbbb0009b00 tx=0x7fbbb0009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:45.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.746+0000 7fbbc2be4700 1 -- 192.168.123.103:0/1578021795 shutdown_connections 2026-03-09T00:01:45.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.746+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/1578021795 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc104340 0x7fbbbc1047a0 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.746+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/1578021795 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc103140 0x7fbbbc103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.746+0000 7fbbc2be4700 1 -- 192.168.123.103:0/1578021795 >> 192.168.123.103:0/1578021795 conn(0x7fbbbc0fe6c0 msgr2=0x7fbbbc100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:45.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.747+0000 7fbbc2be4700 1 -- 192.168.123.103:0/1578021795 shutdown_connections 2026-03-09T00:01:45.746 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.747+0000 7fbbc2be4700 1 -- 192.168.123.103:0/1578021795 wait complete. 2026-03-09T00:01:45.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.747+0000 7fbbc2be4700 1 Processor -- start 2026-03-09T00:01:45.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbc2be4700 1 -- start start 2026-03-09T00:01:45.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbc2be4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103140 0x7fbbbc198ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:45.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbc2be4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 0x7fbbbc199010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:45.747 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbc2be4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbbc199630 con 0x7fbbbc103140 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbc2be4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbbc199770 con 0x7fbbbc104340 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbbbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 0x7fbbbc199010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbbbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 0x7fbbbc199010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57804/0 (socket says 192.168.123.103:57804) 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.748+0000 7fbbbbfff700 1 -- 192.168.123.103:0/565400591 learned_addr learned my addr 192.168.123.103:0/565400591 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbc0980700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103140 0x7fbbbc198ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbbbfff700 1 -- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103140 msgr2=0x7fbbbc198ad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbbbfff700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103140 0x7fbbbc198ad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbbbfff700 1 -- 192.168.123.103:0/565400591 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbbb00097e0 con 0x7fbbbc104340 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbc0980700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103140 0x7fbbbc198ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:45.748 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbbbfff700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 0x7fbbbc199010 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fbbb0009fd0 tx=0x7fbbb0004ab0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:45.749 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbb001d070 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbbb000bd10 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbbb000f890 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbbbc19e1c0 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.749+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbbc19e730 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.751+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fbbb0022b70 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.751+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbbbc066e80 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.752+0000 7fbbb9ffb700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbba406c4e0 0x7fbba406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.752+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fbbb008d050 con 0x7fbbbc104340 2026-03-09T00:01:45.751 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.752+0000 7fbbc0980700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbba406c4e0 0x7fbba406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:45.753 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.754+0000 7fbbc0980700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbba406c4e0 0x7fbba406e9a0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fbbbc1041a0 tx=0x7fbbac008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:45.757 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.755+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbbb005b360 con 0x7fbbbc104340 2026-03-09T00:01:45.882 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:45.883+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"} v 0) v1 -- 0x7fbbbc19ea10 con 0x7fbbbc104340 2026-03-09T00:01:46.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.094+0000 7fbbb9ffb700 1 -- 192.168.123.103:0/565400591 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]=0 v5) v1 ==== 122+0+0 (secure 0 0 0) 0x7fbbb0027070 con 0x7fbbbc104340 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbba406c4e0 msgr2=0x7fbba406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbba406c4e0 0x7fbba406e9a0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fbbbc1041a0 tx=0x7fbbac008040 comp rx=0 tx=0).stop 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 msgr2=0x7fbbbc199010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 0x7fbbbc199010 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fbbb0009fd0 tx=0x7fbbb0004ab0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 shutdown_connections 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fbba406c4e0 0x7fbba406e9a0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbbbc103140 0x7fbbbc198ad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 --2- 192.168.123.103:0/565400591 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbbbc104340 0x7fbbbc199010 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 >> 192.168.123.103:0/565400591 conn(0x7fbbbc0fe6c0 msgr2=0x7fbbbc107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:46.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 shutdown_connections 2026-03-09T00:01:46.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.097+0000 7fbbc2be4700 1 -- 192.168.123.103:0/565400591 wait complete. 2026-03-09T00:01:46.140 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:46.142 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:46.142 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs set cephfs inline_data true --yes-i-really-really-mean-it' 2026-03-09T00:01:46.293 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: Deploying daemon mds.cephfs.vm03.ralade on vm03 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/565400591' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.106:6824/3799306593,v1:192.168.123.106:6825/3799306593] up:active 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:boot 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:active} 2 up:standby 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:01:46.320 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:46 vm03 ceph-mon[52346]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:active} 2 up:standby 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: Deploying daemon mds.cephfs.vm03.ralade on vm03 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/565400591' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.106:6824/3799306593,v1:192.168.123.106:6825/3799306593] up:active 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:boot 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:active} 2 up:standby 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:01:46.401 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:46 vm06 ceph-mon[58395]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:active} 2 up:standby 2026-03-09T00:01:46.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.536+0000 7f702740c700 1 -- 192.168.123.103:0/2613551374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020104380 msgr2=0x7f70201047e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:46.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.536+0000 7f702740c700 1 --2- 192.168.123.103:0/2613551374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020104380 0x7f70201047e0 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7f701c009b00 tx=0x7f701c009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:46.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.538+0000 7f702740c700 1 -- 192.168.123.103:0/2613551374 shutdown_connections 2026-03-09T00:01:46.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.538+0000 7f702740c700 1 --2- 192.168.123.103:0/2613551374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020104380 0x7f70201047e0 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.538+0000 7f702740c700 1 --2- 192.168.123.103:0/2613551374 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020103180 0x7f70201035a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.538+0000 7f702740c700 1 -- 192.168.123.103:0/2613551374 >> 192.168.123.103:0/2613551374 conn(0x7f70200fe720 msgr2=0x7f7020100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:46.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.540+0000 7f702740c700 1 -- 192.168.123.103:0/2613551374 shutdown_connections 2026-03-09T00:01:46.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.540+0000 7f702740c700 1 -- 192.168.123.103:0/2613551374 wait complete. 2026-03-09T00:01:46.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.542+0000 7f702740c700 1 Processor -- start 2026-03-09T00:01:46.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.549+0000 7f702740c700 1 -- start start 2026-03-09T00:01:46.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.549+0000 7f702740c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020103180 0x7f7020071cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:46.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.549+0000 7f702740c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 0x7f7020072210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:46.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.549+0000 7f702740c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7020072830 con 0x7f7020103180 2026-03-09T00:01:46.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.549+0000 7f702740c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f70201a6800 con 0x7f7020104380 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70249a7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 0x7f7020072210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70249a7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 0x7f7020072210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57834/0 (socket says 192.168.123.103:57834) 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70249a7700 1 -- 192.168.123.103:0/3153555182 learned_addr learned my addr 192.168.123.103:0/3153555182 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70251a8700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020103180 0x7f7020071cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70249a7700 1 -- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020103180 msgr2=0x7f7020071cd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70249a7700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020103180 0x7f7020071cd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:46.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.550+0000 7f70249a7700 1 -- 192.168.123.103:0/3153555182 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f701c0097e0 con 0x7f7020104380 2026-03-09T00:01:46.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.551+0000 7f70249a7700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 0x7f7020072210 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f701c005230 tx=0x7f701c004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:46.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.552+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f701c01d070 con 0x7f7020104380 2026-03-09T00:01:46.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.552+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f70201a69a0 con 0x7f7020104380 2026-03-09T00:01:46.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.552+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f70201a6e40 con 0x7f7020104380 2026-03-09T00:01:46.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.553+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f701c00bc50 con 0x7f7020104380 2026-03-09T00:01:46.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.553+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f701c00f780 con 0x7f7020104380 2026-03-09T00:01:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.554+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f701c00f8e0 con 0x7f7020104380 2026-03-09T00:01:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.554+0000 7f70127fc700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f700c06c530 0x7f700c06e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.555+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f701c08df60 con 0x7f7020104380 2026-03-09T00:01:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.556+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7004005320 con 0x7f7020104380 2026-03-09T00:01:46.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.556+0000 7f70251a8700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f700c06c530 0x7f700c06e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:46.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.559+0000 7f70251a8700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f700c06c530 0x7f700c06e9f0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f70201a2770 tx=0x7f7014009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:46.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.559+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f701c0bb0e0 con 0x7f7020104380 2026-03-09T00:01:46.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:46.693+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true} v 0) v1 -- 0x7f7004005190 con 0x7f7020104380 2026-03-09T00:01:47.151 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.152+0000 7f70127fc700 1 -- 192.168.123.103:0/3153555182 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]=0 inline data enabled v7) v1 ==== 168+0+0 (secure 0 0 0) 0x7f701c027080 con 0x7f7020104380 2026-03-09T00:01:47.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.154+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f700c06c530 msgr2=0x7f700c06e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:47.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.154+0000 7f702740c700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f700c06c530 0x7f700c06e9f0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f70201a2770 tx=0x7f7014009380 comp rx=0 tx=0).stop 2026-03-09T00:01:47.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.154+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 msgr2=0x7f7020072210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:47.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.154+0000 7f702740c700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 0x7f7020072210 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f701c005230 tx=0x7f701c004c30 comp rx=0 tx=0).stop 2026-03-09T00:01:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 shutdown_connections 2026-03-09T00:01:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f700c06c530 0x7f700c06e9f0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7020103180 0x7f7020071cd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 --2- 192.168.123.103:0/3153555182 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7020104380 0x7f7020072210 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 >> 192.168.123.103:0/3153555182 conn(0x7f70200fe720 msgr2=0x7f70201075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 shutdown_connections 2026-03-09T00:01:47.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.156+0000 7f702740c700 1 -- 192.168.123.103:0/3153555182 wait complete. 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: pgmap v79: 65 pgs: 14 creating+peering, 14 unknown, 37 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 5 op/s 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: Deploying daemon mds.cephfs.vm06.ixduim on vm06 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3153555182' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:47 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:47.158 INFO:teuthology.orchestra.run.vm03.stderr:inline data enabled 2026-03-09T00:01:47.213 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:01:47.216 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:47.216 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs dump' 2026-03-09T00:01:47.390 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: pgmap v79: 65 pgs: 14 creating+peering, 14 unknown, 37 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 5 op/s 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: Deploying daemon mds.cephfs.vm06.ixduim on vm06 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3153555182' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:47.422 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:47 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:47.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.691+0000 7f507c7b7700 1 -- 192.168.123.103:0/2560485181 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5074107d90 msgr2=0x7f507410a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:47.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.691+0000 7f507c7b7700 1 --2- 192.168.123.103:0/2560485181 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5074107d90 0x7f507410a1c0 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f5070009b00 tx=0x7f5070009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:47.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.691+0000 7f507c7b7700 1 -- 192.168.123.103:0/2560485181 shutdown_connections 2026-03-09T00:01:47.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.691+0000 7f507c7b7700 1 --2- 192.168.123.103:0/2560485181 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f507410a700 0x7f507410cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.691+0000 7f507c7b7700 1 --2- 192.168.123.103:0/2560485181 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5074107d90 0x7f507410a1c0 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.691+0000 7f507c7b7700 1 -- 192.168.123.103:0/2560485181 >> 192.168.123.103:0/2560485181 conn(0x7f507406dae0 msgr2=0x7f507406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:47.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.693+0000 7f507c7b7700 1 -- 192.168.123.103:0/2560485181 shutdown_connections 2026-03-09T00:01:47.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.693+0000 7f507c7b7700 1 -- 192.168.123.103:0/2560485181 wait complete. 2026-03-09T00:01:47.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.693+0000 7f507c7b7700 1 Processor -- start 2026-03-09T00:01:47.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.693+0000 7f507c7b7700 1 -- start start 2026-03-09T00:01:47.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507c7b7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5074107d90 0x7f50741a52f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507c7b7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f507410a700 0x7f50741a5830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507c7b7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50741a5e50 con 0x7f507410a700 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507c7b7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f50741a5f90 con 0x7f5074107d90 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f5079d52700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f507410a700 0x7f50741a5830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f5079d52700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f507410a700 0x7f50741a5830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41734/0 (socket says 192.168.123.103:41734) 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f5079d52700 1 -- 192.168.123.103:0/3980263072 learned_addr learned my addr 192.168.123.103:0/3980263072 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507a553700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5074107d90 0x7f50741a52f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507a553700 1 -- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f507410a700 msgr2=0x7f50741a5830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507a553700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f507410a700 0x7f50741a5830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507a553700 1 -- 192.168.123.103:0/3980263072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f50700097e0 con 0x7f5074107d90 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f507a553700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5074107d90 0x7f50741a52f0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f5070005fd0 tx=0x7f507000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.694+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f507001d070 con 0x7f5074107d90 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.695+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f50741aa9e0 con 0x7f5074107d90 2026-03-09T00:01:47.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.695+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f50741aaea0 con 0x7f5074107d90 2026-03-09T00:01:47.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.699+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f507419f580 con 0x7f5074107d90 2026-03-09T00:01:47.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.699+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f507000bdf0 con 0x7f5074107d90 2026-03-09T00:01:47.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.700+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5070021620 con 0x7f5074107d90 2026-03-09T00:01:47.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.700+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f507002b430 con 0x7f5074107d90 2026-03-09T00:01:47.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.701+0000 7f506b7fe700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f506006c600 0x7f506006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:47.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.701+0000 7f5079d52700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f506006c600 0x7f506006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:47.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.701+0000 7f5079d52700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f506006c600 0x7f506006eac0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f506c0060b0 tx=0x7f506c009040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:47.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.701+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f507008dc70 con 0x7f5074107d90 2026-03-09T00:01:47.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.702+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f507005c000 con 0x7f5074107d90 2026-03-09T00:01:47.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.846+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f507404ea90 con 0x7f5074107d90 2026-03-09T00:01:47.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.847+0000 7f506b7fe700 1 -- 192.168.123.103:0/3980263072 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 7 v7) v1 ==== 75+0+1773 (secure 0 0 0) 0x7f507005bb90 con 0x7f5074107d90 2026-03-09T00:01:47.849 INFO:teuthology.orchestra.run.vm03.stdout:e7 2026-03-09T00:01:47.849 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:01:47.849 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:01:47.849 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:01:47.849 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:47.849 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:epoch 7 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:47.147440+0000 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 0 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:up {0=24277} 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{0:24277} state up:active seq 2 addr [v2:192.168.123.106:6824/3799306593,v1:192.168.123.106:6825/3799306593] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{-1:14480} state up:standby seq 1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:01:47.850 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.850+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f506006c600 msgr2=0x7f506006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.850+0000 7f507c7b7700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f506006c600 0x7f506006eac0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f506c0060b0 tx=0x7f506c009040 comp rx=0 tx=0).stop 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.850+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5074107d90 msgr2=0x7f50741a52f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.850+0000 7f507c7b7700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5074107d90 0x7f50741a52f0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f5070005fd0 tx=0x7f507000bac0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 shutdown_connections 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f506006c600 0x7f506006eac0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5074107d90 0x7f50741a52f0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 --2- 192.168.123.103:0/3980263072 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f507410a700 0x7f50741a5830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 >> 192.168.123.103:0/3980263072 conn(0x7f507406dae0 msgr2=0x7f507406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 shutdown_connections 2026-03-09T00:01:47.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:47.851+0000 7f507c7b7700 1 -- 192.168.123.103:0/3980263072 wait complete. 2026-03-09T00:01:47.855 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 7 2026-03-09T00:01:47.904 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-09T00:01:48.088 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:boot 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:active} 3 up:standby 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:48.248 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:48 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3980263072' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.373+0000 7f44c3fff700 1 -- 192.168.123.103:0/3157076606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4100c90 msgr2=0x7f44c41010b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.373+0000 7f44c3fff700 1 --2- 192.168.123.103:0/3157076606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4100c90 0x7f44c41010b0 secure :-1 s=READY pgs=261 cs=0 l=1 rev1=1 crypto rx=0x7f44ac009b00 tx=0x7f44ac009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 -- 192.168.123.103:0/3157076606 shutdown_connections 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 --2- 192.168.123.103:0/3157076606 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4101e90 0x7f44c41022f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 --2- 192.168.123.103:0/3157076606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4100c90 0x7f44c41010b0 unknown :-1 s=CLOSED pgs=261 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 -- 192.168.123.103:0/3157076606 >> 192.168.123.103:0/3157076606 conn(0x7f44c40fc210 msgr2=0x7f44c40fe670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 -- 192.168.123.103:0/3157076606 shutdown_connections 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 -- 192.168.123.103:0/3157076606 wait complete. 2026-03-09T00:01:48.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 Processor -- start 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.374+0000 7f44c3fff700 1 -- start start 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4100c90 0x7f44c4196560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 0x7f44c4196aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c3fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44c41970c0 con 0x7f44c4101e90 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c3fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f44c4197200 con 0x7f44c4100c90 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 0x7f44c4196aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 0x7f44c4196aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41756/0 (socket says 192.168.123.103:41756) 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 -- 192.168.123.103:0/1630742026 learned_addr learned my addr 192.168.123.103:0/1630742026 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 -- 192.168.123.103:0/1630742026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4100c90 msgr2=0x7f44c4196560 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:48.374 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c2ffd700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4100c90 0x7f44c4196560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:48.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4100c90 0x7f44c4196560 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:48.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 -- 192.168.123.103:0/1630742026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f44ac0097e0 con 0x7f44c4101e90 2026-03-09T00:01:48.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c27fc700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 0x7f44c4196aa0 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7f44b400d8d0 tx=0x7f44b400dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:48.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.375+0000 7f44c2ffd700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4100c90 0x7f44c4196560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:48.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.376+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f44b4009940 con 0x7f44c4101e90 2026-03-09T00:01:48.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.376+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f44b4010460 con 0x7f44c4101e90 2026-03-09T00:01:48.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.376+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f44b400f5d0 con 0x7f44c4101e90 2026-03-09T00:01:48.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.376+0000 7f44c3fff700 1 -- 192.168.123.103:0/1630742026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f44c4073020 con 0x7f44c4101e90 2026-03-09T00:01:48.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.376+0000 7f44c3fff700 1 -- 192.168.123.103:0/1630742026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f44c4073570 con 0x7f44c4101e90 2026-03-09T00:01:48.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.377+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f44b40105d0 con 0x7f44c4101e90 2026-03-09T00:01:48.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.377+0000 7f44bbfff700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f44b006c600 0x7f44b006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:48.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.377+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f44b408b470 con 0x7f44c4101e90 2026-03-09T00:01:48.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.378+0000 7f44c3fff700 1 -- 192.168.123.103:0/1630742026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f44c404ea90 con 0x7f44c4101e90 2026-03-09T00:01:48.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.378+0000 7f44c2ffd700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f44b006c600 0x7f44b006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:48.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.381+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f44b4059570 con 0x7f44c4101e90 2026-03-09T00:01:48.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.386+0000 7f44c2ffd700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f44b006c600 0x7f44b006eac0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f44ac000c00 tx=0x7f44ac005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:boot 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: fsmap cephfs:1 {0=cephfs.vm06.vlrwtl=up:active} 3 up:standby 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:48.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:48 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3980263072' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:01:48.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.538+0000 7f44c3fff700 1 -- 192.168.123.103:0/1630742026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f44c4073ce0 con 0x7f44c4101e90 2026-03-09T00:01:48.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.539+0000 7f44bbfff700 1 -- 192.168.123.103:0/1630742026 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 8 v8) v1 ==== 93+0+3969 (secure 0 0 0) 0x7f44b4016020 con 0x7f44c4101e90 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 -- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f44b006c600 msgr2=0x7f44b006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f44b006c600 0x7f44b006eac0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f44ac000c00 tx=0x7f44ac005c00 comp rx=0 tx=0).stop 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 -- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 msgr2=0x7f44c4196aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 0x7f44c4196aa0 secure :-1 s=READY pgs=262 cs=0 l=1 rev1=1 crypto rx=0x7f44b400d8d0 tx=0x7f44b400dc90 comp rx=0 tx=0).stop 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 -- 192.168.123.103:0/1630742026 shutdown_connections 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f44b006c600 0x7f44b006eac0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f44c4100c90 0x7f44c4196560 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 --2- 192.168.123.103:0/1630742026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f44c4101e90 0x7f44c4196aa0 unknown :-1 s=CLOSED pgs=262 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 -- 192.168.123.103:0/1630742026 >> 192.168.123.103:0/1630742026 conn(0x7f44c40fc210 msgr2=0x7f44c41050c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 -- 192.168.123.103:0/1630742026 shutdown_connections 2026-03-09T00:01:48.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:48.542+0000 7f44b9ffb700 1 -- 192.168.123.103:0/1630742026 wait complete. 2026-03-09T00:01:48.544 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 8 2026-03-09T00:01:48.551 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:01:48.595 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-09T00:01:48.780 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:49.053 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.053+0000 7fa01edbc700 1 -- 192.168.123.103:0/1436390174 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a61c0 msgr2=0x7fa0100a65e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.053+0000 7fa01edbc700 1 --2- 192.168.123.103:0/1436390174 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a61c0 0x7fa0100a65e0 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7fa0140099b0 tx=0x7fa014009cc0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.053+0000 7fa01edbc700 1 -- 192.168.123.103:0/1436390174 shutdown_connections 2026-03-09T00:01:49.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.053+0000 7fa01edbc700 1 --2- 192.168.123.103:0/1436390174 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa0100a4800 0x7fa0100a4c60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.053+0000 7fa01edbc700 1 --2- 192.168.123.103:0/1436390174 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a61c0 0x7fa0100a65e0 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.053+0000 7fa01edbc700 1 -- 192.168.123.103:0/1436390174 >> 192.168.123.103:0/1436390174 conn(0x7fa0100a0160 msgr2=0x7fa0100a25c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:49.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 -- 192.168.123.103:0/1436390174 shutdown_connections 2026-03-09T00:01:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 -- 192.168.123.103:0/1436390174 wait complete. 2026-03-09T00:01:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 Processor -- start 2026-03-09T00:01:49.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 -- start start 2026-03-09T00:01:49.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a4800 0x7fa0100110c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:49.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 0x7fa010012650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:49.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa010011a80 con 0x7fa0100a4800 2026-03-09T00:01:49.059 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.055+0000 7fa01edbc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa010011bf0 con 0x7fa010011600 2026-03-09T00:01:49.060 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.056+0000 7fa01d5b9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 0x7fa010012650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.056+0000 7fa01d5b9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 0x7fa010012650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57890/0 (socket says 192.168.123.103:57890) 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.056+0000 7fa01d5b9700 1 -- 192.168.123.103:0/3579847978 learned_addr learned my addr 192.168.123.103:0/3579847978 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.056+0000 7fa01ddba700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a4800 0x7fa0100110c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.057+0000 7fa01d5b9700 1 -- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a4800 msgr2=0x7fa0100110c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.057+0000 7fa01d5b9700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a4800 0x7fa0100110c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.057+0000 7fa01d5b9700 1 -- 192.168.123.103:0/3579847978 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa014009650 con 0x7fa010011600 2026-03-09T00:01:49.062 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.058+0000 7fa01d5b9700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 0x7fa010012650 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa01804f8e0 tx=0x7fa01806f7d0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:49.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.058+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa018074070 con 0x7fa010011600 2026-03-09T00:01:49.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.059+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa018069320 con 0x7fa010011600 2026-03-09T00:01:49.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.059+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa0180799e0 con 0x7fa010011600 2026-03-09T00:01:49.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.059+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa010012b90 con 0x7fa010011600 2026-03-09T00:01:49.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.059+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa010013010 con 0x7fa010011600 2026-03-09T00:01:49.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.061+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fa018079b40 con 0x7fa010011600 2026-03-09T00:01:49.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.061+0000 7fa00effd700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa00406c330 0x7fa00406e7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:49.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.061+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fa0180eda30 con 0x7fa010011600 2026-03-09T00:01:49.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.063+0000 7fa01ddba700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa00406c330 0x7fa00406e7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:49.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.064+0000 7fa01ddba700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa00406c330 0x7fa00406e7f0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fa014005fd0 tx=0x7fa01400b560 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:49.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.066+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa010004f40 con 0x7fa010011600 2026-03-09T00:01:49.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.070+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa0180bbd60 con 0x7fa010011600 2026-03-09T00:01:49.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.225+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7fa0100132f0 con 0x7fa010011600 2026-03-09T00:01:49.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.226+0000 7fa00effd700 1 -- 192.168.123.103:0/3579847978 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v8) v1 ==== 78+0+83 (secure 0 0 0) 0x7fa0180bb8f0 con 0x7fa010011600 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.233+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa00406c330 msgr2=0x7fa00406e7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.233+0000 7fa01edbc700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa00406c330 0x7fa00406e7f0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fa014005fd0 tx=0x7fa01400b560 comp rx=0 tx=0).stop 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.233+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 msgr2=0x7fa010012650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.233+0000 7fa01edbc700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 0x7fa010012650 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa01804f8e0 tx=0x7fa01806f7d0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 shutdown_connections 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa00406c330 0x7fa00406e7f0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa0100a4800 0x7fa0100110c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 --2- 192.168.123.103:0/3579847978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa010011600 0x7fa010012650 secure :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7fa01804f8e0 tx=0x7fa01806f7d0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 >> 192.168.123.103:0/3579847978 conn(0x7fa0100a0160 msgr2=0x7fa0100a0e60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 shutdown_connections 2026-03-09T00:01:49.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.234+0000 7fa01edbc700 1 -- 192.168.123.103:0/3579847978 wait complete. 2026-03-09T00:01:49.245 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:01:49.291 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-09T00:01:49.295 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: pgmap v80: 65 pgs: 65 active+clean; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s wr, 7 op/s 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: Dropping low affinity active daemon mds.cephfs.vm06.vlrwtl in favor of higher affinity standby. 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: Replacing daemon mds.cephfs.vm06.vlrwtl as rank 0 with standby daemon mds.cephfs.vm03.sejksk 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:standby 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:replay} 2 up:standby 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/1630742026' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:49.333 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:49 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:49.476 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: pgmap v80: 65 pgs: 65 active+clean; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.2 KiB/s wr, 7 op/s 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: Dropping low affinity active daemon mds.cephfs.vm06.vlrwtl in favor of higher affinity standby. 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: Replacing daemon mds.cephfs.vm06.vlrwtl as rank 0 with standby daemon mds.cephfs.vm03.sejksk 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: osdmap e39: 6 total, 6 up, 6 in 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:standby 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:replay} 2 up:standby 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1630742026' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:49.531 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:49 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.759+0000 7f3939ae8700 1 -- 192.168.123.103:0/3829314074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c099ed0 msgr2=0x7f392c09a2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.759+0000 7f3939ae8700 1 --2- 192.168.123.103:0/3829314074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c099ed0 0x7f392c09a2b0 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f3928009b00 tx=0x7f3928009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.759+0000 7f3939ae8700 1 -- 192.168.123.103:0/3829314074 shutdown_connections 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.759+0000 7f3939ae8700 1 --2- 192.168.123.103:0/3829314074 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f392c0980a0 0x7f392c098520 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.759+0000 7f3939ae8700 1 --2- 192.168.123.103:0/3829314074 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c099ed0 0x7f392c09a2b0 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.759+0000 7f3939ae8700 1 -- 192.168.123.103:0/3829314074 >> 192.168.123.103:0/3829314074 conn(0x7f392c00daf0 msgr2=0x7f392c00df00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.760+0000 7f3939ae8700 1 -- 192.168.123.103:0/3829314074 shutdown_connections 2026-03-09T00:01:49.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.760+0000 7f3939ae8700 1 -- 192.168.123.103:0/3829314074 wait complete. 2026-03-09T00:01:49.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3939ae8700 1 Processor -- start 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3939ae8700 1 -- start start 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3939ae8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 0x7f392c1438e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3939ae8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f392c099ed0 0x7f392c00bdd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3939ae8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f392c143fe0 con 0x7f392c0980a0 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3939ae8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f392c144150 con 0x7f392c099ed0 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3938ae6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 0x7f392c1438e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3938ae6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 0x7f392c1438e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41782/0 (socket says 192.168.123.103:41782) 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3938ae6700 1 -- 192.168.123.103:0/722983608 learned_addr learned my addr 192.168.123.103:0/722983608 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.761+0000 7f3933fff700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f392c099ed0 0x7f392c00bdd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.762+0000 7f3938ae6700 1 -- 192.168.123.103:0/722983608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f392c099ed0 msgr2=0x7f392c00bdd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.762+0000 7f3938ae6700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f392c099ed0 0x7f392c00bdd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.762+0000 7f3938ae6700 1 -- 192.168.123.103:0/722983608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f39280097e0 con 0x7f392c0980a0 2026-03-09T00:01:49.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.762+0000 7f3938ae6700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 0x7f392c1438e0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f392800b5c0 tx=0x7f39280049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:49.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.762+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f392801d070 con 0x7f392c0980a0 2026-03-09T00:01:49.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.762+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3928004b80 con 0x7f392c0980a0 2026-03-09T00:01:49.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.763+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f392800f670 con 0x7f392c0980a0 2026-03-09T00:01:49.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.763+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f392c00c450 con 0x7f392c0980a0 2026-03-09T00:01:49.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.763+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f392c00c8c0 con 0x7f392c0980a0 2026-03-09T00:01:49.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.763+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f392c00cba0 con 0x7f392c0980a0 2026-03-09T00:01:49.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.768+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f392800bc50 con 0x7f392c0980a0 2026-03-09T00:01:49.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.769+0000 7f3931ffb700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f392406c380 0x7f392406e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:49.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.769+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f392808d740 con 0x7f392c0980a0 2026-03-09T00:01:49.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.769+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f392808dbc0 con 0x7f392c0980a0 2026-03-09T00:01:49.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.769+0000 7f3933fff700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f392406c380 0x7f392406e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:49.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.770+0000 7f3933fff700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f392406c380 0x7f392406e840 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f3920006fd0 tx=0x7f3920008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:49.898 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:01:49.898 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":9,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":5},{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":9,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:01:49.417295+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14480},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14480":{"gid":14480,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":8,"state":"up:reconnect","state_seq":3,"addr":"192.168.123.103:6827/3708505754","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3708505754},{"type":"v1","addr":"192.168.123.103:6827","nonce":3708505754}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-09T00:01:49.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.897+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f392c0044c0 con 0x7f392c0980a0 2026-03-09T00:01:49.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.898+0000 7f3931ffb700 1 -- 192.168.123.103:0/722983608 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 9 v9) v1 ==== 93+0+4754 (secure 0 0 0) 0x7f3928057fe0 con 0x7f392c0980a0 2026-03-09T00:01:49.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.901+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f392406c380 msgr2=0x7f392406e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.901+0000 7f3939ae8700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f392406c380 0x7f392406e840 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f3920006fd0 tx=0x7f3920008040 comp rx=0 tx=0).stop 2026-03-09T00:01:49.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.901+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 msgr2=0x7f392c1438e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:49.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.901+0000 7f3939ae8700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 0x7f392c1438e0 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7f392800b5c0 tx=0x7f39280049e0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.902+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 shutdown_connections 2026-03-09T00:01:49.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.902+0000 7f3939ae8700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f392406c380 0x7f392406e840 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.902+0000 7f3939ae8700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f392c0980a0 0x7f392c1438e0 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.902+0000 7f3939ae8700 1 --2- 192.168.123.103:0/722983608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f392c099ed0 0x7f392c00bdd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:49.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.902+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 >> 192.168.123.103:0/722983608 conn(0x7f392c00daf0 msgr2=0x7f392c095fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:49.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.903+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 shutdown_connections 2026-03-09T00:01:49.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:49.903+0000 7f3939ae8700 1 -- 192.168.123.103:0/722983608 wait complete. 2026-03-09T00:01:49.904 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 9 2026-03-09T00:01:49.943 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 1, 'flags': 18} 2026-03-09T00:01:49.943 INFO:teuthology.run_tasks:Running task kclient... 2026-03-09T00:01:49.953 INFO:tasks.kclient:Mounting kernel clients... 2026-03-09T00:01:49.953 INFO:tasks.kclient:config is {'client.0': {}, 'client.1': {}} 2026-03-09T00:01:49.953 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:49.953 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-09T00:01:50.027 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:50.027 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link delete ceph-brx 2026-03-09T00:01:50.060 INFO:teuthology.orchestra.run.vm03.stderr:Cannot find device "ceph-brx" 2026-03-09T00:01:50.062 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T00:01:50.062 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:50.062 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-09T00:01:50.078 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:50.078 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link delete ceph-brx 2026-03-09T00:01:50.146 INFO:teuthology.orchestra.run.vm06.stderr:Cannot find device "ceph-brx" 2026-03-09T00:01:50.148 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T00:01:50.148 INFO:tasks.kclient:client.0 config is {'syntax': 'v1'} 2026-03-09T00:01:50.148 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T00:01:50.148 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:50.148 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs ls 2026-03-09T00:01:50.333 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:50.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/3579847978' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T00:01:50.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:reconnect 2026-03-09T00:01:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] up:boot 2026-03-09T00:01:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:reconnect} 3 up:standby 2026-03-09T00:01:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:01:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:50 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/722983608' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/3579847978' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:reconnect 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: mds.? [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] up:boot 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:reconnect} 3 up:standby 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:50.429 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:50 vm03 ceph-mon[52346]: from='client.? 192.168.123.103:0/722983608' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 -- 192.168.123.103:0/2555158358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f331810f420 msgr2=0x7f331810f800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f3316ffd700 1 -- 192.168.123.103:0/2555158358 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f330800ba40 con 0x7f331810f420 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 --2- 192.168.123.103:0/2555158358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f331810f420 0x7f331810f800 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f3308009b50 tx=0x7f3308009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 -- 192.168.123.103:0/2555158358 shutdown_connections 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 --2- 192.168.123.103:0/2555158358 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f3318108210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 --2- 192.168.123.103:0/2555158358 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f331810f420 0x7f331810f800 secure :-1 s=CLOSED pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7f3308009b50 tx=0x7f3308009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 -- 192.168.123.103:0/2555158358 >> 192.168.123.103:0/2555158358 conn(0x7f331806ce20 msgr2=0x7f331806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 -- 192.168.123.103:0/2555158358 shutdown_connections 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 -- 192.168.123.103:0/2555158358 wait complete. 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 Processor -- start 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.585+0000 7f331e749700 1 -- start start 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.586+0000 7f331e749700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f33181130c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.586+0000 7f331e749700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33181180c0 0x7f3318113600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.586+0000 7f331e749700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3318113c70 con 0x7f33181180c0 2026-03-09T00:01:50.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.586+0000 7f331e749700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3318113de0 con 0x7f3318107d90 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f3317fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f33181130c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f33177fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33181180c0 0x7f3318113600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f3317fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f33181130c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57924/0 (socket says 192.168.123.103:57924) 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f3317fff700 1 -- 192.168.123.103:0/46905831 learned_addr learned my addr 192.168.123.103:0/46905831 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f3317fff700 1 -- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33181180c0 msgr2=0x7f3318113600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f3317fff700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33181180c0 0x7f3318113600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.587+0000 7f3317fff700 1 -- 192.168.123.103:0/46905831 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33080097e0 con 0x7f3318107d90 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.588+0000 7f33177fe700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33181180c0 0x7f3318113600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.588+0000 7f3317fff700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f33181130c0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f3308000c00 tx=0x7f3308004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.588+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f330801d070 con 0x7f3318107d90 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.588+0000 7f331e749700 1 -- 192.168.123.103:0/46905831 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f33181b8470 con 0x7f3318107d90 2026-03-09T00:01:50.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.588+0000 7f331e749700 1 -- 192.168.123.103:0/46905831 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33181b8880 con 0x7f3318107d90 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.589+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3308022470 con 0x7f3318107d90 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.589+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f330800f670 con 0x7f3318107d90 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.589+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f33080225e0 con 0x7f3318107d90 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.590+0000 7f33157fa700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f330006c330 0x7f330006e7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.590+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f330808ccc0 con 0x7f3318107d90 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.590+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f32f80052f0 con 0x7f3318107d90 2026-03-09T00:01:50.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.590+0000 7f33177fe700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f330006c330 0x7f330006e7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:50.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.593+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3308055e60 con 0x7f3318107d90 2026-03-09T00:01:50.597 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.594+0000 7f33177fe700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f330006c330 0x7f330006e7f0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f330c009dd0 tx=0x7f330c009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:50.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.731+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f32f80061d0 con 0x7f3318107d90 2026-03-09T00:01:50.733 INFO:teuthology.orchestra.run.vm03.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T00:01:50.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.733+0000 7f33157fa700 1 -- 192.168.123.103:0/46905831 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v10) v1 ==== 53+0+83 (secure 0 0 0) 0x7f3308027070 con 0x7f3318107d90 2026-03-09T00:01:50.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f330006c330 msgr2=0x7f330006e7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:50.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f330006c330 0x7f330006e7f0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f330c009dd0 tx=0x7f330c009450 comp rx=0 tx=0).stop 2026-03-09T00:01:50.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 msgr2=0x7f33181130c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:50.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f33181130c0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f3308000c00 tx=0x7f3308004970 comp rx=0 tx=0).stop 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 shutdown_connections 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f330006c330 0x7f330006e7f0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3318107d90 0x7f33181130c0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 --2- 192.168.123.103:0/46905831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f33181180c0 0x7f3318113600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.736+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 >> 192.168.123.103:0/46905831 conn(0x7f331806ce20 msgr2=0x7f331810d2f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.737+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 shutdown_connections 2026-03-09T00:01:50.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50.737+0000 7f32feffd700 1 -- 192.168.123.103:0/46905831 wait complete. 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm03.local 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T00:01:50.800 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-09T00:01:50.800 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:50.800 DEBUG:teuthology.orchestra.run.vm03:> ip addr 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: inet6 ::1/128 scope host 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: link/ether 52:55:00:00:00:03 brd ff:ff:ff:ff:ff:ff 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: altname enp0s3 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: altname ens3 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: inet 192.168.123.103/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft 3126sec preferred_lft 3126sec 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: inet6 fe80::5055:ff:fe00:3/64 scope link noprefixroute 2026-03-09T00:01:50.819 INFO:teuthology.orchestra.run.vm03.stdout: valid_lft forever preferred_lft forever 2026-03-09T00:01:50.820 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link add name ceph-brx type bridge 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> sudo ip addr flush dev ceph-brx 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set ceph-brx up 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T00:01:50.820 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T00:01:50.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:50.970 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:50 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:50.973 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:50.973 DEBUG:teuthology.orchestra.run.vm03:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T00:01:51.043 INFO:teuthology.orchestra.run.vm03.stdout:1 2026-03-09T00:01:51.044 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:51.044 DEBUG:teuthology.orchestra.run.vm03:> ip r 2026-03-09T00:01:51.098 INFO:teuthology.orchestra.run.vm03.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.103 metric 100 2026-03-09T00:01:51.098 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.103 metric 100 2026-03-09T00:01:51.098 INFO:teuthology.orchestra.run.vm03.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T00:01:51.098 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:51.098 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T00:01:51.098 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T00:01:51.098 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T00:01:51.098 DEBUG:teuthology.orchestra.run.vm03:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T00:01:51.098 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T00:01:51.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 339 B/s rd, 2.3 KiB/s wr, 7 op/s 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:rejoin 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:standby 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:rejoin} 3 up:standby 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: daemon mds.cephfs.vm03.sejksk is now active in filesystem cephfs as rank 0 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:51.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:51 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/46905831' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T00:01:51.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:51.239 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:51.239 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-09T00:01:51.294 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:51.294 DEBUG:teuthology.orchestra.run.vm03:> ip netns list-id 2026-03-09T00:01:51.350 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:51.350 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T00:01:51.350 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T00:01:51.350 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-09T00:01:51.350 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T00:01:51.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:51.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:51.454 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-09T00:01:51.455 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T00:01:51.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:51.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:51.596 DEBUG:teuthology.orchestra.run.vm03:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:51.596 DEBUG:teuthology.orchestra.run.vm03:> set -e 2026-03-09T00:01:51.596 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set brx.0 up 2026-03-09T00:01:51.596 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T00:01:51.596 DEBUG:teuthology.orchestra.run.vm03:> ') 2026-03-09T00:01:51.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 339 B/s rd, 2.3 KiB/s wr, 7 op/s 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:rejoin 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:standby 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:rejoin} 3 up:standby 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: daemon mds.cephfs.vm03.sejksk is now active in filesystem cephfs as rank 0 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:51 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/46905831' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T00:01:51.692 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:51 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:51.696 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T00:01:51.696 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:51.750 INFO:teuthology.orchestra.run.vm03.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-09T00:01:51.750 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T00:01:51.750 DEBUG:teuthology.orchestra.run.vm03:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:51.806 INFO:tasks.cephfs.kernel_mount:mounting using device: :/ 2026-03-09T00:01:51.806 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:51.806 DEBUG:teuthology.orchestra.run.vm03:> sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage /bin/mount -t ceph :/ /home/ubuntu/cephtest/mnt.0 -v -o norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=0,mds_namespace=cephfs 2026-03-09T00:01:51.887 INFO:teuthology.orchestra.run.vm03.stdout:parsing options: rw,norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=0,mds_namespace=cephfs 2026-03-09T00:01:51.887 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=0,mds_namespace=cephfs". 2026-03-09T00:01:51.887 INFO:teuthology.orchestra.run.vm03.stdout:invalid new device string format 2026-03-09T00:01:51.944 INFO:teuthology.orchestra.run.vm03.stdout:parsing options: rw,norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=0,mds_namespace=cephfs 2026-03-09T00:01:51.944 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=0,mds_namespace=cephfs". 2026-03-09T00:01:51.944 INFO:teuthology.orchestra.run.vm03.stdout:invalid new device string format 2026-03-09T00:01:51.944 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: resolved to: "192.168.123.103:3300,192.168.123.106:3300" 2026-03-09T00:01:51.944 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: switching to using v1 address with old syntax 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:parsing options: rw,norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=0,mds_namespace=cephfs 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=0,mds_namespace=cephfs". 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:invalid new device string format 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: resolved to: "192.168.123.103:3300,192.168.123.106:3300" 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: switching to using v1 address with old syntax 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: resolved to: "192.168.123.103:6789,192.168.123.106:6789" 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: trying mount with old device syntax: 192.168.123.103:6789,192.168.123.106:6789:/ 2026-03-09T00:01:51.956 INFO:teuthology.orchestra.run.vm03.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=0,mds_namespace=cephfs,key=0,fsid=ae8f0172-1b4a-11f1-916a-712b2ac006b7" will pass to kernel 2026-03-09T00:01:51.957 INFO:teuthology.orchestra.run.vm03.stdout:mount: /home/ubuntu/cephtest/mnt.0 does not contain SELinux labels. 2026-03-09T00:01:51.957 INFO:teuthology.orchestra.run.vm03.stdout: You just mounted a file system that supports labels which does not 2026-03-09T00:01:51.957 INFO:teuthology.orchestra.run.vm03.stdout: contain labels, onto an SELinux box. It is likely that confined 2026-03-09T00:01:51.957 INFO:teuthology.orchestra.run.vm03.stdout: applications will generate AVC messages and not be allowed access to 2026-03-09T00:01:51.957 INFO:teuthology.orchestra.run.vm03.stdout: this file system. For more details see restorecon(8) and mount(8). 2026-03-09T00:01:51.958 INFO:tasks.cephfs.kernel_mount:mount command passed 2026-03-09T00:01:51.958 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:51.958 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:52.021 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest/mnt.0 && exec stdin-killer --timeout=900 -- bash -c 'getfattr --only-values -n ceph.client_id .') 2026-03-09T00:01:52.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: expiration expected; waiting 900 seconds for command to complete 2026-03-09T00:01:52.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:52.103 INFO:teuthology.orchestra.run.vm03.stdout:client24309 2026-03-09T00:01:52.103 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:01:52.103 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T00:01:52.159 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest/mnt.0 && exec stdin-killer --timeout=900 -- bash -c 'getfattr --only-values -n ceph.client_id .') 2026-03-09T00:01:52.183 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:52 vm03.local ceph-mon[52346]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:01:52.183 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:52 vm03.local ceph-mon[52346]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:active 2026-03-09T00:01:52.183 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:52 vm03.local ceph-mon[52346]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:standby 2026-03-09T00:01:52.183 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:52 vm03.local ceph-mon[52346]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:01:52.183 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:52 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:52.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: expiration expected; waiting 900 seconds for command to complete 2026-03-09T00:01:52.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:52.224 INFO:teuthology.orchestra.run.vm03.stdout:client24309 2026-03-09T00:01:52.225 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest/mnt.0 && exec stdin-killer --timeout=900 -- bash -c 'getfattr --only-values -n ceph.client_id .') 2026-03-09T00:01:52.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: expiration expected; waiting 900 seconds for command to complete 2026-03-09T00:01:52.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:52.304 INFO:teuthology.orchestra.run.vm03.stdout:client24309 2026-03-09T00:01:52.304 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest/mnt.0 && exec stdin-killer --timeout=300 -- bash -c 'sudo dd if=/sys/kernel/debug/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7.client24309/status') 2026-03-09T00:01:52.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:52.388 INFO:teuthology.orchestra.run.vm03.stdout:instance: client.24309 (3)192.168.123.103:0/929198358 2026-03-09T00:01:52.388 INFO:teuthology.orchestra.run.vm03.stdout:blocklisted: false 2026-03-09T00:01:52.388 INFO:teuthology.orchestra.run.vm03.stderr:0+1 records in 2026-03-09T00:01:52.388 INFO:teuthology.orchestra.run.vm03.stderr:0+1 records out 2026-03-09T00:01:52.388 INFO:teuthology.orchestra.run.vm03.stderr:73 bytes copied, 0.000100328 s, 728 kB/s 2026-03-09T00:01:52.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:52.393 INFO:tasks.kclient:client.1 config is {'syntax': 'v1'} 2026-03-09T00:01:52.393 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-09T00:01:52.393 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:52.393 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs ls 2026-03-09T00:01:52.567 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:52.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:52 vm06 ceph-mon[58395]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:01:52.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:52 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] up:active 2026-03-09T00:01:52.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:52 vm06 ceph-mon[58395]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:standby 2026-03-09T00:01:52.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:52 vm06 ceph-mon[58395]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:01:52.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:52 vm06 ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.816+0000 7f04fa129700 1 -- 192.168.123.103:0/2770901674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 msgr2=0x7f04f4069bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.816+0000 7f04fa129700 1 --2- 192.168.123.103:0/2770901674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f4069bc0 secure :-1 s=READY pgs=267 cs=0 l=1 rev1=1 crypto rx=0x7f04e4009b50 tx=0x7f04e4009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.817+0000 7f04fa129700 1 -- 192.168.123.103:0/2770901674 shutdown_connections 2026-03-09T00:01:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.817+0000 7f04fa129700 1 --2- 192.168.123.103:0/2770901674 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f4069bc0 unknown :-1 s=CLOSED pgs=267 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.817+0000 7f04fa129700 1 --2- 192.168.123.103:0/2770901674 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04f4105f30 0x7f04f4069200 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.816 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.817+0000 7f04fa129700 1 -- 192.168.123.103:0/2770901674 >> 192.168.123.103:0/2770901674 conn(0x7f04f4076b30 msgr2=0x7f04f4076f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.817+0000 7f04fa129700 1 -- 192.168.123.103:0/2770901674 shutdown_connections 2026-03-09T00:01:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 -- 192.168.123.103:0/2770901674 wait complete. 2026-03-09T00:01:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 Processor -- start 2026-03-09T00:01:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 -- start start 2026-03-09T00:01:52.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f419d2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:52.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04f4105f30 0x7f04f419d7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:52.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04f419ded0 con 0x7f04f4069740 2026-03-09T00:01:52.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04fa129700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f04f41a1c60 con 0x7f04f4105f30 2026-03-09T00:01:52.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.818+0000 7f04f37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f419d2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:52.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f419d2b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41810/0 (socket says 192.168.123.103:41810) 2026-03-09T00:01:52.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f37fe700 1 -- 192.168.123.103:0/1379363720 learned_addr learned my addr 192.168.123.103:0/1379363720 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:52.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f37fe700 1 -- 192.168.123.103:0/1379363720 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04f4105f30 msgr2=0x7f04f419d7f0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:01:52.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f37fe700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04f4105f30 0x7f04f419d7f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f37fe700 1 -- 192.168.123.103:0/1379363720 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f04e40097e0 con 0x7f04f4069740 2026-03-09T00:01:52.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f37fe700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f419d2b0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f04dc00eb10 tx=0x7f04dc00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:52.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f04dc00cca0 con 0x7f04f4069740 2026-03-09T00:01:52.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f04dc00ce00 con 0x7f04f4069740 2026-03-09T00:01:52.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f04dc0189c0 con 0x7f04f4069740 2026-03-09T00:01:52.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f04f41a1f40 con 0x7f04f4069740 2026-03-09T00:01:52.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.819+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f04f41a2490 con 0x7f04f4069740 2026-03-09T00:01:52.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.821+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f04dc018b20 con 0x7f04f4069740 2026-03-09T00:01:52.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.821+0000 7f04f0ff9700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f04e006c600 0x7f04e006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:52.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.821+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f04dc014070 con 0x7f04f4069740 2026-03-09T00:01:52.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.822+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f04f404ea90 con 0x7f04f4069740 2026-03-09T00:01:52.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.822+0000 7f04f2ffd700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f04e006c600 0x7f04e006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:52.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.823+0000 7f04f2ffd700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f04e006c600 0x7f04e006eac0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f04e4005b40 tx=0x7f04e4005ab0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:52.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.825+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f04dc057ac0 con 0x7f04f4069740 2026-03-09T00:01:52.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.956+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f04f419e610 con 0x7f04f4069740 2026-03-09T00:01:52.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.956+0000 7f04f0ff9700 1 -- 192.168.123.103:0/1379363720 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v11) v1 ==== 53+0+83 (secure 0 0 0) 0x7f04dc05b0e0 con 0x7f04f4069740 2026-03-09T00:01:52.956 INFO:teuthology.orchestra.run.vm03.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-09T00:01:52.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f04e006c600 msgr2=0x7f04e006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:52.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f04e006c600 0x7f04e006eac0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f04e4005b40 tx=0x7f04e4005ab0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 msgr2=0x7f04f419d2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:52.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f419d2b0 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f04dc00eb10 tx=0x7f04dc00eed0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 shutdown_connections 2026-03-09T00:01:52.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f04e006c600 0x7f04e006eac0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f04f4069740 0x7f04f419d2b0 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 --2- 192.168.123.103:0/1379363720 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f04f4105f30 0x7f04f419d7f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:52.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.959+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 >> 192.168.123.103:0/1379363720 conn(0x7f04f4076b30 msgr2=0x7f04f4101890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:52.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.961+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 shutdown_connections 2026-03-09T00:01:52.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:52.961+0000 7f04fa129700 1 -- 192.168.123.103:0/1379363720 wait complete. 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm06.local 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-09T00:01:53.018 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-09T00:01:53.018 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:53.018 DEBUG:teuthology.orchestra.run.vm06:> ip addr 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: inet 127.0.0.1/8 scope host lo 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: inet6 ::1/128 scope host 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: link/ether 52:55:00:00:00:06 brd ff:ff:ff:ff:ff:ff 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: altname enp0s3 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: altname ens3 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: inet 192.168.123.106/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft 3155sec preferred_lft 3155sec 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: inet6 fe80::5055:ff:fe00:6/64 scope link noprefixroute 2026-03-09T00:01:53.036 INFO:teuthology.orchestra.run.vm06.stdout: valid_lft forever preferred_lft forever 2026-03-09T00:01:53.036 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link add name ceph-brx type bridge 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> sudo ip addr flush dev ceph-brx 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set ceph-brx up 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-09T00:01:53.036 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T00:01:53.112 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:53.190 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:53.196 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:53.197 DEBUG:teuthology.orchestra.run.vm06:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-09T00:01:53.263 INFO:teuthology.orchestra.run.vm06.stdout:1 2026-03-09T00:01:53.264 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:53.264 DEBUG:teuthology.orchestra.run.vm06:> ip r 2026-03-09T00:01:53.317 INFO:teuthology.orchestra.run.vm06.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.106 metric 100 2026-03-09T00:01:53.317 INFO:teuthology.orchestra.run.vm06.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.106 metric 100 2026-03-09T00:01:53.317 INFO:teuthology.orchestra.run.vm06.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-09T00:01:53.318 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:53.318 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T00:01:53.318 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-09T00:01:53.318 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-09T00:01:53.318 DEBUG:teuthology.orchestra.run.vm06:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-09T00:01:53.318 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T00:01:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:53 vm03.local ceph-mon[52346]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 639 B/s rd, 1.7 KiB/s wr, 6 op/s 2026-03-09T00:01:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:53 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/1379363720' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T00:01:53.390 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:53.397 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:53 vm06 ceph-mon[58395]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 639 B/s rd, 1.7 KiB/s wr, 6 op/s 2026-03-09T00:01:53.397 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:53 vm06 ceph-mon[58395]: from='client.? 192.168.123.103:0/1379363720' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-09T00:01:53.444 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:53.448 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:53.449 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-09T00:01:53.503 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:53.503 DEBUG:teuthology.orchestra.run.vm06:> ip netns list-id 2026-03-09T00:01:53.557 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:53.557 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T00:01:53.557 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T00:01:53.557 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-09T00:01:53.557 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T00:01:53.629 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:53.650 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:53.654 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-09T00:01:53.654 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T00:01:53.726 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:53.779 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:53.782 DEBUG:teuthology.orchestra.run.vm06:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-09T00:01:53.782 DEBUG:teuthology.orchestra.run.vm06:> set -e 2026-03-09T00:01:53.782 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set brx.0 up 2026-03-09T00:01:53.782 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link set dev brx.0 master ceph-brx 2026-03-09T00:01:53.782 DEBUG:teuthology.orchestra.run.vm06:> ') 2026-03-09T00:01:53.855 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:53.882 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:53 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:53.884 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T00:01:53.884 DEBUG:teuthology.orchestra.run.vm06:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:53.938 INFO:teuthology.orchestra.run.vm06.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-09T00:01:53.938 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-09T00:01:53.938 DEBUG:teuthology.orchestra.run.vm06:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:53.991 INFO:tasks.cephfs.kernel_mount:mounting using device: :/ 2026-03-09T00:01:53.991 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:53.991 DEBUG:teuthology.orchestra.run.vm06:> sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage /bin/mount -t ceph :/ /home/ubuntu/cephtest/mnt.1 -v -o norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=1,mds_namespace=cephfs 2026-03-09T00:01:54.067 INFO:teuthology.orchestra.run.vm06.stdout:parsing options: rw,norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=1,mds_namespace=cephfs 2026-03-09T00:01:54.067 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=1,mds_namespace=cephfs". 2026-03-09T00:01:54.067 INFO:teuthology.orchestra.run.vm06.stdout:invalid new device string format 2026-03-09T00:01:54.125 INFO:teuthology.orchestra.run.vm06.stdout:parsing options: rw,norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=1,mds_namespace=cephfs 2026-03-09T00:01:54.125 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=1,mds_namespace=cephfs". 2026-03-09T00:01:54.125 INFO:teuthology.orchestra.run.vm06.stdout:invalid new device string format 2026-03-09T00:01:54.125 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: resolved to: "192.168.123.103:3300,192.168.123.106:3300" 2026-03-09T00:01:54.125 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: switching to using v1 address with old syntax 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:parsing options: rw,norequire_active_mds,conf=/etc/ceph/ceph.conf,norbytes,name=1,mds_namespace=cephfs 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=1,mds_namespace=cephfs". 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:invalid new device string format 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: resolved to: "192.168.123.103:3300,192.168.123.106:3300" 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: switching to using v1 address with old syntax 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: resolved to: "192.168.123.103:6789,192.168.123.106:6789" 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: trying mount with old device syntax: 192.168.123.103:6789,192.168.123.106:6789:/ 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount.ceph: options "norequire_active_mds,norbytes,name=1,mds_namespace=cephfs,key=1,fsid=ae8f0172-1b4a-11f1-916a-712b2ac006b7" will pass to kernel 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout:mount: /home/ubuntu/cephtest/mnt.1 does not contain SELinux labels. 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout: You just mounted a file system that supports labels which does not 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout: contain labels, onto an SELinux box. It is likely that confined 2026-03-09T00:01:54.135 INFO:teuthology.orchestra.run.vm06.stdout: applications will generate AVC messages and not be allowed access to 2026-03-09T00:01:54.136 INFO:teuthology.orchestra.run.vm06.stdout: this file system. For more details see restorecon(8) and mount(8). 2026-03-09T00:01:54.136 INFO:tasks.cephfs.kernel_mount:mount command passed 2026-03-09T00:01:54.136 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:01:54.136 DEBUG:teuthology.orchestra.run.vm06:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:54.199 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest/mnt.1 && exec stdin-killer --timeout=900 -- bash -c 'getfattr --only-values -n ceph.client_id .') 2026-03-09T00:01:54.272 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: expiration expected; waiting 900 seconds for command to complete 2026-03-09T00:01:54.273 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:54.277 INFO:teuthology.orchestra.run.vm06.stdout:client24313 2026-03-09T00:01:54.277 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:01:54.277 DEBUG:teuthology.orchestra.run.vm06:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T00:01:54.333 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest/mnt.1 && exec stdin-killer --timeout=900 -- bash -c 'getfattr --only-values -n ceph.client_id .') 2026-03-09T00:01:54.409 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: expiration expected; waiting 900 seconds for command to complete 2026-03-09T00:01:54.410 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:54.415 INFO:teuthology.orchestra.run.vm06.stdout:client24313 2026-03-09T00:01:54.415 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest/mnt.1 && exec stdin-killer --timeout=900 -- bash -c 'getfattr --only-values -n ceph.client_id .') 2026-03-09T00:01:54.490 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: expiration expected; waiting 900 seconds for command to complete 2026-03-09T00:01:54.491 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:54.495 INFO:teuthology.orchestra.run.vm06.stdout:client24313 2026-03-09T00:01:54.495 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest/mnt.1 && exec stdin-killer --timeout=300 -- bash -c 'sudo dd if=/sys/kernel/debug/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7.client24313/status') 2026-03-09T00:01:54.568 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-09T00:01:54.578 INFO:teuthology.orchestra.run.vm06.stdout:instance: client.24313 (3)192.168.144.1:0/748981855 2026-03-09T00:01:54.578 INFO:teuthology.orchestra.run.vm06.stdout:blocklisted: false 2026-03-09T00:01:54.578 INFO:teuthology.orchestra.run.vm06.stderr:0+1 records in 2026-03-09T00:01:54.578 INFO:teuthology.orchestra.run.vm06.stderr:0+1 records out 2026-03-09T00:01:54.578 INFO:teuthology.orchestra.run.vm06.stderr:71 bytes copied, 0.000101401 s, 700 kB/s 2026-03-09T00:01:54.581 INFO:teuthology.orchestra.run.vm06.stderr:2026-03-09T00:01:54 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-09T00:01:54.583 INFO:teuthology.run_tasks:Running task print... 2026-03-09T00:01:54.585 INFO:teuthology.task.print:**** done client 2026-03-09T00:01:54.585 INFO:teuthology.run_tasks:Running task parallel... 2026-03-09T00:01:54.588 INFO:teuthology.task.parallel:starting parallel... 2026-03-09T00:01:54.588 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T00:01:54.589 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T00:01:54.589 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:01:54.589 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-09T00:01:54.589 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-09T00:01:54.589 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-09T00:01:54.591 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T00:01:54.591 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-09T00:01:54.591 INFO:tasks.workunit:timeout=3h 2026-03-09T00:01:54.591 INFO:tasks.workunit:cleanup=True 2026-03-09T00:01:54.591 DEBUG:teuthology.orchestra.run.vm03:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout: Size: 0 Blocks: 0 IO Block: 65536 directory 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout:Device: 5dh/93d Inode: 1 Links: 2 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout:Context: system_u:object_r:unlabeled_t:s0 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-09 00:01:45.090850600 +0000 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-09 00:01:52.021115768 +0000 2026-03-09T00:01:54.610 INFO:teuthology.orchestra.run.vm03.stdout: Birth: 2026-03-09 00:01:45.090850600 +0000 2026-03-09T00:01:54.610 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-09T00:01:54.610 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-09T00:01:54.674 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:54 vm03.local ceph-mon[52346]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.1 KiB/s rd, 1.8 KiB/s wr, 8 op/s 2026-03-09T00:01:54.681 DEBUG:teuthology.orchestra.run.vm06:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:54.697 INFO:teuthology.orchestra.run.vm06.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:54.697 INFO:teuthology.orchestra.run.vm06.stdout: Size: 1 Blocks: 0 IO Block: 65536 directory 2026-03-09T00:01:54.697 INFO:teuthology.orchestra.run.vm06.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-09T00:01:54.697 INFO:teuthology.orchestra.run.vm06.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-09T00:01:54.698 INFO:teuthology.orchestra.run.vm06.stdout:Context: system_u:object_r:unlabeled_t:s0 2026-03-09T00:01:54.698 INFO:teuthology.orchestra.run.vm06.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-09T00:01:54.698 INFO:teuthology.orchestra.run.vm06.stdout:Modify: 2026-03-09 00:01:54.676117977 +0000 2026-03-09T00:01:54.698 INFO:teuthology.orchestra.run.vm06.stdout:Change: 2026-03-09 00:01:54.676117977 +0000 2026-03-09T00:01:54.698 INFO:teuthology.orchestra.run.vm06.stdout: Birth: 2026-03-09 00:01:45.090850600 +0000 2026-03-09T00:01:54.698 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-09T00:01:54.698 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-09T00:01:54.745 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:54.760 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:54 vm06.local ceph-mon[58395]: pgmap v84: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.1 KiB/s rd, 1.8 KiB/s wr, 8 op/s 2026-03-09T00:01:54.767 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T00:01:54.767 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-09T00:01:54.792 INFO:tasks.workunit.client.0.vm03.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-09T00:01:54.824 INFO:tasks.workunit.client.1.vm06.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-09T00:01:54.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.986+0000 7f5d21e10700 1 -- 192.168.123.103:0/2834919130 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 msgr2=0x7f5d1c101330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:54.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.986+0000 7f5d21e10700 1 --2- 192.168.123.103:0/2834919130 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c101330 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f5d04009b00 tx=0x7f5d04009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:54.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.987+0000 7f5d21e10700 1 -- 192.168.123.103:0/2834919130 shutdown_connections 2026-03-09T00:01:54.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.987+0000 7f5d21e10700 1 --2- 192.168.123.103:0/2834919130 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d1c1020d0 0x7f5d1c102550 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.987+0000 7f5d21e10700 1 --2- 192.168.123.103:0/2834919130 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c101330 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.987+0000 7f5d21e10700 1 -- 192.168.123.103:0/2834919130 >> 192.168.123.103:0/2834919130 conn(0x7f5d1c0fc4c0 msgr2=0x7f5d1c0fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.987+0000 7f5d21e10700 1 -- 192.168.123.103:0/2834919130 shutdown_connections 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.987+0000 7f5d21e10700 1 -- 192.168.123.103:0/2834919130 wait complete. 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d21e10700 1 Processor -- start 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d21e10700 1 -- start start 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d21e10700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c194610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d21e10700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d1c1020d0 0x7f5d1c194b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:54.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d21e10700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d1c1950e0 con 0x7f5d1c100ef0 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d21e10700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d1c195220 con 0x7f5d1c1020d0 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d1b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c194610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d1b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c194610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55626/0 (socket says 192.168.123.103:55626) 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.988+0000 7f5d1b7fe700 1 -- 192.168.123.103:0/961298630 learned_addr learned my addr 192.168.123.103:0/961298630 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d1affd700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d1c1020d0 0x7f5d1c194b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d1b7fe700 1 -- 192.168.123.103:0/961298630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d1c1020d0 msgr2=0x7f5d1c194b50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d1b7fe700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d1c1020d0 0x7f5d1c194b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d1b7fe700 1 -- 192.168.123.103:0/961298630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d040097e0 con 0x7f5d1c100ef0 2026-03-09T00:01:54.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d1b7fe700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c194610 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f5d04009fd0 tx=0x7f5d04004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:54.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d0401d070 con 0x7f5d1c100ef0 2026-03-09T00:01:54.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5d0400bc50 con 0x7f5d1c100ef0 2026-03-09T00:01:54.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d0400f7e0 con 0x7f5d1c100ef0 2026-03-09T00:01:54.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d1c199c80 con 0x7f5d1c100ef0 2026-03-09T00:01:54.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.989+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d1c19a140 con 0x7f5d1c100ef0 2026-03-09T00:01:54.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.991+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d1c066e80 con 0x7f5d1c100ef0 2026-03-09T00:01:54.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.991+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5d0400f940 con 0x7f5d1c100ef0 2026-03-09T00:01:54.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.991+0000 7f5d18ff9700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d080708f0 0x7f5d08072db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:54.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.991+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f5d0408cf50 con 0x7f5d1c100ef0 2026-03-09T00:01:54.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.992+0000 7f5d1affd700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d080708f0 0x7f5d08072db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:54.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.992+0000 7f5d1affd700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d080708f0 0x7f5d08072db0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f5d0c00bab0 tx=0x7f5d0c005d50 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:54.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:54.993+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5d0405af00 con 0x7f5d1c100ef0 2026-03-09T00:01:55.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.094+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f5d1c19a480 con 0x7f5d1c100ef0 2026-03-09T00:01:55.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.094+0000 7f5d18ff9700 1 -- 192.168.123.103:0/961298630 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v15)=0 v15) v1 ==== 155+0+0 (secure 0 0 0) 0x7f5d0400bdc0 con 0x7f5d1c100ef0 2026-03-09T00:01:55.095 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d080708f0 msgr2=0x7f5d08072db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d080708f0 0x7f5d08072db0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f5d0c00bab0 tx=0x7f5d0c005d50 comp rx=0 tx=0).stop 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 msgr2=0x7f5d1c194610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c194610 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f5d04009fd0 tx=0x7f5d04004ab0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 shutdown_connections 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5d080708f0 0x7f5d08072db0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d1c100ef0 0x7f5d1c194610 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 --2- 192.168.123.103:0/961298630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d1c1020d0 0x7f5d1c194b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.096+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 >> 192.168.123.103:0/961298630 conn(0x7f5d1c0fc4c0 msgr2=0x7f5d1c105390 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.097+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 shutdown_connections 2026-03-09T00:01:55.096 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.097+0000 7f5d21e10700 1 -- 192.168.123.103:0/961298630 wait complete. 2026-03-09T00:01:55.154 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-09T00:01:55.334 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.563+0000 7f1c28152700 1 -- 192.168.123.103:0/2804865423 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c20101990 msgr2=0x7f1c20103d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.563+0000 7f1c28152700 1 --2- 192.168.123.103:0/2804865423 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c20101990 0x7f1c20103d80 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7f1c10009b00 tx=0x7f1c10009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.564+0000 7f1c28152700 1 -- 192.168.123.103:0/2804865423 shutdown_connections 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.564+0000 7f1c28152700 1 --2- 192.168.123.103:0/2804865423 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c201042c0 0x7f1c201066b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.564+0000 7f1c28152700 1 --2- 192.168.123.103:0/2804865423 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c20101990 0x7f1c20103d80 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.564+0000 7f1c28152700 1 -- 192.168.123.103:0/2804865423 >> 192.168.123.103:0/2804865423 conn(0x7f1c200fb360 msgr2=0x7f1c200fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 -- 192.168.123.103:0/2804865423 shutdown_connections 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 -- 192.168.123.103:0/2804865423 wait complete. 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 Processor -- start 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 -- start start 2026-03-09T00:01:55.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c20101990 0x7f1c201968c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 0x7f1c20196e00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c20197420 con 0x7f1c201042c0 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.565+0000 7f1c28152700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c20197560 con 0x7f1c20101990 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 0x7f1c20196e00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 0x7f1c20196e00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55638/0 (socket says 192.168.123.103:55638) 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 -- 192.168.123.103:0/747248327 learned_addr learned my addr 192.168.123.103:0/747248327 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c25eee700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c20101990 0x7f1c201968c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 -- 192.168.123.103:0/747248327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c20101990 msgr2=0x7f1c201968c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c20101990 0x7f1c201968c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 -- 192.168.123.103:0/747248327 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1c100097e0 con 0x7f1c201042c0 2026-03-09T00:01:55.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c25eee700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c20101990 0x7f1c201968c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:01:55.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.566+0000 7f1c256ed700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 0x7f1c20196e00 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f1c1c00b700 tx=0x7f1c1c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:55.566 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.567+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c1c010840 con 0x7f1c201042c0 2026-03-09T00:01:55.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.567+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1c1c010e80 con 0x7f1c201042c0 2026-03-09T00:01:55.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.567+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1c200ff7e0 con 0x7f1c201042c0 2026-03-09T00:01:55.567 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.567+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1c200ffd30 con 0x7f1c201042c0 2026-03-09T00:01:55.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.567+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1c20190a60 con 0x7f1c201042c0 2026-03-09T00:01:55.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.568+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c1c010b50 con 0x7f1c201042c0 2026-03-09T00:01:55.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.569+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1c1c00f3e0 con 0x7f1c201042c0 2026-03-09T00:01:55.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.569+0000 7f1c16ffd700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c0c06c4e0 0x7f1c0c06e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:55.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.569+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f1c1c08bbe0 con 0x7f1c201042c0 2026-03-09T00:01:55.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.571+0000 7f1c25eee700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c0c06c4e0 0x7f1c0c06e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:55.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.571+0000 7f1c25eee700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c0c06c4e0 0x7f1c0c06e9a0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f1c10006010 tx=0x7f1c1000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:55.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.571+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1c1c059e10 con 0x7f1c201042c0 2026-03-09T00:01:55.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.671+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f1c200ff970 con 0x7f1c201042c0 2026-03-09T00:01:55.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.672+0000 7f1c16ffd700 1 -- 192.168.123.103:0/747248327 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v15)=0 v15) v1 ==== 163+0+0 (secure 0 0 0) 0x7f1c200ff970 con 0x7f1c201042c0 2026-03-09T00:01:55.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c0c06c4e0 msgr2=0x7f1c0c06e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:55.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c0c06c4e0 0x7f1c0c06e9a0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f1c10006010 tx=0x7f1c1000b540 comp rx=0 tx=0).stop 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 msgr2=0x7f1c20196e00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 0x7f1c20196e00 secure :-1 s=READY pgs=272 cs=0 l=1 rev1=1 crypto rx=0x7f1c1c00b700 tx=0x7f1c1c00bac0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 shutdown_connections 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f1c0c06c4e0 0x7f1c0c06e9a0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c20101990 0x7f1c201968c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 --2- 192.168.123.103:0/747248327 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c201042c0 0x7f1c20196e00 unknown :-1 s=CLOSED pgs=272 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 >> 192.168.123.103:0/747248327 conn(0x7f1c200fb360 msgr2=0x7f1c200fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 shutdown_connections 2026-03-09T00:01:55.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:55.674+0000 7f1c28152700 1 -- 192.168.123.103:0/747248327 wait complete. 2026-03-09T00:01:55.735 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-09T00:01:55.876 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:56.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.108+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/3640608748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 msgr2=0x7fa3e41066b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:56.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.108+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/3640608748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e41066b0 secure :-1 s=READY pgs=273 cs=0 l=1 rev1=1 crypto rx=0x7fa3e0009b50 tx=0x7fa3e0009e60 comp rx=0 tx=0).stop 2026-03-09T00:01:56.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.109+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/3640608748 shutdown_connections 2026-03-09T00:01:56.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.109+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/3640608748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e41066b0 unknown :-1 s=CLOSED pgs=273 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.109+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/3640608748 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3e4101990 0x7fa3e4103d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.109+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/3640608748 >> 192.168.123.103:0/3640608748 conn(0x7fa3e40fb380 msgr2=0x7fa3e40fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:56.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.109+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/3640608748 shutdown_connections 2026-03-09T00:01:56.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.109+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/3640608748 wait complete. 2026-03-09T00:01:56.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3eb9fc700 1 Processor -- start 2026-03-09T00:01:56.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3eb9fc700 1 -- start start 2026-03-09T00:01:56.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3eb9fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3e4101990 0x7fa3e41945b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:56.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3eb9fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e4194af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3eb9fc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3e4195110 con 0x7fa3e41042c0 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3eb9fc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa3e4195250 con 0x7fa3e4101990 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3e8f97700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e4194af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3e8f97700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e4194af0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55654/0 (socket says 192.168.123.103:55654) 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.110+0000 7fa3e8f97700 1 -- 192.168.123.103:0/2813410887 learned_addr learned my addr 192.168.123.103:0/2813410887 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3e9798700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3e4101990 0x7fa3e41945b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3e8f97700 1 -- 192.168.123.103:0/2813410887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3e4101990 msgr2=0x7fa3e41945b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3e8f97700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3e4101990 0x7fa3e41945b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3e8f97700 1 -- 192.168.123.103:0/2813410887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa3e00097e0 con 0x7fa3e41042c0 2026-03-09T00:01:56.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3e8f97700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e4194af0 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7fa3e000b5c0 tx=0x7fa3e0005250 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:56.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3e001d070 con 0x7fa3e41042c0 2026-03-09T00:01:56.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa3e000bc30 con 0x7fa3e41042c0 2026-03-09T00:01:56.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa3e000f8b0 con 0x7fa3e41042c0 2026-03-09T00:01:56.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa3e4199ca0 con 0x7fa3e41042c0 2026-03-09T00:01:56.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.111+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa3e419a190 con 0x7fa3e41042c0 2026-03-09T00:01:56.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.113+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa3e418e7d0 con 0x7fa3e41042c0 2026-03-09T00:01:56.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.114+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fa3e0022b30 con 0x7fa3e41042c0 2026-03-09T00:01:56.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.114+0000 7fa3da7fc700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3d0070a10 0x7fa3d0072ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:56.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.114+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fa3e008cf90 con 0x7fa3e41042c0 2026-03-09T00:01:56.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.114+0000 7fa3e9798700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3d0070a10 0x7fa3d0072ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:56.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.115+0000 7fa3e9798700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3d0070a10 0x7fa3d0072ed0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa3d4005950 tx=0x7fa3d40058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:56.115 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.116+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa3e005b2c0 con 0x7fa3e41042c0 2026-03-09T00:01:56.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.216+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7fa3e4066e80 con 0x7fa3e41042c0 2026-03-09T00:01:56.215 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.216+0000 7fa3da7fc700 1 -- 192.168.123.103:0/2813410887 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v15)=0 v15) v1 ==== 135+0+0 (secure 0 0 0) 0x7fa3e005ae50 con 0x7fa3e41042c0 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.218+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3d0070a10 msgr2=0x7fa3d0072ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.218+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3d0070a10 0x7fa3d0072ed0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa3d4005950 tx=0x7fa3d40058e0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 msgr2=0x7fa3e4194af0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e4194af0 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7fa3e000b5c0 tx=0x7fa3e0005250 comp rx=0 tx=0).stop 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 shutdown_connections 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fa3d0070a10 0x7fa3d0072ed0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa3e4101990 0x7fa3e41945b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 --2- 192.168.123.103:0/2813410887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa3e41042c0 0x7fa3e4194af0 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 >> 192.168.123.103:0/2813410887 conn(0x7fa3e40fb380 msgr2=0x7fa3e40fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:56.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.219+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 shutdown_connections 2026-03-09T00:01:56.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.220+0000 7fa3eb9fc700 1 -- 192.168.123.103:0/2813410887 wait complete. 2026-03-09T00:01:56.277 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr' 2026-03-09T00:01:56.413 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:01:56.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.641+0000 7f4c13c39700 1 -- 192.168.123.103:0/3093077833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c103150 msgr2=0x7f4c0c103570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:01:56.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.641+0000 7f4c13c39700 1 --2- 192.168.123.103:0/3093077833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c103150 0x7f4c0c103570 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f4c08009b00 tx=0x7f4c08009e10 comp rx=0 tx=0).stop 2026-03-09T00:01:56.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.642+0000 7f4c13c39700 1 -- 192.168.123.103:0/3093077833 shutdown_connections 2026-03-09T00:01:56.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.642+0000 7f4c13c39700 1 --2- 192.168.123.103:0/3093077833 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c104350 0x7f4c0c1047b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.642+0000 7f4c13c39700 1 --2- 192.168.123.103:0/3093077833 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c103150 0x7f4c0c103570 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.642+0000 7f4c13c39700 1 -- 192.168.123.103:0/3093077833 >> 192.168.123.103:0/3093077833 conn(0x7f4c0c0fe6d0 msgr2=0x7f4c0c100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.642+0000 7f4c13c39700 1 -- 192.168.123.103:0/3093077833 shutdown_connections 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.642+0000 7f4c13c39700 1 -- 192.168.123.103:0/3093077833 wait complete. 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c13c39700 1 Processor -- start 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c13c39700 1 -- start start 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c13c39700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 0x7f4c0c198a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c13c39700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c104350 0x7f4c0c198f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:56.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c13c39700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c0c199560 con 0x7f4c0c104350 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c13c39700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c0c1996a0 con 0x7f4c0c103150 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c119d5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 0x7f4c0c198a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c119d5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 0x7f4c0c198a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:40698/0 (socket says 192.168.123.103:40698) 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c119d5700 1 -- 192.168.123.103:0/2720049248 learned_addr learned my addr 192.168.123.103:0/2720049248 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.643+0000 7f4c119d5700 1 -- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c104350 msgr2=0x7f4c0c198f40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c111d4700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c104350 0x7f4c0c198f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c119d5700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c104350 0x7f4c0c198f40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c119d5700 1 -- 192.168.123.103:0/2720049248 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c080097e0 con 0x7f4c0c103150 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c119d5700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 0x7f4c0c198a00 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4c08006010 tx=0x7f4c080049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c111d4700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c104350 0x7f4c0c198f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:01:56.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c0801d070 con 0x7f4c0c103150 2026-03-09T00:01:56.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c0c19e0f0 con 0x7f4c0c103150 2026-03-09T00:01:56.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.644+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c0c19e5e0 con 0x7f4c0c103150 2026-03-09T00:01:56.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.645+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4c0800bc50 con 0x7f4c0c103150 2026-03-09T00:01:56.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.645+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c0800f810 con 0x7f4c0c103150 2026-03-09T00:01:56.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.645+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f4c0800fa30 con 0x7f4c0c103150 2026-03-09T00:01:56.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.645+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c0c066e80 con 0x7f4c0c103150 2026-03-09T00:01:56.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.646+0000 7f4c02ffd700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4bf806c290 0x7f4bf806e750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:01:56.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.646+0000 7f4c111d4700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4bf806c290 0x7f4bf806e750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:01:56.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.646+0000 7f4c111d4700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4bf806c290 0x7f4bf806e750 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f4bfc005fd0 tx=0x7f4bfc005ee0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:01:56.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.647+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f4c0808cfe0 con 0x7f4c0c103150 2026-03-09T00:01:56.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.648+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4c0805c4e0 con 0x7f4c0c103150 2026-03-09T00:01:56.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:01:56.759+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f4c0c108d50 con 0x7f4bf806c290 2026-03-09T00:01:57.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:57 vm03.local ceph-mon[52346]: pgmap v85: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.3 KiB/s wr, 9 op/s 2026-03-09T00:01:57.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:57 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:57.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:57 vm06.local ceph-mon[58395]: pgmap v85: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.3 KiB/s wr, 9 op/s 2026-03-09T00:01:57.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:57 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:01:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:58 vm03.local ceph-mon[52346]: from='client.24321 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:58.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:58 vm06.local ceph-mon[58395]: from='client.24321 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:01:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:01:59 vm06.local ceph-mon[58395]: pgmap v86: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.8 KiB/s wr, 9 op/s 2026-03-09T00:01:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:01:59 vm03.local ceph-mon[52346]: pgmap v86: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.8 KiB/s wr, 9 op/s 2026-03-09T00:02:01.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:01 vm03.local ceph-mon[52346]: pgmap v87: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.9 KiB/s rd, 1.6 KiB/s wr, 7 op/s 2026-03-09T00:02:01.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:01 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:01.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:01 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:01.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:01 vm06.local ceph-mon[58395]: pgmap v87: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.9 KiB/s rd, 1.6 KiB/s wr, 7 op/s 2026-03-09T00:02:01.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:01 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:01.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:01 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:03.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:03 vm03.local ceph-mon[52346]: pgmap v88: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.5 KiB/s rd, 938 B/s wr, 6 op/s 2026-03-09T00:02:03.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:03 vm06.local ceph-mon[58395]: pgmap v88: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.5 KiB/s rd, 938 B/s wr, 6 op/s 2026-03-09T00:02:04.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:04 vm03.local ceph-mon[52346]: pgmap v89: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.2 KiB/s rd, 938 B/s wr, 6 op/s 2026-03-09T00:02:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:04 vm06.local ceph-mon[58395]: pgmap v89: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.2 KiB/s rd, 938 B/s wr, 6 op/s 2026-03-09T00:02:07.068 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:07 vm03.local ceph-mon[52346]: pgmap v90: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 4.3 KiB/s rd, 767 B/s wr, 4 op/s 2026-03-09T00:02:07.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:07 vm06.local ceph-mon[58395]: pgmap v90: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 4.3 KiB/s rd, 767 B/s wr, 4 op/s 2026-03-09T00:02:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:09 vm03.local ceph-mon[52346]: pgmap v91: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 0 op/s 2026-03-09T00:02:09.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:09 vm06.local ceph-mon[58395]: pgmap v91: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 0 op/s 2026-03-09T00:02:11.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:11 vm03.local ceph-mon[52346]: pgmap v92: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:11.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:11 vm06.local ceph-mon[58395]: pgmap v92: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:13 vm03.local ceph-mon[52346]: pgmap v93: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:13.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:13 vm06.local ceph-mon[58395]: pgmap v93: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:15.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:15 vm03.local ceph-mon[52346]: pgmap v94: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:15.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:15 vm06.local ceph-mon[58395]: pgmap v94: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:16.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:16 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:16 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:17.030 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:17 vm03.local ceph-mon[52346]: pgmap v95: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:17.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:17 vm06.local ceph-mon[58395]: pgmap v95: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:19 vm03.local ceph-mon[52346]: pgmap v96: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:19.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:19 vm06.local ceph-mon[58395]: pgmap v96: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:21 vm03.local ceph-mon[52346]: pgmap v97: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:21 vm06.local ceph-mon[58395]: pgmap v97: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:23.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:23 vm06.local ceph-mon[58395]: pgmap v98: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:23 vm03.local ceph-mon[52346]: pgmap v98: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:24.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:24 vm03.local ceph-mon[52346]: pgmap v99: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:24 vm06.local ceph-mon[58395]: pgmap v99: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:27.057 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:27 vm03.local ceph-mon[52346]: pgmap v100: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:27.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:27 vm06.local ceph-mon[58395]: pgmap v100: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:29 vm06.local ceph-mon[58395]: pgmap v101: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:29.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:29 vm03.local ceph-mon[52346]: pgmap v101: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:30.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:30 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:30.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:30 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:31.822 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:31 vm03.local ceph-mon[52346]: pgmap v102: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:31.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:31 vm06.local ceph-mon[58395]: pgmap v102: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-09T00:02:32.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:32 vm03.local ceph-mon[52346]: pgmap v103: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:32.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:32 vm06.local ceph-mon[58395]: pgmap v103: 65 pgs: 65 active+clean; 456 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:34 vm03.local ceph-mon[52346]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T00:02:34.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:34 vm06.local ceph-mon[58395]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-09T00:02:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:37 vm03.local ceph-mon[52346]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:37.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:37 vm06.local ceph-mon[58395]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:39.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:38 vm03.local ceph-mon[52346]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:38 vm06.local ceph-mon[58395]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:42.105 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:41 vm03.local ceph-mon[52346]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:42.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:41 vm06.local ceph-mon[58395]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:42.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.305+0000 7f4c02ffd700 1 -- 192.168.123.103:0/2720049248 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f4c0c108d50 con 0x7f4bf806c290 2026-03-09T00:02:42.307 INFO:teuthology.orchestra.run.vm03.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4bf806c290 msgr2=0x7f4bf806e750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4bf806c290 0x7f4bf806e750 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f4bfc005fd0 tx=0x7f4bfc005ee0 comp rx=0 tx=0).stop 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 msgr2=0x7f4c0c198a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 0x7f4c0c198a00 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f4c08006010 tx=0x7f4c080049e0 comp rx=0 tx=0).stop 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 shutdown_connections 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f4bf806c290 0x7f4bf806e750 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4c0c103150 0x7f4c0c198a00 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 --2- 192.168.123.103:0/2720049248 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4c0c104350 0x7f4c0c198f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 >> 192.168.123.103:0/2720049248 conn(0x7f4c0c0fe6d0 msgr2=0x7f4c0c107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 shutdown_connections 2026-03-09T00:02:42.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:42.308+0000 7f4c13c39700 1 -- 192.168.123.103:0/2720049248 wait complete. 2026-03-09T00:02:42.382 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done' 2026-03-09T00:02:42.785 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:02:43.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.295+0000 7f5c360c9700 1 -- 192.168.123.103:0/4138661983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30103140 msgr2=0x7f5c30103560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.295+0000 7f5c360c9700 1 --2- 192.168.123.103:0/4138661983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30103140 0x7f5c30103560 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7f5c20009b00 tx=0x7f5c20009e10 comp rx=0 tx=0).stop 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 -- 192.168.123.103:0/4138661983 shutdown_connections 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 --2- 192.168.123.103:0/4138661983 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30104340 0x7f5c301047a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 --2- 192.168.123.103:0/4138661983 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30103140 0x7f5c30103560 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 -- 192.168.123.103:0/4138661983 >> 192.168.123.103:0/4138661983 conn(0x7f5c300fe6c0 msgr2=0x7f5c30100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 -- 192.168.123.103:0/4138661983 shutdown_connections 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 -- 192.168.123.103:0/4138661983 wait complete. 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 Processor -- start 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.296+0000 7f5c360c9700 1 -- start start 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c360c9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 0x7f5c30198a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c360c9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30104340 0x7f5c30198f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c360c9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c30199580 con 0x7f5c30104340 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c360c9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5c301996c0 con 0x7f5c30103140 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 0x7f5c30198a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 0x7f5c30198a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49530/0 (socket says 192.168.123.103:49530) 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 -- 192.168.123.103:0/1738155205 learned_addr learned my addr 192.168.123.103:0/1738155205 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2effd700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30104340 0x7f5c30198f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 -- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30104340 msgr2=0x7f5c30198f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30104340 0x7f5c30198f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 -- 192.168.123.103:0/1738155205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5c200097e0 con 0x7f5c30103140 2026-03-09T00:02:43.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.297+0000 7f5c2f7fe700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 0x7f5c30198a20 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5c200052d0 tx=0x7f5c2000bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:43.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.299+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c2001d070 con 0x7f5c30103140 2026-03-09T00:02:43.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.299+0000 7f5c360c9700 1 -- 192.168.123.103:0/1738155205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5c3019e110 con 0x7f5c30103140 2026-03-09T00:02:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.299+0000 7f5c360c9700 1 -- 192.168.123.103:0/1738155205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5c3019e600 con 0x7f5c30103140 2026-03-09T00:02:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.299+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5c2000bdf0 con 0x7f5c30103140 2026-03-09T00:02:43.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.299+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5c20021c30 con 0x7f5c30103140 2026-03-09T00:02:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.300+0000 7f5c360c9700 1 -- 192.168.123.103:0/1738155205 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5c1c005320 con 0x7f5c30103140 2026-03-09T00:02:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.300+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5c20021410 con 0x7f5c30103140 2026-03-09T00:02:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.300+0000 7f5c2cff9700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5c1806c600 0x7f5c1806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.300+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f5c2008ceb0 con 0x7f5c30103140 2026-03-09T00:02:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.301+0000 7f5c2effd700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5c1806c600 0x7f5c1806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.301+0000 7f5c2effd700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5c1806c600 0x7f5c1806eac0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f5c24009c00 tx=0x7f5c24009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:43.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.304+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5c2005b1e0 con 0x7f5c30103140 2026-03-09T00:02:43.443 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:43 vm03.local ceph-mon[52346]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 682 B/s wr, 0 op/s 2026-03-09T00:02:43.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:43 vm03.local ceph-mon[52346]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:02:43.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:43 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:43.444 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:43 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:02:43.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.442+0000 7f5c360c9700 1 -- 192.168.123.103:0/1738155205 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5c1c000bf0 con 0x7f5c1806c600 2026-03-09T00:02:43.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.445+0000 7f5c2cff9700 1 -- 192.168.123.103:0/1738155205 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+259 (secure 0 0 0) 0x7f5c1c000bf0 con 0x7f5c1806c600 2026-03-09T00:02:43.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 -- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5c1806c600 msgr2=0x7f5c1806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5c1806c600 0x7f5c1806eac0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f5c24009c00 tx=0x7f5c24009380 comp rx=0 tx=0).stop 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 -- 192.168.123.103:0/1738155205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 msgr2=0x7f5c30198a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 0x7f5c30198a20 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f5c200052d0 tx=0x7f5c2000bac0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 -- 192.168.123.103:0/1738155205 shutdown_connections 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f5c1806c600 0x7f5c1806eac0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5c30103140 0x7f5c30198a20 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 --2- 192.168.123.103:0/1738155205 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5c30104340 0x7f5c30198f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.447+0000 7f5c167fc700 1 -- 192.168.123.103:0/1738155205 >> 192.168.123.103:0/1738155205 conn(0x7f5c300fe6c0 msgr2=0x7f5c30107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.448+0000 7f5c167fc700 1 -- 192.168.123.103:0/1738155205 shutdown_connections 2026-03-09T00:02:43.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.448+0000 7f5c167fc700 1 -- 192.168.123.103:0/1738155205 wait complete. 2026-03-09T00:02:43.459 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:02:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- 192.168.123.103:0/1968824749 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c072b20 msgr2=0x7fad4c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 --2- 192.168.123.103:0/1968824749 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c072b20 0x7fad4c072f40 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7fad3c009b50 tx=0x7fad3c009e60 comp rx=0 tx=0).stop 2026-03-09T00:02:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- 192.168.123.103:0/1968824749 shutdown_connections 2026-03-09T00:02:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 --2- 192.168.123.103:0/1968824749 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad4c075a10 0x7fad4c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 --2- 192.168.123.103:0/1968824749 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c072b20 0x7fad4c072f40 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- 192.168.123.103:0/1968824749 >> 192.168.123.103:0/1968824749 conn(0x7fad4c06daa0 msgr2=0x7fad4c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- 192.168.123.103:0/1968824749 shutdown_connections 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- 192.168.123.103:0/1968824749 wait complete. 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 Processor -- start 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- start start 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad4c075a10 0x7fad4c083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 0x7fad4c1b3090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad4c083b00 con 0x7fad4c0835c0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.536+0000 7fad4bfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad4c083c70 con 0x7fad4c075a10 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 0x7fad4c1b3090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 0x7fad4c1b3090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42522/0 (socket says 192.168.123.103:42522) 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 -- 192.168.123.103:0/1078126734 learned_addr learned my addr 192.168.123.103:0/1078126734 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 -- 192.168.123.103:0/1078126734 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad4c075a10 msgr2=0x7fad4c083080 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4affd700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad4c075a10 0x7fad4c083080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad4c075a10 0x7fad4c083080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 -- 192.168.123.103:0/1078126734 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fad3c0097e0 con 0x7fad4c0835c0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4a7fc700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 0x7fad4c1b3090 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7fad4400c390 tx=0x7fad4400c750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad440090d0 con 0x7fad4c0835c0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fad4400f040 con 0x7fad4c0835c0 2026-03-09T00:02:43.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad44014670 con 0x7fad4c0835c0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4bfff700 1 -- 192.168.123.103:0/1078126734 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fad4c1b3690 con 0x7fad4c0835c0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.537+0000 7fad4bfff700 1 -- 192.168.123.103:0/1078126734 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fad4c1b3b60 con 0x7fad4c0835c0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.539+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fad440147d0 con 0x7fad4c0835c0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.539+0000 7fad33fff700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fad3406e820 0x7fad34070ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.539+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fad4408c9f0 con 0x7fad4c0835c0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.540+0000 7fad4affd700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fad3406e820 0x7fad34070ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.540+0000 7fad4affd700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fad3406e820 0x7fad34070ce0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fad3c005fd0 tx=0x7fad3c011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:43.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.540+0000 7fad4bfff700 1 -- 192.168.123.103:0/1078126734 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fad4c04ea90 con 0x7fad4c0835c0 2026-03-09T00:02:43.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.544+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fad44057290 con 0x7fad4c0835c0 2026-03-09T00:02:43.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.669+0000 7fad4bfff700 1 -- 192.168.123.103:0/1078126734 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fad4c077730 con 0x7fad3406e820 2026-03-09T00:02:43.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:43 vm06.local ceph-mon[58395]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 682 B/s wr, 0 op/s 2026-03-09T00:02:43.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:43 vm06.local ceph-mon[58395]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:02:43.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:43 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:43.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:43 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:02:43.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.672+0000 7fad33fff700 1 -- 192.168.123.103:0/1078126734 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+259 (secure 0 0 0) 0x7fad4c077730 con 0x7fad3406e820 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 -- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fad3406e820 msgr2=0x7fad34070ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fad3406e820 0x7fad34070ce0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fad3c005fd0 tx=0x7fad3c011040 comp rx=0 tx=0).stop 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 -- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 msgr2=0x7fad4c1b3090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 0x7fad4c1b3090 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7fad4400c390 tx=0x7fad4400c750 comp rx=0 tx=0).stop 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 -- 192.168.123.103:0/1078126734 shutdown_connections 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fad3406e820 0x7fad34070ce0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad4c075a10 0x7fad4c083080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 --2- 192.168.123.103:0/1078126734 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad4c0835c0 0x7fad4c1b3090 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 -- 192.168.123.103:0/1078126734 >> 192.168.123.103:0/1078126734 conn(0x7fad4c06daa0 msgr2=0x7fad4c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 -- 192.168.123.103:0/1078126734 shutdown_connections 2026-03-09T00:02:43.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.679+0000 7fad31ffb700 1 -- 192.168.123.103:0/1078126734 wait complete. 2026-03-09T00:02:43.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.768+0000 7f26819aa700 1 -- 192.168.123.103:0/2234544071 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 msgr2=0x7f267c1044e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.768+0000 7f26819aa700 1 --2- 192.168.123.103:0/2234544071 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 0x7f267c1044e0 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f2668009b00 tx=0x7f2668009e10 comp rx=0 tx=0).stop 2026-03-09T00:02:43.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.770+0000 7f26819aa700 1 -- 192.168.123.103:0/2234544071 shutdown_connections 2026-03-09T00:02:43.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.770+0000 7f26819aa700 1 --2- 192.168.123.103:0/2234544071 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 0x7f267c1044e0 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.770+0000 7f26819aa700 1 --2- 192.168.123.103:0/2234544071 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c103290 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.770+0000 7f26819aa700 1 -- 192.168.123.103:0/2234544071 >> 192.168.123.103:0/2234544071 conn(0x7f267c0fe440 msgr2=0x7f267c1008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.770+0000 7f26819aa700 1 -- 192.168.123.103:0/2234544071 shutdown_connections 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 -- 192.168.123.103:0/2234544071 wait complete. 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 Processor -- start 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 -- start start 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c1a1510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 0x7f267c1a1af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f267c1a2130 con 0x7f267c104060 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.773+0000 7f26819aa700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f267c1a22a0 con 0x7f267c102e70 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f26809a8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c1a1510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f26809a8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c1a1510 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49556/0 (socket says 192.168.123.103:49556) 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f26809a8700 1 -- 192.168.123.103:0/2562644363 learned_addr learned my addr 192.168.123.103:0/2562644363 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:02:43.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f267bfff700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 0x7f267c1a1af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f26809a8700 1 -- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 msgr2=0x7f267c1a1af0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f26809a8700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 0x7f267c1a1af0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.774+0000 7f26809a8700 1 -- 192.168.123.103:0/2562644363 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26680097e0 con 0x7f267c102e70 2026-03-09T00:02:43.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.775+0000 7f26809a8700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c1a1510 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f267000b700 tx=0x7f267000ba10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.775+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2670011840 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.775+0000 7f26819aa700 1 -- 192.168.123.103:0/2562644363 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f267c1a6a20 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.775+0000 7f26819aa700 1 -- 192.168.123.103:0/2562644363 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f267c1a6f70 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.775+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2670011e80 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.775+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26700103e0 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.776+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f26700119a0 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.777+0000 7f26819aa700 1 -- 192.168.123.103:0/2562644363 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2660005320 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.777+0000 7f2679ffb700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f266c06c4e0 0x7f266c06e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.777+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f267008b510 con 0x7f267c102e70 2026-03-09T00:02:43.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.777+0000 7f267bfff700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f266c06c4e0 0x7f266c06e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:43.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.777+0000 7f267bfff700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f266c06c4e0 0x7f266c06e9a0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f2668005fd0 tx=0x7f2668009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:43.780 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.779+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f26700597c0 con 0x7f267c102e70 2026-03-09T00:02:43.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.946+0000 7f26819aa700 1 -- 192.168.123.103:0/2562644363 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2660000bf0 con 0x7f266c06c4e0 2026-03-09T00:02:43.955 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 53s ago 3m 22.5M - 0.25.0 c8568f914cd2 9b05d2f3502a 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (3m) 53s ago 3m 7964k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (2m) 54s ago 2m 8220k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 53s ago 3m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (2m) 54s ago 2m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 53s ago 2m 80.7M - 9.4.7 954c08fa6188 9db2e5805e97 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (57s) 53s ago 57s 14.0M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (59s) 53s ago 59s 14.4M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (57s) 54s ago 57s 16.1M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (59s) 54s ago 59s 10.7M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:9283,8765,8443 running (3m) 53s ago 3m 503M - 18.2.1 5be31c24972a e48c90025d56 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (2m) 54s ago 2m 450M - 18.2.1 5be31c24972a 4c6a564e9efa 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 53s ago 3m 51.2M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (2m) 54s ago 2m 48.2M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 53s ago 3m 14.0M - 1.5.0 0da6a335fe13 750af7597536 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 54s ago 2m 14.0M - 1.5.0 0da6a335fe13 a82b7dc84593 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 53s ago 2m 49.0M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 53s ago 2m 48.8M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (113s) 53s ago 113s 48.6M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (102s) 54s ago 102s 46.8M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (92s) 54s ago 92s 42.5M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (82s) 54s ago 82s 45.5M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 53s ago 2m 36.8M - 2.43.0 a07b618ecd1d a4a1b4f06180 2026-03-09T00:02:43.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.954+0000 7f2679ffb700 1 -- 192.168.123.103:0/2562644363 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7f2660000bf0 con 0x7f266c06c4e0 2026-03-09T00:02:43.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 -- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f266c06c4e0 msgr2=0x7f266c06e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f266c06c4e0 0x7f266c06e9a0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f2668005fd0 tx=0x7f2668009f90 comp rx=0 tx=0).stop 2026-03-09T00:02:43.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 -- 192.168.123.103:0/2562644363 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 msgr2=0x7f267c1a1510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:43.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c1a1510 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f267000b700 tx=0x7f267000ba10 comp rx=0 tx=0).stop 2026-03-09T00:02:43.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 -- 192.168.123.103:0/2562644363 shutdown_connections 2026-03-09T00:02:43.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f266c06c4e0 0x7f266c06e9a0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f267c102e70 0x7f267c1a1510 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 --2- 192.168.123.103:0/2562644363 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f267c104060 0x7f267c1a1af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:43.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.962+0000 7f26677fe700 1 -- 192.168.123.103:0/2562644363 >> 192.168.123.103:0/2562644363 conn(0x7f267c0fe440 msgr2=0x7f267c107320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:43.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.963+0000 7f26677fe700 1 -- 192.168.123.103:0/2562644363 shutdown_connections 2026-03-09T00:02:43.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:43.964+0000 7f26677fe700 1 -- 192.168.123.103:0/2562644363 wait complete. 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.051+0000 7fc48a4ef700 1 -- 192.168.123.103:0/2642968545 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 msgr2=0x7fc48410cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.051+0000 7fc48a4ef700 1 --2- 192.168.123.103:0/2642968545 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 0x7fc48410cb90 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7fc474009b00 tx=0x7fc474009e10 comp rx=0 tx=0).stop 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- 192.168.123.103:0/2642968545 shutdown_connections 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 --2- 192.168.123.103:0/2642968545 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 0x7fc48410cb90 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 --2- 192.168.123.103:0/2642968545 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc48410a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- 192.168.123.103:0/2642968545 >> 192.168.123.103:0/2642968545 conn(0x7fc48406daa0 msgr2=0x7fc48406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- 192.168.123.103:0/2642968545 shutdown_connections 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- 192.168.123.103:0/2642968545 wait complete. 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 Processor -- start 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- start start 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc484116ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 0x7fc484117220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc484117840 con 0x7fc48410a700 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc48a4ef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc484117980 con 0x7fc484107d90 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc4894ed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc484116ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc4894ed700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc484116ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49582/0 (socket says 192.168.123.103:49582) 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.054+0000 7fc4894ed700 1 -- 192.168.123.103:0/993063543 learned_addr learned my addr 192.168.123.103:0/993063543 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.055+0000 7fc4894ed700 1 -- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 msgr2=0x7fc484117220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.055+0000 7fc4894ed700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 0x7fc484117220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.055+0000 7fc4894ed700 1 -- 192.168.123.103:0/993063543 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4740097e0 con 0x7fc484107d90 2026-03-09T00:02:44.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.055+0000 7fc4894ed700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc484116ce0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fc48000d8d0 tx=0x7fc48000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:44.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.056+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc480009940 con 0x7fc484107d90 2026-03-09T00:02:44.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.056+0000 7fc48a4ef700 1 -- 192.168.123.103:0/993063543 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4841b3520 con 0x7fc484107d90 2026-03-09T00:02:44.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.056+0000 7fc48a4ef700 1 -- 192.168.123.103:0/993063543 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4841b3a70 con 0x7fc484107d90 2026-03-09T00:02:44.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.056+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc480010460 con 0x7fc484107d90 2026-03-09T00:02:44.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.056+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc48000f5d0 con 0x7fc484107d90 2026-03-09T00:02:44.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.057+0000 7fc48a4ef700 1 -- 192.168.123.103:0/993063543 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc468005320 con 0x7fc484107d90 2026-03-09T00:02:44.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.061+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc4800105d0 con 0x7fc484107d90 2026-03-09T00:02:44.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.061+0000 7fc47a7fc700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc47006c530 0x7fc47006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:44.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.062+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fc48008bc60 con 0x7fc484107d90 2026-03-09T00:02:44.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.062+0000 7fc488cec700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc47006c530 0x7fc47006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:44.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.062+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc4800b8180 con 0x7fc484107d90 2026-03-09T00:02:44.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.062+0000 7fc488cec700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc47006c530 0x7fc47006e9f0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc47400b5c0 tx=0x7fc474009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:44.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.247+0000 7fc48a4ef700 1 -- 192.168.123.103:0/993063543 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc468005cc0 con 0x7fc484107d90 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:02:44.255 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:02:44.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.254+0000 7fc47a7fc700 1 -- 192.168.123.103:0/993063543 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7fc480059f90 con 0x7fc484107d90 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 -- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc47006c530 msgr2=0x7fc47006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc47006c530 0x7fc47006e9f0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fc47400b5c0 tx=0x7fc474009f90 comp rx=0 tx=0).stop 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 -- 192.168.123.103:0/993063543 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 msgr2=0x7fc484116ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc484116ce0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fc48000d8d0 tx=0x7fc48000dc90 comp rx=0 tx=0).stop 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 -- 192.168.123.103:0/993063543 shutdown_connections 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc47006c530 0x7fc47006e9f0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc484107d90 0x7fc484116ce0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 --2- 192.168.123.103:0/993063543 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc48410a700 0x7fc484117220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.259+0000 7fc46ffff700 1 -- 192.168.123.103:0/993063543 >> 192.168.123.103:0/993063543 conn(0x7fc48406daa0 msgr2=0x7fc48406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.260+0000 7fc46ffff700 1 -- 192.168.123.103:0/993063543 shutdown_connections 2026-03-09T00:02:44.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.260+0000 7fc46ffff700 1 -- 192.168.123.103:0/993063543 wait complete. 2026-03-09T00:02:44.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:44 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:02:44.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:44 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:02:44.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:44 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:44.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:44 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/993063543' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.357+0000 7f78bffff700 1 -- 192.168.123.103:0/2575273830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c0075a10 msgr2=0x7f78c0077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.357+0000 7f78bffff700 1 --2- 192.168.123.103:0/2575273830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c0075a10 0x7f78c0077ea0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f78b800b780 tx=0x7f78b800ba90 comp rx=0 tx=0).stop 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.357+0000 7f78bffff700 1 -- 192.168.123.103:0/2575273830 shutdown_connections 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.357+0000 7f78bffff700 1 --2- 192.168.123.103:0/2575273830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c0075a10 0x7f78c0077ea0 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.357+0000 7f78bffff700 1 --2- 192.168.123.103:0/2575273830 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f78c0072b20 0x7f78c0072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.357+0000 7f78bffff700 1 -- 192.168.123.103:0/2575273830 >> 192.168.123.103:0/2575273830 conn(0x7f78c006daa0 msgr2=0x7f78c006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 -- 192.168.123.103:0/2575273830 shutdown_connections 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 -- 192.168.123.103:0/2575273830 wait complete. 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 Processor -- start 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 -- start start 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f78c0072b20 0x7f78c0083990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 0x7f78c012e240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78c012e780 con 0x7f78c012bdb0 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78bffff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78c012e8f0 con 0x7f78c0072b20 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78be7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 0x7f78c012e240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78be7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 0x7f78c012e240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42584/0 (socket says 192.168.123.103:42584) 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78be7fc700 1 -- 192.168.123.103:0/1601869656 learned_addr learned my addr 192.168.123.103:0/1601869656 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.358+0000 7f78beffd700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f78c0072b20 0x7f78c0083990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.359+0000 7f78be7fc700 1 -- 192.168.123.103:0/1601869656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f78c0072b20 msgr2=0x7f78c0083990 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.359+0000 7f78be7fc700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f78c0072b20 0x7f78c0083990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.359+0000 7f78be7fc700 1 -- 192.168.123.103:0/1601869656 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78b800b050 con 0x7f78c012bdb0 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.359+0000 7f78be7fc700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 0x7f78c012e240 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f78b800b750 tx=0x7f78b8009300 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.359+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f78b8003bb0 con 0x7f78c012bdb0 2026-03-09T00:02:44.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.360+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f78b8003d10 con 0x7f78c012bdb0 2026-03-09T00:02:44.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.360+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f78b8004790 con 0x7f78c012bdb0 2026-03-09T00:02:44.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.360+0000 7f78bffff700 1 -- 192.168.123.103:0/1601869656 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f78c012eb70 con 0x7f78c012bdb0 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.360+0000 7f78bffff700 1 -- 192.168.123.103:0/1601869656 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f78c012f060 con 0x7f78c012bdb0 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.361+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f78b80077d0 con 0x7f78c012bdb0 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.361+0000 7f78a7fff700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f78a806c530 0x7f78a806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.362+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f78b808c7f0 con 0x7f78c012bdb0 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.363+0000 7f78beffd700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f78a806c530 0x7f78a806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.363+0000 7f78beffd700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f78a806c530 0x7f78a806e9f0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f78b0009910 tx=0x7f78b0008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:02:44.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.363+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f78c004ea90 con 0x7f78c012bdb0 2026-03-09T00:02:44.369 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.366+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f78b805ab20 con 0x7f78c012bdb0 2026-03-09T00:02:44.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.544+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f78c0061cc0 con 0x7f78a806c530 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "", 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T00:02:44.548 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:02:44.549 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:02:44.549 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.547+0000 7f78a7fff700 1 -- 192.168.123.103:0/1601869656 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+351 (secure 0 0 0) 0x7f78c0061cc0 con 0x7f78a806c530 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.549+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f78a806c530 msgr2=0x7f78a806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.549+0000 7f78a5ffb700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f78a806c530 0x7f78a806e9f0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f78b0009910 tx=0x7f78b0008040 comp rx=0 tx=0).stop 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.549+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 msgr2=0x7f78c012e240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.549+0000 7f78a5ffb700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 0x7f78c012e240 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f78b800b750 tx=0x7f78b8009300 comp rx=0 tx=0).stop 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 shutdown_connections 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f78a806c530 0x7f78a806e9f0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f78c0072b20 0x7f78c0083990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 --2- 192.168.123.103:0/1601869656 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f78c012bdb0 0x7f78c012e240 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 >> 192.168.123.103:0/1601869656 conn(0x7f78c006daa0 msgr2=0x7f78c006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 shutdown_connections 2026-03-09T00:02:44.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:02:44.551+0000 7f78a5ffb700 1 -- 192.168.123.103:0/1601869656 wait complete. 2026-03-09T00:02:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:44 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:02:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:44 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:02:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:44 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:44 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/993063543' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='client.24323 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='client.14544 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='client.24327 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:45.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:45 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='client.24323 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='client.14544 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='client.24327 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 597 B/s wr, 0 op/s 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:02:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:45 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: git switch -c 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:Or undo this operation with: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: git switch - 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr: 2026-03-09T00:02:46.536 INFO:tasks.workunit.client.1.vm06.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T00:02:46.541 DEBUG:teuthology.orchestra.run.vm06:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-09T00:02:46.597 INFO:tasks.workunit.client.1.vm06.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T00:02:46.599 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T00:02:46.599 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T00:02:46.643 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T00:02:46.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:46 vm06.local ceph-mon[58395]: from='client.14554 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:46.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:46 vm06.local ceph-mon[58395]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T00:02:46.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:46 vm06.local ceph-mon[58395]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T00:02:46.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:46 vm06.local ceph-mon[58395]: Upgrade: Need to upgrade myself (mgr.vm03.yvcons) 2026-03-09T00:02:46.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:46 vm06.local ceph-mon[58395]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm06 2026-03-09T00:02:46.678 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T00:02:46.706 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-09T00:02:46.707 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T00:02:46.707 INFO:tasks.workunit.client.1.vm06.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T00:02:46.736 INFO:tasks.workunit.client.1.vm06.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-09T00:02:46.739 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:02:46.739 DEBUG:teuthology.orchestra.run.vm06:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-09T00:02:46.794 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-09T00:02:46.795 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T00:02:46.795 DEBUG:teuthology.orchestra.run.vm06:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-09T00:02:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:46 vm03.local ceph-mon[52346]: from='client.14554 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:02:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:46 vm03.local ceph-mon[52346]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-09T00:02:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:46 vm03.local ceph-mon[52346]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-09T00:02:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:46 vm03.local ceph-mon[52346]: Upgrade: Need to upgrade myself (mgr.vm03.yvcons) 2026-03-09T00:02:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:46 vm03.local ceph-mon[52346]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm06 2026-03-09T00:02:46.860 INFO:tasks.workunit.client.1.vm06.stderr:+ mkdir -p fsstress 2026-03-09T00:02:46.862 INFO:tasks.workunit.client.1.vm06.stderr:+ pushd fsstress 2026-03-09T00:02:46.862 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T00:02:46.862 INFO:tasks.workunit.client.1.vm06.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T00:02:47.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:47 vm06.local ceph-mon[58395]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 170 B/s wr, 0 op/s 2026-03-09T00:02:47.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:47 vm03.local ceph-mon[52346]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 170 B/s wr, 0 op/s 2026-03-09T00:02:48.576 INFO:tasks.workunit.client.1.vm06.stderr:+ tar xzf ltp-full.tgz 2026-03-09T00:02:49.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:49 vm06.local ceph-mon[58395]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:49.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:49 vm03.local ceph-mon[52346]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:51.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:51 vm06.local ceph-mon[58395]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:51.837 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:51 vm03.local ceph-mon[52346]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-09T00:02:52.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:52 vm03.local ceph-mon[52346]: pgmap v113: 65 pgs: 65 active+clean; 1.6 MiB data, 164 MiB used, 120 GiB / 120 GiB avail; 102 KiB/s wr, 0 op/s 2026-03-09T00:02:52.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:52 vm06.local ceph-mon[58395]: pgmap v113: 65 pgs: 65 active+clean; 1.6 MiB data, 164 MiB used, 120 GiB / 120 GiB avail; 102 KiB/s wr, 0 op/s 2026-03-09T00:02:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:54 vm03.local ceph-mon[52346]: pgmap v114: 65 pgs: 65 active+clean; 6.8 MiB data, 177 MiB used, 120 GiB / 120 GiB avail; 556 KiB/s wr, 16 op/s 2026-03-09T00:02:55.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:54 vm06.local ceph-mon[58395]: pgmap v114: 65 pgs: 65 active+clean; 6.8 MiB data, 177 MiB used, 120 GiB / 120 GiB avail; 556 KiB/s wr, 16 op/s 2026-03-09T00:02:57.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:56 vm03.local ceph-mon[52346]: pgmap v115: 65 pgs: 65 active+clean; 22 MiB data, 243 MiB used, 120 GiB / 120 GiB avail; 1.9 MiB/s wr, 149 op/s 2026-03-09T00:02:57.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:56 vm06.local ceph-mon[58395]: pgmap v115: 65 pgs: 65 active+clean; 22 MiB data, 243 MiB used, 120 GiB / 120 GiB avail; 1.9 MiB/s wr, 149 op/s 2026-03-09T00:02:58.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:02:58 vm06.local ceph-mon[58395]: pgmap v116: 65 pgs: 65 active+clean; 36 MiB data, 279 MiB used, 120 GiB / 120 GiB avail; 3.1 MiB/s wr, 230 op/s 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:state without impacting any branches by switching back to a branch. 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: git switch -c 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:Or undo this operation with: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: git switch - 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-09T00:02:59.020 INFO:tasks.workunit.client.0.vm03.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-09T00:02:59.025 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-09T00:02:59.047 INFO:tasks.workunit.client.0.vm03.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-09T00:02:59.049 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T00:02:59.049 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-09T00:02:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:02:58 vm03.local ceph-mon[52346]: pgmap v116: 65 pgs: 65 active+clean; 36 MiB data, 279 MiB used, 120 GiB / 120 GiB avail; 3.1 MiB/s wr, 230 op/s 2026-03-09T00:02:59.129 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-09T00:02:59.167 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-09T00:02:59.195 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-09T00:02:59.197 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T00:02:59.197 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-09T00:02:59.224 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-09T00:02:59.228 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:02:59.228 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-09T00:02:59.286 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-09T00:02:59.287 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-09T00:02:59.287 DEBUG:teuthology.orchestra.run.vm03:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-09T00:02:59.352 INFO:tasks.workunit.client.0.vm03.stderr:+ mkdir -p fsstress 2026-03-09T00:02:59.355 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T00:02:59.355 INFO:tasks.workunit.client.0.vm03.stderr:+ pushd fsstress 2026-03-09T00:02:59.355 INFO:tasks.workunit.client.0.vm03.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-09T00:03:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:00 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:00.837 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:00 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:01.126 INFO:tasks.workunit.client.0.vm03.stderr:+ tar xzf ltp-full.tgz 2026-03-09T00:03:02.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:01 vm03.local ceph-mon[52346]: pgmap v117: 65 pgs: 65 active+clean; 40 MiB data, 285 MiB used, 120 GiB / 120 GiB avail; 3.4 MiB/s wr, 233 op/s 2026-03-09T00:03:02.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:01 vm06.local ceph-mon[58395]: pgmap v117: 65 pgs: 65 active+clean; 40 MiB data, 285 MiB used, 120 GiB / 120 GiB avail; 3.4 MiB/s wr, 233 op/s 2026-03-09T00:03:03.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:03 vm03.local ceph-mon[52346]: pgmap v118: 65 pgs: 65 active+clean; 50 MiB data, 332 MiB used, 120 GiB / 120 GiB avail; 4.3 MiB/s wr, 321 op/s 2026-03-09T00:03:03.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:03 vm06.local ceph-mon[58395]: pgmap v118: 65 pgs: 65 active+clean; 50 MiB data, 332 MiB used, 120 GiB / 120 GiB avail; 4.3 MiB/s wr, 321 op/s 2026-03-09T00:03:05.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:05 vm03.local ceph-mon[52346]: pgmap v119: 65 pgs: 65 active+clean; 65 MiB data, 428 MiB used, 120 GiB / 120 GiB avail; 5.5 MiB/s wr, 437 op/s 2026-03-09T00:03:05.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:05 vm06.local ceph-mon[58395]: pgmap v119: 65 pgs: 65 active+clean; 65 MiB data, 428 MiB used, 120 GiB / 120 GiB avail; 5.5 MiB/s wr, 437 op/s 2026-03-09T00:03:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:06 vm06.local ceph-mon[58395]: pgmap v120: 65 pgs: 65 active+clean; 75 MiB data, 640 MiB used, 119 GiB / 120 GiB avail; 6.0 MiB/s wr, 557 op/s 2026-03-09T00:03:07.072 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:06 vm03.local ceph-mon[52346]: pgmap v120: 65 pgs: 65 active+clean; 75 MiB data, 640 MiB used, 119 GiB / 120 GiB avail; 6.0 MiB/s wr, 557 op/s 2026-03-09T00:03:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:09 vm06.local ceph-mon[58395]: pgmap v121: 65 pgs: 65 active+clean; 97 MiB data, 706 MiB used, 119 GiB / 120 GiB avail; 6.5 MiB/s wr, 507 op/s 2026-03-09T00:03:09.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:09 vm03.local ceph-mon[52346]: pgmap v121: 65 pgs: 65 active+clean; 97 MiB data, 706 MiB used, 119 GiB / 120 GiB avail; 6.5 MiB/s wr, 507 op/s 2026-03-09T00:03:10.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:10 vm03.local ceph-mon[52346]: pgmap v122: 65 pgs: 65 active+clean; 107 MiB data, 737 MiB used, 119 GiB / 120 GiB avail; 6.2 MiB/s wr, 484 op/s 2026-03-09T00:03:10.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:10 vm06.local ceph-mon[58395]: pgmap v122: 65 pgs: 65 active+clean; 107 MiB data, 737 MiB used, 119 GiB / 120 GiB avail; 6.2 MiB/s wr, 484 op/s 2026-03-09T00:03:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:13 vm06.local ceph-mon[58395]: pgmap v123: 65 pgs: 65 active+clean; 117 MiB data, 855 MiB used, 119 GiB / 120 GiB avail; 6.8 MiB/s wr, 599 op/s 2026-03-09T00:03:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:13 vm03.local ceph-mon[52346]: pgmap v123: 65 pgs: 65 active+clean; 117 MiB data, 855 MiB used, 119 GiB / 120 GiB avail; 6.8 MiB/s wr, 599 op/s 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.641+0000 7f7164444700 1 -- 192.168.123.103:0/1624891744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 msgr2=0x7f715c10cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.641+0000 7f7164444700 1 --2- 192.168.123.103:0/1624891744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c10cb90 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f715401c320 tx=0x7f715401c630 comp rx=0 tx=0).stop 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 -- 192.168.123.103:0/1624891744 shutdown_connections 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 --2- 192.168.123.103:0/1624891744 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c10cb90 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 --2- 192.168.123.103:0/1624891744 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f715c107d90 0x7f715c10a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 -- 192.168.123.103:0/1624891744 >> 192.168.123.103:0/1624891744 conn(0x7f715c06dad0 msgr2=0x7f715c06ff30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 -- 192.168.123.103:0/1624891744 shutdown_connections 2026-03-09T00:03:14.641 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 -- 192.168.123.103:0/1624891744 wait complete. 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 Processor -- start 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.642+0000 7f7164444700 1 -- start start 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f7164444700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f715c107d90 0x7f715c19cec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f7164444700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c19d400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f7164444700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f715c19da20 con 0x7f715c10a700 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f7164444700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f715c19db60 con 0x7f715c107d90 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71619df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c19d400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71621e0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f715c107d90 0x7f715c19cec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71619df700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c19d400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57766/0 (socket says 192.168.123.103:57766) 2026-03-09T00:03:14.642 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71619df700 1 -- 192.168.123.103:0/4153878193 learned_addr learned my addr 192.168.123.103:0/4153878193 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71619df700 1 -- 192.168.123.103:0/4153878193 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f715c107d90 msgr2=0x7f715c19cec0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71619df700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f715c107d90 0x7f715c19cec0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.643+0000 7f71619df700 1 -- 192.168.123.103:0/4153878193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f715401c060 con 0x7f715c10a700 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.644+0000 7f71619df700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c19d400 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7f7154007dc0 tx=0x7f7154003ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.644+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7154003e30 con 0x7f715c10a700 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.644+0000 7f7164444700 1 -- 192.168.123.103:0/4153878193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f715c1a25b0 con 0x7f715c10a700 2026-03-09T00:03:14.643 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.644+0000 7f7164444700 1 -- 192.168.123.103:0/4153878193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f715c1a2b00 con 0x7f715c10a700 2026-03-09T00:03:14.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.646+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f715401f050 con 0x7f715c10a700 2026-03-09T00:03:14.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.646+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7154023a20 con 0x7f715c10a700 2026-03-09T00:03:14.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.646+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f715402a040 con 0x7f715c10a700 2026-03-09T00:03:14.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.647+0000 7f71537fe700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f714806c380 0x7f714806e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:14.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.647+0000 7f71621e0700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f714806c380 0x7f714806e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:14.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.647+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f715409e780 con 0x7f715c10a700 2026-03-09T00:03:14.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.648+0000 7f7164444700 1 -- 192.168.123.103:0/4153878193 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7140005320 con 0x7f715c10a700 2026-03-09T00:03:14.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.650+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f715406cab0 con 0x7f715c10a700 2026-03-09T00:03:14.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.654+0000 7f71621e0700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f714806c380 0x7f714806e840 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f7158009910 tx=0x7f7158008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:14.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.779+0000 7f7164444700 1 -- 192.168.123.103:0/4153878193 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7140000bf0 con 0x7f714806c380 2026-03-09T00:03:14.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.780+0000 7f71537fe700 1 -- 192.168.123.103:0/4153878193 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f7140000bf0 con 0x7f714806c380 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 -- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f714806c380 msgr2=0x7f714806e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f714806c380 0x7f714806e840 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f7158009910 tx=0x7f7158008040 comp rx=0 tx=0).stop 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 -- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 msgr2=0x7f715c19d400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c19d400 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7f7154007dc0 tx=0x7f7154003ab0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 -- 192.168.123.103:0/4153878193 shutdown_connections 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f714806c380 0x7f714806e840 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f715c107d90 0x7f715c19cec0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 --2- 192.168.123.103:0/4153878193 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f715c10a700 0x7f715c19d400 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 -- 192.168.123.103:0/4153878193 >> 192.168.123.103:0/4153878193 conn(0x7f715c06dad0 msgr2=0x7f715c10c3b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:14.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 -- 192.168.123.103:0/4153878193 shutdown_connections 2026-03-09T00:03:14.783 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.783+0000 7f71517fa700 1 -- 192.168.123.103:0/4153878193 wait complete. 2026-03-09T00:03:14.792 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:03:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 -- 192.168.123.103:0/44899795 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058075a10 msgr2=0x7f3058077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 --2- 192.168.123.103:0/44899795 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058075a10 0x7f3058077ea0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7f305000b600 tx=0x7f305000b910 comp rx=0 tx=0).stop 2026-03-09T00:03:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 -- 192.168.123.103:0/44899795 shutdown_connections 2026-03-09T00:03:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 --2- 192.168.123.103:0/44899795 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058075a10 0x7f3058077ea0 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 --2- 192.168.123.103:0/44899795 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3058072b20 0x7f3058072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 -- 192.168.123.103:0/44899795 >> 192.168.123.103:0/44899795 conn(0x7f305806daa0 msgr2=0x7f305806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 -- 192.168.123.103:0/44899795 shutdown_connections 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.873+0000 7f305ddf5700 1 -- 192.168.123.103:0/44899795 wait complete. 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305ddf5700 1 Processor -- start 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305ddf5700 1 -- start start 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305ddf5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 0x7f3058083130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305ddf5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3058075a10 0x7f3058083670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:14.874 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305ddf5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3058083ce0 con 0x7f3058072b20 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305ddf5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f30581b3140 con 0x7f3058075a10 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305cdf3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 0x7f3058083130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305cdf3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 0x7f3058083130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57786/0 (socket says 192.168.123.103:57786) 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f305cdf3700 1 -- 192.168.123.103:0/1098250460 learned_addr learned my addr 192.168.123.103:0/1098250460 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.874+0000 7f3057fff700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3058075a10 0x7f3058083670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f305cdf3700 1 -- 192.168.123.103:0/1098250460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3058075a10 msgr2=0x7f3058083670 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f305cdf3700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3058075a10 0x7f3058083670 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f305cdf3700 1 -- 192.168.123.103:0/1098250460 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f305000b050 con 0x7f3058072b20 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f305cdf3700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 0x7f3058083130 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f304800ba70 tx=0x7f304800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:14.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f304800c700 con 0x7f3058072b20 2026-03-09T00:03:14.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f304800cd40 con 0x7f3058072b20 2026-03-09T00:03:14.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3048012340 con 0x7f3058072b20 2026-03-09T00:03:14.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f30581b3420 con 0x7f3058072b20 2026-03-09T00:03:14.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.875+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f30581b3940 con 0x7f3058072b20 2026-03-09T00:03:14.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.877+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f30480124c0 con 0x7f3058072b20 2026-03-09T00:03:14.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.877+0000 7f3055ffb700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f304006c600 0x7f304006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:14.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.877+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f304808b8a0 con 0x7f3058072b20 2026-03-09T00:03:14.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.877+0000 7f3057fff700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f304006c600 0x7f304006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:14.877 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.878+0000 7f3057fff700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f304006c600 0x7f304006eac0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f305000bd90 tx=0x7f30500096a0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:14.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.878+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3044005320 con 0x7f3058072b20 2026-03-09T00:03:14.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:14.881+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f30480561f0 con 0x7f3058072b20 2026-03-09T00:03:15.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.005+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3044000bf0 con 0x7f304006c600 2026-03-09T00:03:15.008 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.009+0000 7f3055ffb700 1 -- 192.168.123.103:0/1098250460 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f3044000bf0 con 0x7f304006c600 2026-03-09T00:03:15.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f304006c600 msgr2=0x7f304006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f304006c600 0x7f304006eac0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f305000bd90 tx=0x7f30500096a0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 msgr2=0x7f3058083130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 0x7f3058083130 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f304800ba70 tx=0x7f304800bd80 comp rx=0 tx=0).stop 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 shutdown_connections 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f304006c600 0x7f304006eac0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3058072b20 0x7f3058083130 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 --2- 192.168.123.103:0/1098250460 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3058075a10 0x7f3058083670 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.011+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 >> 192.168.123.103:0/1098250460 conn(0x7f305806daa0 msgr2=0x7f305806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.012+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 shutdown_connections 2026-03-09T00:03:15.011 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.012+0000 7f305ddf5700 1 -- 192.168.123.103:0/1098250460 wait complete. 2026-03-09T00:03:15.101 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 -- 192.168.123.103:0/431692013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a4075a10 msgr2=0x7fc9a4077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/431692013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a4075a10 0x7fc9a4077ea0 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7fc99c00b600 tx=0x7fc99c00b910 comp rx=0 tx=0).stop 2026-03-09T00:03:15.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 -- 192.168.123.103:0/431692013 shutdown_connections 2026-03-09T00:03:15.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/431692013 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a4075a10 0x7fc9a4077ea0 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/431692013 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4072b20 0x7fc9a4072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 -- 192.168.123.103:0/431692013 >> 192.168.123.103:0/431692013 conn(0x7fc9a406daa0 msgr2=0x7fc9a406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 -- 192.168.123.103:0/431692013 shutdown_connections 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 -- 192.168.123.103:0/431692013 wait complete. 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 Processor -- start 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.102+0000 7fc9a88fe700 1 -- start start 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a88fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4072b20 0x7fc9a40831b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a88fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 0x7fc9a41b3200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a88fe700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9a4083c30 con 0x7fc9a40836f0 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a88fe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc9a4083da0 con 0x7fc9a4072b20 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 0x7fc9a41b3200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 0x7fc9a41b3200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57800/0 (socket says 192.168.123.103:57800) 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 -- 192.168.123.103:0/243468320 learned_addr learned my addr 192.168.123.103:0/243468320 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 -- 192.168.123.103:0/243468320 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4072b20 msgr2=0x7fc9a40831b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4072b20 0x7fc9a40831b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 -- 192.168.123.103:0/243468320 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc99c00b050 con 0x7fc9a40836f0 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc9a259c700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 0x7fc9a41b3200 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fc99c00bd90 tx=0x7fc99c003ce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:15.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.103+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc99c00e030 con 0x7fc9a40836f0 2026-03-09T00:03:15.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.104+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc9a41b37a0 con 0x7fc9a40836f0 2026-03-09T00:03:15.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.104+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc9a41b3ca0 con 0x7fc9a40836f0 2026-03-09T00:03:15.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.105+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc99c0048e0 con 0x7fc9a40836f0 2026-03-09T00:03:15.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.105+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc99c01cd20 con 0x7fc9a40836f0 2026-03-09T00:03:15.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.105+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc99c012430 con 0x7fc9a40836f0 2026-03-09T00:03:15.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.106+0000 7fc98bfff700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc98c06e8f0 0x7fc98c070db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.106+0000 7fc9a2d9d700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc98c06e8f0 0x7fc98c070db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:15.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.107+0000 7fc9a2d9d700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc98c06e8f0 0x7fc98c070db0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fc9940097c0 tx=0x7fc994006cd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:15.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.107+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fc99c08e100 con 0x7fc9a40836f0 2026-03-09T00:03:15.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.107+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc990005320 con 0x7fc9a40836f0 2026-03-09T00:03:15.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.110+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc99c05c430 con 0x7fc9a40836f0 2026-03-09T00:03:15.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.239+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fc990000bf0 con 0x7fc98c06e8f0 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.244+0000 7fc98bfff700 1 -- 192.168.123.103:0/243468320 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fc990000bf0 con 0x7fc98c06e8f0 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 84s ago 3m 22.5M - 0.25.0 c8568f914cd2 9b05d2f3502a 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (3m) 84s ago 3m 7964k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (3m) 85s ago 3m 8220k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 84s ago 3m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (3m) 85s ago 3m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 84s ago 3m 80.7M - 9.4.7 954c08fa6188 9db2e5805e97 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (89s) 84s ago 89s 14.0M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (91s) 84s ago 91s 14.4M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (88s) 85s ago 88s 16.1M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (90s) 85s ago 90s 10.7M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:9283,8765,8443 running (4m) 84s ago 4m 503M - 18.2.1 5be31c24972a e48c90025d56 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (3m) 85s ago 3m 450M - 18.2.1 5be31c24972a 4c6a564e9efa 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 84s ago 4m 51.2M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (3m) 85s ago 3m 48.2M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 84s ago 3m 14.0M - 1.5.0 0da6a335fe13 750af7597536 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 85s ago 3m 14.0M - 1.5.0 0da6a335fe13 a82b7dc84593 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 84s ago 2m 49.0M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 84s ago 2m 48.8M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 84s ago 2m 48.6M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (2m) 85s ago 2m 46.8M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (2m) 85s ago 2m 42.5M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (114s) 85s ago 114s 45.5M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:03:15.244 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 84s ago 3m 36.8M - 2.43.0 a07b618ecd1d a4a1b4f06180 2026-03-09T00:03:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc98c06e8f0 msgr2=0x7fc98c070db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc98c06e8f0 0x7fc98c070db0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fc9940097c0 tx=0x7fc994006cd0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 msgr2=0x7fc9a41b3200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.246 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 0x7fc9a41b3200 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fc99c00bd90 tx=0x7fc99c003ce0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 shutdown_connections 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7fc98c06e8f0 0x7fc98c070db0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.247+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc9a4072b20 0x7fc9a40831b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.248+0000 7fc9a88fe700 1 --2- 192.168.123.103:0/243468320 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc9a40836f0 0x7fc9a41b3200 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.248+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 >> 192.168.123.103:0/243468320 conn(0x7fc9a406daa0 msgr2=0x7fc9a406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.248+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 shutdown_connections 2026-03-09T00:03:15.247 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.248+0000 7fc9a88fe700 1 -- 192.168.123.103:0/243468320 wait complete. 2026-03-09T00:03:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 -- 192.168.123.103:0/1323229216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f0072b20 msgr2=0x7f02f0072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 --2- 192.168.123.103:0/1323229216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f0072b20 0x7f02f0072f40 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f02ec009ab0 tx=0x7f02ec009dc0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 -- 192.168.123.103:0/1323229216 shutdown_connections 2026-03-09T00:03:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 --2- 192.168.123.103:0/1323229216 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f02f0075a10 0x7f02f0077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 --2- 192.168.123.103:0/1323229216 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f0072b20 0x7f02f0072f40 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 -- 192.168.123.103:0/1323229216 >> 192.168.123.103:0/1323229216 conn(0x7f02f006daa0 msgr2=0x7f02f006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:15 vm03.local ceph-mon[52346]: pgmap v124: 65 pgs: 65 active+clean; 139 MiB data, 986 MiB used, 119 GiB / 120 GiB avail; 7.8 MiB/s wr, 702 op/s 2026-03-09T00:03:15.322 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:15 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 -- 192.168.123.103:0/1323229216 shutdown_connections 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.322+0000 7f02f709c700 1 -- 192.168.123.103:0/1323229216 wait complete. 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f709c700 1 Processor -- start 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f709c700 1 -- start start 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f709c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f02f0075a10 0x7f02f0083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f709c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 0x7f02f01b3090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f709c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f02f0083b00 con 0x7f02f00835c0 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f709c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f02f0083c70 con 0x7f02f0075a10 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 0x7f02f01b3090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 0x7f02f01b3090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57822/0 (socket says 192.168.123.103:57822) 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 -- 192.168.123.103:0/3778457501 learned_addr learned my addr 192.168.123.103:0/3778457501 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 -- 192.168.123.103:0/3778457501 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f02f0075a10 msgr2=0x7f02f0083080 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f02f0075a10 0x7f02f0083080 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 -- 192.168.123.103:0/3778457501 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f02ec009710 con 0x7f02f00835c0 2026-03-09T00:03:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.323+0000 7f02f5899700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 0x7f02f01b3090 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f02e800e530 tx=0x7f02e800e8f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:15.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.324+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f02e80090d0 con 0x7f02f00835c0 2026-03-09T00:03:15.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.324+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f02f01b3690 con 0x7f02f00835c0 2026-03-09T00:03:15.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.324+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f02f01b3b90 con 0x7f02f00835c0 2026-03-09T00:03:15.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.326+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f02e800f040 con 0x7f02f00835c0 2026-03-09T00:03:15.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.326+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f02e8014820 con 0x7f02f00835c0 2026-03-09T00:03:15.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.326+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f02e8014980 con 0x7f02f00835c0 2026-03-09T00:03:15.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.326+0000 7f02e77fe700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f02dc06c6d0 0x7f02dc06eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.326+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f02e808cbd0 con 0x7f02f00835c0 2026-03-09T00:03:15.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.327+0000 7f02f609a700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f02dc06c6d0 0x7f02dc06eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:15.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.327+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f02d4005320 con 0x7f02f00835c0 2026-03-09T00:03:15.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.329+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f02e8057470 con 0x7f02f00835c0 2026-03-09T00:03:15.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.337+0000 7f02f609a700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f02dc06c6d0 0x7f02dc06eb90 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f02ec000c00 tx=0x7f02ec005d20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:15.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.489+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f02d4006200 con 0x7f02f00835c0 2026-03-09T00:03:15.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.490+0000 7f02e77fe700 1 -- 192.168.123.103:0/3778457501 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f02e805aa90 con 0x7f02f00835c0 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:03:15.490 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.492+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f02dc06c6d0 msgr2=0x7f02dc06eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.492+0000 7f02f709c700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f02dc06c6d0 0x7f02dc06eb90 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f02ec000c00 tx=0x7f02ec005d20 comp rx=0 tx=0).stop 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 msgr2=0x7f02f01b3090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 0x7f02f01b3090 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f02e800e530 tx=0x7f02e800e8f0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 shutdown_connections 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f02dc06c6d0 0x7f02dc06eb90 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f02f0075a10 0x7f02f0083080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 --2- 192.168.123.103:0/3778457501 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f02f00835c0 0x7f02f01b3090 unknown :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 >> 192.168.123.103:0/3778457501 conn(0x7f02f006daa0 msgr2=0x7f02f006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 shutdown_connections 2026-03-09T00:03:15.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.493+0000 7f02f709c700 1 -- 192.168.123.103:0/3778457501 wait complete. 2026-03-09T00:03:15.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.576+0000 7f7f70864700 1 -- 192.168.123.103:0/523132495 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c072b20 msgr2=0x7f7f6c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.576+0000 7f7f70864700 1 --2- 192.168.123.103:0/523132495 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c072b20 0x7f7f6c072f40 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7f7f5c009b50 tx=0x7f7f5c009e60 comp rx=0 tx=0).stop 2026-03-09T00:03:15.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.576+0000 7f7f70864700 1 -- 192.168.123.103:0/523132495 shutdown_connections 2026-03-09T00:03:15.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.576+0000 7f7f70864700 1 --2- 192.168.123.103:0/523132495 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f6c075a10 0x7f7f6c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.576+0000 7f7f70864700 1 --2- 192.168.123.103:0/523132495 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c072b20 0x7f7f6c072f40 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.576+0000 7f7f70864700 1 -- 192.168.123.103:0/523132495 >> 192.168.123.103:0/523132495 conn(0x7f7f6c06daa0 msgr2=0x7f7f6c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 -- 192.168.123.103:0/523132495 shutdown_connections 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 -- 192.168.123.103:0/523132495 wait complete. 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 Processor -- start 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 -- start start 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f6c075a10 0x7f7f6c083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 0x7f7f6c1b3090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f6c083b00 con 0x7f7f6c0835c0 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f70864700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f6c083c70 con 0x7f7f6c075a10 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f6affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 0x7f7f6c1b3090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f6affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 0x7f7f6c1b3090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57836/0 (socket says 192.168.123.103:57836) 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.577+0000 7f7f6affd700 1 -- 192.168.123.103:0/1887881633 learned_addr learned my addr 192.168.123.103:0/1887881633 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f6affd700 1 -- 192.168.123.103:0/1887881633 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f6c075a10 msgr2=0x7f7f6c083080 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:03:15.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f6affd700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f6c075a10 0x7f7f6c083080 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f6affd700 1 -- 192.168.123.103:0/1887881633 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f5c0097e0 con 0x7f7f6c0835c0 2026-03-09T00:03:15.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f6affd700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 0x7f7f6c1b3090 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f7f6400c420 tx=0x7f7f6400c7e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:15.578 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f6400ce90 con 0x7f7f6c0835c0 2026-03-09T00:03:15.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f70864700 1 -- 192.168.123.103:0/1887881633 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f6c1b3690 con 0x7f7f6c0835c0 2026-03-09T00:03:15.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.578+0000 7f7f70864700 1 -- 192.168.123.103:0/1887881633 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f6c1b3b60 con 0x7f7f6c0835c0 2026-03-09T00:03:15.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.580+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f6400f040 con 0x7f7f6c0835c0 2026-03-09T00:03:15.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.580+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f640147b0 con 0x7f7f6c0835c0 2026-03-09T00:03:15.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.580+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7f64014910 con 0x7f7f6c0835c0 2026-03-09T00:03:15.579 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.580+0000 7f7f68ff9700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f5406e8f0 0x7f7f54070db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:15.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.580+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f7f6408cb20 con 0x7f7f6c0835c0 2026-03-09T00:03:15.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.581+0000 7f7f6b7fe700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f5406e8f0 0x7f7f54070db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:15.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.582+0000 7f7f6b7fe700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f5406e8f0 0x7f7f54070db0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f7f5c005cb0 tx=0x7f7f5c005be0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:15.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.582+0000 7f7f70864700 1 -- 192.168.123.103:0/1887881633 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f58005320 con 0x7f7f6c0835c0 2026-03-09T00:03:15.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.584+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7f640573c0 con 0x7f7f6c0835c0 2026-03-09T00:03:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:15 vm06.local ceph-mon[58395]: pgmap v124: 65 pgs: 65 active+clean; 139 MiB data, 986 MiB used, 119 GiB / 120 GiB avail; 7.8 MiB/s wr, 702 op/s 2026-03-09T00:03:15.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:15 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:15.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.697+0000 7f7f70864700 1 -- 192.168.123.103:0/1887881633 --> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f58000bf0 con 0x7f7f5406e8f0 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.698+0000 7f7f68ff9700 1 -- 192.168.123.103:0/1887881633 <== mgr.14223 v2:192.168.123.103:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f7f58000bf0 con 0x7f7f5406e8f0 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "0/2 daemons upgraded", 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm06", 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:03:15.698 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 -- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f5406e8f0 msgr2=0x7f7f54070db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f5406e8f0 0x7f7f54070db0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f7f5c005cb0 tx=0x7f7f5c005be0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 -- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 msgr2=0x7f7f6c1b3090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 0x7f7f6c1b3090 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7f7f6400c420 tx=0x7f7f6400c7e0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 -- 192.168.123.103:0/1887881633 shutdown_connections 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:6800/2,v1:192.168.123.103:6801/2] conn(0x7f7f5406e8f0 0x7f7f54070db0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7f6c075a10 0x7f7f6c083080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 --2- 192.168.123.103:0/1887881633 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7f6c0835c0 0x7f7f6c1b3090 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 -- 192.168.123.103:0/1887881633 >> 192.168.123.103:0/1887881633 conn(0x7f7f6c06daa0 msgr2=0x7f7f6c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 -- 192.168.123.103:0/1887881633 shutdown_connections 2026-03-09T00:03:15.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:15.701+0000 7f7f527fc700 1 -- 192.168.123.103:0/1887881633 wait complete. 2026-03-09T00:03:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:16 vm06.local ceph-mon[58395]: from='client.14558 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:16 vm06.local ceph-mon[58395]: from='client.14562 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:16 vm06.local ceph-mon[58395]: from='client.14566 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:16 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/3778457501' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:03:16.428 INFO:tasks.workunit.client.1.vm06.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T00:03:16.429 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T00:03:16.429 INFO:tasks.workunit.client.1.vm06.stderr:+ make 2026-03-09T00:03:16.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:16 vm03.local ceph-mon[52346]: from='client.14558 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:16.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:16 vm03.local ceph-mon[52346]: from='client.14562 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:16.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:16 vm03.local ceph-mon[52346]: from='client.14566 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:16.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:16 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/3778457501' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:03:17.057 INFO:tasks.workunit.client.1.vm06.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T00:03:17.188 INFO:tasks.workunit.client.1.vm06.stderr:++ readlink -f fsstress 2026-03-09T00:03:17.188 INFO:tasks.workunit.client.1.vm06.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T00:03:17.189 INFO:tasks.workunit.client.1.vm06.stderr:+ popd 2026-03-09T00:03:17.189 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-09T00:03:17.189 INFO:tasks.workunit.client.1.vm06.stderr:+ popd 2026-03-09T00:03:17.189 INFO:tasks.workunit.client.1.vm06.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-09T00:03:17.189 INFO:tasks.workunit.client.1.vm06.stderr:++ mktemp -d -p . 2026-03-09T00:03:17.190 INFO:tasks.workunit.client.1.vm06.stderr:+ T=./tmp.42dJIYM0SW 2026-03-09T00:03:17.190 INFO:tasks.workunit.client.1.vm06.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.42dJIYM0SW -l 1 -n 1000 -p 10 -v 2026-03-09T00:03:17.191 INFO:tasks.workunit.client.1.vm06.stdout:seed = 1772945191 2026-03-09T00:03:17.194 INFO:tasks.workunit.client.1.vm06.stdout:7/0: mkdir d0 0 2026-03-09T00:03:17.194 INFO:tasks.workunit.client.1.vm06.stdout:7/1: write - no filename 2026-03-09T00:03:17.195 INFO:tasks.workunit.client.1.vm06.stdout:7/2: creat d0/f1 x:0 0 0 2026-03-09T00:03:17.195 INFO:tasks.workunit.client.1.vm06.stdout:7/3: creat d0/f2 x:0 0 0 2026-03-09T00:03:17.196 INFO:tasks.workunit.client.1.vm06.stdout:8/0: mknod c0 0 2026-03-09T00:03:17.196 INFO:tasks.workunit.client.1.vm06.stdout:8/1: chown c0 631549 1 2026-03-09T00:03:17.198 INFO:tasks.workunit.client.1.vm06.stdout:7/4: rename d0/f1 to d0/f3 0 2026-03-09T00:03:17.198 INFO:tasks.workunit.client.1.vm06.stdout:7/5: write d0/f3 [106382,63674] 0 2026-03-09T00:03:17.202 INFO:tasks.workunit.client.1.vm06.stdout:6/0: stat - no entries 2026-03-09T00:03:17.202 INFO:tasks.workunit.client.1.vm06.stdout:9/0: mknod c0 0 2026-03-09T00:03:17.202 INFO:tasks.workunit.client.1.vm06.stdout:8/2: rename c0 to c1 0 2026-03-09T00:03:17.202 INFO:tasks.workunit.client.1.vm06.stdout:8/3: dread - no filename 2026-03-09T00:03:17.202 INFO:tasks.workunit.client.1.vm06.stdout:8/4: write - no filename 2026-03-09T00:03:17.216 INFO:tasks.workunit.client.1.vm06.stdout:5/0: chown . 14314174 1 2026-03-09T00:03:17.216 INFO:tasks.workunit.client.1.vm06.stdout:5/1: dwrite - no filename 2026-03-09T00:03:17.220 INFO:tasks.workunit.client.1.vm06.stdout:9/1: mkdir d1 0 2026-03-09T00:03:17.220 INFO:tasks.workunit.client.1.vm06.stdout:6/1: symlink l0 0 2026-03-09T00:03:17.220 INFO:tasks.workunit.client.1.vm06.stdout:6/2: write - no filename 2026-03-09T00:03:17.220 INFO:tasks.workunit.client.1.vm06.stdout:8/5: mknod c2 0 2026-03-09T00:03:17.237 INFO:tasks.workunit.client.1.vm06.stdout:8/6: creat f3 x:0 0 0 2026-03-09T00:03:17.237 INFO:tasks.workunit.client.1.vm06.stdout:8/7: write f3 [990054,30340] 0 2026-03-09T00:03:17.237 INFO:tasks.workunit.client.1.vm06.stdout:8/8: write f3 [1730627,45930] 0 2026-03-09T00:03:17.237 INFO:tasks.workunit.client.1.vm06.stdout:5/2: mkdir d0 0 2026-03-09T00:03:17.237 INFO:tasks.workunit.client.1.vm06.stdout:5/3: fsync - no filename 2026-03-09T00:03:17.253 INFO:tasks.workunit.client.1.vm06.stdout:8/9: symlink l4 0 2026-03-09T00:03:17.973 INFO:tasks.workunit.client.1.vm06.stdout:8/10: write f3 [140912,63544] 0 2026-03-09T00:03:17.973 INFO:tasks.workunit.client.1.vm06.stdout:8/11: rmdir - no directory 2026-03-09T00:03:17.973 INFO:tasks.workunit.client.1.vm06.stdout:8/12: creat f5 x:0 0 0 2026-03-09T00:03:18.398 INFO:tasks.workunit.client.1.vm06.stdout:5/4: rename d0 to d0/d1 22 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:4/0: dwrite - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:4/1: write - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:4/2: dread - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/2: chown c0 1 1 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/3: chown d1 76394926 1 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/4: write - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/5: dwrite - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/6: chown d1 0 1 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/7: truncate - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:9/8: readlink - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:6/3: dwrite - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:3/0: dwrite - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:0/0: write - no filename 2026-03-09T00:03:18.399 INFO:tasks.workunit.client.1.vm06.stdout:2/0: getdents . 0 2026-03-09T00:03:18.402 INFO:tasks.workunit.client.1.vm06.stdout:4/3: symlink l0 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:9/9: link c0 d1/c2 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:3/1: mknod c0 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:3/2: fsync - no filename 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:1/0: creat f0 x:0 0 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:1/1: truncate f0 742426 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:5/5: rmdir d0 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:5/6: rmdir - no directory 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:5/7: unlink - no file 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:0/1: mknod c0 0 2026-03-09T00:03:18.403 INFO:tasks.workunit.client.1.vm06.stdout:0/2: write - no filename 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:2/1: mkdir d0 0 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:2/2: fsync - no filename 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:2/3: write - no filename 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:2/4: chown d0 164312 1 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:2/5: dread - no filename 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:2/6: write - no filename 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:4/4: creat f1 x:0 0 0 2026-03-09T00:03:18.404 INFO:tasks.workunit.client.1.vm06.stdout:4/5: dread - f1 zero size 2026-03-09T00:03:18.405 INFO:tasks.workunit.client.1.vm06.stdout:9/10: mkdir d1/d3 0 2026-03-09T00:03:18.405 INFO:tasks.workunit.client.1.vm06.stdout:9/11: dwrite - no filename 2026-03-09T00:03:18.405 INFO:tasks.workunit.client.1.vm06.stdout:9/12: dwrite - no filename 2026-03-09T00:03:18.406 INFO:tasks.workunit.client.1.vm06.stdout:3/3: symlink l1 0 2026-03-09T00:03:18.406 INFO:tasks.workunit.client.1.vm06.stdout:3/4: dwrite - no filename 2026-03-09T00:03:18.411 INFO:tasks.workunit.client.1.vm06.stdout:1/2: mknod c1 0 2026-03-09T00:03:18.412 INFO:tasks.workunit.client.1.vm06.stdout:5/8: creat f2 x:0 0 0 2026-03-09T00:03:18.412 INFO:tasks.workunit.client.1.vm06.stdout:5/9: dread - f2 zero size 2026-03-09T00:03:18.412 INFO:tasks.workunit.client.1.vm06.stdout:5/10: write f2 [68372,86155] 0 2026-03-09T00:03:18.413 INFO:tasks.workunit.client.1.vm06.stdout:0/3: creat f1 x:0 0 0 2026-03-09T00:03:18.413 INFO:tasks.workunit.client.1.vm06.stdout:0/4: write f1 [826915,6872] 0 2026-03-09T00:03:18.413 INFO:tasks.workunit.client.1.vm06.stdout:0/5: creat f2 x:0 0 0 2026-03-09T00:03:18.414 INFO:tasks.workunit.client.1.vm06.stdout:2/7: rmdir d0 0 2026-03-09T00:03:18.414 INFO:tasks.workunit.client.1.vm06.stdout:2/8: truncate - no filename 2026-03-09T00:03:18.414 INFO:tasks.workunit.client.1.vm06.stdout:2/9: chown . 746248820 1 2026-03-09T00:03:18.414 INFO:tasks.workunit.client.1.vm06.stdout:4/6: symlink l2 0 2026-03-09T00:03:18.415 INFO:tasks.workunit.client.1.vm06.stdout:9/13: mkdir d1/d4 0 2026-03-09T00:03:18.415 INFO:tasks.workunit.client.1.vm06.stdout:9/14: fsync - no filename 2026-03-09T00:03:18.415 INFO:tasks.workunit.client.1.vm06.stdout:9/15: write - no filename 2026-03-09T00:03:18.415 INFO:tasks.workunit.client.1.vm06.stdout:9/16: chown d1/c2 5311175 1 2026-03-09T00:03:18.415 INFO:tasks.workunit.client.1.vm06.stdout:9/17: dread - no filename 2026-03-09T00:03:18.421 INFO:tasks.workunit.client.1.vm06.stdout:5/11: mknod c3 0 2026-03-09T00:03:18.426 INFO:tasks.workunit.client.1.vm06.stdout:0/6: mkdir d3 0 2026-03-09T00:03:18.426 INFO:tasks.workunit.client.1.vm06.stdout:0/7: getdents d3 0 2026-03-09T00:03:18.426 INFO:tasks.workunit.client.1.vm06.stdout:2/10: mknod c1 0 2026-03-09T00:03:18.426 INFO:tasks.workunit.client.1.vm06.stdout:2/11: write - no filename 2026-03-09T00:03:18.426 INFO:tasks.workunit.client.1.vm06.stdout:4/7: mknod c3 0 2026-03-09T00:03:18.426 INFO:tasks.workunit.client.1.vm06.stdout:4/8: fdatasync f1 0 2026-03-09T00:03:18.427 INFO:tasks.workunit.client.1.vm06.stdout:2/12: creat f2 x:0 0 0 2026-03-09T00:03:18.427 INFO:tasks.workunit.client.1.vm06.stdout:2/13: write f2 [626970,92734] 0 2026-03-09T00:03:18.427 INFO:tasks.workunit.client.1.vm06.stdout:2/14: write f2 [996182,43082] 0 2026-03-09T00:03:18.427 INFO:tasks.workunit.client.1.vm06.stdout:2/15: chown f2 160827 1 2026-03-09T00:03:18.427 INFO:tasks.workunit.client.1.vm06.stdout:4/9: rename c3 to c4 0 2026-03-09T00:03:18.430 INFO:tasks.workunit.client.1.vm06.stdout:0/8: mknod d3/c4 0 2026-03-09T00:03:18.430 INFO:tasks.workunit.client.1.vm06.stdout:0/9: chown f2 3 1 2026-03-09T00:03:18.432 INFO:tasks.workunit.client.1.vm06.stdout:0/10: creat d3/f5 x:0 0 0 2026-03-09T00:03:18.432 INFO:tasks.workunit.client.1.vm06.stdout:0/11: creat d3/f6 x:0 0 0 2026-03-09T00:03:18.530 INFO:tasks.workunit.client.1.vm06.stdout:0/12: dread f1 [0,4194304] 0 2026-03-09T00:03:18.530 INFO:tasks.workunit.client.1.vm06.stdout:0/13: creat d3/f7 x:0 0 0 2026-03-09T00:03:18.532 INFO:tasks.workunit.client.1.vm06.stdout:0/14: rename c0 to d3/c8 0 2026-03-09T00:03:18.550 INFO:tasks.workunit.client.1.vm06.stdout:4/10: dwrite f1 [0,4194304] 0 2026-03-09T00:03:18.550 INFO:tasks.workunit.client.1.vm06.stdout:4/11: creat f5 x:0 0 0 2026-03-09T00:03:18.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:18 vm03.local ceph-mon[52346]: from='client.14574 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:18.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:18 vm03.local ceph-mon[52346]: pgmap v125: 65 pgs: 65 active+clean; 155 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 7.9 MiB/s wr, 767 op/s 2026-03-09T00:03:18.612 INFO:tasks.workunit.client.1.vm06.stdout:2/16: dread f2 [0,4194304] 0 2026-03-09T00:03:18.618 INFO:tasks.workunit.client.1.vm06.stdout:1/3: dwrite f0 [0,4194304] 0 2026-03-09T00:03:18.635 INFO:tasks.workunit.client.1.vm06.stdout:2/17: creat f3 x:0 0 0 2026-03-09T00:03:18.635 INFO:tasks.workunit.client.1.vm06.stdout:2/18: creat f4 x:0 0 0 2026-03-09T00:03:18.635 INFO:tasks.workunit.client.1.vm06.stdout:8/13: getdents . 0 2026-03-09T00:03:18.635 INFO:tasks.workunit.client.1.vm06.stdout:8/14: fsync f5 0 2026-03-09T00:03:18.635 INFO:tasks.workunit.client.1.vm06.stdout:8/15: stat f5 0 2026-03-09T00:03:18.635 INFO:tasks.workunit.client.1.vm06.stdout:8/16: write f5 [233193,7888] 0 2026-03-09T00:03:18.651 INFO:tasks.workunit.client.1.vm06.stdout:0/15: dwrite d3/f6 [0,4194304] 0 2026-03-09T00:03:18.659 INFO:tasks.workunit.client.1.vm06.stdout:3/5: getdents . 0 2026-03-09T00:03:18.676 INFO:tasks.workunit.client.1.vm06.stdout:5/12: dwrite f2 [0,4194304] 0 2026-03-09T00:03:18.676 INFO:tasks.workunit.client.1.vm06.stdout:5/13: fdatasync f2 0 2026-03-09T00:03:18.676 INFO:tasks.workunit.client.1.vm06.stdout:5/14: chown c3 45074240 1 2026-03-09T00:03:18.676 INFO:tasks.workunit.client.1.vm06.stdout:5/15: truncate f2 4467803 0 2026-03-09T00:03:18.684 INFO:tasks.workunit.client.1.vm06.stdout:4/12: dwrite f5 [0,4194304] 0 2026-03-09T00:03:18.684 INFO:tasks.workunit.client.1.vm06.stdout:4/13: creat f6 x:0 0 0 2026-03-09T00:03:18.732 INFO:tasks.workunit.client.1.vm06.stdout:4/14: dwrite f6 [0,4194304] 0 2026-03-09T00:03:18.732 INFO:tasks.workunit.client.1.vm06.stdout:5/16: dwrite f2 [0,4194304] 0 2026-03-09T00:03:18.732 INFO:tasks.workunit.client.1.vm06.stdout:5/17: chown f2 21021 1 2026-03-09T00:03:18.741 INFO:tasks.workunit.client.1.vm06.stdout:5/18: write f2 [2273043,99501] 0 2026-03-09T00:03:18.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:18 vm06.local ceph-mon[58395]: from='client.14574 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:18.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:18 vm06.local ceph-mon[58395]: pgmap v125: 65 pgs: 65 active+clean; 155 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 7.9 MiB/s wr, 767 op/s 2026-03-09T00:03:19.289 INFO:tasks.workunit.client.1.vm06.stdout:2/19: mknod c5 0 2026-03-09T00:03:19.289 INFO:tasks.workunit.client.1.vm06.stdout:2/20: rmdir - no directory 2026-03-09T00:03:19.289 INFO:tasks.workunit.client.1.vm06.stdout:2/21: creat f6 x:0 0 0 2026-03-09T00:03:19.289 INFO:tasks.workunit.client.1.vm06.stdout:2/22: fdatasync f3 0 2026-03-09T00:03:19.289 INFO:tasks.workunit.client.1.vm06.stdout:2/23: write f3 [295223,96948] 0 2026-03-09T00:03:19.290 INFO:tasks.workunit.client.1.vm06.stdout:1/4: symlink l2 0 2026-03-09T00:03:19.291 INFO:tasks.workunit.client.1.vm06.stdout:8/17: symlink l6 0 2026-03-09T00:03:19.291 INFO:tasks.workunit.client.1.vm06.stdout:8/18: fdatasync f5 0 2026-03-09T00:03:19.291 INFO:tasks.workunit.client.1.vm06.stdout:8/19: write f5 [810853,95052] 0 2026-03-09T00:03:19.300 INFO:tasks.workunit.client.1.vm06.stdout:3/6: mknod c2 0 2026-03-09T00:03:19.300 INFO:tasks.workunit.client.1.vm06.stdout:1/5: dread f0 [0,4194304] 0 2026-03-09T00:03:19.305 INFO:tasks.workunit.client.1.vm06.stdout:1/6: dread f0 [0,4194304] 0 2026-03-09T00:03:19.306 INFO:tasks.workunit.client.1.vm06.stdout:1/7: creat f3 x:0 0 0 2026-03-09T00:03:19.322 INFO:tasks.workunit.client.1.vm06.stdout:2/24: mkdir d7 0 2026-03-09T00:03:19.322 INFO:tasks.workunit.client.1.vm06.stdout:2/25: chown f6 15 1 2026-03-09T00:03:19.325 INFO:tasks.workunit.client.1.vm06.stdout:0/16: unlink f2 0 2026-03-09T00:03:19.331 INFO:tasks.workunit.client.1.vm06.stdout:3/7: unlink c2 0 2026-03-09T00:03:19.337 INFO:tasks.workunit.client.1.vm06.stdout:3/8: read - no filename 2026-03-09T00:03:19.337 INFO:tasks.workunit.client.1.vm06.stdout:3/9: dwrite - no filename 2026-03-09T00:03:19.337 INFO:tasks.workunit.client.1.vm06.stdout:2/26: link f4 d7/f8 0 2026-03-09T00:03:19.337 INFO:tasks.workunit.client.1.vm06.stdout:2/27: stat f6 0 2026-03-09T00:03:19.337 INFO:tasks.workunit.client.1.vm06.stdout:2/28: write d7/f8 [223788,97074] 0 2026-03-09T00:03:19.337 INFO:tasks.workunit.client.1.vm06.stdout:0/17: getdents d3 0 2026-03-09T00:03:19.338 INFO:tasks.workunit.client.1.vm06.stdout:0/18: truncate f1 755495 0 2026-03-09T00:03:19.343 INFO:tasks.workunit.client.1.vm06.stdout:0/19: dread d3/f6 [0,4194304] 0 2026-03-09T00:03:19.344 INFO:tasks.workunit.client.1.vm06.stdout:0/20: getdents d3 0 2026-03-09T00:03:19.346 INFO:tasks.workunit.client.1.vm06.stdout:0/21: mknod d3/c9 0 2026-03-09T00:03:19.355 INFO:tasks.workunit.client.1.vm06.stdout:8/20: dwrite f5 [0,4194304] 0 2026-03-09T00:03:19.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:19 vm03.local ceph-mon[52346]: pgmap v126: 65 pgs: 65 active+clean; 168 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 8.1 MiB/s wr, 713 op/s 2026-03-09T00:03:19.611 INFO:tasks.workunit.client.1.vm06.stdout:8/21: dwrite f5 [0,4194304] 0 2026-03-09T00:03:19.611 INFO:tasks.workunit.client.1.vm06.stdout:2/29: dwrite d7/f8 [0,4194304] 0 2026-03-09T00:03:19.615 INFO:tasks.workunit.client.1.vm06.stdout:2/30: mknod d7/c9 0 2026-03-09T00:03:19.615 INFO:tasks.workunit.client.1.vm06.stdout:2/31: fdatasync d7/f8 0 2026-03-09T00:03:19.615 INFO:tasks.workunit.client.1.vm06.stdout:2/32: truncate f2 1981407 0 2026-03-09T00:03:19.618 INFO:tasks.workunit.client.1.vm06.stdout:2/33: mkdir d7/da 0 2026-03-09T00:03:19.624 INFO:tasks.workunit.client.1.vm06.stdout:2/34: write d7/f8 [3011595,68682] 0 2026-03-09T00:03:19.624 INFO:tasks.workunit.client.1.vm06.stdout:2/35: fdatasync f6 0 2026-03-09T00:03:19.629 INFO:tasks.workunit.client.1.vm06.stdout:2/36: dread d7/f8 [0,4194304] 0 2026-03-09T00:03:19.631 INFO:tasks.workunit.client.1.vm06.stdout:2/37: mkdir d7/da/db 0 2026-03-09T00:03:19.631 INFO:tasks.workunit.client.1.vm06.stdout:2/38: stat c5 0 2026-03-09T00:03:19.632 INFO:tasks.workunit.client.1.vm06.stdout:2/39: creat d7/da/fc x:0 0 0 2026-03-09T00:03:19.635 INFO:tasks.workunit.client.1.vm06.stdout:2/40: rename f4 to d7/fd 0 2026-03-09T00:03:19.635 INFO:tasks.workunit.client.1.vm06.stdout:2/41: write f3 [1048422,26506] 0 2026-03-09T00:03:19.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:19 vm06.local ceph-mon[58395]: pgmap v126: 65 pgs: 65 active+clean; 168 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 8.1 MiB/s wr, 713 op/s 2026-03-09T00:03:19.878 INFO:tasks.workunit.client.1.vm06.stdout:0/22: truncate d3/f6 468539 0 2026-03-09T00:03:19.878 INFO:tasks.workunit.client.1.vm06.stdout:0/23: dread - d3/f5 zero size 2026-03-09T00:03:19.879 INFO:tasks.workunit.client.1.vm06.stdout:0/24: creat d3/fa x:0 0 0 2026-03-09T00:03:19.880 INFO:tasks.workunit.client.1.vm06.stdout:0/25: symlink d3/lb 0 2026-03-09T00:03:19.881 INFO:tasks.workunit.client.1.vm06.stdout:0/26: mkdir d3/dc 0 2026-03-09T00:03:19.881 INFO:tasks.workunit.client.1.vm06.stdout:0/27: mknod d3/dc/cd 0 2026-03-09T00:03:19.882 INFO:tasks.workunit.client.1.vm06.stdout:0/28: rmdir d3/dc 39 2026-03-09T00:03:19.883 INFO:tasks.workunit.client.1.vm06.stdout:0/29: mknod d3/ce 0 2026-03-09T00:03:19.883 INFO:tasks.workunit.client.1.vm06.stdout:0/30: write d3/f5 [353958,59997] 0 2026-03-09T00:03:19.883 INFO:tasks.workunit.client.1.vm06.stdout:0/31: chown d3/dc/cd 45254 1 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:6/4: sync 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:6/5: chown l0 2765 1 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:6/6: write - no filename 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:6/7: write - no filename 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:7/6: sync 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:9/18: sync 2026-03-09T00:03:19.894 INFO:tasks.workunit.client.1.vm06.stdout:6/8: creat f1 x:0 0 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:7/7: mknod d0/c4 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:9/19: mkdir d1/d4/d5 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:6/9: mknod c2 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:6/10: chown c2 742515999 1 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:6/11: write f1 [1014619,126778] 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:7/8: rename d0/f3 to d0/f5 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:7/9: creat d0/f6 x:0 0 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:7/10: write d0/f6 [491818,126766] 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:9/20: unlink d1/c2 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:9/21: dwrite - no filename 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:1/8: getdents . 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:1/9: fdatasync f0 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:9/22: creat d1/d4/f6 x:0 0 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:1/10: mknod c4 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:1/11: chown l2 79735234 1 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:9/23: creat d1/d4/d5/f7 x:0 0 0 2026-03-09T00:03:19.914 INFO:tasks.workunit.client.1.vm06.stdout:1/12: truncate f0 1935808 0 2026-03-09T00:03:19.921 INFO:tasks.workunit.client.1.vm06.stdout:0/32: write f1 [1210783,83602] 0 2026-03-09T00:03:20.036 INFO:tasks.workunit.client.1.vm06.stdout:2/42: truncate f3 315877 0 2026-03-09T00:03:20.036 INFO:tasks.workunit.client.1.vm06.stdout:2/43: write d7/fd [1316617,36932] 0 2026-03-09T00:03:20.037 INFO:tasks.workunit.client.1.vm06.stdout:2/44: mkdir d7/da/db/de 0 2026-03-09T00:03:20.038 INFO:tasks.workunit.client.1.vm06.stdout:2/45: symlink d7/da/db/de/lf 0 2026-03-09T00:03:20.038 INFO:tasks.workunit.client.1.vm06.stdout:2/46: mknod d7/da/db/de/c10 0 2026-03-09T00:03:20.054 INFO:tasks.workunit.client.1.vm06.stdout:5/19: dwrite f2 [4194304,4194304] 0 2026-03-09T00:03:20.062 INFO:tasks.workunit.client.1.vm06.stdout:4/15: dwrite f6 [4194304,4194304] 0 2026-03-09T00:03:20.066 INFO:tasks.workunit.client.1.vm06.stdout:9/24: dwrite d1/d4/f6 [0,4194304] 0 2026-03-09T00:03:20.067 INFO:tasks.workunit.client.1.vm06.stdout:9/25: write d1/d4/d5/f7 [178886,83144] 0 2026-03-09T00:03:20.067 INFO:tasks.workunit.client.1.vm06.stdout:5/20: getdents . 0 2026-03-09T00:03:20.067 INFO:tasks.workunit.client.1.vm06.stdout:5/21: fsync f2 0 2026-03-09T00:03:20.067 INFO:tasks.workunit.client.1.vm06.stdout:5/22: fsync f2 0 2026-03-09T00:03:20.070 INFO:tasks.workunit.client.1.vm06.stdout:7/11: dread d0/f6 [0,4194304] 0 2026-03-09T00:03:20.070 INFO:tasks.workunit.client.1.vm06.stdout:7/12: write d0/f2 [352229,88018] 0 2026-03-09T00:03:20.074 INFO:tasks.workunit.client.1.vm06.stdout:5/23: mknod c4 0 2026-03-09T00:03:20.076 INFO:tasks.workunit.client.1.vm06.stdout:9/26: unlink c0 0 2026-03-09T00:03:20.076 INFO:tasks.workunit.client.1.vm06.stdout:9/27: write d1/d4/d5/f7 [237281,86524] 0 2026-03-09T00:03:20.076 INFO:tasks.workunit.client.1.vm06.stdout:9/28: write d1/d4/d5/f7 [625874,119839] 0 2026-03-09T00:03:20.077 INFO:tasks.workunit.client.1.vm06.stdout:7/13: dread d0/f6 [0,4194304] 0 2026-03-09T00:03:20.077 INFO:tasks.workunit.client.1.vm06.stdout:7/14: chown d0/f5 13091418 1 2026-03-09T00:03:20.102 INFO:tasks.workunit.client.1.vm06.stdout:5/24: dread f2 [0,4194304] 0 2026-03-09T00:03:20.114 INFO:tasks.workunit.client.1.vm06.stdout:5/25: dread f2 [4194304,4194304] 0 2026-03-09T00:03:20.115 INFO:tasks.workunit.client.1.vm06.stdout:5/26: mkdir d5 0 2026-03-09T00:03:20.116 INFO:tasks.workunit.client.1.vm06.stdout:5/27: write f2 [4085645,130638] 0 2026-03-09T00:03:20.116 INFO:tasks.workunit.client.1.vm06.stdout:5/28: mknod d5/c6 0 2026-03-09T00:03:20.117 INFO:tasks.workunit.client.1.vm06.stdout:5/29: creat d5/f7 x:0 0 0 2026-03-09T00:03:20.134 INFO:tasks.workunit.client.1.vm06.stdout:6/12: dwrite f1 [0,4194304] 0 2026-03-09T00:03:20.134 INFO:tasks.workunit.client.1.vm06.stdout:6/13: creat f3 x:0 0 0 2026-03-09T00:03:20.136 INFO:tasks.workunit.client.1.vm06.stdout:6/14: mkdir d4 0 2026-03-09T00:03:20.138 INFO:tasks.workunit.client.1.vm06.stdout:6/15: rename f3 to d4/f5 0 2026-03-09T00:03:20.143 INFO:tasks.workunit.client.1.vm06.stdout:6/16: write d4/f5 [315762,41225] 0 2026-03-09T00:03:20.147 INFO:tasks.workunit.client.1.vm06.stdout:9/29: fdatasync d1/d4/d5/f7 0 2026-03-09T00:03:20.147 INFO:tasks.workunit.client.1.vm06.stdout:9/30: link d1/d4/d5/f7 d1/d4/d5/f8 0 2026-03-09T00:03:20.147 INFO:tasks.workunit.client.1.vm06.stdout:9/31: readlink - no filename 2026-03-09T00:03:20.149 INFO:tasks.workunit.client.1.vm06.stdout:9/32: dread d1/d4/d5/f8 [0,4194304] 0 2026-03-09T00:03:20.149 INFO:tasks.workunit.client.1.vm06.stdout:9/33: chown d1 2328168 1 2026-03-09T00:03:20.149 INFO:tasks.workunit.client.1.vm06.stdout:9/34: mkdir d1/d4/d5/d9 0 2026-03-09T00:03:20.150 INFO:tasks.workunit.client.1.vm06.stdout:9/35: creat d1/fa x:0 0 0 2026-03-09T00:03:20.162 INFO:tasks.workunit.client.1.vm06.stdout:4/16: dwrite f1 [0,4194304] 0 2026-03-09T00:03:20.163 INFO:tasks.workunit.client.1.vm06.stdout:0/33: dwrite f1 [0,4194304] 0 2026-03-09T00:03:20.163 INFO:tasks.workunit.client.1.vm06.stdout:7/15: dwrite d0/f6 [0,4194304] 0 2026-03-09T00:03:20.163 INFO:tasks.workunit.client.1.vm06.stdout:7/16: fdatasync d0/f6 0 2026-03-09T00:03:20.163 INFO:tasks.workunit.client.1.vm06.stdout:7/17: creat d0/f7 x:0 0 0 2026-03-09T00:03:20.172 INFO:tasks.workunit.client.1.vm06.stdout:6/17: dread d4/f5 [0,4194304] 0 2026-03-09T00:03:20.176 INFO:tasks.workunit.client.1.vm06.stdout:4/17: read f1 [3928093,117275] 0 2026-03-09T00:03:20.176 INFO:tasks.workunit.client.1.vm06.stdout:4/18: chown l2 632491 1 2026-03-09T00:03:20.181 INFO:tasks.workunit.client.1.vm06.stdout:0/34: rename d3/c9 to d3/cf 0 2026-03-09T00:03:20.192 INFO:tasks.workunit.client.1.vm06.stdout:2/47: truncate d7/fd 3669091 0 2026-03-09T00:03:20.247 INFO:tasks.workunit.client.1.vm06.stdout:2/48: creat d7/da/db/de/f11 x:0 0 0 2026-03-09T00:03:20.253 INFO:tasks.workunit.client.1.vm06.stdout:6/18: truncate f1 3754427 0 2026-03-09T00:03:20.261 INFO:tasks.workunit.client.1.vm06.stdout:4/19: truncate f6 2766322 0 2026-03-09T00:03:20.263 INFO:tasks.workunit.client.1.vm06.stdout:6/19: creat d4/f6 x:0 0 0 2026-03-09T00:03:20.265 INFO:tasks.workunit.client.1.vm06.stdout:9/36: rename d1/d4/d5 to d1/db 0 2026-03-09T00:03:20.266 INFO:tasks.workunit.client.1.vm06.stdout:6/20: mknod d4/c7 0 2026-03-09T00:03:20.266 INFO:tasks.workunit.client.1.vm06.stdout:6/21: creat d4/f8 x:0 0 0 2026-03-09T00:03:20.266 INFO:tasks.workunit.client.1.vm06.stdout:6/22: dread - d4/f6 zero size 2026-03-09T00:03:20.269 INFO:tasks.workunit.client.1.vm06.stdout:9/37: creat d1/db/d9/fc x:0 0 0 2026-03-09T00:03:20.272 INFO:tasks.workunit.client.1.vm06.stdout:9/38: symlink d1/d3/ld 0 2026-03-09T00:03:20.284 INFO:tasks.workunit.client.1.vm06.stdout:9/39: write d1/db/f8 [859854,112645] 0 2026-03-09T00:03:20.284 INFO:tasks.workunit.client.1.vm06.stdout:9/40: write d1/db/f8 [1478853,58370] 0 2026-03-09T00:03:20.398 INFO:tasks.workunit.client.1.vm06.stdout:7/18: dwrite d0/f2 [0,4194304] 0 2026-03-09T00:03:20.419 INFO:tasks.workunit.client.1.vm06.stdout:0/35: dwrite d3/f6 [0,4194304] 0 2026-03-09T00:03:20.419 INFO:tasks.workunit.client.1.vm06.stdout:0/36: creat d3/f10 x:0 0 0 2026-03-09T00:03:20.419 INFO:tasks.workunit.client.1.vm06.stdout:0/37: truncate d3/fa 315375 0 2026-03-09T00:03:20.419 INFO:tasks.workunit.client.1.vm06.stdout:0/38: write d3/fa [509619,9628] 0 2026-03-09T00:03:20.419 INFO:tasks.workunit.client.1.vm06.stdout:0/39: readlink d3/lb 0 2026-03-09T00:03:20.630 INFO:tasks.workunit.client.1.vm06.stdout:9/41: write d1/d4/f6 [1479391,125640] 0 2026-03-09T00:03:20.630 INFO:tasks.workunit.client.1.vm06.stdout:9/42: write d1/db/d9/fc [71934,12128] 0 2026-03-09T00:03:20.744 INFO:tasks.workunit.client.1.vm06.stdout:4/20: dwrite f5 [4194304,4194304] 0 2026-03-09T00:03:20.744 INFO:tasks.workunit.client.1.vm06.stdout:4/21: creat f7 x:0 0 0 2026-03-09T00:03:20.744 INFO:tasks.workunit.client.1.vm06.stdout:4/22: chown l0 232664491 1 2026-03-09T00:03:20.744 INFO:tasks.workunit.client.1.vm06.stdout:4/23: rmdir - no directory 2026-03-09T00:03:20.744 INFO:tasks.workunit.client.1.vm06.stdout:4/24: write f1 [1934675,37519] 0 2026-03-09T00:03:20.747 INFO:tasks.workunit.client.1.vm06.stdout:4/25: symlink l8 0 2026-03-09T00:03:20.747 INFO:tasks.workunit.client.1.vm06.stdout:4/26: dread - f7 zero size 2026-03-09T00:03:20.786 INFO:tasks.workunit.client.1.vm06.stdout:4/27: write f5 [4556219,85989] 0 2026-03-09T00:03:20.816 INFO:tasks.workunit.client.1.vm06.stdout:0/40: dwrite d3/fa [0,4194304] 0 2026-03-09T00:03:20.816 INFO:tasks.workunit.client.1.vm06.stdout:0/41: write d3/f7 [803079,108157] 0 2026-03-09T00:03:20.816 INFO:tasks.workunit.client.1.vm06.stdout:0/42: write d3/f5 [934908,106848] 0 2026-03-09T00:03:20.829 INFO:tasks.workunit.client.1.vm06.stdout:0/43: write d3/f6 [108328,111717] 0 2026-03-09T00:03:20.829 INFO:tasks.workunit.client.1.vm06.stdout:0/44: creat d3/f11 x:0 0 0 2026-03-09T00:03:20.829 INFO:tasks.workunit.client.1.vm06.stdout:0/45: chown d3/f5 0 1 2026-03-09T00:03:20.829 INFO:tasks.workunit.client.1.vm06.stdout:0/46: stat d3/lb 0 2026-03-09T00:03:20.830 INFO:tasks.workunit.client.1.vm06.stdout:0/47: mknod d3/c12 0 2026-03-09T00:03:20.830 INFO:tasks.workunit.client.1.vm06.stdout:0/48: link d3/cf d3/c13 0 2026-03-09T00:03:20.831 INFO:tasks.workunit.client.1.vm06.stdout:0/49: creat d3/dc/f14 x:0 0 0 2026-03-09T00:03:20.833 INFO:tasks.workunit.client.1.vm06.stdout:0/50: symlink d3/l15 0 2026-03-09T00:03:20.833 INFO:tasks.workunit.client.1.vm06.stdout:0/51: chown d3/f11 10576 1 2026-03-09T00:03:20.833 INFO:tasks.workunit.client.1.vm06.stdout:0/52: chown d3 53064134 1 2026-03-09T00:03:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:21 vm06.local ceph-mon[58395]: pgmap v127: 65 pgs: 65 active+clean; 175 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 6.7 MiB/s wr, 649 op/s 2026-03-09T00:03:21.759 INFO:tasks.workunit.client.1.vm06.stdout:7/19: dwrite d0/f5 [0,4194304] 0 2026-03-09T00:03:21.759 INFO:tasks.workunit.client.1.vm06.stdout:7/20: creat d0/f8 x:0 0 0 2026-03-09T00:03:21.761 INFO:tasks.workunit.client.1.vm06.stdout:7/21: link d0/c4 d0/c9 0 2026-03-09T00:03:21.761 INFO:tasks.workunit.client.1.vm06.stdout:7/22: read - d0/f8 zero size 2026-03-09T00:03:21.761 INFO:tasks.workunit.client.1.vm06.stdout:7/23: write d0/f7 [822553,81417] 0 2026-03-09T00:03:21.762 INFO:tasks.workunit.client.1.vm06.stdout:7/24: rename d0/f8 to d0/fa 0 2026-03-09T00:03:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:21 vm03.local ceph-mon[52346]: pgmap v127: 65 pgs: 65 active+clean; 175 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 6.7 MiB/s wr, 649 op/s 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/49: dwrite d7/f8 [0,4194304] 0 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/50: readlink d7/da/db/de/lf 0 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/51: chown d7/da/db/de/lf 22 1 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/52: write d7/f8 [4461499,35258] 0 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/53: fdatasync f3 0 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/54: fdatasync f3 0 2026-03-09T00:03:22.295 INFO:tasks.workunit.client.1.vm06.stdout:2/55: chown c1 7466 1 2026-03-09T00:03:22.297 INFO:tasks.workunit.client.1.vm06.stdout:2/56: creat d7/da/db/f12 x:0 0 0 2026-03-09T00:03:22.297 INFO:tasks.workunit.client.1.vm06.stdout:2/57: fdatasync f6 0 2026-03-09T00:03:22.298 INFO:tasks.workunit.client.1.vm06.stdout:2/58: mknod d7/da/db/c13 0 2026-03-09T00:03:22.298 INFO:tasks.workunit.client.1.vm06.stdout:2/59: stat f2 0 2026-03-09T00:03:22.298 INFO:tasks.workunit.client.1.vm06.stdout:2/60: readlink d7/da/db/de/lf 0 2026-03-09T00:03:22.298 INFO:tasks.workunit.client.1.vm06.stdout:2/61: mknod d7/da/c14 0 2026-03-09T00:03:22.298 INFO:tasks.workunit.client.1.vm06.stdout:2/62: truncate d7/da/db/f12 449371 0 2026-03-09T00:03:22.490 INFO:tasks.workunit.client.1.vm06.stdout:2/63: dread d7/f8 [0,4194304] 0 2026-03-09T00:03:22.490 INFO:tasks.workunit.client.1.vm06.stdout:2/64: fsync d7/da/fc 0 2026-03-09T00:03:22.530 INFO:tasks.workunit.client.1.vm06.stdout:2/65: dread f3 [0,4194304] 0 2026-03-09T00:03:22.530 INFO:tasks.workunit.client.1.vm06.stdout:2/66: symlink d7/da/l15 0 2026-03-09T00:03:22.531 INFO:tasks.workunit.client.1.vm06.stdout:2/67: truncate f6 567609 0 2026-03-09T00:03:22.531 INFO:tasks.workunit.client.1.vm06.stdout:2/68: mknod d7/da/db/c16 0 2026-03-09T00:03:22.533 INFO:tasks.workunit.client.1.vm06.stdout:2/69: rename d7/da/fc to d7/f17 0 2026-03-09T00:03:22.533 INFO:tasks.workunit.client.1.vm06.stdout:2/70: creat d7/da/f18 x:0 0 0 2026-03-09T00:03:22.533 INFO:tasks.workunit.client.1.vm06.stdout:2/71: read - d7/da/db/de/f11 zero size 2026-03-09T00:03:22.534 INFO:tasks.workunit.client.1.vm06.stdout:2/72: creat d7/f19 x:0 0 0 2026-03-09T00:03:22.534 INFO:tasks.workunit.client.1.vm06.stdout:2/73: readlink d7/da/db/de/lf 0 2026-03-09T00:03:22.534 INFO:tasks.workunit.client.1.vm06.stdout:2/74: readlink d7/da/db/de/lf 0 2026-03-09T00:03:22.535 INFO:tasks.workunit.client.1.vm06.stdout:2/75: dread d7/da/db/f12 [0,4194304] 0 2026-03-09T00:03:22.535 INFO:tasks.workunit.client.1.vm06.stdout:2/76: chown d7/da/db/de 0 1 2026-03-09T00:03:22.535 INFO:tasks.workunit.client.1.vm06.stdout:2/77: truncate d7/da/db/de/f11 60749 0 2026-03-09T00:03:22.557 INFO:tasks.workunit.client.1.vm06.stdout:2/78: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:22.557 INFO:tasks.workunit.client.1.vm06.stdout:2/79: chown f2 6 1 2026-03-09T00:03:22.561 INFO:tasks.workunit.client.1.vm06.stdout:2/80: mkdir d7/d1a 0 2026-03-09T00:03:22.589 INFO:tasks.workunit.client.1.vm06.stdout:2/81: read d7/da/db/de/f11 [52534,53345] 0 2026-03-09T00:03:22.589 INFO:tasks.workunit.client.1.vm06.stdout:2/82: write d7/da/f18 [261347,72433] 0 2026-03-09T00:03:23.090 INFO:tasks.workunit.client.1.vm06.stdout:9/43: dread d1/db/f8 [0,4194304] 0 2026-03-09T00:03:23.090 INFO:tasks.workunit.client.1.vm06.stdout:9/44: creat d1/d4/fe x:0 0 0 2026-03-09T00:03:23.090 INFO:tasks.workunit.client.1.vm06.stdout:9/45: write d1/db/f8 [1889178,95507] 0 2026-03-09T00:03:23.090 INFO:tasks.workunit.client.1.vm06.stdout:9/46: write d1/d4/fe [895180,50136] 0 2026-03-09T00:03:23.151 INFO:tasks.workunit.client.1.vm06.stdout:9/47: dwrite d1/d4/fe [0,4194304] 0 2026-03-09T00:03:23.151 INFO:tasks.workunit.client.1.vm06.stdout:9/48: write d1/fa [362685,89954] 0 2026-03-09T00:03:23.151 INFO:tasks.workunit.client.1.vm06.stdout:9/49: creat d1/d4/ff x:0 0 0 2026-03-09T00:03:23.154 INFO:tasks.workunit.client.1.vm06.stdout:9/50: rename d1/fa to d1/db/d9/f10 0 2026-03-09T00:03:23.157 INFO:tasks.workunit.client.1.vm06.stdout:9/51: creat d1/d3/f11 x:0 0 0 2026-03-09T00:03:23.159 INFO:tasks.workunit.client.1.vm06.stdout:9/52: mkdir d1/d3/d12 0 2026-03-09T00:03:23.163 INFO:tasks.workunit.client.1.vm06.stdout:0/53: dwrite d3/f6 [0,4194304] 0 2026-03-09T00:03:23.171 INFO:tasks.workunit.client.1.vm06.stdout:0/54: mknod d3/dc/c16 0 2026-03-09T00:03:23.172 INFO:tasks.workunit.client.1.vm06.stdout:3/10: sync 2026-03-09T00:03:23.172 INFO:tasks.workunit.client.1.vm06.stdout:8/22: sync 2026-03-09T00:03:23.172 INFO:tasks.workunit.client.1.vm06.stdout:8/23: write f3 [2220265,41482] 0 2026-03-09T00:03:23.172 INFO:tasks.workunit.client.1.vm06.stdout:8/24: chown l6 242 1 2026-03-09T00:03:23.172 INFO:tasks.workunit.client.1.vm06.stdout:1/13: sync 2026-03-09T00:03:23.177 INFO:tasks.workunit.client.1.vm06.stdout:6/23: getdents d4 0 2026-03-09T00:03:23.177 INFO:tasks.workunit.client.1.vm06.stdout:6/24: stat d4/f6 0 2026-03-09T00:03:23.177 INFO:tasks.workunit.client.1.vm06.stdout:6/25: rename d4 to d4/d9 22 2026-03-09T00:03:23.183 INFO:tasks.workunit.client.1.vm06.stdout:7/25: truncate d0/f2 427216 0 2026-03-09T00:03:23.183 INFO:tasks.workunit.client.1.vm06.stdout:7/26: chown d0/f7 73 1 2026-03-09T00:03:23.183 INFO:tasks.workunit.client.1.vm06.stdout:5/30: sync 2026-03-09T00:03:23.183 INFO:tasks.workunit.client.1.vm06.stdout:4/28: getdents . 0 2026-03-09T00:03:23.188 INFO:tasks.workunit.client.1.vm06.stdout:8/25: link f3 f7 0 2026-03-09T00:03:23.189 INFO:tasks.workunit.client.1.vm06.stdout:1/14: symlink l5 0 2026-03-09T00:03:23.189 INFO:tasks.workunit.client.1.vm06.stdout:1/15: stat l2 0 2026-03-09T00:03:23.191 INFO:tasks.workunit.client.1.vm06.stdout:4/29: fsync f1 0 2026-03-09T00:03:23.191 INFO:tasks.workunit.client.1.vm06.stdout:6/26: rmdir d4 39 2026-03-09T00:03:23.191 INFO:tasks.workunit.client.1.vm06.stdout:6/27: readlink l0 0 2026-03-09T00:03:23.192 INFO:tasks.workunit.client.1.vm06.stdout:7/27: mkdir d0/db 0 2026-03-09T00:03:23.193 INFO:tasks.workunit.client.1.vm06.stdout:5/31: truncate f2 3459153 0 2026-03-09T00:03:23.195 INFO:tasks.workunit.client.1.vm06.stdout:8/26: truncate f5 3602702 0 2026-03-09T00:03:23.196 INFO:tasks.workunit.client.1.vm06.stdout:1/16: mkdir d6 0 2026-03-09T00:03:23.196 INFO:tasks.workunit.client.1.vm06.stdout:1/17: chown f3 232114384 1 2026-03-09T00:03:23.205 INFO:tasks.workunit.client.1.vm06.stdout:4/30: mkdir d9 0 2026-03-09T00:03:23.205 INFO:tasks.workunit.client.1.vm06.stdout:4/31: readlink l0 0 2026-03-09T00:03:23.205 INFO:tasks.workunit.client.1.vm06.stdout:6/28: creat d4/fa x:0 0 0 2026-03-09T00:03:23.205 INFO:tasks.workunit.client.1.vm06.stdout:6/29: creat d4/fb x:0 0 0 2026-03-09T00:03:23.205 INFO:tasks.workunit.client.1.vm06.stdout:6/30: truncate d4/fb 772894 0 2026-03-09T00:03:23.212 INFO:tasks.workunit.client.1.vm06.stdout:4/32: dread f6 [0,4194304] 0 2026-03-09T00:03:23.218 INFO:tasks.workunit.client.1.vm06.stdout:6/31: link d4/fb d4/fc 0 2026-03-09T00:03:23.223 INFO:tasks.workunit.client.1.vm06.stdout:4/33: unlink f6 0 2026-03-09T00:03:23.227 INFO:tasks.workunit.client.1.vm06.stdout:4/34: rmdir d9 0 2026-03-09T00:03:23.227 INFO:tasks.workunit.client.1.vm06.stdout:4/35: chown l8 31106111 1 2026-03-09T00:03:23.227 INFO:tasks.workunit.client.1.vm06.stdout:4/36: readlink l0 0 2026-03-09T00:03:23.229 INFO:tasks.workunit.client.1.vm06.stdout:6/32: dread f1 [0,4194304] 0 2026-03-09T00:03:23.244 INFO:tasks.workunit.client.1.vm06.stdout:7/28: truncate d0/f5 2754678 0 2026-03-09T00:03:23.254 INFO:tasks.workunit.client.1.vm06.stdout:2/83: getdents d7 0 2026-03-09T00:03:23.266 INFO:tasks.workunit.client.1.vm06.stdout:2/84: dread d7/da/f18 [0,4194304] 0 2026-03-09T00:03:23.266 INFO:tasks.workunit.client.1.vm06.stdout:2/85: fsync f2 0 2026-03-09T00:03:23.269 INFO:tasks.workunit.client.1.vm06.stdout:2/86: mkdir d7/d1b 0 2026-03-09T00:03:23.269 INFO:tasks.workunit.client.1.vm06.stdout:2/87: dread - d7/f17 zero size 2026-03-09T00:03:23.275 INFO:tasks.workunit.client.1.vm06.stdout:6/33: dwrite d4/f8 [0,4194304] 0 2026-03-09T00:03:23.275 INFO:tasks.workunit.client.1.vm06.stdout:2/88: mkdir d7/da/d1c 0 2026-03-09T00:03:23.275 INFO:tasks.workunit.client.1.vm06.stdout:2/89: read d7/da/db/de/f11 [38217,68498] 0 2026-03-09T00:03:23.280 INFO:tasks.workunit.client.1.vm06.stdout:2/90: mknod d7/da/db/de/c1d 0 2026-03-09T00:03:23.318 INFO:tasks.workunit.client.1.vm06.stdout:4/37: dwrite f7 [0,4194304] 0 2026-03-09T00:03:23.333 INFO:tasks.workunit.client.1.vm06.stdout:4/38: dread f1 [0,4194304] 0 2026-03-09T00:03:23.505 INFO:tasks.workunit.client.1.vm06.stdout:2/91: dwrite d7/fd [0,4194304] 0 2026-03-09T00:03:23.505 INFO:tasks.workunit.client.1.vm06.stdout:2/92: chown f2 14585 1 2026-03-09T00:03:23.505 INFO:tasks.workunit.client.1.vm06.stdout:2/93: rename d7/da to d7/da/d1e 22 2026-03-09T00:03:23.505 INFO:tasks.workunit.client.1.vm06.stdout:2/94: write d7/da/db/f12 [1439664,103817] 0 2026-03-09T00:03:23.512 INFO:tasks.workunit.client.1.vm06.stdout:2/95: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:23.657 INFO:tasks.workunit.client.1.vm06.stdout:4/39: dwrite f1 [0,4194304] 0 2026-03-09T00:03:23.657 INFO:tasks.workunit.client.1.vm06.stdout:4/40: mknod ca 0 2026-03-09T00:03:23.658 INFO:tasks.workunit.client.1.vm06.stdout:4/41: creat fb x:0 0 0 2026-03-09T00:03:23.658 INFO:tasks.workunit.client.1.vm06.stdout:4/42: read f5 [6644929,64825] 0 2026-03-09T00:03:23.658 INFO:tasks.workunit.client.1.vm06.stdout:4/43: dread - fb zero size 2026-03-09T00:03:23.669 INFO:tasks.workunit.client.1.vm06.stdout:0/55: truncate d3/fa 3395833 0 2026-03-09T00:03:23.669 INFO:tasks.workunit.client.1.vm06.stdout:0/56: readlink d3/lb 0 2026-03-09T00:03:23.673 INFO:tasks.workunit.client.1.vm06.stdout:0/57: truncate f1 263809 0 2026-03-09T00:03:23.675 INFO:tasks.workunit.client.1.vm06.stdout:0/58: link d3/f6 d3/f17 0 2026-03-09T00:03:23.675 INFO:tasks.workunit.client.1.vm06.stdout:0/59: write d3/f10 [89942,44039] 0 2026-03-09T00:03:23.682 INFO:tasks.workunit.client.1.vm06.stdout:6/34: rmdir d4 39 2026-03-09T00:03:23.683 INFO:tasks.workunit.client.1.vm06.stdout:8/27: write f5 [2202677,49064] 0 2026-03-09T00:03:23.683 INFO:tasks.workunit.client.1.vm06.stdout:6/35: write d4/fc [562438,25468] 0 2026-03-09T00:03:23.684 INFO:tasks.workunit.client.1.vm06.stdout:7/29: read d0/f2 [358294,42136] 0 2026-03-09T00:03:23.689 INFO:tasks.workunit.client.1.vm06.stdout:8/28: symlink l8 0 2026-03-09T00:03:23.689 INFO:tasks.workunit.client.1.vm06.stdout:6/36: mkdir d4/dd 0 2026-03-09T00:03:23.704 INFO:tasks.workunit.client.1.vm06.stdout:0/60: rename d3/dc to d3/d18 0 2026-03-09T00:03:23.714 INFO:tasks.workunit.client.1.vm06.stdout:0/61: fdatasync d3/f6 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/29: creat f9 x:0 0 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/30: truncate f7 3035347 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/31: creat fa x:0 0 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:0/62: unlink d3/f6 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:0/63: creat d3/f19 x:0 0 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/32: mkdir db 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:0/64: unlink d3/f5 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/33: symlink db/lc 0 2026-03-09T00:03:23.715 INFO:tasks.workunit.client.1.vm06.stdout:8/34: mkdir db/dd 0 2026-03-09T00:03:23.717 INFO:tasks.workunit.client.1.vm06.stdout:8/35: unlink fa 0 2026-03-09T00:03:23.976 INFO:tasks.workunit.client.1.vm06.stdout:2/96: dread d7/da/db/f12 [0,4194304] 0 2026-03-09T00:03:23.985 INFO:tasks.workunit.client.1.vm06.stdout:2/97: rename d7/fd to d7/da/d1c/f1f 0 2026-03-09T00:03:23.987 INFO:tasks.workunit.client.1.vm06.stdout:2/98: dread d7/da/f18 [0,4194304] 0 2026-03-09T00:03:23.987 INFO:tasks.workunit.client.1.vm06.stdout:2/99: chown f2 124595734 1 2026-03-09T00:03:24.417 INFO:tasks.workunit.client.1.vm06.stdout:5/32: dwrite f2 [0,4194304] 0 2026-03-09T00:03:24.431 INFO:tasks.workunit.client.1.vm06.stdout:6/37: dwrite d4/f8 [0,4194304] 0 2026-03-09T00:03:24.432 INFO:tasks.workunit.client.1.vm06.stdout:6/38: rmdir d4/dd 0 2026-03-09T00:03:24.432 INFO:tasks.workunit.client.1.vm06.stdout:6/39: read d4/fc [621853,125165] 0 2026-03-09T00:03:24.433 INFO:tasks.workunit.client.1.vm06.stdout:6/40: symlink d4/le 0 2026-03-09T00:03:24.433 INFO:tasks.workunit.client.1.vm06.stdout:6/41: dread - d4/fa zero size 2026-03-09T00:03:24.433 INFO:tasks.workunit.client.1.vm06.stdout:6/42: write d4/fc [1475382,79561] 0 2026-03-09T00:03:24.435 INFO:tasks.workunit.client.1.vm06.stdout:6/43: dread d4/f5 [0,4194304] 0 2026-03-09T00:03:24.441 INFO:tasks.workunit.client.1.vm06.stdout:6/44: creat d4/ff x:0 0 0 2026-03-09T00:03:24.441 INFO:tasks.workunit.client.1.vm06.stdout:6/45: chown f1 117803 1 2026-03-09T00:03:24.441 INFO:tasks.workunit.client.1.vm06.stdout:6/46: unlink d4/f8 0 2026-03-09T00:03:24.471 INFO:tasks.workunit.client.1.vm06.stdout:8/36: dread f5 [0,4194304] 0 2026-03-09T00:03:24.484 INFO:tasks.workunit.client.1.vm06.stdout:8/37: chown db/dd 38273 1 2026-03-09T00:03:24.486 INFO:tasks.workunit.client.1.vm06.stdout:8/38: dread f3 [0,4194304] 0 2026-03-09T00:03:24.714 INFO:tasks.workunit.client.1.vm06.stdout:2/100: dwrite d7/da/db/f12 [0,4194304] 0 2026-03-09T00:03:24.714 INFO:tasks.workunit.client.1.vm06.stdout:7/30: dread d0/f2 [0,4194304] 0 2026-03-09T00:03:24.720 INFO:tasks.workunit.client.1.vm06.stdout:1/18: dwrite f0 [0,4194304] 0 2026-03-09T00:03:24.720 INFO:tasks.workunit.client.1.vm06.stdout:7/31: dread d0/f5 [0,4194304] 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:1/19: creat d6/f7 x:0 0 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:2/101: getdents d7/da/db/de 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:2/102: creat d7/da/f20 x:0 0 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:1/20: mknod d6/c8 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:1/21: creat d6/f9 x:0 0 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:1/22: write d6/f9 [680499,40155] 0 2026-03-09T00:03:24.723 INFO:tasks.workunit.client.1.vm06.stdout:1/23: creat d6/fa x:0 0 0 2026-03-09T00:03:24.724 INFO:tasks.workunit.client.1.vm06.stdout:2/103: symlink d7/da/d1c/l21 0 2026-03-09T00:03:24.724 INFO:tasks.workunit.client.1.vm06.stdout:2/104: write f3 [1158435,58828] 0 2026-03-09T00:03:24.724 INFO:tasks.workunit.client.1.vm06.stdout:2/105: write f6 [880524,88057] 0 2026-03-09T00:03:24.732 INFO:tasks.workunit.client.1.vm06.stdout:7/32: write d0/f6 [1370694,52067] 0 2026-03-09T00:03:24.734 INFO:tasks.workunit.client.1.vm06.stdout:2/106: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:24.735 INFO:tasks.workunit.client.1.vm06.stdout:2/107: dread d7/da/f18 [0,4194304] 0 2026-03-09T00:03:24.736 INFO:tasks.workunit.client.1.vm06.stdout:7/33: dread d0/f2 [0,4194304] 0 2026-03-09T00:03:24.736 INFO:tasks.workunit.client.1.vm06.stdout:7/34: chown d0/db 6412669 1 2026-03-09T00:03:24.804 INFO:tasks.workunit.client.1.vm06.stdout:4/44: getdents . 0 2026-03-09T00:03:24.804 INFO:tasks.workunit.client.1.vm06.stdout:4/45: write f5 [5467819,22848] 0 2026-03-09T00:03:24.807 INFO:tasks.workunit.client.1.vm06.stdout:0/65: write f1 [903824,118424] 0 2026-03-09T00:03:24.971 INFO:tasks.workunit.client.1.vm06.stdout:5/33: dwrite f2 [0,4194304] 0 2026-03-09T00:03:24.971 INFO:tasks.workunit.client.1.vm06.stdout:5/34: stat d5 0 2026-03-09T00:03:24.972 INFO:tasks.workunit.client.1.vm06.stdout:5/35: symlink d5/l8 0 2026-03-09T00:03:24.974 INFO:tasks.workunit.client.1.vm06.stdout:9/53: dwrite d1/db/f8 [0,4194304] 0 2026-03-09T00:03:25.000 INFO:tasks.workunit.client.1.vm06.stdout:5/36: write f2 [3406976,14800] 0 2026-03-09T00:03:25.002 INFO:tasks.workunit.client.1.vm06.stdout:5/37: chown d5 12455 1 2026-03-09T00:03:25.002 INFO:tasks.workunit.client.1.vm06.stdout:5/38: creat d5/f9 x:0 0 0 2026-03-09T00:03:25.002 INFO:tasks.workunit.client.1.vm06.stdout:5/39: rename f2 to d5/fa 0 2026-03-09T00:03:25.002 INFO:tasks.workunit.client.1.vm06.stdout:5/40: write d5/f7 [688776,32135] 0 2026-03-09T00:03:25.196 INFO:tasks.workunit.client.1.vm06.stdout:7/35: dwrite d0/f2 [0,4194304] 0 2026-03-09T00:03:25.196 INFO:tasks.workunit.client.1.vm06.stdout:7/36: rename d0 to d0/db/dc 22 2026-03-09T00:03:25.197 INFO:tasks.workunit.client.1.vm06.stdout:7/37: truncate d0/f5 2212378 0 2026-03-09T00:03:25.197 INFO:tasks.workunit.client.1.vm06.stdout:7/38: truncate d0/fa 6958 0 2026-03-09T00:03:25.197 INFO:tasks.workunit.client.1.vm06.stdout:7/39: write d0/f6 [1529703,46557] 0 2026-03-09T00:03:25.391 INFO:tasks.workunit.client.1.vm06.stdout:2/108: dwrite d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:25.391 INFO:tasks.workunit.client.1.vm06.stdout:2/109: fsync d7/da/d1c/f1f 0 2026-03-09T00:03:25.396 INFO:tasks.workunit.client.1.vm06.stdout:4/46: dread f5 [4194304,4194304] 0 2026-03-09T00:03:25.396 INFO:tasks.workunit.client.1.vm06.stdout:4/47: stat fb 0 2026-03-09T00:03:25.396 INFO:tasks.workunit.client.1.vm06.stdout:4/48: chown l0 16106770 1 2026-03-09T00:03:25.396 INFO:tasks.workunit.client.1.vm06.stdout:4/49: chown l0 28 1 2026-03-09T00:03:25.398 INFO:tasks.workunit.client.1.vm06.stdout:2/110: write d7/da/d1c/f1f [1506740,78172] 0 2026-03-09T00:03:25.402 INFO:tasks.workunit.client.1.vm06.stdout:9/54: dwrite d1/d4/fe [0,4194304] 0 2026-03-09T00:03:25.412 INFO:tasks.workunit.client.1.vm06.stdout:2/111: write d7/da/db/f12 [1684754,52357] 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/112: write d7/f17 [468462,56834] 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/113: write f6 [1000934,90876] 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/114: write d7/f19 [826402,27338] 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/115: creat d7/d1b/f22 x:0 0 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/116: creat d7/da/f23 x:0 0 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/117: truncate d7/da/db/f12 4519162 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:9/55: write d1/d4/fe [810078,104029] 0 2026-03-09T00:03:25.449 INFO:tasks.workunit.client.1.vm06.stdout:2/118: symlink d7/da/l24 0 2026-03-09T00:03:25.452 INFO:tasks.workunit.client.1.vm06.stdout:9/56: dread d1/db/f7 [0,4194304] 0 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/40: dwrite d0/f2 [0,4194304] 0 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/41: fdatasync d0/fa 0 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/42: write d0/f6 [5144508,35417] 0 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/43: rename d0 to d0/db/dd 22 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/44: chown d0 2 1 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/45: write d0/f6 [763747,42042] 0 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/46: stat d0/f6 0 2026-03-09T00:03:25.582 INFO:tasks.workunit.client.1.vm06.stdout:7/47: stat d0/f5 0 2026-03-09T00:03:25.584 INFO:tasks.workunit.client.1.vm06.stdout:5/41: dread d5/f7 [0,4194304] 0 2026-03-09T00:03:25.584 INFO:tasks.workunit.client.1.vm06.stdout:5/42: write d5/fa [2998749,7286] 0 2026-03-09T00:03:25.590 INFO:tasks.workunit.client.1.vm06.stdout:5/43: symlink d5/lb 0 2026-03-09T00:03:25.590 INFO:tasks.workunit.client.1.vm06.stdout:5/44: write d5/f9 [523973,43664] 0 2026-03-09T00:03:25.590 INFO:tasks.workunit.client.1.vm06.stdout:5/45: stat d5/f7 0 2026-03-09T00:03:25.668 INFO:tasks.workunit.client.1.vm06.stdout:4/50: dwrite f1 [4194304,4194304] 0 2026-03-09T00:03:25.669 INFO:tasks.workunit.client.1.vm06.stdout:4/51: readlink l8 0 2026-03-09T00:03:25.709 INFO:tasks.workunit.client.1.vm06.stdout:2/119: fdatasync d7/f8 0 2026-03-09T00:03:25.709 INFO:tasks.workunit.client.1.vm06.stdout:2/120: write f3 [1998295,50758] 0 2026-03-09T00:03:25.709 INFO:tasks.workunit.client.1.vm06.stdout:2/121: chown c1 524919879 1 2026-03-09T00:03:25.709 INFO:tasks.workunit.client.1.vm06.stdout:7/48: dwrite d0/fa [0,4194304] 0 2026-03-09T00:03:25.709 INFO:tasks.workunit.client.1.vm06.stdout:7/49: creat d0/fe x:0 0 0 2026-03-09T00:03:25.709 INFO:tasks.workunit.client.1.vm06.stdout:7/50: chown d0/f5 2 1 2026-03-09T00:03:25.714 INFO:tasks.workunit.client.1.vm06.stdout:7/51: mkdir d0/df 0 2026-03-09T00:03:25.715 INFO:tasks.workunit.client.1.vm06.stdout:7/52: unlink d0/c9 0 2026-03-09T00:03:25.715 INFO:tasks.workunit.client.1.vm06.stdout:7/53: symlink d0/df/l10 0 2026-03-09T00:03:25.747 INFO:tasks.workunit.client.1.vm06.stdout:6/47: rmdir d4 39 2026-03-09T00:03:25.748 INFO:tasks.workunit.client.1.vm06.stdout:6/48: write d4/f6 [85690,89013] 0 2026-03-09T00:03:25.748 INFO:tasks.workunit.client.1.vm06.stdout:6/49: chown d4/f5 3861 1 2026-03-09T00:03:25.748 INFO:tasks.workunit.client.1.vm06.stdout:6/50: chown d4/ff 15 1 2026-03-09T00:03:25.748 INFO:tasks.workunit.client.1.vm06.stdout:6/51: truncate d4/f6 473494 0 2026-03-09T00:03:25.751 INFO:tasks.workunit.client.1.vm06.stdout:8/39: truncate f7 2348974 0 2026-03-09T00:03:25.752 INFO:tasks.workunit.client.1.vm06.stdout:5/46: dwrite d5/f7 [0,4194304] 0 2026-03-09T00:03:25.756 INFO:tasks.workunit.client.1.vm06.stdout:5/47: mknod d5/cc 0 2026-03-09T00:03:25.822 INFO:tasks.workunit.client.1.vm06.stdout:4/52: truncate f5 3506802 0 2026-03-09T00:03:25.853 INFO:tasks.workunit.client.1.vm06.stdout:2/122: truncate f3 1808508 0 2026-03-09T00:03:25.862 INFO:tasks.workunit.client.1.vm06.stdout:9/57: dwrite d1/db/f7 [4194304,4194304] 0 2026-03-09T00:03:25.870 INFO:tasks.workunit.client.1.vm06.stdout:9/58: mknod d1/c13 0 2026-03-09T00:03:25.870 INFO:tasks.workunit.client.1.vm06.stdout:5/48: getdents d5 0 2026-03-09T00:03:25.871 INFO:tasks.workunit.client.1.vm06.stdout:5/49: mknod d5/cd 0 2026-03-09T00:03:25.872 INFO:tasks.workunit.client.1.vm06.stdout:9/59: read d1/db/d9/f10 [192901,31819] 0 2026-03-09T00:03:25.873 INFO:tasks.workunit.client.1.vm06.stdout:9/60: mkdir d1/db/d14 0 2026-03-09T00:03:25.888 INFO:tasks.workunit.client.1.vm06.stdout:2/123: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:03:25.888 INFO:tasks.workunit.client.1.vm06.stdout:2/124: fdatasync f2 0 2026-03-09T00:03:25.888 INFO:tasks.workunit.client.1.vm06.stdout:2/125: chown c1 1841 1 2026-03-09T00:03:25.890 INFO:tasks.workunit.client.1.vm06.stdout:2/126: mkdir d7/d1a/d25 0 2026-03-09T00:03:25.892 INFO:tasks.workunit.client.1.vm06.stdout:2/127: rename d7/f19 to d7/f26 0 2026-03-09T00:03:25.894 INFO:tasks.workunit.client.1.vm06.stdout:2/128: symlink d7/da/l27 0 2026-03-09T00:03:25.898 INFO:tasks.workunit.client.1.vm06.stdout:0/66: rmdir d3 39 2026-03-09T00:03:25.900 INFO:tasks.workunit.client.1.vm06.stdout:0/67: chown d3/c12 7948 1 2026-03-09T00:03:25.902 INFO:tasks.workunit.client.1.vm06.stdout:0/68: rmdir d3/d18 39 2026-03-09T00:03:25.913 INFO:tasks.workunit.client.1.vm06.stdout:2/129: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:25.921 INFO:tasks.workunit.client.1.vm06.stdout:2/130: symlink d7/da/d1c/l28 0 2026-03-09T00:03:25.921 INFO:tasks.workunit.client.1.vm06.stdout:2/131: read d7/da/db/f12 [2502270,38082] 0 2026-03-09T00:03:25.939 INFO:tasks.workunit.client.1.vm06.stdout:0/69: dwrite d3/f7 [0,4194304] 0 2026-03-09T00:03:25.939 INFO:tasks.workunit.client.1.vm06.stdout:0/70: write f1 [1739872,8611] 0 2026-03-09T00:03:25.943 INFO:tasks.workunit.client.1.vm06.stdout:0/71: chown d3/f10 1981480510 1 2026-03-09T00:03:25.945 INFO:tasks.workunit.client.1.vm06.stdout:0/72: creat d3/f1a x:0 0 0 2026-03-09T00:03:25.950 INFO:tasks.workunit.client.1.vm06.stdout:4/53: fsync f5 0 2026-03-09T00:03:25.950 INFO:tasks.workunit.client.1.vm06.stdout:0/73: creat d3/f1b x:0 0 0 2026-03-09T00:03:25.952 INFO:tasks.workunit.client.1.vm06.stdout:4/54: mknod cc 0 2026-03-09T00:03:25.988 INFO:tasks.workunit.client.1.vm06.stdout:3/11: sync 2026-03-09T00:03:25.988 INFO:tasks.workunit.client.1.vm06.stdout:1/24: sync 2026-03-09T00:03:25.988 INFO:tasks.workunit.client.1.vm06.stdout:1/25: write d6/f9 [1307698,97742] 0 2026-03-09T00:03:25.988 INFO:tasks.workunit.client.1.vm06.stdout:1/26: fsync f0 0 2026-03-09T00:03:25.988 INFO:tasks.workunit.client.1.vm06.stdout:1/27: chown d6/fa 3 1 2026-03-09T00:03:25.988 INFO:tasks.workunit.client.1.vm06.stdout:1/28: creat d6/fb x:0 0 0 2026-03-09T00:03:25.991 INFO:tasks.workunit.client.1.vm06.stdout:3/12: creat f3 x:0 0 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:3/13: chown l1 127325 1 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:3/14: creat f4 x:0 0 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:3/15: write f4 [107824,109981] 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:7/54: sync 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:7/55: write d0/f5 [3059186,15259] 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:6/52: sync 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:8/40: sync 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:8/41: dread - f9 zero size 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:8/42: write f3 [3275442,72143] 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:9/61: truncate d1/db/f8 2571347 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/29: symlink d6/lc 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/30: write d6/fa [574972,56924] 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/31: creat d6/fd x:0 0 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/32: truncate d6/f7 713537 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/33: truncate f3 77426 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/34: fdatasync f0 0 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/35: chown d6/fb 27 1 2026-03-09T00:03:25.996 INFO:tasks.workunit.client.1.vm06.stdout:1/36: rename d6 to d6/de 22 2026-03-09T00:03:25.997 INFO:tasks.workunit.client.1.vm06.stdout:7/56: write d0/f5 [1668320,37776] 0 2026-03-09T00:03:26.007 INFO:tasks.workunit.client.1.vm06.stdout:6/53: unlink c2 0 2026-03-09T00:03:26.007 INFO:tasks.workunit.client.1.vm06.stdout:6/54: dread - d4/fa zero size 2026-03-09T00:03:26.009 INFO:tasks.workunit.client.1.vm06.stdout:5/50: getdents d5 0 2026-03-09T00:03:26.010 INFO:tasks.workunit.client.1.vm06.stdout:8/43: unlink l4 0 2026-03-09T00:03:26.023 INFO:tasks.workunit.client.1.vm06.stdout:5/51: rmdir d5 39 2026-03-09T00:03:26.023 INFO:tasks.workunit.client.1.vm06.stdout:5/52: creat d5/fe x:0 0 0 2026-03-09T00:03:26.023 INFO:tasks.workunit.client.1.vm06.stdout:5/53: creat d5/ff x:0 0 0 2026-03-09T00:03:26.027 INFO:tasks.workunit.client.1.vm06.stdout:5/54: mknod d5/c10 0 2026-03-09T00:03:26.037 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:25 vm06.local ceph-mon[58395]: pgmap v128: 65 pgs: 65 active+clean; 220 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 2.1 MiB/s rd, 11 MiB/s wr, 698 op/s 2026-03-09T00:03:26.044 INFO:tasks.workunit.client.1.vm06.stdout:4/55: dwrite f5 [0,4194304] 0 2026-03-09T00:03:26.050 INFO:tasks.workunit.client.1.vm06.stdout:4/56: read f7 [2983191,78239] 0 2026-03-09T00:03:26.050 INFO:tasks.workunit.client.1.vm06.stdout:4/57: chown l2 6852374 1 2026-03-09T00:03:26.050 INFO:tasks.workunit.client.1.vm06.stdout:4/58: write fb [306227,26393] 0 2026-03-09T00:03:26.050 INFO:tasks.workunit.client.1.vm06.stdout:4/59: chown ca 12555829 1 2026-03-09T00:03:26.067 INFO:tasks.workunit.client.1.vm06.stdout:4/60: dread f7 [0,4194304] 0 2026-03-09T00:03:26.073 INFO:tasks.workunit.client.1.vm06.stdout:3/16: dwrite f3 [0,4194304] 0 2026-03-09T00:03:26.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:25 vm03.local ceph-mon[52346]: pgmap v128: 65 pgs: 65 active+clean; 220 MiB data, 1.5 GiB used, 118 GiB / 120 GiB avail; 2.1 MiB/s rd, 11 MiB/s wr, 698 op/s 2026-03-09T00:03:26.091 INFO:tasks.workunit.client.1.vm06.stdout:2/132: rmdir d7/da 39 2026-03-09T00:03:26.097 INFO:tasks.workunit.client.1.vm06.stdout:0/74: dwrite d3/f11 [0,4194304] 0 2026-03-09T00:03:26.100 INFO:tasks.workunit.client.1.vm06.stdout:2/133: rename d7/da/db/f12 to d7/da/d1c/f29 0 2026-03-09T00:03:26.102 INFO:tasks.workunit.client.1.vm06.stdout:2/134: symlink d7/d1a/l2a 0 2026-03-09T00:03:26.103 INFO:tasks.workunit.client.1.vm06.stdout:2/135: rename d7/da/db/de/c1d to d7/d1b/c2b 0 2026-03-09T00:03:26.113 INFO:tasks.workunit.client.1.vm06.stdout:2/136: chown d7/da 231052781 1 2026-03-09T00:03:26.113 INFO:tasks.workunit.client.1.vm06.stdout:2/137: creat d7/da/d1c/f2c x:0 0 0 2026-03-09T00:03:26.113 INFO:tasks.workunit.client.1.vm06.stdout:2/138: write d7/f17 [873393,37526] 0 2026-03-09T00:03:26.113 INFO:tasks.workunit.client.1.vm06.stdout:2/139: rmdir d7 39 2026-03-09T00:03:26.113 INFO:tasks.workunit.client.1.vm06.stdout:2/140: rename d7/da/d1c/f2c to d7/da/db/f2d 0 2026-03-09T00:03:26.113 INFO:tasks.workunit.client.1.vm06.stdout:2/141: fsync f3 0 2026-03-09T00:03:26.115 INFO:tasks.workunit.client.1.vm06.stdout:6/55: dwrite f1 [0,4194304] 0 2026-03-09T00:03:26.118 INFO:tasks.workunit.client.1.vm06.stdout:6/56: mknod d4/c10 0 2026-03-09T00:03:26.125 INFO:tasks.workunit.client.1.vm06.stdout:6/57: truncate d4/ff 589706 0 2026-03-09T00:03:26.125 INFO:tasks.workunit.client.1.vm06.stdout:6/58: truncate d4/f6 1043131 0 2026-03-09T00:03:26.125 INFO:tasks.workunit.client.1.vm06.stdout:6/59: dread - d4/fa zero size 2026-03-09T00:03:26.156 INFO:tasks.workunit.client.1.vm06.stdout:8/44: dwrite f3 [0,4194304] 0 2026-03-09T00:03:26.156 INFO:tasks.workunit.client.1.vm06.stdout:8/45: truncate f9 928214 0 2026-03-09T00:03:26.158 INFO:tasks.workunit.client.1.vm06.stdout:5/55: dwrite d5/fa [4194304,4194304] 0 2026-03-09T00:03:26.159 INFO:tasks.workunit.client.1.vm06.stdout:8/46: creat db/dd/fe x:0 0 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:8/47: chown db/dd 0 1 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:5/56: symlink d5/l11 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:8/48: symlink db/lf 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:5/57: link c4 d5/c12 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:5/58: chown d5/f7 5 1 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:8/49: mknod db/dd/c10 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:8/50: mknod db/dd/c11 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:8/51: symlink db/l12 0 2026-03-09T00:03:26.164 INFO:tasks.workunit.client.1.vm06.stdout:8/52: creat db/dd/f13 x:0 0 0 2026-03-09T00:03:26.173 INFO:tasks.workunit.client.1.vm06.stdout:8/53: dread f7 [0,4194304] 0 2026-03-09T00:03:26.199 INFO:tasks.workunit.client.1.vm06.stdout:3/17: dwrite f4 [0,4194304] 0 2026-03-09T00:03:26.206 INFO:tasks.workunit.client.1.vm06.stdout:3/18: link l1 l5 0 2026-03-09T00:03:26.208 INFO:tasks.workunit.client.1.vm06.stdout:3/19: rename f4 to f6 0 2026-03-09T00:03:26.211 INFO:tasks.workunit.client.1.vm06.stdout:6/60: dwrite d4/f5 [0,4194304] 0 2026-03-09T00:03:26.214 INFO:tasks.workunit.client.1.vm06.stdout:4/61: dwrite f7 [0,4194304] 0 2026-03-09T00:03:26.219 INFO:tasks.workunit.client.1.vm06.stdout:1/37: rmdir d6 39 2026-03-09T00:03:26.219 INFO:tasks.workunit.client.1.vm06.stdout:3/20: dread f6 [0,4194304] 0 2026-03-09T00:03:26.219 INFO:tasks.workunit.client.1.vm06.stdout:3/21: chown l1 22499 1 2026-03-09T00:03:26.219 INFO:tasks.workunit.client.1.vm06.stdout:3/22: creat f7 x:0 0 0 2026-03-09T00:03:26.220 INFO:tasks.workunit.client.1.vm06.stdout:4/62: mknod cd 0 2026-03-09T00:03:26.220 INFO:tasks.workunit.client.1.vm06.stdout:4/63: fdatasync f5 0 2026-03-09T00:03:26.224 INFO:tasks.workunit.client.1.vm06.stdout:2/142: dwrite f3 [0,4194304] 0 2026-03-09T00:03:26.226 INFO:tasks.workunit.client.1.vm06.stdout:1/38: getdents d6 0 2026-03-09T00:03:26.228 INFO:tasks.workunit.client.1.vm06.stdout:2/143: dread d7/da/d1c/f29 [0,4194304] 0 2026-03-09T00:03:26.228 INFO:tasks.workunit.client.1.vm06.stdout:2/144: write d7/da/f18 [699408,126762] 0 2026-03-09T00:03:26.232 INFO:tasks.workunit.client.1.vm06.stdout:1/39: creat d6/ff x:0 0 0 2026-03-09T00:03:26.233 INFO:tasks.workunit.client.1.vm06.stdout:1/40: link l5 d6/l10 0 2026-03-09T00:03:26.233 INFO:tasks.workunit.client.1.vm06.stdout:1/41: write d6/fb [78424,7312] 0 2026-03-09T00:03:26.233 INFO:tasks.workunit.client.1.vm06.stdout:2/145: dread f3 [0,4194304] 0 2026-03-09T00:03:26.234 INFO:tasks.workunit.client.1.vm06.stdout:2/146: symlink d7/da/l2e 0 2026-03-09T00:03:26.240 INFO:tasks.workunit.client.1.vm06.stdout:2/147: chown d7/da/db/c16 1796 1 2026-03-09T00:03:26.240 INFO:tasks.workunit.client.1.vm06.stdout:2/148: dread - d7/da/db/f2d zero size 2026-03-09T00:03:26.240 INFO:tasks.workunit.client.1.vm06.stdout:1/42: dread d6/f7 [0,4194304] 0 2026-03-09T00:03:26.240 INFO:tasks.workunit.client.1.vm06.stdout:7/57: dwrite d0/f5 [0,4194304] 0 2026-03-09T00:03:26.240 INFO:tasks.workunit.client.1.vm06.stdout:7/58: chown d0 0 1 2026-03-09T00:03:26.242 INFO:tasks.workunit.client.1.vm06.stdout:1/43: rename c4 to d6/c11 0 2026-03-09T00:03:26.245 INFO:tasks.workunit.client.1.vm06.stdout:3/23: write f3 [4430891,62689] 0 2026-03-09T00:03:26.252 INFO:tasks.workunit.client.1.vm06.stdout:0/75: truncate d3/f11 2468983 0 2026-03-09T00:03:26.257 INFO:tasks.workunit.client.1.vm06.stdout:7/59: truncate d0/fa 3636382 0 2026-03-09T00:03:26.258 INFO:tasks.workunit.client.1.vm06.stdout:1/44: symlink d6/l12 0 2026-03-09T00:03:26.264 INFO:tasks.workunit.client.1.vm06.stdout:4/64: dwrite f5 [0,4194304] 0 2026-03-09T00:03:26.266 INFO:tasks.workunit.client.1.vm06.stdout:0/76: creat d3/f1c x:0 0 0 2026-03-09T00:03:26.266 INFO:tasks.workunit.client.1.vm06.stdout:1/45: mknod d6/c13 0 2026-03-09T00:03:26.267 INFO:tasks.workunit.client.1.vm06.stdout:2/149: unlink d7/da/db/f2d 0 2026-03-09T00:03:26.270 INFO:tasks.workunit.client.1.vm06.stdout:5/59: dwrite d5/f7 [4194304,4194304] 0 2026-03-09T00:03:26.282 INFO:tasks.workunit.client.1.vm06.stdout:2/150: dread f3 [0,4194304] 0 2026-03-09T00:03:26.282 INFO:tasks.workunit.client.1.vm06.stdout:2/151: fdatasync f3 0 2026-03-09T00:03:26.282 INFO:tasks.workunit.client.1.vm06.stdout:2/152: write d7/f26 [1736672,85472] 0 2026-03-09T00:03:26.284 INFO:tasks.workunit.client.1.vm06.stdout:5/60: symlink d5/l13 0 2026-03-09T00:03:26.285 INFO:tasks.workunit.client.1.vm06.stdout:0/77: mkdir d3/d1d 0 2026-03-09T00:03:26.349 INFO:tasks.workunit.client.1.vm06.stdout:8/54: getdents db/dd 0 2026-03-09T00:03:26.354 INFO:tasks.workunit.client.1.vm06.stdout:8/55: link db/lc db/dd/l14 0 2026-03-09T00:03:26.357 INFO:tasks.workunit.client.1.vm06.stdout:8/56: link db/lf db/dd/l15 0 2026-03-09T00:03:26.359 INFO:tasks.workunit.client.1.vm06.stdout:7/60: dwrite d0/fe [0,4194304] 0 2026-03-09T00:03:26.360 INFO:tasks.workunit.client.1.vm06.stdout:4/65: dwrite f1 [4194304,4194304] 0 2026-03-09T00:03:26.368 INFO:tasks.workunit.client.1.vm06.stdout:1/46: dwrite d6/fa [0,4194304] 0 2026-03-09T00:03:26.368 INFO:tasks.workunit.client.1.vm06.stdout:1/47: write d6/fd [862673,26042] 0 2026-03-09T00:03:26.369 INFO:tasks.workunit.client.1.vm06.stdout:7/61: write d0/f5 [3678618,124349] 0 2026-03-09T00:03:26.376 INFO:tasks.workunit.client.1.vm06.stdout:4/66: creat fe x:0 0 0 2026-03-09T00:03:26.376 INFO:tasks.workunit.client.1.vm06.stdout:4/67: stat ca 0 2026-03-09T00:03:26.380 INFO:tasks.workunit.client.1.vm06.stdout:4/68: mkdir df 0 2026-03-09T00:03:26.382 INFO:tasks.workunit.client.1.vm06.stdout:4/69: rmdir df 0 2026-03-09T00:03:26.387 INFO:tasks.workunit.client.1.vm06.stdout:2/153: dwrite d7/f17 [0,4194304] 0 2026-03-09T00:03:26.393 INFO:tasks.workunit.client.1.vm06.stdout:2/154: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:26.393 INFO:tasks.workunit.client.1.vm06.stdout:2/155: chown d7/da/db/de 161136 1 2026-03-09T00:03:26.412 INFO:tasks.workunit.client.1.vm06.stdout:8/57: dwrite f7 [0,4194304] 0 2026-03-09T00:03:26.412 INFO:tasks.workunit.client.1.vm06.stdout:8/58: truncate db/dd/fe 823988 0 2026-03-09T00:03:26.412 INFO:tasks.workunit.client.1.vm06.stdout:8/59: readlink db/l12 0 2026-03-09T00:03:26.412 INFO:tasks.workunit.client.1.vm06.stdout:8/60: dread - db/dd/f13 zero size 2026-03-09T00:03:26.413 INFO:tasks.workunit.client.1.vm06.stdout:2/156: dread f2 [0,4194304] 0 2026-03-09T00:03:26.413 INFO:tasks.workunit.client.1.vm06.stdout:2/157: chown d7/da/d1c/l21 15540 1 2026-03-09T00:03:26.413 INFO:tasks.workunit.client.1.vm06.stdout:0/78: dwrite d3/f17 [0,4194304] 0 2026-03-09T00:03:26.413 INFO:tasks.workunit.client.1.vm06.stdout:8/61: dread f9 [0,4194304] 0 2026-03-09T00:03:26.413 INFO:tasks.workunit.client.1.vm06.stdout:2/158: fsync d7/f26 0 2026-03-09T00:03:26.415 INFO:tasks.workunit.client.1.vm06.stdout:0/79: stat d3/l15 0 2026-03-09T00:03:26.415 INFO:tasks.workunit.client.1.vm06.stdout:0/80: truncate d3/f17 4503089 0 2026-03-09T00:03:26.419 INFO:tasks.workunit.client.1.vm06.stdout:8/62: creat db/f16 x:0 0 0 2026-03-09T00:03:26.419 INFO:tasks.workunit.client.1.vm06.stdout:8/63: read - db/f16 zero size 2026-03-09T00:03:26.419 INFO:tasks.workunit.client.1.vm06.stdout:8/64: creat db/f17 x:0 0 0 2026-03-09T00:03:26.419 INFO:tasks.workunit.client.1.vm06.stdout:8/65: chown l8 3818 1 2026-03-09T00:03:26.420 INFO:tasks.workunit.client.1.vm06.stdout:2/159: rename d7/da/f23 to d7/d1a/f2f 0 2026-03-09T00:03:26.420 INFO:tasks.workunit.client.1.vm06.stdout:2/160: chown d7/da/f18 3491 1 2026-03-09T00:03:26.421 INFO:tasks.workunit.client.1.vm06.stdout:0/81: creat d3/f1e x:0 0 0 2026-03-09T00:03:26.421 INFO:tasks.workunit.client.1.vm06.stdout:0/82: dread - d3/f1c zero size 2026-03-09T00:03:26.422 INFO:tasks.workunit.client.1.vm06.stdout:4/70: dwrite fb [0,4194304] 0 2026-03-09T00:03:26.425 INFO:tasks.workunit.client.1.vm06.stdout:8/66: symlink db/dd/l18 0 2026-03-09T00:03:26.425 INFO:tasks.workunit.client.1.vm06.stdout:2/161: creat d7/d1a/f30 x:0 0 0 2026-03-09T00:03:26.427 INFO:tasks.workunit.client.1.vm06.stdout:2/162: read f6 [505324,46075] 0 2026-03-09T00:03:26.431 INFO:tasks.workunit.client.1.vm06.stdout:4/71: write f5 [3029632,119521] 0 2026-03-09T00:03:26.431 INFO:tasks.workunit.client.1.vm06.stdout:4/72: dread - fe zero size 2026-03-09T00:03:26.440 INFO:tasks.workunit.client.1.vm06.stdout:8/67: symlink db/dd/l19 0 2026-03-09T00:03:26.441 INFO:tasks.workunit.client.1.vm06.stdout:9/62: sync 2026-03-09T00:03:26.442 INFO:tasks.workunit.client.1.vm06.stdout:9/63: mknod d1/d3/d12/c15 0 2026-03-09T00:03:26.446 INFO:tasks.workunit.client.1.vm06.stdout:5/61: getdents d5 0 2026-03-09T00:03:26.446 INFO:tasks.workunit.client.1.vm06.stdout:5/62: truncate d5/ff 161524 0 2026-03-09T00:03:26.446 INFO:tasks.workunit.client.1.vm06.stdout:9/64: read d1/db/f7 [153866,109505] 0 2026-03-09T00:03:26.447 INFO:tasks.workunit.client.1.vm06.stdout:5/63: creat d5/f14 x:0 0 0 2026-03-09T00:03:26.455 INFO:tasks.workunit.client.1.vm06.stdout:5/64: dread d5/f9 [0,4194304] 0 2026-03-09T00:03:26.458 INFO:tasks.workunit.client.1.vm06.stdout:5/65: rename d5/fa to d5/f15 0 2026-03-09T00:03:26.458 INFO:tasks.workunit.client.1.vm06.stdout:5/66: truncate d5/ff 993127 0 2026-03-09T00:03:26.458 INFO:tasks.workunit.client.1.vm06.stdout:5/67: chown d5/f7 9819 1 2026-03-09T00:03:26.466 INFO:tasks.workunit.client.1.vm06.stdout:7/62: dread d0/fa [0,4194304] 0 2026-03-09T00:03:26.478 INFO:tasks.workunit.client.1.vm06.stdout:6/61: sync 2026-03-09T00:03:26.482 INFO:tasks.workunit.client.1.vm06.stdout:6/62: mkdir d4/d11 0 2026-03-09T00:03:26.482 INFO:tasks.workunit.client.1.vm06.stdout:6/63: rmdir d4/d11 0 2026-03-09T00:03:26.485 INFO:tasks.workunit.client.1.vm06.stdout:6/64: write f1 [456219,31858] 0 2026-03-09T00:03:26.485 INFO:tasks.workunit.client.1.vm06.stdout:6/65: write f1 [4257489,83919] 0 2026-03-09T00:03:26.486 INFO:tasks.workunit.client.1.vm06.stdout:6/66: readlink d4/le 0 2026-03-09T00:03:26.489 INFO:tasks.workunit.client.1.vm06.stdout:6/67: creat d4/f12 x:0 0 0 2026-03-09T00:03:26.492 INFO:tasks.workunit.client.1.vm06.stdout:5/68: dread d5/f7 [0,4194304] 0 2026-03-09T00:03:26.493 INFO:tasks.workunit.client.1.vm06.stdout:3/24: sync 2026-03-09T00:03:26.493 INFO:tasks.workunit.client.1.vm06.stdout:0/83: getdents d3 0 2026-03-09T00:03:26.499 INFO:tasks.workunit.client.1.vm06.stdout:0/84: rename d3/d1d to d3/d18/d1f 0 2026-03-09T00:03:26.501 INFO:tasks.workunit.client.1.vm06.stdout:0/85: mknod d3/d18/c20 0 2026-03-09T00:03:26.503 INFO:tasks.workunit.client.1.vm06.stdout:0/86: mknod d3/d18/c21 0 2026-03-09T00:03:26.503 INFO:tasks.workunit.client.1.vm06.stdout:0/87: dread - d3/f1b zero size 2026-03-09T00:03:26.503 INFO:tasks.workunit.client.1.vm06.stdout:0/88: dread - d3/f1b zero size 2026-03-09T00:03:26.504 INFO:tasks.workunit.client.1.vm06.stdout:0/89: link f1 d3/d18/f22 0 2026-03-09T00:03:26.505 INFO:tasks.workunit.client.1.vm06.stdout:0/90: mkdir d3/d18/d1f/d23 0 2026-03-09T00:03:26.505 INFO:tasks.workunit.client.1.vm06.stdout:0/91: chown d3/d18/c20 0 1 2026-03-09T00:03:26.519 INFO:tasks.workunit.client.1.vm06.stdout:8/68: dwrite db/dd/f13 [0,4194304] 0 2026-03-09T00:03:26.521 INFO:tasks.workunit.client.1.vm06.stdout:8/69: rename l6 to db/l1a 0 2026-03-09T00:03:26.522 INFO:tasks.workunit.client.1.vm06.stdout:8/70: mknod db/c1b 0 2026-03-09T00:03:26.527 INFO:tasks.workunit.client.1.vm06.stdout:8/71: dread f9 [0,4194304] 0 2026-03-09T00:03:26.527 INFO:tasks.workunit.client.1.vm06.stdout:8/72: write db/f16 [66020,96732] 0 2026-03-09T00:03:26.527 INFO:tasks.workunit.client.1.vm06.stdout:8/73: fsync f7 0 2026-03-09T00:03:26.527 INFO:tasks.workunit.client.1.vm06.stdout:8/74: creat db/dd/f1c x:0 0 0 2026-03-09T00:03:26.527 INFO:tasks.workunit.client.1.vm06.stdout:8/75: fdatasync db/dd/f1c 0 2026-03-09T00:03:26.527 INFO:tasks.workunit.client.1.vm06.stdout:8/76: creat db/f1d x:0 0 0 2026-03-09T00:03:26.531 INFO:tasks.workunit.client.1.vm06.stdout:5/69: read d5/f15 [5589863,115076] 0 2026-03-09T00:03:26.531 INFO:tasks.workunit.client.1.vm06.stdout:5/70: creat d5/f16 x:0 0 0 2026-03-09T00:03:26.531 INFO:tasks.workunit.client.1.vm06.stdout:5/71: chown c4 226349 1 2026-03-09T00:03:26.531 INFO:tasks.workunit.client.1.vm06.stdout:5/72: write d5/f15 [2777187,55423] 0 2026-03-09T00:03:26.537 INFO:tasks.workunit.client.1.vm06.stdout:5/73: link d5/c6 d5/c17 0 2026-03-09T00:03:26.537 INFO:tasks.workunit.client.1.vm06.stdout:5/74: stat c3 0 2026-03-09T00:03:26.539 INFO:tasks.workunit.client.1.vm06.stdout:5/75: rename d5/l13 to d5/l18 0 2026-03-09T00:03:26.557 INFO:tasks.workunit.client.1.vm06.stdout:9/65: dwrite d1/db/d9/f10 [0,4194304] 0 2026-03-09T00:03:26.572 INFO:tasks.workunit.client.1.vm06.stdout:1/48: sync 2026-03-09T00:03:26.576 INFO:tasks.workunit.client.1.vm06.stdout:1/49: mknod d6/c14 0 2026-03-09T00:03:26.586 INFO:tasks.workunit.client.1.vm06.stdout:1/50: unlink c1 0 2026-03-09T00:03:26.586 INFO:tasks.workunit.client.1.vm06.stdout:1/51: link d6/c14 d6/c15 0 2026-03-09T00:03:26.587 INFO:tasks.workunit.client.1.vm06.stdout:2/163: dwrite f6 [0,4194304] 0 2026-03-09T00:03:26.596 INFO:tasks.workunit.client.1.vm06.stdout:0/92: dwrite d3/f10 [0,4194304] 0 2026-03-09T00:03:26.631 INFO:tasks.workunit.client.1.vm06.stdout:8/77: dwrite db/dd/f1c [0,4194304] 0 2026-03-09T00:03:26.631 INFO:tasks.workunit.client.1.vm06.stdout:3/25: dwrite f7 [0,4194304] 0 2026-03-09T00:03:26.634 INFO:tasks.workunit.client.1.vm06.stdout:8/78: unlink db/lf 0 2026-03-09T00:03:26.634 INFO:tasks.workunit.client.1.vm06.stdout:8/79: dread - db/f17 zero size 2026-03-09T00:03:26.634 INFO:tasks.workunit.client.1.vm06.stdout:8/80: write db/f16 [259154,25751] 0 2026-03-09T00:03:26.637 INFO:tasks.workunit.client.1.vm06.stdout:3/26: link f3 f8 0 2026-03-09T00:03:26.637 INFO:tasks.workunit.client.1.vm06.stdout:3/27: creat f9 x:0 0 0 2026-03-09T00:03:26.637 INFO:tasks.workunit.client.1.vm06.stdout:3/28: write f9 [585836,60841] 0 2026-03-09T00:03:26.637 INFO:tasks.workunit.client.1.vm06.stdout:3/29: fdatasync f6 0 2026-03-09T00:03:26.637 INFO:tasks.workunit.client.1.vm06.stdout:3/30: creat fa x:0 0 0 2026-03-09T00:03:26.638 INFO:tasks.workunit.client.1.vm06.stdout:8/81: mkdir db/d1e 0 2026-03-09T00:03:26.649 INFO:tasks.workunit.client.1.vm06.stdout:8/82: creat db/dd/f1f x:0 0 0 2026-03-09T00:03:26.649 INFO:tasks.workunit.client.1.vm06.stdout:8/83: creat db/d1e/f20 x:0 0 0 2026-03-09T00:03:26.651 INFO:tasks.workunit.client.1.vm06.stdout:6/68: dwrite f1 [4194304,4194304] 0 2026-03-09T00:03:26.652 INFO:tasks.workunit.client.1.vm06.stdout:8/84: write db/dd/f13 [2155356,66972] 0 2026-03-09T00:03:26.654 INFO:tasks.workunit.client.1.vm06.stdout:6/69: mknod d4/c13 0 2026-03-09T00:03:26.658 INFO:tasks.workunit.client.1.vm06.stdout:8/85: link c2 db/c21 0 2026-03-09T00:03:26.664 INFO:tasks.workunit.client.1.vm06.stdout:8/86: chown db/dd 24 1 2026-03-09T00:03:26.664 INFO:tasks.workunit.client.1.vm06.stdout:6/70: mknod d4/c14 0 2026-03-09T00:03:26.664 INFO:tasks.workunit.client.1.vm06.stdout:6/71: fdatasync d4/f12 0 2026-03-09T00:03:26.664 INFO:tasks.workunit.client.1.vm06.stdout:6/72: write d4/f6 [1191452,104058] 0 2026-03-09T00:03:26.664 INFO:tasks.workunit.client.1.vm06.stdout:6/73: write d4/f12 [338421,123955] 0 2026-03-09T00:03:26.667 INFO:tasks.workunit.client.1.vm06.stdout:8/87: write f3 [1879642,122925] 0 2026-03-09T00:03:26.675 INFO:tasks.workunit.client.1.vm06.stdout:8/88: mknod db/dd/c22 0 2026-03-09T00:03:26.675 INFO:tasks.workunit.client.1.vm06.stdout:8/89: creat db/d1e/f23 x:0 0 0 2026-03-09T00:03:26.675 INFO:tasks.workunit.client.1.vm06.stdout:8/90: write f5 [434271,90753] 0 2026-03-09T00:03:26.675 INFO:tasks.workunit.client.1.vm06.stdout:8/91: truncate db/d1e/f20 490928 0 2026-03-09T00:03:26.678 INFO:tasks.workunit.client.1.vm06.stdout:8/92: unlink db/lc 0 2026-03-09T00:03:26.680 INFO:tasks.workunit.client.1.vm06.stdout:8/93: mkdir db/dd/d24 0 2026-03-09T00:03:26.694 INFO:tasks.workunit.client.1.vm06.stdout:9/66: dwrite d1/d4/fe [0,4194304] 0 2026-03-09T00:03:26.694 INFO:tasks.workunit.client.1.vm06.stdout:2/164: dwrite d7/d1a/f30 [0,4194304] 0 2026-03-09T00:03:26.694 INFO:tasks.workunit.client.1.vm06.stdout:2/165: chown d7/da/d1c/f29 220731 1 2026-03-09T00:03:26.704 INFO:tasks.workunit.client.1.vm06.stdout:0/93: dwrite d3/d18/f22 [0,4194304] 0 2026-03-09T00:03:26.705 INFO:tasks.workunit.client.1.vm06.stdout:2/166: truncate f3 3760187 0 2026-03-09T00:03:26.705 INFO:tasks.workunit.client.1.vm06.stdout:2/167: chown d7/f17 3330463 1 2026-03-09T00:03:26.706 INFO:tasks.workunit.client.1.vm06.stdout:5/76: rmdir d5 39 2026-03-09T00:03:26.707 INFO:tasks.workunit.client.1.vm06.stdout:9/67: getdents d1/db/d9 0 2026-03-09T00:03:26.708 INFO:tasks.workunit.client.1.vm06.stdout:0/94: rmdir d3/d18 39 2026-03-09T00:03:26.710 INFO:tasks.workunit.client.1.vm06.stdout:2/168: dread d7/f8 [0,4194304] 0 2026-03-09T00:03:26.713 INFO:tasks.workunit.client.1.vm06.stdout:5/77: creat d5/f19 x:0 0 0 2026-03-09T00:03:26.713 INFO:tasks.workunit.client.1.vm06.stdout:5/78: dread - d5/fe zero size 2026-03-09T00:03:26.714 INFO:tasks.workunit.client.1.vm06.stdout:2/169: write f2 [1627614,89756] 0 2026-03-09T00:03:26.717 INFO:tasks.workunit.client.1.vm06.stdout:4/73: dwrite f5 [0,4194304] 0 2026-03-09T00:03:26.723 INFO:tasks.workunit.client.1.vm06.stdout:4/74: stat f1 0 2026-03-09T00:03:26.727 INFO:tasks.workunit.client.1.vm06.stdout:0/95: mknod d3/d18/d1f/c24 0 2026-03-09T00:03:26.727 INFO:tasks.workunit.client.1.vm06.stdout:5/79: link d5/l11 d5/l1a 0 2026-03-09T00:03:26.727 INFO:tasks.workunit.client.1.vm06.stdout:5/80: chown d5/f14 7215849 1 2026-03-09T00:03:26.728 INFO:tasks.workunit.client.1.vm06.stdout:2/170: truncate d7/d1b/f22 2639390 0 2026-03-09T00:03:26.728 INFO:tasks.workunit.client.1.vm06.stdout:2/171: write d7/d1a/f2f [303709,25584] 0 2026-03-09T00:03:26.728 INFO:tasks.workunit.client.1.vm06.stdout:2/172: dread - d7/da/f20 zero size 2026-03-09T00:03:26.750 INFO:tasks.workunit.client.1.vm06.stdout:0/96: creat d3/d18/f25 x:0 0 0 2026-03-09T00:03:26.751 INFO:tasks.workunit.client.1.vm06.stdout:2/173: mkdir d7/d1b/d31 0 2026-03-09T00:03:26.753 INFO:tasks.workunit.client.1.vm06.stdout:2/174: creat d7/da/db/de/f32 x:0 0 0 2026-03-09T00:03:26.755 INFO:tasks.workunit.client.1.vm06.stdout:3/31: dwrite fa [0,4194304] 0 2026-03-09T00:03:26.766 INFO:tasks.workunit.client.1.vm06.stdout:2/175: creat d7/d1a/d25/f33 x:0 0 0 2026-03-09T00:03:26.766 INFO:tasks.workunit.client.1.vm06.stdout:2/176: dread - d7/da/f20 zero size 2026-03-09T00:03:26.766 INFO:tasks.workunit.client.1.vm06.stdout:3/32: write f9 [422162,48981] 0 2026-03-09T00:03:26.766 INFO:tasks.workunit.client.1.vm06.stdout:2/177: mknod d7/c34 0 2026-03-09T00:03:26.766 INFO:tasks.workunit.client.1.vm06.stdout:2/178: rename d7/da/d1c/l21 to d7/da/db/l35 0 2026-03-09T00:03:26.766 INFO:tasks.workunit.client.1.vm06.stdout:1/52: rmdir d6 39 2026-03-09T00:03:26.768 INFO:tasks.workunit.client.1.vm06.stdout:3/33: dread f3 [0,4194304] 0 2026-03-09T00:03:26.768 INFO:tasks.workunit.client.1.vm06.stdout:2/179: mknod d7/d1b/d31/c36 0 2026-03-09T00:03:26.768 INFO:tasks.workunit.client.1.vm06.stdout:2/180: write d7/da/d1c/f1f [660843,75494] 0 2026-03-09T00:03:26.768 INFO:tasks.workunit.client.1.vm06.stdout:2/181: chown d7/da/l24 89 1 2026-03-09T00:03:26.779 INFO:tasks.workunit.client.1.vm06.stdout:9/68: dwrite d1/d4/ff [0,4194304] 0 2026-03-09T00:03:26.779 INFO:tasks.workunit.client.1.vm06.stdout:9/69: chown d1/d3/d12 12 1 2026-03-09T00:03:26.783 INFO:tasks.workunit.client.1.vm06.stdout:1/53: write d6/fa [481886,15079] 0 2026-03-09T00:03:26.814 INFO:tasks.workunit.client.1.vm06.stdout:4/75: dwrite fe [0,4194304] 0 2026-03-09T00:03:26.814 INFO:tasks.workunit.client.1.vm06.stdout:4/76: readlink l8 0 2026-03-09T00:03:26.818 INFO:tasks.workunit.client.1.vm06.stdout:4/77: read fe [811705,47799] 0 2026-03-09T00:03:26.825 INFO:tasks.workunit.client.1.vm06.stdout:4/78: readlink l2 0 2026-03-09T00:03:26.825 INFO:tasks.workunit.client.1.vm06.stdout:4/79: creat f10 x:0 0 0 2026-03-09T00:03:26.825 INFO:tasks.workunit.client.1.vm06.stdout:4/80: symlink l11 0 2026-03-09T00:03:26.831 INFO:tasks.workunit.client.1.vm06.stdout:4/81: dread f5 [0,4194304] 0 2026-03-09T00:03:26.851 INFO:tasks.workunit.client.1.vm06.stdout:6/74: rmdir d4 39 2026-03-09T00:03:26.853 INFO:tasks.workunit.client.1.vm06.stdout:6/75: mknod d4/c15 0 2026-03-09T00:03:26.854 INFO:tasks.workunit.client.1.vm06.stdout:5/81: dwrite d5/f7 [0,4194304] 0 2026-03-09T00:03:26.865 INFO:tasks.workunit.client.1.vm06.stdout:5/82: rename d5/f7 to d5/f1b 0 2026-03-09T00:03:26.870 INFO:tasks.workunit.client.1.vm06.stdout:5/83: mkdir d5/d1c 0 2026-03-09T00:03:26.870 INFO:tasks.workunit.client.1.vm06.stdout:5/84: chown d5 0 1 2026-03-09T00:03:26.870 INFO:tasks.workunit.client.1.vm06.stdout:5/85: truncate d5/fe 796997 0 2026-03-09T00:03:26.870 INFO:tasks.workunit.client.1.vm06.stdout:5/86: chown d5/d1c 354 1 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/87: unlink d5/c10 0 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/88: dread - d5/f14 zero size 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/89: dread d5/ff [0,4194304] 0 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/90: creat d5/f1d x:0 0 0 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/91: write d5/f1d [761505,105218] 0 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/92: fdatasync d5/ff 0 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/93: dread d5/ff [0,4194304] 0 2026-03-09T00:03:26.872 INFO:tasks.workunit.client.1.vm06.stdout:5/94: chown d5/f19 875 1 2026-03-09T00:03:26.876 INFO:tasks.workunit.client.1.vm06.stdout:5/95: unlink c3 0 2026-03-09T00:03:26.897 INFO:tasks.workunit.client.1.vm06.stdout:3/34: dwrite f6 [0,4194304] 0 2026-03-09T00:03:26.901 INFO:tasks.workunit.client.1.vm06.stdout:1/54: dwrite d6/f7 [0,4194304] 0 2026-03-09T00:03:26.901 INFO:tasks.workunit.client.1.vm06.stdout:1/55: read d6/fa [473737,59007] 0 2026-03-09T00:03:26.902 INFO:tasks.workunit.client.1.vm06.stdout:8/94: dwrite db/f16 [0,4194304] 0 2026-03-09T00:03:26.906 INFO:tasks.workunit.client.1.vm06.stdout:3/35: getdents . 0 2026-03-09T00:03:26.906 INFO:tasks.workunit.client.1.vm06.stdout:1/56: mknod d6/c16 0 2026-03-09T00:03:26.910 INFO:tasks.workunit.client.1.vm06.stdout:9/70: dwrite d1/db/f7 [0,4194304] 0 2026-03-09T00:03:26.912 INFO:tasks.workunit.client.1.vm06.stdout:9/71: dread d1/db/d9/fc [0,4194304] 0 2026-03-09T00:03:26.916 INFO:tasks.workunit.client.1.vm06.stdout:9/72: write d1/db/f8 [3603832,113322] 0 2026-03-09T00:03:26.918 INFO:tasks.workunit.client.1.vm06.stdout:8/95: dread db/dd/fe [0,4194304] 0 2026-03-09T00:03:26.924 INFO:tasks.workunit.client.1.vm06.stdout:9/73: write d1/d4/fe [4142944,32877] 0 2026-03-09T00:03:26.924 INFO:tasks.workunit.client.1.vm06.stdout:9/74: fdatasync d1/d3/f11 0 2026-03-09T00:03:26.928 INFO:tasks.workunit.client.1.vm06.stdout:3/36: mknod cb 0 2026-03-09T00:03:26.930 INFO:tasks.workunit.client.1.vm06.stdout:3/37: dread f8 [0,4194304] 0 2026-03-09T00:03:26.930 INFO:tasks.workunit.client.1.vm06.stdout:9/75: truncate d1/db/d9/f10 1981281 0 2026-03-09T00:03:26.932 INFO:tasks.workunit.client.1.vm06.stdout:3/38: dread f9 [0,4194304] 0 2026-03-09T00:03:26.933 INFO:tasks.workunit.client.1.vm06.stdout:7/63: sync 2026-03-09T00:03:26.933 INFO:tasks.workunit.client.1.vm06.stdout:3/39: mknod cc 0 2026-03-09T00:03:26.934 INFO:tasks.workunit.client.1.vm06.stdout:0/97: fsync d3/d18/f25 0 2026-03-09T00:03:26.935 INFO:tasks.workunit.client.1.vm06.stdout:7/64: symlink d0/db/l11 0 2026-03-09T00:03:26.935 INFO:tasks.workunit.client.1.vm06.stdout:7/65: write d0/f5 [2112826,40354] 0 2026-03-09T00:03:26.936 INFO:tasks.workunit.client.1.vm06.stdout:0/98: creat d3/d18/d1f/f26 x:0 0 0 2026-03-09T00:03:26.936 INFO:tasks.workunit.client.1.vm06.stdout:0/99: dread - d3/d18/f14 zero size 2026-03-09T00:03:26.937 INFO:tasks.workunit.client.1.vm06.stdout:7/66: rmdir d0/db 39 2026-03-09T00:03:26.947 INFO:tasks.workunit.client.1.vm06.stdout:0/100: symlink d3/d18/l27 0 2026-03-09T00:03:26.947 INFO:tasks.workunit.client.1.vm06.stdout:0/101: dread - d3/f1e zero size 2026-03-09T00:03:26.949 INFO:tasks.workunit.client.1.vm06.stdout:7/67: truncate d0/fa 3062307 0 2026-03-09T00:03:26.951 INFO:tasks.workunit.client.1.vm06.stdout:0/102: mkdir d3/d18/d28 0 2026-03-09T00:03:26.952 INFO:tasks.workunit.client.1.vm06.stdout:0/103: chown f1 3771 1 2026-03-09T00:03:26.952 INFO:tasks.workunit.client.1.vm06.stdout:0/104: dread - d3/f1c zero size 2026-03-09T00:03:26.952 INFO:tasks.workunit.client.1.vm06.stdout:0/105: readlink d3/l15 0 2026-03-09T00:03:26.952 INFO:tasks.workunit.client.1.vm06.stdout:0/106: truncate d3/f1b 655720 0 2026-03-09T00:03:26.952 INFO:tasks.workunit.client.1.vm06.stdout:0/107: fsync d3/d18/f22 0 2026-03-09T00:03:26.956 INFO:tasks.workunit.client.1.vm06.stdout:0/108: dread d3/f17 [0,4194304] 0 2026-03-09T00:03:26.956 INFO:tasks.workunit.client.1.vm06.stdout:7/68: mknod d0/db/c12 0 2026-03-09T00:03:26.964 INFO:tasks.workunit.client.1.vm06.stdout:0/109: write d3/f7 [1982655,9698] 0 2026-03-09T00:03:26.964 INFO:tasks.workunit.client.1.vm06.stdout:0/110: chown d3/d18/l27 44752 1 2026-03-09T00:03:26.965 INFO:tasks.workunit.client.1.vm06.stdout:7/69: creat d0/df/f13 x:0 0 0 2026-03-09T00:03:26.965 INFO:tasks.workunit.client.1.vm06.stdout:7/70: fdatasync d0/f6 0 2026-03-09T00:03:26.967 INFO:tasks.workunit.client.1.vm06.stdout:4/82: dwrite f1 [8388608,4194304] 0 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:0/111: rmdir d3 39 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:0/112: dread d3/f17 [0,4194304] 0 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:0/113: read - d3/f1c zero size 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:0/114: write d3/d18/f14 [793888,76854] 0 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:0/115: chown d3/f1b 74 1 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:4/83: unlink c4 0 2026-03-09T00:03:26.977 INFO:tasks.workunit.client.1.vm06.stdout:4/84: write fe [2696866,19294] 0 2026-03-09T00:03:27.023 INFO:tasks.workunit.client.1.vm06.stdout:1/57: dwrite d6/fa [0,4194304] 0 2026-03-09T00:03:27.027 INFO:tasks.workunit.client.1.vm06.stdout:5/96: dwrite d5/f1b [4194304,4194304] 0 2026-03-09T00:03:27.027 INFO:tasks.workunit.client.1.vm06.stdout:6/76: dwrite d4/f5 [4194304,4194304] 0 2026-03-09T00:03:27.027 INFO:tasks.workunit.client.1.vm06.stdout:5/97: dread d5/ff [0,4194304] 0 2026-03-09T00:03:27.027 INFO:tasks.workunit.client.1.vm06.stdout:5/98: truncate d5/f14 868851 0 2026-03-09T00:03:27.028 INFO:tasks.workunit.client.1.vm06.stdout:5/99: chown d5/f16 149 1 2026-03-09T00:03:27.028 INFO:tasks.workunit.client.1.vm06.stdout:5/100: fsync d5/f15 0 2026-03-09T00:03:27.029 INFO:tasks.workunit.client.1.vm06.stdout:6/77: dread d4/f6 [0,4194304] 0 2026-03-09T00:03:27.034 INFO:tasks.workunit.client.1.vm06.stdout:5/101: rename d5/lb to d5/l1e 0 2026-03-09T00:03:27.035 INFO:tasks.workunit.client.1.vm06.stdout:5/102: dread d5/fe [0,4194304] 0 2026-03-09T00:03:27.036 INFO:tasks.workunit.client.1.vm06.stdout:6/78: mkdir d4/d16 0 2026-03-09T00:03:27.037 INFO:tasks.workunit.client.1.vm06.stdout:6/79: mknod d4/d16/c17 0 2026-03-09T00:03:27.037 INFO:tasks.workunit.client.1.vm06.stdout:6/80: rmdir d4/d16 39 2026-03-09T00:03:27.037 INFO:tasks.workunit.client.1.vm06.stdout:6/81: chown d4/f5 785 1 2026-03-09T00:03:27.038 INFO:tasks.workunit.client.1.vm06.stdout:6/82: mknod d4/d16/c18 0 2026-03-09T00:03:27.038 INFO:tasks.workunit.client.1.vm06.stdout:6/83: fdatasync d4/f12 0 2026-03-09T00:03:27.038 INFO:tasks.workunit.client.1.vm06.stdout:6/84: write d4/ff [1218930,38263] 0 2026-03-09T00:03:27.038 INFO:tasks.workunit.client.1.vm06.stdout:6/85: fdatasync f1 0 2026-03-09T00:03:27.041 INFO:tasks.workunit.client.1.vm06.stdout:5/103: write d5/f1b [755731,45257] 0 2026-03-09T00:03:27.051 INFO:tasks.workunit.client.1.vm06.stdout:3/40: dwrite f7 [0,4194304] 0 2026-03-09T00:03:27.051 INFO:tasks.workunit.client.1.vm06.stdout:3/41: chown l5 239428727 1 2026-03-09T00:03:27.054 INFO:tasks.workunit.client.1.vm06.stdout:0/116: dwrite d3/d18/f22 [0,4194304] 0 2026-03-09T00:03:27.062 INFO:tasks.workunit.client.1.vm06.stdout:5/104: unlink d5/cc 0 2026-03-09T00:03:27.062 INFO:tasks.workunit.client.1.vm06.stdout:5/105: write d5/f15 [350597,123174] 0 2026-03-09T00:03:27.062 INFO:tasks.workunit.client.1.vm06.stdout:3/42: mknod cd 0 2026-03-09T00:03:27.064 INFO:tasks.workunit.client.1.vm06.stdout:0/117: creat d3/f29 x:0 0 0 2026-03-09T00:03:27.064 INFO:tasks.workunit.client.1.vm06.stdout:0/118: write d3/d18/d1f/f26 [803333,16415] 0 2026-03-09T00:03:27.064 INFO:tasks.workunit.client.1.vm06.stdout:0/119: chown d3/f1a 0 1 2026-03-09T00:03:27.064 INFO:tasks.workunit.client.1.vm06.stdout:0/120: chown d3/d18/d1f/d23 205614 1 2026-03-09T00:03:27.065 INFO:tasks.workunit.client.1.vm06.stdout:3/43: mknod ce 0 2026-03-09T00:03:27.065 INFO:tasks.workunit.client.1.vm06.stdout:3/44: stat c0 0 2026-03-09T00:03:27.078 INFO:tasks.workunit.client.1.vm06.stdout:8/96: dwrite db/f17 [0,4194304] 0 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: pgmap v129: 65 pgs: 65 active+clean; 268 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 4.4 MiB/s rd, 15 MiB/s wr, 709 op/s 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: pgmap v130: 65 pgs: 65 active+clean; 301 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 6.2 MiB/s rd, 16 MiB/s wr, 592 op/s 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: Upgrade: Updating mgr.vm06.rzcvhn 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:26 vm03.local ceph-mon[52346]: Deploying daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: pgmap v129: 65 pgs: 65 active+clean; 268 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 4.4 MiB/s rd, 15 MiB/s wr, 709 op/s 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: pgmap v130: 65 pgs: 65 active+clean; 301 MiB data, 1.9 GiB used, 118 GiB / 120 GiB avail; 6.2 MiB/s rd, 16 MiB/s wr, 592 op/s 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: Upgrade: Updating mgr.vm06.rzcvhn 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:27.119 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:26 vm06.local ceph-mon[58395]: Deploying daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:03:27.130 INFO:tasks.workunit.client.1.vm06.stdout:3/45: dwrite fa [0,4194304] 0 2026-03-09T00:03:27.130 INFO:tasks.workunit.client.1.vm06.stdout:9/76: dwrite d1/db/f7 [0,4194304] 0 2026-03-09T00:03:27.131 INFO:tasks.workunit.client.1.vm06.stdout:4/85: dwrite fb [0,4194304] 0 2026-03-09T00:03:27.135 INFO:tasks.workunit.client.1.vm06.stdout:3/46: creat ff x:0 0 0 2026-03-09T00:03:27.135 INFO:tasks.workunit.client.1.vm06.stdout:3/47: truncate ff 601252 0 2026-03-09T00:03:27.135 INFO:tasks.workunit.client.1.vm06.stdout:3/48: creat f10 x:0 0 0 2026-03-09T00:03:27.145 INFO:tasks.workunit.client.1.vm06.stdout:3/49: write f7 [3357451,62455] 0 2026-03-09T00:03:27.150 INFO:tasks.workunit.client.1.vm06.stdout:3/50: write f6 [582258,52218] 0 2026-03-09T00:03:27.150 INFO:tasks.workunit.client.1.vm06.stdout:3/51: write ff [1302154,90560] 0 2026-03-09T00:03:27.154 INFO:tasks.workunit.client.1.vm06.stdout:7/71: dwrite d0/f5 [0,4194304] 0 2026-03-09T00:03:27.154 INFO:tasks.workunit.client.1.vm06.stdout:7/72: creat d0/f14 x:0 0 0 2026-03-09T00:03:27.155 INFO:tasks.workunit.client.1.vm06.stdout:3/52: mkdir d11 0 2026-03-09T00:03:27.156 INFO:tasks.workunit.client.1.vm06.stdout:0/121: dwrite d3/f1b [0,4194304] 0 2026-03-09T00:03:27.162 INFO:tasks.workunit.client.1.vm06.stdout:1/58: getdents d6 0 2026-03-09T00:03:27.162 INFO:tasks.workunit.client.1.vm06.stdout:1/59: readlink l2 0 2026-03-09T00:03:27.162 INFO:tasks.workunit.client.1.vm06.stdout:3/53: unlink cd 0 2026-03-09T00:03:27.165 INFO:tasks.workunit.client.1.vm06.stdout:8/97: truncate db/f16 1550497 0 2026-03-09T00:03:27.165 INFO:tasks.workunit.client.1.vm06.stdout:1/60: symlink d6/l17 0 2026-03-09T00:03:27.165 INFO:tasks.workunit.client.1.vm06.stdout:1/61: dread - d6/ff zero size 2026-03-09T00:03:27.167 INFO:tasks.workunit.client.1.vm06.stdout:3/54: creat d11/f12 x:0 0 0 2026-03-09T00:03:27.167 INFO:tasks.workunit.client.1.vm06.stdout:3/55: write f6 [3556385,44149] 0 2026-03-09T00:03:27.167 INFO:tasks.workunit.client.1.vm06.stdout:3/56: dread - f10 zero size 2026-03-09T00:03:27.171 INFO:tasks.workunit.client.1.vm06.stdout:1/62: write d6/fa [3385652,77779] 0 2026-03-09T00:03:27.173 INFO:tasks.workunit.client.1.vm06.stdout:0/122: read d3/f1b [3561328,92964] 0 2026-03-09T00:03:27.173 INFO:tasks.workunit.client.1.vm06.stdout:0/123: chown d3/c12 2984724 1 2026-03-09T00:03:27.173 INFO:tasks.workunit.client.1.vm06.stdout:9/77: truncate d1/d4/fe 1472449 0 2026-03-09T00:03:27.174 INFO:tasks.workunit.client.1.vm06.stdout:8/98: dread db/dd/fe [0,4194304] 0 2026-03-09T00:03:27.174 INFO:tasks.workunit.client.1.vm06.stdout:8/99: creat db/d1e/f25 x:0 0 0 2026-03-09T00:03:27.174 INFO:tasks.workunit.client.1.vm06.stdout:9/78: read d1/d4/f6 [1769529,74061] 0 2026-03-09T00:03:27.174 INFO:tasks.workunit.client.1.vm06.stdout:9/79: stat d1/d3/d12/c15 0 2026-03-09T00:03:27.175 INFO:tasks.workunit.client.1.vm06.stdout:3/57: rename f6 to d11/f13 0 2026-03-09T00:03:27.177 INFO:tasks.workunit.client.1.vm06.stdout:1/63: symlink d6/l18 0 2026-03-09T00:03:27.177 INFO:tasks.workunit.client.1.vm06.stdout:1/64: readlink d6/l17 0 2026-03-09T00:03:27.179 INFO:tasks.workunit.client.1.vm06.stdout:0/124: link d3/c8 d3/d18/d28/c2a 0 2026-03-09T00:03:27.181 INFO:tasks.workunit.client.1.vm06.stdout:8/100: chown db/dd/c10 1007 1 2026-03-09T00:03:27.181 INFO:tasks.workunit.client.1.vm06.stdout:8/101: chown f3 6271 1 2026-03-09T00:03:27.182 INFO:tasks.workunit.client.1.vm06.stdout:8/102: fdatasync db/f16 0 2026-03-09T00:03:27.183 INFO:tasks.workunit.client.1.vm06.stdout:9/80: link d1/db/d9/fc d1/f16 0 2026-03-09T00:03:27.184 INFO:tasks.workunit.client.1.vm06.stdout:1/65: rename d6/fd to d6/f19 0 2026-03-09T00:03:27.198 INFO:tasks.workunit.client.1.vm06.stdout:0/125: link d3/fa d3/d18/d28/f2b 0 2026-03-09T00:03:27.226 INFO:tasks.workunit.client.1.vm06.stdout:9/81: dread d1/db/d9/f10 [0,4194304] 0 2026-03-09T00:03:27.226 INFO:tasks.workunit.client.1.vm06.stdout:9/82: readlink d1/d3/ld 0 2026-03-09T00:03:27.228 INFO:tasks.workunit.client.1.vm06.stdout:0/126: mkdir d3/d18/d2c 0 2026-03-09T00:03:27.228 INFO:tasks.workunit.client.1.vm06.stdout:0/127: readlink d3/l15 0 2026-03-09T00:03:27.234 INFO:tasks.workunit.client.1.vm06.stdout:1/66: dwrite d6/f9 [0,4194304] 0 2026-03-09T00:03:27.236 INFO:tasks.workunit.client.1.vm06.stdout:9/83: mknod d1/db/d9/c17 0 2026-03-09T00:03:27.236 INFO:tasks.workunit.client.1.vm06.stdout:7/73: fsync d0/df/f13 0 2026-03-09T00:03:27.242 INFO:tasks.workunit.client.1.vm06.stdout:0/128: dread d3/d18/d1f/f26 [0,4194304] 0 2026-03-09T00:03:27.245 INFO:tasks.workunit.client.1.vm06.stdout:7/74: unlink d0/db/l11 0 2026-03-09T00:03:27.245 INFO:tasks.workunit.client.1.vm06.stdout:7/75: stat d0 0 2026-03-09T00:03:27.245 INFO:tasks.workunit.client.1.vm06.stdout:7/76: creat d0/f15 x:0 0 0 2026-03-09T00:03:27.245 INFO:tasks.workunit.client.1.vm06.stdout:7/77: chown d0/f2 992 1 2026-03-09T00:03:27.245 INFO:tasks.workunit.client.1.vm06.stdout:7/78: rename d0 to d0/df/d16 22 2026-03-09T00:03:27.245 INFO:tasks.workunit.client.1.vm06.stdout:7/79: write d0/f15 [13549,27417] 0 2026-03-09T00:03:27.248 INFO:tasks.workunit.client.1.vm06.stdout:0/129: mkdir d3/d18/d2c/d2d 0 2026-03-09T00:03:27.252 INFO:tasks.workunit.client.1.vm06.stdout:7/80: fdatasync d0/f15 0 2026-03-09T00:03:27.252 INFO:tasks.workunit.client.1.vm06.stdout:7/81: chown d0/fa 13 1 2026-03-09T00:03:27.253 INFO:tasks.workunit.client.1.vm06.stdout:7/82: mkdir d0/df/d17 0 2026-03-09T00:03:27.253 INFO:tasks.workunit.client.1.vm06.stdout:7/83: stat d0/df 0 2026-03-09T00:03:27.255 INFO:tasks.workunit.client.1.vm06.stdout:7/84: creat d0/db/f18 x:0 0 0 2026-03-09T00:03:27.256 INFO:tasks.workunit.client.1.vm06.stdout:7/85: read d0/f15 [8703,34821] 0 2026-03-09T00:03:27.265 INFO:tasks.workunit.client.1.vm06.stdout:8/103: dwrite db/dd/fe [0,4194304] 0 2026-03-09T00:03:27.268 INFO:tasks.workunit.client.1.vm06.stdout:8/104: symlink db/l26 0 2026-03-09T00:03:27.319 INFO:tasks.workunit.client.1.vm06.stdout:7/86: dwrite d0/f5 [0,4194304] 0 2026-03-09T00:03:27.319 INFO:tasks.workunit.client.1.vm06.stdout:7/87: write d0/f7 [1000658,60284] 0 2026-03-09T00:03:27.321 INFO:tasks.workunit.client.1.vm06.stdout:7/88: mknod d0/c19 0 2026-03-09T00:03:27.321 INFO:tasks.workunit.client.1.vm06.stdout:7/89: readlink d0/df/l10 0 2026-03-09T00:03:27.322 INFO:tasks.workunit.client.1.vm06.stdout:0/130: dwrite d3/d18/d28/f2b [0,4194304] 0 2026-03-09T00:03:27.322 INFO:tasks.workunit.client.1.vm06.stdout:0/131: truncate d3/f29 801146 0 2026-03-09T00:03:27.326 INFO:tasks.workunit.client.1.vm06.stdout:7/90: write d0/f5 [2865891,57325] 0 2026-03-09T00:03:27.327 INFO:tasks.workunit.client.1.vm06.stdout:7/91: fsync d0/fe 0 2026-03-09T00:03:27.327 INFO:tasks.workunit.client.1.vm06.stdout:7/92: stat d0/f15 0 2026-03-09T00:03:27.327 INFO:tasks.workunit.client.1.vm06.stdout:0/132: rename d3/l15 to d3/d18/d1f/l2e 0 2026-03-09T00:03:27.329 INFO:tasks.workunit.client.1.vm06.stdout:7/93: unlink d0/f15 0 2026-03-09T00:03:27.329 INFO:tasks.workunit.client.1.vm06.stdout:7/94: stat d0/df/d17 0 2026-03-09T00:03:27.329 INFO:tasks.workunit.client.1.vm06.stdout:7/95: readlink d0/df/l10 0 2026-03-09T00:03:27.331 INFO:tasks.workunit.client.1.vm06.stdout:2/182: sync 2026-03-09T00:03:27.331 INFO:tasks.workunit.client.1.vm06.stdout:2/183: chown d7/d1b/f22 1986208056 1 2026-03-09T00:03:27.331 INFO:tasks.workunit.client.1.vm06.stdout:2/184: creat d7/d1b/f37 x:0 0 0 2026-03-09T00:03:27.332 INFO:tasks.workunit.client.1.vm06.stdout:3/58: dwrite f9 [0,4194304] 0 2026-03-09T00:03:27.332 INFO:tasks.workunit.client.1.vm06.stdout:3/59: chown ce 616 1 2026-03-09T00:03:27.342 INFO:tasks.workunit.client.1.vm06.stdout:7/96: dread d0/f5 [0,4194304] 0 2026-03-09T00:03:27.342 INFO:tasks.workunit.client.1.vm06.stdout:7/97: fsync d0/f2 0 2026-03-09T00:03:27.344 INFO:tasks.workunit.client.1.vm06.stdout:7/98: truncate d0/fe 791182 0 2026-03-09T00:03:27.345 INFO:tasks.workunit.client.1.vm06.stdout:7/99: mkdir d0/df/d1a 0 2026-03-09T00:03:27.346 INFO:tasks.workunit.client.1.vm06.stdout:7/100: mknod d0/df/d17/c1b 0 2026-03-09T00:03:27.346 INFO:tasks.workunit.client.1.vm06.stdout:7/101: chown d0/db/f18 22 1 2026-03-09T00:03:27.350 INFO:tasks.workunit.client.1.vm06.stdout:7/102: dread d0/f5 [0,4194304] 0 2026-03-09T00:03:27.354 INFO:tasks.workunit.client.1.vm06.stdout:5/106: getdents d5 0 2026-03-09T00:03:27.354 INFO:tasks.workunit.client.1.vm06.stdout:5/107: dread - d5/f16 zero size 2026-03-09T00:03:27.354 INFO:tasks.workunit.client.1.vm06.stdout:5/108: write d5/f1b [2156817,52140] 0 2026-03-09T00:03:27.362 INFO:tasks.workunit.client.1.vm06.stdout:3/60: write fa [625456,103929] 0 2026-03-09T00:03:27.364 INFO:tasks.workunit.client.1.vm06.stdout:3/61: mknod d11/c14 0 2026-03-09T00:03:27.366 INFO:tasks.workunit.client.1.vm06.stdout:3/62: symlink d11/l15 0 2026-03-09T00:03:27.370 INFO:tasks.workunit.client.1.vm06.stdout:3/63: read fa [1892070,1905] 0 2026-03-09T00:03:27.370 INFO:tasks.workunit.client.1.vm06.stdout:3/64: creat d11/f16 x:0 0 0 2026-03-09T00:03:27.370 INFO:tasks.workunit.client.1.vm06.stdout:3/65: write fa [2361410,43415] 0 2026-03-09T00:03:27.399 INFO:tasks.workunit.client.1.vm06.stdout:3/66: dwrite d11/f13 [0,4194304] 0 2026-03-09T00:03:27.402 INFO:tasks.workunit.client.1.vm06.stdout:3/67: dread f8 [0,4194304] 0 2026-03-09T00:03:27.402 INFO:tasks.workunit.client.1.vm06.stdout:3/68: chown f10 45 1 2026-03-09T00:03:27.410 INFO:tasks.workunit.client.1.vm06.stdout:1/67: dwrite f0 [0,4194304] 0 2026-03-09T00:03:27.420 INFO:tasks.workunit.client.1.vm06.stdout:4/86: truncate fb 2668021 0 2026-03-09T00:03:27.421 INFO:tasks.workunit.client.1.vm06.stdout:4/87: rename cc to c12 0 2026-03-09T00:03:27.452 INFO:tasks.workunit.client.1.vm06.stdout:2/185: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:03:27.452 INFO:tasks.workunit.client.1.vm06.stdout:2/186: write d7/da/f18 [1328042,43504] 0 2026-03-09T00:03:27.454 INFO:tasks.workunit.client.1.vm06.stdout:2/187: symlink d7/d1b/d31/l38 0 2026-03-09T00:03:27.456 INFO:tasks.workunit.client.1.vm06.stdout:2/188: dread d7/f26 [0,4194304] 0 2026-03-09T00:03:27.461 INFO:tasks.workunit.client.1.vm06.stdout:3/69: dwrite f10 [0,4194304] 0 2026-03-09T00:03:27.461 INFO:tasks.workunit.client.1.vm06.stdout:3/70: chown d11/l15 2 1 2026-03-09T00:03:27.463 INFO:tasks.workunit.client.1.vm06.stdout:7/103: dwrite d0/f2 [4194304,4194304] 0 2026-03-09T00:03:27.463 INFO:tasks.workunit.client.1.vm06.stdout:7/104: chown d0/df/l10 34 1 2026-03-09T00:03:27.465 INFO:tasks.workunit.client.1.vm06.stdout:1/68: dwrite d6/ff [0,4194304] 0 2026-03-09T00:03:27.474 INFO:tasks.workunit.client.1.vm06.stdout:3/71: symlink d11/l17 0 2026-03-09T00:03:27.474 INFO:tasks.workunit.client.1.vm06.stdout:3/72: stat f8 0 2026-03-09T00:03:27.474 INFO:tasks.workunit.client.1.vm06.stdout:3/73: creat d11/f18 x:0 0 0 2026-03-09T00:03:27.474 INFO:tasks.workunit.client.1.vm06.stdout:0/133: getdents d3/d18/d28 0 2026-03-09T00:03:27.478 INFO:tasks.workunit.client.1.vm06.stdout:1/69: creat d6/f1a x:0 0 0 2026-03-09T00:03:27.484 INFO:tasks.workunit.client.1.vm06.stdout:2/189: dwrite d7/f8 [4194304,4194304] 0 2026-03-09T00:03:27.488 INFO:tasks.workunit.client.1.vm06.stdout:1/70: dread d6/f9 [0,4194304] 0 2026-03-09T00:03:27.488 INFO:tasks.workunit.client.1.vm06.stdout:1/71: creat d6/f1b x:0 0 0 2026-03-09T00:03:27.488 INFO:tasks.workunit.client.1.vm06.stdout:1/72: truncate d6/fb 136117 0 2026-03-09T00:03:27.495 INFO:tasks.workunit.client.1.vm06.stdout:3/74: rename l5 to d11/l19 0 2026-03-09T00:03:27.497 INFO:tasks.workunit.client.1.vm06.stdout:0/134: truncate d3/f1b 3258202 0 2026-03-09T00:03:27.503 INFO:tasks.workunit.client.1.vm06.stdout:1/73: mknod d6/c1c 0 2026-03-09T00:03:27.506 INFO:tasks.workunit.client.1.vm06.stdout:0/135: write d3/d18/d28/f2b [3937232,118949] 0 2026-03-09T00:03:27.512 INFO:tasks.workunit.client.1.vm06.stdout:8/105: getdents db 0 2026-03-09T00:03:27.512 INFO:tasks.workunit.client.1.vm06.stdout:8/106: write f9 [760401,130777] 0 2026-03-09T00:03:27.512 INFO:tasks.workunit.client.1.vm06.stdout:3/75: unlink ce 0 2026-03-09T00:03:27.513 INFO:tasks.workunit.client.1.vm06.stdout:0/136: mknod d3/d18/d2c/c2f 0 2026-03-09T00:03:27.513 INFO:tasks.workunit.client.1.vm06.stdout:0/137: write d3/d18/d28/f2b [4641772,456] 0 2026-03-09T00:03:27.515 INFO:tasks.workunit.client.1.vm06.stdout:8/107: chown db/dd/l14 401627 1 2026-03-09T00:03:27.517 INFO:tasks.workunit.client.1.vm06.stdout:2/190: dwrite d7/d1a/d25/f33 [0,4194304] 0 2026-03-09T00:03:27.518 INFO:tasks.workunit.client.1.vm06.stdout:0/138: dread f1 [0,4194304] 0 2026-03-09T00:03:27.518 INFO:tasks.workunit.client.1.vm06.stdout:0/139: fdatasync d3/f1c 0 2026-03-09T00:03:27.526 INFO:tasks.workunit.client.1.vm06.stdout:8/108: creat db/dd/f27 x:0 0 0 2026-03-09T00:03:27.532 INFO:tasks.workunit.client.1.vm06.stdout:0/140: mkdir d3/d18/d30 0 2026-03-09T00:03:27.544 INFO:tasks.workunit.client.1.vm06.stdout:8/109: creat db/f28 x:0 0 0 2026-03-09T00:03:27.544 INFO:tasks.workunit.client.1.vm06.stdout:8/110: mknod db/dd/d24/c29 0 2026-03-09T00:03:27.544 INFO:tasks.workunit.client.1.vm06.stdout:8/111: truncate db/dd/f13 1163206 0 2026-03-09T00:03:27.545 INFO:tasks.workunit.client.1.vm06.stdout:8/112: dread - db/d1e/f23 zero size 2026-03-09T00:03:27.545 INFO:tasks.workunit.client.1.vm06.stdout:8/113: fdatasync db/f16 0 2026-03-09T00:03:27.551 INFO:tasks.workunit.client.1.vm06.stdout:2/191: dwrite d7/da/d1c/f29 [0,4194304] 0 2026-03-09T00:03:27.555 INFO:tasks.workunit.client.1.vm06.stdout:2/192: mkdir d7/d1a/d39 0 2026-03-09T00:03:27.557 INFO:tasks.workunit.client.1.vm06.stdout:2/193: creat d7/f3a x:0 0 0 2026-03-09T00:03:27.559 INFO:tasks.workunit.client.1.vm06.stdout:2/194: rmdir d7/d1b/d31 39 2026-03-09T00:03:27.571 INFO:tasks.workunit.client.1.vm06.stdout:2/195: creat d7/d1b/f3b x:0 0 0 2026-03-09T00:03:27.571 INFO:tasks.workunit.client.1.vm06.stdout:2/196: mkdir d7/d1a/d3c 0 2026-03-09T00:03:27.571 INFO:tasks.workunit.client.1.vm06.stdout:2/197: mknod d7/d1a/d39/c3d 0 2026-03-09T00:03:27.571 INFO:tasks.workunit.client.1.vm06.stdout:2/198: fdatasync d7/da/d1c/f29 0 2026-03-09T00:03:27.576 INFO:tasks.workunit.client.1.vm06.stdout:8/114: dwrite db/f17 [4194304,4194304] 0 2026-03-09T00:03:27.576 INFO:tasks.workunit.client.1.vm06.stdout:0/141: write d3/f19 [957400,91937] 0 2026-03-09T00:03:27.579 INFO:tasks.workunit.client.1.vm06.stdout:8/115: readlink db/dd/l18 0 2026-03-09T00:03:27.579 INFO:tasks.workunit.client.1.vm06.stdout:8/116: fsync db/d1e/f23 0 2026-03-09T00:03:27.579 INFO:tasks.workunit.client.1.vm06.stdout:8/117: write db/d1e/f20 [638661,116593] 0 2026-03-09T00:03:27.579 INFO:tasks.workunit.client.1.vm06.stdout:8/118: fdatasync f3 0 2026-03-09T00:03:27.583 INFO:tasks.workunit.client.1.vm06.stdout:8/119: link db/d1e/f23 db/d1e/f2a 0 2026-03-09T00:03:27.637 INFO:tasks.workunit.client.1.vm06.stdout:6/86: sync 2026-03-09T00:03:27.637 INFO:tasks.workunit.client.1.vm06.stdout:4/88: getdents . 0 2026-03-09T00:03:27.640 INFO:tasks.workunit.client.1.vm06.stdout:6/87: creat d4/f19 x:0 0 0 2026-03-09T00:03:27.640 INFO:tasks.workunit.client.1.vm06.stdout:4/89: dread f5 [0,4194304] 0 2026-03-09T00:03:27.641 INFO:tasks.workunit.client.1.vm06.stdout:6/88: truncate f1 319922 0 2026-03-09T00:03:27.642 INFO:tasks.workunit.client.1.vm06.stdout:4/90: creat f13 x:0 0 0 2026-03-09T00:03:27.642 INFO:tasks.workunit.client.1.vm06.stdout:4/91: stat f1 0 2026-03-09T00:03:27.651 INFO:tasks.workunit.client.1.vm06.stdout:0/142: dwrite d3/f19 [0,4194304] 0 2026-03-09T00:03:27.672 INFO:tasks.workunit.client.1.vm06.stdout:8/120: dwrite db/dd/f1f [0,4194304] 0 2026-03-09T00:03:27.672 INFO:tasks.workunit.client.1.vm06.stdout:8/121: chown db/f1d 11 1 2026-03-09T00:03:27.692 INFO:tasks.workunit.client.1.vm06.stdout:3/76: getdents d11 0 2026-03-09T00:03:27.692 INFO:tasks.workunit.client.1.vm06.stdout:3/77: chown d11/c14 44007 1 2026-03-09T00:03:27.692 INFO:tasks.workunit.client.1.vm06.stdout:3/78: fdatasync f9 0 2026-03-09T00:03:27.692 INFO:tasks.workunit.client.1.vm06.stdout:3/79: chown cb 7 1 2026-03-09T00:03:27.693 INFO:tasks.workunit.client.1.vm06.stdout:3/80: rmdir d11 39 2026-03-09T00:03:27.693 INFO:tasks.workunit.client.1.vm06.stdout:3/81: read - d11/f12 zero size 2026-03-09T00:03:27.694 INFO:tasks.workunit.client.1.vm06.stdout:0/143: fsync d3/f1b 0 2026-03-09T00:03:27.696 INFO:tasks.workunit.client.1.vm06.stdout:3/82: link fa d11/f1a 0 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:3/83: write d11/f12 [856230,72843] 0 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:3/84: chown f7 0 1 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:3/85: fdatasync f8 0 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:9/84: sync 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:0/144: dread d3/d18/f22 [0,4194304] 0 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:0/145: write d3/f1a [289027,72293] 0 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:5/109: sync 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:9/85: dread d1/db/d9/f10 [0,4194304] 0 2026-03-09T00:03:27.699 INFO:tasks.workunit.client.1.vm06.stdout:0/146: mkdir d3/d18/d2c/d2d/d31 0 2026-03-09T00:03:27.703 INFO:tasks.workunit.client.1.vm06.stdout:3/86: dread d11/f13 [0,4194304] 0 2026-03-09T00:03:27.703 INFO:tasks.workunit.client.1.vm06.stdout:3/87: dread - d11/f18 zero size 2026-03-09T00:03:27.703 INFO:tasks.workunit.client.1.vm06.stdout:5/110: creat d5/f1f x:0 0 0 2026-03-09T00:03:27.703 INFO:tasks.workunit.client.1.vm06.stdout:5/111: chown d5/cd 2261 1 2026-03-09T00:03:27.703 INFO:tasks.workunit.client.1.vm06.stdout:5/112: fdatasync d5/f9 0 2026-03-09T00:03:27.703 INFO:tasks.workunit.client.1.vm06.stdout:5/113: truncate d5/f16 135239 0 2026-03-09T00:03:27.707 INFO:tasks.workunit.client.1.vm06.stdout:8/122: dwrite db/f17 [4194304,4194304] 0 2026-03-09T00:03:27.707 INFO:tasks.workunit.client.1.vm06.stdout:8/123: chown db/l26 31530577 1 2026-03-09T00:03:27.713 INFO:tasks.workunit.client.1.vm06.stdout:5/114: creat d5/d1c/f20 x:0 0 0 2026-03-09T00:03:27.713 INFO:tasks.workunit.client.1.vm06.stdout:5/115: stat d5/f15 0 2026-03-09T00:03:27.713 INFO:tasks.workunit.client.1.vm06.stdout:5/116: write d5/f16 [1019315,90887] 0 2026-03-09T00:03:27.713 INFO:tasks.workunit.client.1.vm06.stdout:1/74: write f3 [210339,57011] 0 2026-03-09T00:03:27.719 INFO:tasks.workunit.client.1.vm06.stdout:7/105: sync 2026-03-09T00:03:27.719 INFO:tasks.workunit.client.1.vm06.stdout:7/106: chown d0/f14 1 1 2026-03-09T00:03:27.720 INFO:tasks.workunit.client.1.vm06.stdout:8/124: truncate f5 1260428 0 2026-03-09T00:03:27.721 INFO:tasks.workunit.client.1.vm06.stdout:5/117: unlink d5/f9 0 2026-03-09T00:03:27.723 INFO:tasks.workunit.client.1.vm06.stdout:1/75: rename f3 to d6/f1d 0 2026-03-09T00:03:27.741 INFO:tasks.workunit.client.1.vm06.stdout:1/76: fsync d6/f1b 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:7/107: symlink d0/l1c 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:7/108: write d0/f6 [4068471,66757] 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:7/109: fsync d0/f2 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:8/125: symlink db/d1e/l2b 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:0/147: fsync d3/f1a 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:8/126: unlink db/d1e/f2a 0 2026-03-09T00:03:27.742 INFO:tasks.workunit.client.1.vm06.stdout:0/148: symlink d3/d18/d2c/l32 0 2026-03-09T00:03:27.743 INFO:tasks.workunit.client.1.vm06.stdout:0/149: chown d3/d18/d1f/f26 859232 1 2026-03-09T00:03:27.743 INFO:tasks.workunit.client.1.vm06.stdout:8/127: creat db/dd/f2c x:0 0 0 2026-03-09T00:03:27.743 INFO:tasks.workunit.client.1.vm06.stdout:8/128: write db/f16 [1838858,104118] 0 2026-03-09T00:03:27.743 INFO:tasks.workunit.client.1.vm06.stdout:8/129: truncate f9 869387 0 2026-03-09T00:03:27.745 INFO:tasks.workunit.client.1.vm06.stdout:8/130: link db/d1e/f23 db/f2d 0 2026-03-09T00:03:27.753 INFO:tasks.workunit.client.1.vm06.stdout:8/131: dread db/f17 [4194304,4194304] 0 2026-03-09T00:03:27.757 INFO:tasks.workunit.client.1.vm06.stdout:2/199: sync 2026-03-09T00:03:27.760 INFO:tasks.workunit.client.1.vm06.stdout:8/132: write db/f17 [8845,23335] 0 2026-03-09T00:03:27.760 INFO:tasks.workunit.client.1.vm06.stdout:8/133: dread - db/f28 zero size 2026-03-09T00:03:27.760 INFO:tasks.workunit.client.1.vm06.stdout:8/134: chown db/dd/c10 23 1 2026-03-09T00:03:27.771 INFO:tasks.workunit.client.1.vm06.stdout:3/88: dwrite d11/f18 [0,4194304] 0 2026-03-09T00:03:27.785 INFO:tasks.workunit.client.1.vm06.stdout:3/89: read ff [485095,36648] 0 2026-03-09T00:03:27.785 INFO:tasks.workunit.client.1.vm06.stdout:3/90: fsync f10 0 2026-03-09T00:03:27.788 INFO:tasks.workunit.client.1.vm06.stdout:3/91: creat d11/f1b x:0 0 0 2026-03-09T00:03:27.791 INFO:tasks.workunit.client.1.vm06.stdout:3/92: dread d11/f18 [0,4194304] 0 2026-03-09T00:03:27.792 INFO:tasks.workunit.client.1.vm06.stdout:3/93: write d11/f12 [1099651,29596] 0 2026-03-09T00:03:27.800 INFO:tasks.workunit.client.1.vm06.stdout:9/86: dwrite d1/db/f7 [0,4194304] 0 2026-03-09T00:03:27.805 INFO:tasks.workunit.client.1.vm06.stdout:9/87: creat d1/db/d14/f18 x:0 0 0 2026-03-09T00:03:27.829 INFO:tasks.workunit.client.1.vm06.stdout:5/118: dwrite d5/f14 [0,4194304] 0 2026-03-09T00:03:27.829 INFO:tasks.workunit.client.1.vm06.stdout:6/89: getdents d4 0 2026-03-09T00:03:27.829 INFO:tasks.workunit.client.1.vm06.stdout:5/119: write d5/f15 [8824726,46303] 0 2026-03-09T00:03:27.829 INFO:tasks.workunit.client.1.vm06.stdout:5/120: dread - d5/f19 zero size 2026-03-09T00:03:27.837 INFO:tasks.workunit.client.1.vm06.stdout:6/90: unlink d4/c7 0 2026-03-09T00:03:27.840 INFO:tasks.workunit.client.1.vm06.stdout:6/91: rename d4 to d4/d1a 22 2026-03-09T00:03:27.841 INFO:tasks.workunit.client.1.vm06.stdout:4/92: sync 2026-03-09T00:03:27.847 INFO:tasks.workunit.client.1.vm06.stdout:5/121: mkdir d5/d1c/d21 0 2026-03-09T00:03:27.848 INFO:tasks.workunit.client.1.vm06.stdout:5/122: creat d5/d1c/f22 x:0 0 0 2026-03-09T00:03:27.848 INFO:tasks.workunit.client.1.vm06.stdout:5/123: readlink d5/l11 0 2026-03-09T00:03:27.850 INFO:tasks.workunit.client.1.vm06.stdout:4/93: rename f13 to f14 0 2026-03-09T00:03:27.850 INFO:tasks.workunit.client.1.vm06.stdout:4/94: creat f15 x:0 0 0 2026-03-09T00:03:27.852 INFO:tasks.workunit.client.1.vm06.stdout:5/124: mkdir d5/d1c/d23 0 2026-03-09T00:03:27.854 INFO:tasks.workunit.client.1.vm06.stdout:4/95: rename f5 to f16 0 2026-03-09T00:03:27.855 INFO:tasks.workunit.client.1.vm06.stdout:5/125: symlink d5/l24 0 2026-03-09T00:03:27.856 INFO:tasks.workunit.client.1.vm06.stdout:4/96: mkdir d17 0 2026-03-09T00:03:27.857 INFO:tasks.workunit.client.1.vm06.stdout:4/97: readlink l2 0 2026-03-09T00:03:27.857 INFO:tasks.workunit.client.1.vm06.stdout:5/126: mknod d5/d1c/c25 0 2026-03-09T00:03:27.859 INFO:tasks.workunit.client.1.vm06.stdout:4/98: symlink d17/l18 0 2026-03-09T00:03:27.861 INFO:tasks.workunit.client.1.vm06.stdout:4/99: creat d17/f19 x:0 0 0 2026-03-09T00:03:27.866 INFO:tasks.workunit.client.1.vm06.stdout:8/135: truncate db/dd/f1f 1618767 0 2026-03-09T00:03:27.868 INFO:tasks.workunit.client.1.vm06.stdout:8/136: unlink db/dd/l14 0 2026-03-09T00:03:27.870 INFO:tasks.workunit.client.1.vm06.stdout:8/137: dread f3 [0,4194304] 0 2026-03-09T00:03:27.870 INFO:tasks.workunit.client.1.vm06.stdout:8/138: write db/dd/f27 [122935,7376] 0 2026-03-09T00:03:27.870 INFO:tasks.workunit.client.1.vm06.stdout:8/139: stat db/l26 0 2026-03-09T00:03:27.872 INFO:tasks.workunit.client.1.vm06.stdout:8/140: truncate f3 597425 0 2026-03-09T00:03:27.872 INFO:tasks.workunit.client.1.vm06.stdout:2/200: dwrite f3 [4194304,4194304] 0 2026-03-09T00:03:27.872 INFO:tasks.workunit.client.1.vm06.stdout:2/201: truncate d7/d1a/f2f 927020 0 2026-03-09T00:03:27.892 INFO:tasks.workunit.client.1.vm06.stdout:1/77: getdents d6 0 2026-03-09T00:03:27.897 INFO:tasks.workunit.client.1.vm06.stdout:3/94: dwrite f10 [4194304,4194304] 0 2026-03-09T00:03:27.898 INFO:tasks.workunit.client.1.vm06.stdout:3/95: creat d11/f1c x:0 0 0 2026-03-09T00:03:27.898 INFO:tasks.workunit.client.1.vm06.stdout:3/96: creat d11/f1d x:0 0 0 2026-03-09T00:03:27.904 INFO:tasks.workunit.client.1.vm06.stdout:0/150: truncate d3/f1a 168043 0 2026-03-09T00:03:27.905 INFO:tasks.workunit.client.1.vm06.stdout:0/151: write d3/d18/d1f/f26 [415301,30240] 0 2026-03-09T00:03:27.905 INFO:tasks.workunit.client.1.vm06.stdout:0/152: readlink d3/lb 0 2026-03-09T00:03:27.907 INFO:tasks.workunit.client.1.vm06.stdout:6/92: dwrite d4/ff [0,4194304] 0 2026-03-09T00:03:27.915 INFO:tasks.workunit.client.1.vm06.stdout:0/153: mknod d3/d18/d30/c33 0 2026-03-09T00:03:27.917 INFO:tasks.workunit.client.1.vm06.stdout:9/88: dwrite d1/d4/ff [0,4194304] 0 2026-03-09T00:03:27.917 INFO:tasks.workunit.client.1.vm06.stdout:9/89: fdatasync d1/d3/f11 0 2026-03-09T00:03:27.917 INFO:tasks.workunit.client.1.vm06.stdout:8/141: write db/dd/f1c [2075122,70520] 0 2026-03-09T00:03:27.920 INFO:tasks.workunit.client.1.vm06.stdout:0/154: creat d3/d18/d1f/f34 x:0 0 0 2026-03-09T00:03:27.922 INFO:tasks.workunit.client.1.vm06.stdout:9/90: creat d1/db/d9/f19 x:0 0 0 2026-03-09T00:03:27.926 INFO:tasks.workunit.client.1.vm06.stdout:7/110: truncate d0/f6 1106343 0 2026-03-09T00:03:27.940 INFO:tasks.workunit.client.1.vm06.stdout:7/111: mknod d0/df/c1d 0 2026-03-09T00:03:27.943 INFO:tasks.workunit.client.1.vm06.stdout:7/112: mknod d0/df/d1a/c1e 0 2026-03-09T00:03:27.943 INFO:tasks.workunit.client.1.vm06.stdout:7/113: write d0/fe [1448164,79712] 0 2026-03-09T00:03:27.946 INFO:tasks.workunit.client.1.vm06.stdout:7/114: creat d0/df/d17/f1f x:0 0 0 2026-03-09T00:03:27.947 INFO:tasks.workunit.client.1.vm06.stdout:9/91: write d1/d4/ff [2895211,29644] 0 2026-03-09T00:03:27.947 INFO:tasks.workunit.client.1.vm06.stdout:9/92: readlink d1/d3/ld 0 2026-03-09T00:03:27.947 INFO:tasks.workunit.client.1.vm06.stdout:9/93: chown d1/db/f7 33 1 2026-03-09T00:03:27.948 INFO:tasks.workunit.client.1.vm06.stdout:7/115: mknod d0/db/c20 0 2026-03-09T00:03:27.948 INFO:tasks.workunit.client.1.vm06.stdout:7/116: read - d0/f14 zero size 2026-03-09T00:03:27.948 INFO:tasks.workunit.client.1.vm06.stdout:7/117: readlink d0/df/l10 0 2026-03-09T00:03:27.948 INFO:tasks.workunit.client.1.vm06.stdout:7/118: stat d0/l1c 0 2026-03-09T00:03:27.969 INFO:tasks.workunit.client.1.vm06.stdout:6/93: dwrite d4/f6 [0,4194304] 0 2026-03-09T00:03:27.969 INFO:tasks.workunit.client.1.vm06.stdout:8/142: dwrite db/d1e/f20 [0,4194304] 0 2026-03-09T00:03:27.969 INFO:tasks.workunit.client.1.vm06.stdout:8/143: creat db/d1e/f2e x:0 0 0 2026-03-09T00:03:27.974 INFO:tasks.workunit.client.1.vm06.stdout:6/94: mknod d4/d16/c1b 0 2026-03-09T00:03:27.974 INFO:tasks.workunit.client.1.vm06.stdout:6/95: write f1 [1159025,109476] 0 2026-03-09T00:03:27.980 INFO:tasks.workunit.client.1.vm06.stdout:6/96: creat d4/d16/f1c x:0 0 0 2026-03-09T00:03:27.985 INFO:tasks.workunit.client.1.vm06.stdout:6/97: creat d4/f1d x:0 0 0 2026-03-09T00:03:28.014 INFO:tasks.workunit.client.1.vm06.stdout:5/127: getdents d5 0 2026-03-09T00:03:28.015 INFO:tasks.workunit.client.1.vm06.stdout:5/128: mknod d5/d1c/c26 0 2026-03-09T00:03:28.017 INFO:tasks.workunit.client.1.vm06.stdout:5/129: unlink d5/d1c/f20 0 2026-03-09T00:03:28.017 INFO:tasks.workunit.client.1.vm06.stdout:5/130: dread - d5/d1c/f22 zero size 2026-03-09T00:03:28.019 INFO:tasks.workunit.client.1.vm06.stdout:4/100: rmdir d17 39 2026-03-09T00:03:28.024 INFO:tasks.workunit.client.1.vm06.stdout:3/97: dwrite f9 [0,4194304] 0 2026-03-09T00:03:28.028 INFO:tasks.workunit.client.1.vm06.stdout:9/94: dwrite d1/db/f7 [0,4194304] 0 2026-03-09T00:03:28.028 INFO:tasks.workunit.client.1.vm06.stdout:9/95: read d1/d4/ff [1091778,94597] 0 2026-03-09T00:03:28.028 INFO:tasks.workunit.client.1.vm06.stdout:9/96: chown d1/db/d9/fc 161685 1 2026-03-09T00:03:28.028 INFO:tasks.workunit.client.1.vm06.stdout:9/97: creat d1/db/d14/f1a x:0 0 0 2026-03-09T00:03:28.028 INFO:tasks.workunit.client.1.vm06.stdout:9/98: chown d1/d3/ld 523377 1 2026-03-09T00:03:28.034 INFO:tasks.workunit.client.1.vm06.stdout:5/131: dread d5/f1b [4194304,4194304] 0 2026-03-09T00:03:28.035 INFO:tasks.workunit.client.1.vm06.stdout:3/98: rename ff to d11/f1e 0 2026-03-09T00:03:28.043 INFO:tasks.workunit.client.1.vm06.stdout:9/99: rename d1/c13 to d1/d4/c1b 0 2026-03-09T00:03:28.043 INFO:tasks.workunit.client.1.vm06.stdout:9/100: creat d1/f1c x:0 0 0 2026-03-09T00:03:28.046 INFO:tasks.workunit.client.1.vm06.stdout:9/101: symlink d1/db/l1d 0 2026-03-09T00:03:28.046 INFO:tasks.workunit.client.1.vm06.stdout:9/102: dread - d1/db/d9/f19 zero size 2026-03-09T00:03:28.046 INFO:tasks.workunit.client.1.vm06.stdout:9/103: mknod d1/db/d9/c1e 0 2026-03-09T00:03:28.046 INFO:tasks.workunit.client.1.vm06.stdout:4/101: write f1 [10543910,117840] 0 2026-03-09T00:03:28.049 INFO:tasks.workunit.client.1.vm06.stdout:4/102: truncate f7 225816 0 2026-03-09T00:03:28.078 INFO:tasks.workunit.client.1.vm06.stdout:0/155: rmdir d3 39 2026-03-09T00:03:28.079 INFO:tasks.workunit.client.1.vm06.stdout:0/156: mkdir d3/d18/d30/d35 0 2026-03-09T00:03:28.080 INFO:tasks.workunit.client.1.vm06.stdout:0/157: mknod d3/d18/d2c/d2d/c36 0 2026-03-09T00:03:28.081 INFO:tasks.workunit.client.1.vm06.stdout:0/158: mknod d3/d18/d2c/c37 0 2026-03-09T00:03:28.082 INFO:tasks.workunit.client.1.vm06.stdout:0/159: symlink d3/d18/d30/l38 0 2026-03-09T00:03:28.083 INFO:tasks.workunit.client.1.vm06.stdout:0/160: mkdir d3/d18/d1f/d39 0 2026-03-09T00:03:28.085 INFO:tasks.workunit.client.1.vm06.stdout:0/161: rmdir d3/d18/d30/d35 0 2026-03-09T00:03:28.085 INFO:tasks.workunit.client.1.vm06.stdout:0/162: stat d3/d18/d1f/d23 0 2026-03-09T00:03:28.090 INFO:tasks.workunit.client.1.vm06.stdout:0/163: write d3/f10 [1166018,124687] 0 2026-03-09T00:03:28.090 INFO:tasks.workunit.client.1.vm06.stdout:0/164: readlink d3/d18/d1f/l2e 0 2026-03-09T00:03:28.092 INFO:tasks.workunit.client.1.vm06.stdout:0/165: mknod d3/d18/d2c/c3a 0 2026-03-09T00:03:28.093 INFO:tasks.workunit.client.1.vm06.stdout:0/166: dread d3/f29 [0,4194304] 0 2026-03-09T00:03:28.100 INFO:tasks.workunit.client.1.vm06.stdout:5/132: dwrite d5/ff [0,4194304] 0 2026-03-09T00:03:28.103 INFO:tasks.workunit.client.1.vm06.stdout:4/103: dwrite fb [0,4194304] 0 2026-03-09T00:03:28.103 INFO:tasks.workunit.client.1.vm06.stdout:4/104: write d17/f19 [516433,75066] 0 2026-03-09T00:03:28.103 INFO:tasks.workunit.client.1.vm06.stdout:4/105: write fb [4471707,104629] 0 2026-03-09T00:03:28.113 INFO:tasks.workunit.client.1.vm06.stdout:8/144: truncate db/d1e/f20 3356842 0 2026-03-09T00:03:28.113 INFO:tasks.workunit.client.1.vm06.stdout:6/98: getdents d4/d16 0 2026-03-09T00:03:28.113 INFO:tasks.workunit.client.1.vm06.stdout:6/99: fdatasync d4/f19 0 2026-03-09T00:03:28.113 INFO:tasks.workunit.client.1.vm06.stdout:6/100: truncate d4/f1d 150855 0 2026-03-09T00:03:28.113 INFO:tasks.workunit.client.1.vm06.stdout:6/101: readlink d4/le 0 2026-03-09T00:03:28.117 INFO:tasks.workunit.client.1.vm06.stdout:8/145: truncate db/dd/f13 316393 0 2026-03-09T00:03:28.117 INFO:tasks.workunit.client.1.vm06.stdout:8/146: write db/d1e/f23 [310542,72292] 0 2026-03-09T00:03:28.117 INFO:tasks.workunit.client.1.vm06.stdout:6/102: truncate d4/fb 395712 0 2026-03-09T00:03:28.119 INFO:tasks.workunit.client.1.vm06.stdout:6/103: truncate d4/ff 155822 0 2026-03-09T00:03:28.122 INFO:tasks.workunit.client.1.vm06.stdout:6/104: symlink d4/d16/l1e 0 2026-03-09T00:03:28.144 INFO:tasks.workunit.client.1.vm06.stdout:9/104: rmdir d1/db/d14 39 2026-03-09T00:03:28.146 INFO:tasks.workunit.client.1.vm06.stdout:9/105: link d1/db/f8 d1/d3/f1f 0 2026-03-09T00:03:28.154 INFO:tasks.workunit.client.1.vm06.stdout:9/106: write d1/db/f8 [236075,120424] 0 2026-03-09T00:03:28.164 INFO:tasks.workunit.client.1.vm06.stdout:9/107: unlink d1/db/d9/fc 0 2026-03-09T00:03:28.164 INFO:tasks.workunit.client.1.vm06.stdout:9/108: readlink d1/d3/ld 0 2026-03-09T00:03:28.165 INFO:tasks.workunit.client.1.vm06.stdout:9/109: creat d1/db/d14/f20 x:0 0 0 2026-03-09T00:03:28.165 INFO:tasks.workunit.client.1.vm06.stdout:9/110: chown d1/db/d14 17564 1 2026-03-09T00:03:28.166 INFO:tasks.workunit.client.1.vm06.stdout:9/111: truncate d1/db/d9/f10 1298004 0 2026-03-09T00:03:28.170 INFO:tasks.workunit.client.1.vm06.stdout:4/106: dwrite f16 [0,4194304] 0 2026-03-09T00:03:28.171 INFO:tasks.workunit.client.1.vm06.stdout:5/133: dwrite d5/fe [0,4194304] 0 2026-03-09T00:03:28.172 INFO:tasks.workunit.client.1.vm06.stdout:4/107: rmdir d17 39 2026-03-09T00:03:28.174 INFO:tasks.workunit.client.1.vm06.stdout:4/108: mknod d17/c1a 0 2026-03-09T00:03:28.175 INFO:tasks.workunit.client.1.vm06.stdout:4/109: symlink d17/l1b 0 2026-03-09T00:03:28.187 INFO:tasks.workunit.client.1.vm06.stdout:8/147: dread db/d1e/f23 [0,4194304] 0 2026-03-09T00:03:28.187 INFO:tasks.workunit.client.1.vm06.stdout:8/148: readlink db/l1a 0 2026-03-09T00:03:28.187 INFO:tasks.workunit.client.1.vm06.stdout:8/149: chown db/dd/c10 877 1 2026-03-09T00:03:28.189 INFO:tasks.workunit.client.1.vm06.stdout:8/150: unlink c2 0 2026-03-09T00:03:28.196 INFO:tasks.workunit.client.1.vm06.stdout:8/151: link db/dd/l19 db/l2f 0 2026-03-09T00:03:28.209 INFO:tasks.workunit.client.1.vm06.stdout:8/152: read - db/f1d zero size 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/153: mknod db/dd/d24/c30 0 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/154: dread - db/d1e/f25 zero size 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/155: write db/f17 [8397108,13287] 0 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/156: creat db/f31 x:0 0 0 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/157: chown db/dd/d24 1357682 1 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/158: dread - db/f28 zero size 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/159: rmdir db/dd/d24 39 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/160: mknod db/dd/d24/c32 0 2026-03-09T00:03:28.210 INFO:tasks.workunit.client.1.vm06.stdout:8/161: creat db/dd/d24/f33 x:0 0 0 2026-03-09T00:03:28.212 INFO:tasks.workunit.client.1.vm06.stdout:6/105: dwrite d4/d16/f1c [0,4194304] 0 2026-03-09T00:03:28.212 INFO:tasks.workunit.client.1.vm06.stdout:6/106: dread - d4/fa zero size 2026-03-09T00:03:28.213 INFO:tasks.workunit.client.1.vm06.stdout:0/167: rename d3/d18/d30 to d3/d18/d1f/d39/d3b 0 2026-03-09T00:03:28.213 INFO:tasks.workunit.client.1.vm06.stdout:0/168: chown d3/f29 1 1 2026-03-09T00:03:28.229 INFO:tasks.workunit.client.1.vm06.stdout:8/162: dread db/f17 [4194304,4194304] 0 2026-03-09T00:03:28.231 INFO:tasks.workunit.client.1.vm06.stdout:6/107: unlink d4/d16/l1e 0 2026-03-09T00:03:28.231 INFO:tasks.workunit.client.1.vm06.stdout:8/163: dread db/dd/f27 [0,4194304] 0 2026-03-09T00:03:28.231 INFO:tasks.workunit.client.1.vm06.stdout:8/164: fdatasync db/f17 0 2026-03-09T00:03:28.231 INFO:tasks.workunit.client.1.vm06.stdout:8/165: write f9 [1512562,6618] 0 2026-03-09T00:03:28.231 INFO:tasks.workunit.client.1.vm06.stdout:8/166: creat db/d1e/f34 x:0 0 0 2026-03-09T00:03:28.232 INFO:tasks.workunit.client.1.vm06.stdout:6/108: unlink d4/f19 0 2026-03-09T00:03:28.238 INFO:tasks.workunit.client.1.vm06.stdout:6/109: mknod d4/d16/c1f 0 2026-03-09T00:03:28.244 INFO:tasks.workunit.client.1.vm06.stdout:6/110: write d4/f5 [533935,68325] 0 2026-03-09T00:03:28.244 INFO:tasks.workunit.client.1.vm06.stdout:6/111: readlink d4/le 0 2026-03-09T00:03:28.244 INFO:tasks.workunit.client.1.vm06.stdout:5/134: dwrite d5/f16 [0,4194304] 0 2026-03-09T00:03:28.245 INFO:tasks.workunit.client.1.vm06.stdout:4/110: dwrite f14 [0,4194304] 0 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:1/78: sync 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:1/79: dread - d6/f1a zero size 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:2/202: sync 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:2/203: chown d7/d1b/f37 405559271 1 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:2/204: fsync d7/d1b/f22 0 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:5/135: dread d5/f1d [0,4194304] 0 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:2/205: readlink d7/d1a/l2a 0 2026-03-09T00:03:28.252 INFO:tasks.workunit.client.1.vm06.stdout:7/119: sync 2026-03-09T00:03:28.259 INFO:tasks.workunit.client.1.vm06.stdout:0/169: dwrite d3/d18/f25 [0,4194304] 0 2026-03-09T00:03:28.259 INFO:tasks.workunit.client.1.vm06.stdout:6/112: dread d4/f6 [0,4194304] 0 2026-03-09T00:03:28.264 INFO:tasks.workunit.client.1.vm06.stdout:6/113: dread f1 [0,4194304] 0 2026-03-09T00:03:28.269 INFO:tasks.workunit.client.1.vm06.stdout:0/170: dread d3/f29 [0,4194304] 0 2026-03-09T00:03:28.269 INFO:tasks.workunit.client.1.vm06.stdout:0/171: dread - d3/f1c zero size 2026-03-09T00:03:28.269 INFO:tasks.workunit.client.1.vm06.stdout:0/172: chown d3/d18/d2c 4 1 2026-03-09T00:03:28.271 INFO:tasks.workunit.client.1.vm06.stdout:1/80: symlink d6/l1e 0 2026-03-09T00:03:28.277 INFO:tasks.workunit.client.1.vm06.stdout:5/136: rmdir d5 39 2026-03-09T00:03:28.277 INFO:tasks.workunit.client.1.vm06.stdout:5/137: dread - d5/f19 zero size 2026-03-09T00:03:28.277 INFO:tasks.workunit.client.1.vm06.stdout:2/206: mknod d7/da/d1c/c3e 0 2026-03-09T00:03:28.278 INFO:tasks.workunit.client.1.vm06.stdout:6/114: rename d4/c13 to d4/c20 0 2026-03-09T00:03:28.279 INFO:tasks.workunit.client.1.vm06.stdout:0/173: mkdir d3/d18/d3c 0 2026-03-09T00:03:28.279 INFO:tasks.workunit.client.1.vm06.stdout:0/174: write d3/d18/f22 [4550698,39536] 0 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/81: mknod d6/c1f 0 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/82: stat d6/ff 0 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/83: dread - d6/f1a zero size 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/84: fsync d6/f19 0 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/85: fsync d6/f1b 0 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/86: fdatasync d6/f9 0 2026-03-09T00:03:28.280 INFO:tasks.workunit.client.1.vm06.stdout:1/87: fsync f0 0 2026-03-09T00:03:28.282 INFO:tasks.workunit.client.1.vm06.stdout:5/138: mknod d5/c27 0 2026-03-09T00:03:28.282 INFO:tasks.workunit.client.1.vm06.stdout:5/139: readlink d5/l24 0 2026-03-09T00:03:28.282 INFO:tasks.workunit.client.1.vm06.stdout:5/140: dread - d5/f1f zero size 2026-03-09T00:03:28.282 INFO:tasks.workunit.client.1.vm06.stdout:5/141: stat d5/d1c/c25 0 2026-03-09T00:03:28.286 INFO:tasks.workunit.client.1.vm06.stdout:9/112: rename d1/db to d1/d3/d12/d21 0 2026-03-09T00:03:28.287 INFO:tasks.workunit.client.1.vm06.stdout:2/207: getdents d7/d1a/d39 0 2026-03-09T00:03:28.288 INFO:tasks.workunit.client.1.vm06.stdout:7/120: dwrite d0/f14 [0,4194304] 0 2026-03-09T00:03:28.288 INFO:tasks.workunit.client.1.vm06.stdout:7/121: creat d0/df/d17/f21 x:0 0 0 2026-03-09T00:03:28.290 INFO:tasks.workunit.client.1.vm06.stdout:6/115: creat d4/d16/f21 x:0 0 0 2026-03-09T00:03:28.291 INFO:tasks.workunit.client.1.vm06.stdout:6/116: dread d4/ff [0,4194304] 0 2026-03-09T00:03:28.291 INFO:tasks.workunit.client.1.vm06.stdout:6/117: creat d4/f22 x:0 0 0 2026-03-09T00:03:28.299 INFO:tasks.workunit.client.1.vm06.stdout:0/175: rmdir d3/d18/d1f/d23 0 2026-03-09T00:03:28.299 INFO:tasks.workunit.client.1.vm06.stdout:7/122: dread d0/f14 [0,4194304] 0 2026-03-09T00:03:28.300 INFO:tasks.workunit.client.1.vm06.stdout:1/88: truncate f0 1065161 0 2026-03-09T00:03:28.300 INFO:tasks.workunit.client.1.vm06.stdout:1/89: stat d6/f7 0 2026-03-09T00:03:28.302 INFO:tasks.workunit.client.1.vm06.stdout:7/123: dread d0/fa [0,4194304] 0 2026-03-09T00:03:28.302 INFO:tasks.workunit.client.1.vm06.stdout:7/124: write d0/df/d17/f1f [242088,79345] 0 2026-03-09T00:03:28.304 INFO:tasks.workunit.client.1.vm06.stdout:0/176: fdatasync f1 0 2026-03-09T00:03:28.314 INFO:tasks.workunit.client.1.vm06.stdout:5/142: mkdir d5/d1c/d21/d28 0 2026-03-09T00:03:28.318 INFO:tasks.workunit.client.1.vm06.stdout:5/143: dread d5/f14 [0,4194304] 0 2026-03-09T00:03:28.318 INFO:tasks.workunit.client.1.vm06.stdout:5/144: read - d5/d1c/f22 zero size 2026-03-09T00:03:28.318 INFO:tasks.workunit.client.1.vm06.stdout:5/145: fsync d5/ff 0 2026-03-09T00:03:28.323 INFO:tasks.workunit.client.1.vm06.stdout:9/113: read d1/f16 [61273,89621] 0 2026-03-09T00:03:28.323 INFO:tasks.workunit.client.1.vm06.stdout:2/208: mknod d7/d1a/d39/c3f 0 2026-03-09T00:03:28.323 INFO:tasks.workunit.client.1.vm06.stdout:4/111: getdents d17 0 2026-03-09T00:03:28.332 INFO:tasks.workunit.client.1.vm06.stdout:7/125: mkdir d0/df/d1a/d22 0 2026-03-09T00:03:28.341 INFO:tasks.workunit.client.1.vm06.stdout:7/126: creat d0/db/f23 x:0 0 0 2026-03-09T00:03:28.341 INFO:tasks.workunit.client.1.vm06.stdout:7/127: write d0/db/f23 [604094,101401] 0 2026-03-09T00:03:28.341 INFO:tasks.workunit.client.1.vm06.stdout:8/167: getdents db 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:5/146: creat d5/d1c/d21/f29 x:0 0 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:5/147: chown d5/f16 434652473 1 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:2/209: mknod d7/c40 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:4/112: symlink d17/l1c 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:2/210: write d7/da/f18 [1288218,106124] 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:2/211: stat d7/da/db/de/f32 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:6/118: link d4/f22 d4/f23 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:6/119: read d4/fc [383279,78486] 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:7/128: link d0/db/c20 d0/df/d1a/d22/c24 0 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:7/129: chown d0/df/d17/f21 919 1 2026-03-09T00:03:28.342 INFO:tasks.workunit.client.1.vm06.stdout:7/130: chown d0/f2 857 1 2026-03-09T00:03:28.343 INFO:tasks.workunit.client.1.vm06.stdout:3/99: sync 2026-03-09T00:03:28.343 INFO:tasks.workunit.client.1.vm06.stdout:3/100: readlink d11/l15 0 2026-03-09T00:03:28.343 INFO:tasks.workunit.client.1.vm06.stdout:3/101: fdatasync d11/f18 0 2026-03-09T00:03:28.346 INFO:tasks.workunit.client.1.vm06.stdout:4/113: dread fe [0,4194304] 0 2026-03-09T00:03:28.352 INFO:tasks.workunit.client.1.vm06.stdout:8/168: read - db/f31 zero size 2026-03-09T00:03:28.357 INFO:tasks.workunit.client.1.vm06.stdout:3/102: write f10 [5444313,130031] 0 2026-03-09T00:03:28.357 INFO:tasks.workunit.client.1.vm06.stdout:3/103: creat d11/f1f x:0 0 0 2026-03-09T00:03:28.357 INFO:tasks.workunit.client.1.vm06.stdout:5/148: mkdir d5/d1c/d21/d2a 0 2026-03-09T00:03:28.357 INFO:tasks.workunit.client.1.vm06.stdout:2/212: symlink d7/l41 0 2026-03-09T00:03:28.360 INFO:tasks.workunit.client.1.vm06.stdout:5/149: dread d5/f1b [0,4194304] 0 2026-03-09T00:03:28.364 INFO:tasks.workunit.client.1.vm06.stdout:3/104: dread d11/f13 [0,4194304] 0 2026-03-09T00:03:28.364 INFO:tasks.workunit.client.1.vm06.stdout:0/177: dwrite d3/f29 [0,4194304] 0 2026-03-09T00:03:28.365 INFO:tasks.workunit.client.1.vm06.stdout:1/90: dwrite d6/fb [0,4194304] 0 2026-03-09T00:03:28.367 INFO:tasks.workunit.client.1.vm06.stdout:4/114: creat d17/f1d x:0 0 0 2026-03-09T00:03:28.367 INFO:tasks.workunit.client.1.vm06.stdout:4/115: creat d17/f1e x:0 0 0 2026-03-09T00:03:28.367 INFO:tasks.workunit.client.1.vm06.stdout:9/114: dread d1/d3/f1f [0,4194304] 0 2026-03-09T00:03:28.371 INFO:tasks.workunit.client.1.vm06.stdout:8/169: chown db/dd/d24/c32 1227362000 1 2026-03-09T00:03:28.378 INFO:tasks.workunit.client.1.vm06.stdout:5/150: mkdir d5/d1c/d2b 0 2026-03-09T00:03:28.383 INFO:tasks.workunit.client.1.vm06.stdout:5/151: dread d5/f1b [4194304,4194304] 0 2026-03-09T00:03:28.383 INFO:tasks.workunit.client.1.vm06.stdout:5/152: readlink d5/l1a 0 2026-03-09T00:03:28.386 INFO:tasks.workunit.client.1.vm06.stdout:1/91: creat d6/f20 x:0 0 0 2026-03-09T00:03:28.386 INFO:tasks.workunit.client.1.vm06.stdout:1/92: fsync d6/fb 0 2026-03-09T00:03:28.386 INFO:tasks.workunit.client.1.vm06.stdout:6/120: getdents d4 0 2026-03-09T00:03:28.388 INFO:tasks.workunit.client.1.vm06.stdout:0/178: dread d3/f19 [0,4194304] 0 2026-03-09T00:03:28.453 INFO:tasks.workunit.client.1.vm06.stdout:9/115: mknod d1/d3/d12/d21/c22 0 2026-03-09T00:03:28.453 INFO:tasks.workunit.client.1.vm06.stdout:9/116: readlink d1/d3/d12/d21/l1d 0 2026-03-09T00:03:28.453 INFO:tasks.workunit.client.1.vm06.stdout:9/117: creat d1/d3/f23 x:0 0 0 2026-03-09T00:03:28.453 INFO:tasks.workunit.client.1.vm06.stdout:8/170: readlink db/l26 0 2026-03-09T00:03:28.456 INFO:tasks.workunit.client.1.vm06.stdout:6/121: mknod d4/c24 0 2026-03-09T00:03:28.456 INFO:tasks.workunit.client.1.vm06.stdout:5/153: symlink d5/d1c/d23/l2c 0 2026-03-09T00:03:28.456 INFO:tasks.workunit.client.1.vm06.stdout:5/154: creat d5/d1c/f2d x:0 0 0 2026-03-09T00:03:28.478 INFO:tasks.workunit.client.1.vm06.stdout:5/155: unlink d5/d1c/d21/f29 0 2026-03-09T00:03:28.478 INFO:tasks.workunit.client.1.vm06.stdout:5/156: write d5/f15 [6872197,54557] 0 2026-03-09T00:03:28.479 INFO:tasks.workunit.client.1.vm06.stdout:8/171: link db/dd/c22 db/dd/d24/c35 0 2026-03-09T00:03:28.489 INFO:tasks.workunit.client.1.vm06.stdout:8/172: mkdir db/dd/d24/d36 0 2026-03-09T00:03:28.489 INFO:tasks.workunit.client.1.vm06.stdout:8/173: write f3 [213316,45332] 0 2026-03-09T00:03:28.492 INFO:tasks.workunit.client.1.vm06.stdout:8/174: truncate db/d1e/f2e 188512 0 2026-03-09T00:03:28.569 INFO:tasks.workunit.client.1.vm06.stdout:0/179: dwrite d3/f1a [0,4194304] 0 2026-03-09T00:03:28.570 INFO:tasks.workunit.client.1.vm06.stdout:0/180: creat d3/d18/d1f/d39/f3d x:0 0 0 2026-03-09T00:03:28.572 INFO:tasks.workunit.client.1.vm06.stdout:0/181: truncate d3/d18/f22 766631 0 2026-03-09T00:03:28.572 INFO:tasks.workunit.client.1.vm06.stdout:0/182: write d3/f1c [849263,128435] 0 2026-03-09T00:03:28.687 INFO:tasks.workunit.client.1.vm06.stdout:8/175: dread db/f16 [0,4194304] 0 2026-03-09T00:03:28.738 INFO:tasks.workunit.client.1.vm06.stdout:8/176: dread db/d1e/f2e [0,4194304] 0 2026-03-09T00:03:28.739 INFO:tasks.workunit.client.1.vm06.stdout:8/177: symlink db/d1e/l37 0 2026-03-09T00:03:28.762 INFO:tasks.workunit.client.1.vm06.stdout:1/93: getdents d6 0 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:7/131: sync 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:2/213: sync 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:7/132: write d0/df/d17/f21 [602835,79194] 0 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:3/105: sync 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:3/106: write d11/f16 [1024553,51223] 0 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:1/94: mkdir d6/d21 0 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:7/133: creat d0/df/d1a/f25 x:0 0 0 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:7/134: chown d0/f5 1632 1 2026-03-09T00:03:28.763 INFO:tasks.workunit.client.1.vm06.stdout:7/135: creat d0/df/d17/f26 x:0 0 0 2026-03-09T00:03:28.765 INFO:tasks.workunit.client.1.vm06.stdout:1/95: unlink d6/c1c 0 2026-03-09T00:03:28.765 INFO:tasks.workunit.client.1.vm06.stdout:1/96: fsync d6/f20 0 2026-03-09T00:03:28.765 INFO:tasks.workunit.client.1.vm06.stdout:1/97: write d6/f1a [624844,120233] 0 2026-03-09T00:03:28.765 INFO:tasks.workunit.client.1.vm06.stdout:1/98: stat d6/c11 0 2026-03-09T00:03:28.766 INFO:tasks.workunit.client.1.vm06.stdout:7/136: mkdir d0/df/d1a/d27 0 2026-03-09T00:03:28.766 INFO:tasks.workunit.client.1.vm06.stdout:7/137: fdatasync d0/df/d17/f26 0 2026-03-09T00:03:28.769 INFO:tasks.workunit.client.1.vm06.stdout:3/107: getdents d11 0 2026-03-09T00:03:28.769 INFO:tasks.workunit.client.1.vm06.stdout:3/108: creat d11/f20 x:0 0 0 2026-03-09T00:03:28.769 INFO:tasks.workunit.client.1.vm06.stdout:3/109: chown d11/f1b 5088 1 2026-03-09T00:03:28.769 INFO:tasks.workunit.client.1.vm06.stdout:3/110: rename d11/l19 to d11/l21 0 2026-03-09T00:03:28.769 INFO:tasks.workunit.client.1.vm06.stdout:3/111: write d11/f20 [242765,123805] 0 2026-03-09T00:03:28.770 INFO:tasks.workunit.client.1.vm06.stdout:3/112: write fa [756882,17058] 0 2026-03-09T00:03:28.772 INFO:tasks.workunit.client.1.vm06.stdout:3/113: symlink d11/l22 0 2026-03-09T00:03:28.780 INFO:tasks.workunit.client.1.vm06.stdout:5/157: dwrite d5/f1f [0,4194304] 0 2026-03-09T00:03:28.780 INFO:tasks.workunit.client.1.vm06.stdout:5/158: truncate d5/f19 1015176 0 2026-03-09T00:03:28.796 INFO:tasks.workunit.client.1.vm06.stdout:7/138: write d0/f14 [3654138,118628] 0 2026-03-09T00:03:28.798 INFO:tasks.workunit.client.1.vm06.stdout:7/139: getdents d0/db 0 2026-03-09T00:03:28.798 INFO:tasks.workunit.client.1.vm06.stdout:7/140: chown d0/db 135 1 2026-03-09T00:03:28.798 INFO:tasks.workunit.client.1.vm06.stdout:7/141: chown d0/f6 0 1 2026-03-09T00:03:28.856 INFO:tasks.workunit.client.1.vm06.stdout:8/178: dread db/dd/f13 [0,4194304] 0 2026-03-09T00:03:28.857 INFO:tasks.workunit.client.1.vm06.stdout:8/179: readlink db/l1a 0 2026-03-09T00:03:28.857 INFO:tasks.workunit.client.1.vm06.stdout:8/180: stat db/l26 0 2026-03-09T00:03:28.857 INFO:tasks.workunit.client.1.vm06.stdout:8/181: mkdir db/dd/d24/d36/d38 0 2026-03-09T00:03:28.881 INFO:tasks.workunit.client.1.vm06.stdout:7/142: dread d0/df/d17/f1f [0,4194304] 0 2026-03-09T00:03:28.882 INFO:tasks.workunit.client.1.vm06.stdout:7/143: chown d0/f7 1694 1 2026-03-09T00:03:28.882 INFO:tasks.workunit.client.1.vm06.stdout:2/214: dread d7/da/d1c/f1f [4194304,4194304] 0 2026-03-09T00:03:28.884 INFO:tasks.workunit.client.1.vm06.stdout:7/144: creat d0/df/d1a/d22/f28 x:0 0 0 2026-03-09T00:03:28.891 INFO:tasks.workunit.client.1.vm06.stdout:2/215: read d7/f8 [8110705,47347] 0 2026-03-09T00:03:29.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:29 vm06.local ceph-mon[58395]: pgmap v131: 65 pgs: 65 active+clean; 408 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 15 MiB/s rd, 31 MiB/s wr, 486 op/s 2026-03-09T00:03:29.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:29 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:29.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:29 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:29.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:29 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:29.131 INFO:tasks.workunit.client.1.vm06.stdout:4/116: getdents d17 0 2026-03-09T00:03:29.134 INFO:tasks.workunit.client.1.vm06.stdout:4/117: chown l8 2559 1 2026-03-09T00:03:29.147 INFO:tasks.workunit.client.1.vm06.stdout:4/118: creat d17/f1f x:0 0 0 2026-03-09T00:03:29.147 INFO:tasks.workunit.client.1.vm06.stdout:4/119: creat d17/f20 x:0 0 0 2026-03-09T00:03:29.150 INFO:tasks.workunit.client.1.vm06.stdout:4/120: truncate fb 744839 0 2026-03-09T00:03:29.150 INFO:tasks.workunit.client.1.vm06.stdout:4/121: chown f16 44 1 2026-03-09T00:03:29.150 INFO:tasks.workunit.client.1.vm06.stdout:4/122: truncate d17/f1d 546942 0 2026-03-09T00:03:29.161 INFO:tasks.workunit.client.1.vm06.stdout:4/123: dread f7 [0,4194304] 0 2026-03-09T00:03:29.162 INFO:tasks.workunit.client.1.vm06.stdout:4/124: mkdir d17/d21 0 2026-03-09T00:03:29.178 INFO:tasks.workunit.client.1.vm06.stdout:6/122: write d4/f6 [1568242,68260] 0 2026-03-09T00:03:29.283 INFO:tasks.workunit.client.1.vm06.stdout:3/114: dwrite d11/f1c [0,4194304] 0 2026-03-09T00:03:29.284 INFO:tasks.workunit.client.1.vm06.stdout:3/115: rmdir d11 39 2026-03-09T00:03:29.292 INFO:tasks.workunit.client.1.vm06.stdout:9/118: dread d1/d3/d12/d21/f8 [0,4194304] 0 2026-03-09T00:03:29.300 INFO:tasks.workunit.client.1.vm06.stdout:3/116: read d11/f18 [3019983,88129] 0 2026-03-09T00:03:29.301 INFO:tasks.workunit.client.1.vm06.stdout:3/117: truncate d11/f13 1518891 0 2026-03-09T00:03:29.350 INFO:tasks.workunit.client.1.vm06.stdout:1/99: dwrite d6/f1d [0,4194304] 0 2026-03-09T00:03:29.493 INFO:tasks.workunit.client.1.vm06.stdout:1/100: dwrite d6/fb [0,4194304] 0 2026-03-09T00:03:29.494 INFO:tasks.workunit.client.1.vm06.stdout:1/101: chown d6/c16 1063202463 1 2026-03-09T00:03:29.494 INFO:tasks.workunit.client.1.vm06.stdout:1/102: write d6/f19 [1323264,82758] 0 2026-03-09T00:03:29.494 INFO:tasks.workunit.client.1.vm06.stdout:1/103: write d6/fa [4011286,123943] 0 2026-03-09T00:03:29.494 INFO:tasks.workunit.client.1.vm06.stdout:1/104: write d6/f20 [672287,112652] 0 2026-03-09T00:03:29.499 INFO:tasks.workunit.client.1.vm06.stdout:1/105: write d6/f1d [1606226,95499] 0 2026-03-09T00:03:29.511 INFO:tasks.workunit.client.1.vm06.stdout:1/106: dread d6/fb [0,4194304] 0 2026-03-09T00:03:29.511 INFO:tasks.workunit.client.1.vm06.stdout:1/107: readlink d6/l17 0 2026-03-09T00:03:29.511 INFO:tasks.workunit.client.1.vm06.stdout:1/108: stat d6/c8 0 2026-03-09T00:03:29.511 INFO:tasks.workunit.client.1.vm06.stdout:1/109: write d6/f1a [1080473,19613] 0 2026-03-09T00:03:29.511 INFO:tasks.workunit.client.1.vm06.stdout:1/110: readlink d6/l10 0 2026-03-09T00:03:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/111: creat d6/f22 x:0 0 0 2026-03-09T00:03:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/112: fdatasync d6/f22 0 2026-03-09T00:03:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/113: write d6/f20 [1686469,100646] 0 2026-03-09T00:03:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/114: symlink d6/l23 0 2026-03-09T00:03:29.512 INFO:tasks.workunit.client.1.vm06.stdout:1/115: fsync d6/ff 0 2026-03-09T00:03:29.514 INFO:tasks.workunit.client.1.vm06.stdout:1/116: symlink d6/d21/l24 0 2026-03-09T00:03:29.514 INFO:tasks.workunit.client.1.vm06.stdout:1/117: truncate d6/fa 5118423 0 2026-03-09T00:03:29.514 INFO:tasks.workunit.client.1.vm06.stdout:1/118: stat d6/c8 0 2026-03-09T00:03:29.514 INFO:tasks.workunit.client.1.vm06.stdout:1/119: rename d6/f22 to d6/f25 0 2026-03-09T00:03:29.515 INFO:tasks.workunit.client.1.vm06.stdout:1/120: unlink d6/c1f 0 2026-03-09T00:03:29.560 INFO:tasks.workunit.client.1.vm06.stdout:2/216: dread d7/f8 [0,4194304] 0 2026-03-09T00:03:29.560 INFO:tasks.workunit.client.1.vm06.stdout:2/217: symlink d7/d1a/d3c/l42 0 2026-03-09T00:03:29.560 INFO:tasks.workunit.client.1.vm06.stdout:2/218: stat d7/d1b/f37 0 2026-03-09T00:03:29.560 INFO:tasks.workunit.client.1.vm06.stdout:2/219: getdents d7/d1a/d39 0 2026-03-09T00:03:29.560 INFO:tasks.workunit.client.1.vm06.stdout:8/182: dread db/f2d [0,4194304] 0 2026-03-09T00:03:29.560 INFO:tasks.workunit.client.1.vm06.stdout:8/183: dwrite - open db/dd/f1f failed 14 2026-03-09T00:03:29.561 INFO:tasks.workunit.client.1.vm06.stdout:8/184: symlink db/d1e/l39 0 2026-03-09T00:03:29.561 INFO:tasks.workunit.client.1.vm06.stdout:8/185: fdatasync db/dd/f1c 0 2026-03-09T00:03:29.562 INFO:tasks.workunit.client.1.vm06.stdout:8/186: creat db/dd/d24/d36/d38/f3a x:0 0 0 2026-03-09T00:03:29.563 INFO:tasks.workunit.client.1.vm06.stdout:8/187: symlink db/d1e/l3b 0 2026-03-09T00:03:29.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:29 vm03.local ceph-mon[52346]: pgmap v131: 65 pgs: 65 active+clean; 408 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 15 MiB/s rd, 31 MiB/s wr, 486 op/s 2026-03-09T00:03:29.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:29 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:29.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:29 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:29.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:29 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:29.590 INFO:tasks.workunit.client.1.vm06.stdout:7/145: dwrite d0/df/d17/f1f [0,4194304] 0 2026-03-09T00:03:29.654 INFO:tasks.workunit.client.1.vm06.stdout:2/220: read d7/d1b/f22 [1557960,36850] 0 2026-03-09T00:03:29.654 INFO:tasks.workunit.client.1.vm06.stdout:2/221: mknod d7/da/d1c/c43 0 2026-03-09T00:03:29.654 INFO:tasks.workunit.client.1.vm06.stdout:2/222: read - d7/da/f20 zero size 2026-03-09T00:03:29.655 INFO:tasks.workunit.client.1.vm06.stdout:2/223: truncate d7/f17 1462216 0 2026-03-09T00:03:29.655 INFO:tasks.workunit.client.1.vm06.stdout:2/224: fsync d7/da/d1c/f1f 0 2026-03-09T00:03:29.694 INFO:tasks.workunit.client.1.vm06.stdout:2/225: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:29.786 INFO:tasks.workunit.client.1.vm06.stdout:9/119: rename d1/d3/d12/d21/d9/f19 to d1/d4/f24 0 2026-03-09T00:03:29.789 INFO:tasks.workunit.client.1.vm06.stdout:1/121: rename d6/l17 to d6/d21/l26 0 2026-03-09T00:03:29.791 INFO:tasks.workunit.client.1.vm06.stdout:1/122: stat d6/c13 0 2026-03-09T00:03:29.791 INFO:tasks.workunit.client.1.vm06.stdout:9/120: mkdir d1/d3/d12/d21/d14/d25 0 2026-03-09T00:03:29.791 INFO:tasks.workunit.client.1.vm06.stdout:1/123: symlink d6/d21/l27 0 2026-03-09T00:03:29.791 INFO:tasks.workunit.client.1.vm06.stdout:1/124: creat d6/f28 x:0 0 0 2026-03-09T00:03:29.794 INFO:tasks.workunit.client.1.vm06.stdout:9/121: write d1/d3/d12/d21/d9/f10 [59394,66922] 0 2026-03-09T00:03:29.794 INFO:tasks.workunit.client.1.vm06.stdout:9/122: write d1/d3/f1f [2590326,18643] 0 2026-03-09T00:03:29.794 INFO:tasks.workunit.client.1.vm06.stdout:9/123: dread - d1/d3/d12/d21/d14/f1a zero size 2026-03-09T00:03:29.794 INFO:tasks.workunit.client.1.vm06.stdout:9/124: fsync d1/d3/f11 0 2026-03-09T00:03:29.801 INFO:tasks.workunit.client.1.vm06.stdout:6/123: dwrite f1 [0,4194304] 0 2026-03-09T00:03:29.805 INFO:tasks.workunit.client.1.vm06.stdout:6/124: mknod d4/c25 0 2026-03-09T00:03:29.805 INFO:tasks.workunit.client.1.vm06.stdout:6/125: fdatasync d4/f23 0 2026-03-09T00:03:29.805 INFO:tasks.workunit.client.1.vm06.stdout:6/126: truncate d4/d16/f21 380138 0 2026-03-09T00:03:29.805 INFO:tasks.workunit.client.1.vm06.stdout:6/127: write d4/f5 [3586079,70353] 0 2026-03-09T00:03:29.844 INFO:tasks.workunit.client.1.vm06.stdout:6/128: dwrite d4/f6 [0,4194304] 0 2026-03-09T00:03:29.844 INFO:tasks.workunit.client.1.vm06.stdout:6/129: chown d4/fb 18 1 2026-03-09T00:03:29.846 INFO:tasks.workunit.client.1.vm06.stdout:6/130: getdents d4/d16 0 2026-03-09T00:03:29.847 INFO:tasks.workunit.client.1.vm06.stdout:6/131: write d4/f12 [290533,92041] 0 2026-03-09T00:03:29.849 INFO:tasks.workunit.client.1.vm06.stdout:6/132: creat d4/f26 x:0 0 0 2026-03-09T00:03:29.857 INFO:tasks.workunit.client.1.vm06.stdout:6/133: dread d4/f6 [0,4194304] 0 2026-03-09T00:03:29.857 INFO:tasks.workunit.client.1.vm06.stdout:6/134: read d4/f12 [407263,72757] 0 2026-03-09T00:03:29.876 INFO:tasks.workunit.client.1.vm06.stdout:7/146: dread d0/fe [0,4194304] 0 2026-03-09T00:03:29.876 INFO:tasks.workunit.client.1.vm06.stdout:7/147: fsync d0/db/f18 0 2026-03-09T00:03:29.904 INFO:tasks.workunit.client.1.vm06.stdout:8/188: dread db/f17 [0,4194304] 0 2026-03-09T00:03:29.904 INFO:tasks.workunit.client.1.vm06.stdout:8/189: dread - db/f31 zero size 2026-03-09T00:03:29.904 INFO:tasks.workunit.client.1.vm06.stdout:8/190: fsync db/d1e/f2e 0 2026-03-09T00:03:29.905 INFO:tasks.workunit.client.1.vm06.stdout:8/191: rename db/dd/c22 to db/dd/d24/d36/d38/c3c 0 2026-03-09T00:03:29.906 INFO:tasks.workunit.client.1.vm06.stdout:8/192: creat db/dd/d24/d36/f3d x:0 0 0 2026-03-09T00:03:29.947 INFO:tasks.workunit.client.1.vm06.stdout:6/135: dwrite d4/f23 [0,4194304] 0 2026-03-09T00:03:29.947 INFO:tasks.workunit.client.1.vm06.stdout:6/136: readlink l0 0 2026-03-09T00:03:29.990 INFO:tasks.workunit.client.1.vm06.stdout:7/148: dwrite d0/f2 [4194304,4194304] 0 2026-03-09T00:03:29.991 INFO:tasks.workunit.client.1.vm06.stdout:7/149: rename d0/df/d17/f26 to d0/df/f29 0 2026-03-09T00:03:29.992 INFO:tasks.workunit.client.1.vm06.stdout:7/150: mknod d0/df/d1a/d22/c2a 0 2026-03-09T00:03:29.992 INFO:tasks.workunit.client.1.vm06.stdout:7/151: mknod d0/c2b 0 2026-03-09T00:03:29.992 INFO:tasks.workunit.client.1.vm06.stdout:7/152: fsync d0/f5 0 2026-03-09T00:03:29.992 INFO:tasks.workunit.client.1.vm06.stdout:7/153: read d0/f5 [2547646,71780] 0 2026-03-09T00:03:29.994 INFO:tasks.workunit.client.1.vm06.stdout:7/154: unlink d0/db/f18 0 2026-03-09T00:03:29.994 INFO:tasks.workunit.client.1.vm06.stdout:7/155: fsync d0/f5 0 2026-03-09T00:03:29.994 INFO:tasks.workunit.client.1.vm06.stdout:7/156: fdatasync d0/f7 0 2026-03-09T00:03:29.994 INFO:tasks.workunit.client.1.vm06.stdout:7/157: fdatasync d0/fa 0 2026-03-09T00:03:29.994 INFO:tasks.workunit.client.1.vm06.stdout:7/158: stat d0/df/d1a/d27 0 2026-03-09T00:03:30.033 INFO:tasks.workunit.client.1.vm06.stdout:7/159: dread d0/fe [0,4194304] 0 2026-03-09T00:03:30.033 INFO:tasks.workunit.client.1.vm06.stdout:7/160: dread - d0/df/d1a/d22/f28 zero size 2026-03-09T00:03:30.051 INFO:tasks.workunit.client.1.vm06.stdout:5/159: dwrite d5/f1b [8388608,4194304] 0 2026-03-09T00:03:30.053 INFO:tasks.workunit.client.1.vm06.stdout:5/160: mknod d5/d1c/d21/d28/c2e 0 2026-03-09T00:03:30.053 INFO:tasks.workunit.client.1.vm06.stdout:5/161: chown d5/f16 75 1 2026-03-09T00:03:30.053 INFO:tasks.workunit.client.1.vm06.stdout:5/162: dread - d5/d1c/f22 zero size 2026-03-09T00:03:30.053 INFO:tasks.workunit.client.1.vm06.stdout:5/163: readlink d5/l11 0 2026-03-09T00:03:30.053 INFO:tasks.workunit.client.1.vm06.stdout:5/164: creat d5/d1c/f2f x:0 0 0 2026-03-09T00:03:30.053 INFO:tasks.workunit.client.1.vm06.stdout:5/165: stat d5/d1c/c25 0 2026-03-09T00:03:30.054 INFO:tasks.workunit.client.1.vm06.stdout:5/166: rename d5/l18 to d5/d1c/l30 0 2026-03-09T00:03:30.055 INFO:tasks.workunit.client.1.vm06.stdout:5/167: rename d5/d1c/f2f to d5/d1c/d21/d2a/f31 0 2026-03-09T00:03:30.055 INFO:tasks.workunit.client.1.vm06.stdout:5/168: creat d5/d1c/d21/f32 x:0 0 0 2026-03-09T00:03:30.057 INFO:tasks.workunit.client.1.vm06.stdout:5/169: creat d5/d1c/d21/d28/f33 x:0 0 0 2026-03-09T00:03:30.057 INFO:tasks.workunit.client.1.vm06.stdout:5/170: write d5/d1c/f2d [851305,27256] 0 2026-03-09T00:03:30.057 INFO:tasks.workunit.client.1.vm06.stdout:5/171: getdents d5/d1c/d2b 0 2026-03-09T00:03:30.057 INFO:tasks.workunit.client.1.vm06.stdout:5/172: fsync d5/d1c/f22 0 2026-03-09T00:03:30.057 INFO:tasks.workunit.client.1.vm06.stdout:5/173: write d5/f15 [750787,120211] 0 2026-03-09T00:03:30.059 INFO:tasks.workunit.client.1.vm06.stdout:5/174: rmdir d5/d1c/d23 39 2026-03-09T00:03:30.121 INFO:tasks.workunit.client.1.vm06.stdout:4/125: dwrite d17/f1e [0,4194304] 0 2026-03-09T00:03:30.121 INFO:tasks.workunit.client.1.vm06.stdout:4/126: write f15 [219884,29391] 0 2026-03-09T00:03:30.121 INFO:tasks.workunit.client.1.vm06.stdout:4/127: write f10 [683446,101198] 0 2026-03-09T00:03:30.123 INFO:tasks.workunit.client.1.vm06.stdout:4/128: unlink fb 0 2026-03-09T00:03:30.141 INFO:tasks.workunit.client.1.vm06.stdout:5/175: dwrite d5/d1c/d21/d28/f33 [0,4194304] 0 2026-03-09T00:03:30.141 INFO:tasks.workunit.client.1.vm06.stdout:5/176: truncate d5/d1c/f22 996644 0 2026-03-09T00:03:30.142 INFO:tasks.workunit.client.1.vm06.stdout:5/177: rename d5/d1c/d2b to d5/d1c/d23/d34 0 2026-03-09T00:03:30.143 INFO:tasks.workunit.client.1.vm06.stdout:5/178: truncate d5/f16 1490092 0 2026-03-09T00:03:30.143 INFO:tasks.workunit.client.1.vm06.stdout:3/118: dwrite d11/f1c [0,4194304] 0 2026-03-09T00:03:30.144 INFO:tasks.workunit.client.1.vm06.stdout:5/179: read d5/f1d [542770,44131] 0 2026-03-09T00:03:30.144 INFO:tasks.workunit.client.1.vm06.stdout:5/180: getdents d5/d1c/d23/d34 0 2026-03-09T00:03:30.146 INFO:tasks.workunit.client.1.vm06.stdout:3/119: symlink d11/l23 0 2026-03-09T00:03:30.147 INFO:tasks.workunit.client.1.vm06.stdout:5/181: mkdir d5/d1c/d21/d28/d35 0 2026-03-09T00:03:30.148 INFO:tasks.workunit.client.1.vm06.stdout:3/120: creat d11/f24 x:0 0 0 2026-03-09T00:03:30.148 INFO:tasks.workunit.client.1.vm06.stdout:3/121: truncate d11/f20 1326615 0 2026-03-09T00:03:30.155 INFO:tasks.workunit.client.1.vm06.stdout:5/182: write d5/f19 [496741,31306] 0 2026-03-09T00:03:30.155 INFO:tasks.workunit.client.1.vm06.stdout:5/183: readlink d5/l11 0 2026-03-09T00:03:30.155 INFO:tasks.workunit.client.1.vm06.stdout:4/129: dwrite fe [0,4194304] 0 2026-03-09T00:03:30.159 INFO:tasks.workunit.client.1.vm06.stdout:4/130: mkdir d17/d21/d22 0 2026-03-09T00:03:30.161 INFO:tasks.workunit.client.1.vm06.stdout:5/184: unlink d5/f15 0 2026-03-09T00:03:30.161 INFO:tasks.workunit.client.1.vm06.stdout:5/185: fdatasync d5/d1c/d21/f32 0 2026-03-09T00:03:30.189 INFO:tasks.workunit.client.1.vm06.stdout:7/161: rmdir d0/df/d1a/d22 39 2026-03-09T00:03:30.189 INFO:tasks.workunit.client.1.vm06.stdout:7/162: truncate d0/df/f13 509536 0 2026-03-09T00:03:30.189 INFO:tasks.workunit.client.1.vm06.stdout:7/163: chown d0/c4 19682 1 2026-03-09T00:03:30.189 INFO:tasks.workunit.client.1.vm06.stdout:7/164: creat d0/df/d1a/d22/f2c x:0 0 0 2026-03-09T00:03:30.189 INFO:tasks.workunit.client.1.vm06.stdout:7/165: chown d0/db 3753 1 2026-03-09T00:03:30.280 INFO:tasks.workunit.client.1.vm06.stdout:5/186: rmdir d5/d1c/d21 39 2026-03-09T00:03:30.281 INFO:tasks.workunit.client.1.vm06.stdout:5/187: stat d5/cd 0 2026-03-09T00:03:30.283 INFO:tasks.workunit.client.1.vm06.stdout:5/188: creat d5/f36 x:0 0 0 2026-03-09T00:03:30.286 INFO:tasks.workunit.client.1.vm06.stdout:5/189: mknod d5/d1c/c37 0 2026-03-09T00:03:30.286 INFO:tasks.workunit.client.1.vm06.stdout:7/166: rmdir d0/df/d1a 39 2026-03-09T00:03:30.286 INFO:tasks.workunit.client.1.vm06.stdout:7/167: dread - d0/df/d1a/d22/f28 zero size 2026-03-09T00:03:30.286 INFO:tasks.workunit.client.1.vm06.stdout:7/168: readlink d0/df/l10 0 2026-03-09T00:03:30.290 INFO:tasks.workunit.client.1.vm06.stdout:5/190: symlink d5/l38 0 2026-03-09T00:03:30.300 INFO:tasks.workunit.client.1.vm06.stdout:7/169: truncate d0/f5 2556403 0 2026-03-09T00:03:30.300 INFO:tasks.workunit.client.1.vm06.stdout:7/170: write d0/db/f23 [1120537,71184] 0 2026-03-09T00:03:30.300 INFO:tasks.workunit.client.1.vm06.stdout:7/171: dread - d0/df/f29 zero size 2026-03-09T00:03:30.381 INFO:tasks.workunit.client.1.vm06.stdout:6/137: dwrite d4/f5 [8388608,4194304] 0 2026-03-09T00:03:30.381 INFO:tasks.workunit.client.1.vm06.stdout:6/138: chown l0 20 1 2026-03-09T00:03:30.382 INFO:tasks.workunit.client.1.vm06.stdout:6/139: truncate d4/f23 2009633 0 2026-03-09T00:03:30.430 INFO:tasks.workunit.client.1.vm06.stdout:2/226: dwrite d7/d1a/d25/f33 [0,4194304] 0 2026-03-09T00:03:30.430 INFO:tasks.workunit.client.1.vm06.stdout:2/227: readlink d7/da/l24 0 2026-03-09T00:03:30.430 INFO:tasks.workunit.client.1.vm06.stdout:2/228: chown d7/da/db/l35 5583838 1 2026-03-09T00:03:30.432 INFO:tasks.workunit.client.1.vm06.stdout:2/229: truncate f6 4039020 0 2026-03-09T00:03:30.440 INFO:tasks.workunit.client.1.vm06.stdout:2/230: dread d7/da/d1c/f29 [0,4194304] 0 2026-03-09T00:03:30.442 INFO:tasks.workunit.client.1.vm06.stdout:2/231: symlink d7/da/db/l44 0 2026-03-09T00:03:30.444 INFO:tasks.workunit.client.1.vm06.stdout:2/232: mkdir d7/d1a/d3c/d45 0 2026-03-09T00:03:30.444 INFO:tasks.workunit.client.1.vm06.stdout:2/233: getdents d7/d1a/d39 0 2026-03-09T00:03:30.444 INFO:tasks.workunit.client.1.vm06.stdout:2/234: read - d7/da/db/de/f32 zero size 2026-03-09T00:03:30.444 INFO:tasks.workunit.client.1.vm06.stdout:2/235: creat d7/d1b/f46 x:0 0 0 2026-03-09T00:03:30.444 INFO:tasks.workunit.client.1.vm06.stdout:2/236: fsync d7/f17 0 2026-03-09T00:03:30.444 INFO:tasks.workunit.client.1.vm06.stdout:2/237: write d7/d1a/f2f [1605858,43456] 0 2026-03-09T00:03:30.445 INFO:tasks.workunit.client.1.vm06.stdout:2/238: rmdir d7/d1a/d3c/d45 0 2026-03-09T00:03:30.446 INFO:tasks.workunit.client.1.vm06.stdout:2/239: mkdir d7/da/d1c/d47 0 2026-03-09T00:03:30.447 INFO:tasks.workunit.client.1.vm06.stdout:2/240: rename d7/d1a/f2f to d7/f48 0 2026-03-09T00:03:30.448 INFO:tasks.workunit.client.1.vm06.stdout:2/241: dread d7/da/d1c/f29 [4194304,4194304] 0 2026-03-09T00:03:30.461 INFO:tasks.workunit.client.1.vm06.stdout:6/140: dwrite d4/f12 [0,4194304] 0 2026-03-09T00:03:30.469 INFO:tasks.workunit.client.1.vm06.stdout:6/141: dread d4/fc [0,4194304] 0 2026-03-09T00:03:30.469 INFO:tasks.workunit.client.1.vm06.stdout:5/191: dwrite d5/d1c/f22 [0,4194304] 0 2026-03-09T00:03:30.470 INFO:tasks.workunit.client.1.vm06.stdout:6/142: mkdir d4/d27 0 2026-03-09T00:03:30.471 INFO:tasks.workunit.client.1.vm06.stdout:6/143: write d4/fb [703843,75863] 0 2026-03-09T00:03:30.473 INFO:tasks.workunit.client.1.vm06.stdout:5/192: rename d5/f1b to d5/d1c/d21/f39 0 2026-03-09T00:03:30.474 INFO:tasks.workunit.client.1.vm06.stdout:5/193: symlink d5/d1c/l3a 0 2026-03-09T00:03:30.501 INFO:tasks.workunit.client.1.vm06.stdout:3/122: dwrite d11/f16 [0,4194304] 0 2026-03-09T00:03:30.502 INFO:tasks.workunit.client.1.vm06.stdout:3/123: mknod d11/c25 0 2026-03-09T00:03:30.503 INFO:tasks.workunit.client.1.vm06.stdout:3/124: mknod d11/c26 0 2026-03-09T00:03:30.503 INFO:tasks.workunit.client.1.vm06.stdout:3/125: creat d11/f27 x:0 0 0 2026-03-09T00:03:30.535 INFO:tasks.workunit.client.1.vm06.stdout:2/242: dwrite d7/d1a/f30 [0,4194304] 0 2026-03-09T00:03:30.541 INFO:tasks.workunit.client.1.vm06.stdout:5/194: dwrite d5/f14 [4194304,4194304] 0 2026-03-09T00:03:30.541 INFO:tasks.workunit.client.1.vm06.stdout:5/195: fsync d5/f14 0 2026-03-09T00:03:30.549 INFO:tasks.workunit.client.1.vm06.stdout:5/196: read d5/d1c/d21/f39 [10529340,21898] 0 2026-03-09T00:03:30.549 INFO:tasks.workunit.client.1.vm06.stdout:5/197: chown d5/d1c/d21 50 1 2026-03-09T00:03:30.549 INFO:tasks.workunit.client.1.vm06.stdout:5/198: write d5/f36 [52380,112587] 0 2026-03-09T00:03:30.549 INFO:tasks.workunit.client.1.vm06.stdout:5/199: write d5/f1d [1752924,130747] 0 2026-03-09T00:03:30.553 INFO:tasks.workunit.client.1.vm06.stdout:3/126: dwrite f3 [4194304,4194304] 0 2026-03-09T00:03:30.553 INFO:tasks.workunit.client.1.vm06.stdout:3/127: dread - d11/f27 zero size 2026-03-09T00:03:30.553 INFO:tasks.workunit.client.1.vm06.stdout:3/128: write d11/f27 [486263,9885] 0 2026-03-09T00:03:30.553 INFO:tasks.workunit.client.1.vm06.stdout:3/129: chown d11/f13 767 1 2026-03-09T00:03:30.554 INFO:tasks.workunit.client.1.vm06.stdout:5/200: creat d5/d1c/d21/d28/f3b x:0 0 0 2026-03-09T00:03:30.555 INFO:tasks.workunit.client.1.vm06.stdout:5/201: creat d5/d1c/d21/f3c x:0 0 0 2026-03-09T00:03:30.570 INFO:tasks.workunit.client.1.vm06.stdout:6/144: dwrite f1 [0,4194304] 0 2026-03-09T00:03:30.570 INFO:tasks.workunit.client.1.vm06.stdout:6/145: chown d4/d16/c1b 46071151 1 2026-03-09T00:03:30.571 INFO:tasks.workunit.client.1.vm06.stdout:6/146: symlink d4/d16/l28 0 2026-03-09T00:03:30.588 INFO:tasks.workunit.client.1.vm06.stdout:5/202: dwrite d5/d1c/d21/d28/f3b [0,4194304] 0 2026-03-09T00:03:30.592 INFO:tasks.workunit.client.1.vm06.stdout:5/203: write d5/fe [2498764,106303] 0 2026-03-09T00:03:30.592 INFO:tasks.workunit.client.1.vm06.stdout:5/204: creat d5/f3d x:0 0 0 2026-03-09T00:03:30.594 INFO:tasks.workunit.client.1.vm06.stdout:5/205: rename d5/d1c/c25 to d5/d1c/d23/d34/c3e 0 2026-03-09T00:03:30.594 INFO:tasks.workunit.client.1.vm06.stdout:5/206: creat d5/d1c/d21/d2a/f3f x:0 0 0 2026-03-09T00:03:30.596 INFO:tasks.workunit.client.1.vm06.stdout:5/207: creat d5/d1c/d21/d28/d35/f40 x:0 0 0 2026-03-09T00:03:30.597 INFO:tasks.workunit.client.1.vm06.stdout:5/208: mknod d5/d1c/c41 0 2026-03-09T00:03:30.597 INFO:tasks.workunit.client.1.vm06.stdout:5/209: creat d5/d1c/d23/f42 x:0 0 0 2026-03-09T00:03:30.597 INFO:tasks.workunit.client.1.vm06.stdout:5/210: chown d5/d1c/d21/d2a/f31 23234533 1 2026-03-09T00:03:30.633 INFO:tasks.workunit.client.1.vm06.stdout:2/243: getdents d7/d1a/d3c 0 2026-03-09T00:03:30.633 INFO:tasks.workunit.client.1.vm06.stdout:2/244: write d7/d1b/f22 [3089300,64497] 0 2026-03-09T00:03:30.648 INFO:tasks.workunit.client.1.vm06.stdout:2/245: write d7/f17 [1348454,60088] 0 2026-03-09T00:03:30.648 INFO:tasks.workunit.client.1.vm06.stdout:2/246: readlink d7/d1a/d3c/l42 0 2026-03-09T00:03:30.649 INFO:tasks.workunit.client.1.vm06.stdout:2/247: unlink d7/d1a/d39/c3f 0 2026-03-09T00:03:30.649 INFO:tasks.workunit.client.1.vm06.stdout:2/248: fdatasync d7/da/db/de/f32 0 2026-03-09T00:03:30.649 INFO:tasks.workunit.client.1.vm06.stdout:2/249: creat d7/da/db/de/f49 x:0 0 0 2026-03-09T00:03:30.650 INFO:tasks.workunit.client.1.vm06.stdout:6/147: dwrite d4/f23 [0,4194304] 0 2026-03-09T00:03:30.650 INFO:tasks.workunit.client.1.vm06.stdout:6/148: fsync d4/d16/f1c 0 2026-03-09T00:03:30.650 INFO:tasks.workunit.client.1.vm06.stdout:2/250: truncate f3 2558766 0 2026-03-09T00:03:30.652 INFO:tasks.workunit.client.1.vm06.stdout:6/149: rename d4/d16/c1b to d4/d16/c29 0 2026-03-09T00:03:30.653 INFO:tasks.workunit.client.1.vm06.stdout:2/251: mknod d7/da/db/c4a 0 2026-03-09T00:03:30.653 INFO:tasks.workunit.client.1.vm06.stdout:2/252: write d7/d1b/f3b [443712,112305] 0 2026-03-09T00:03:30.653 INFO:tasks.workunit.client.1.vm06.stdout:6/150: truncate d4/d16/f1c 470265 0 2026-03-09T00:03:30.653 INFO:tasks.workunit.client.1.vm06.stdout:6/151: chown d4/d16/c1f 5 1 2026-03-09T00:03:30.655 INFO:tasks.workunit.client.1.vm06.stdout:2/253: write d7/da/d1c/f29 [3501199,35734] 0 2026-03-09T00:03:30.655 INFO:tasks.workunit.client.1.vm06.stdout:2/254: stat d7/d1b/d31/l38 0 2026-03-09T00:03:30.655 INFO:tasks.workunit.client.1.vm06.stdout:2/255: chown d7/d1a/d25/f33 192453 1 2026-03-09T00:03:30.656 INFO:tasks.workunit.client.1.vm06.stdout:2/256: truncate d7/da/db/de/f11 3726411 0 2026-03-09T00:03:30.658 INFO:tasks.workunit.client.1.vm06.stdout:2/257: mkdir d7/d4b 0 2026-03-09T00:03:30.660 INFO:tasks.workunit.client.1.vm06.stdout:2/258: dread d7/da/d1c/f1f [4194304,4194304] 0 2026-03-09T00:03:30.661 INFO:tasks.workunit.client.1.vm06.stdout:3/130: dread d11/f27 [0,4194304] 0 2026-03-09T00:03:30.661 INFO:tasks.workunit.client.1.vm06.stdout:3/131: stat d11/f24 0 2026-03-09T00:03:30.661 INFO:tasks.workunit.client.1.vm06.stdout:3/132: readlink d11/l21 0 2026-03-09T00:03:30.663 INFO:tasks.workunit.client.1.vm06.stdout:3/133: mkdir d11/d28 0 2026-03-09T00:03:30.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:30 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:30.679 INFO:tasks.workunit.client.1.vm06.stdout:5/211: truncate d5/d1c/d21/f39 7773287 0 2026-03-09T00:03:30.681 INFO:tasks.workunit.client.1.vm06.stdout:5/212: rename d5/f1f to d5/f43 0 2026-03-09T00:03:30.681 INFO:tasks.workunit.client.1.vm06.stdout:5/213: mkdir d5/d44 0 2026-03-09T00:03:30.682 INFO:tasks.workunit.client.1.vm06.stdout:5/214: rename d5/d1c/d21/d28/c2e to d5/d1c/d23/c45 0 2026-03-09T00:03:30.686 INFO:tasks.workunit.client.1.vm06.stdout:5/215: unlink d5/f16 0 2026-03-09T00:03:30.686 INFO:tasks.workunit.client.1.vm06.stdout:5/216: creat d5/d1c/d21/d28/d35/f46 x:0 0 0 2026-03-09T00:03:30.699 INFO:tasks.workunit.client.1.vm06.stdout:1/125: dread d6/f1a [0,4194304] 0 2026-03-09T00:03:30.699 INFO:tasks.workunit.client.1.vm06.stdout:1/126: write d6/f1b [903592,55596] 0 2026-03-09T00:03:30.699 INFO:tasks.workunit.client.1.vm06.stdout:2/259: fsync d7/f48 0 2026-03-09T00:03:30.702 INFO:tasks.workunit.client.1.vm06.stdout:1/127: unlink d6/f9 0 2026-03-09T00:03:30.705 INFO:tasks.workunit.client.1.vm06.stdout:7/172: truncate d0/f5 1072147 0 2026-03-09T00:03:30.705 INFO:tasks.workunit.client.1.vm06.stdout:7/173: write d0/df/d17/f1f [5051171,110755] 0 2026-03-09T00:03:30.712 INFO:tasks.workunit.client.1.vm06.stdout:1/128: mknod d6/c29 0 2026-03-09T00:03:30.712 INFO:tasks.workunit.client.1.vm06.stdout:7/174: link d0/df/d1a/d22/f28 d0/df/d17/f2d 0 2026-03-09T00:03:30.712 INFO:tasks.workunit.client.1.vm06.stdout:7/175: readlink d0/df/l10 0 2026-03-09T00:03:30.713 INFO:tasks.workunit.client.1.vm06.stdout:1/129: creat d6/d21/f2a x:0 0 0 2026-03-09T00:03:30.716 INFO:tasks.workunit.client.1.vm06.stdout:7/176: unlink d0/db/c12 0 2026-03-09T00:03:30.716 INFO:tasks.workunit.client.1.vm06.stdout:7/177: write d0/df/f29 [654786,57536] 0 2026-03-09T00:03:30.717 INFO:tasks.workunit.client.1.vm06.stdout:2/260: write f6 [3185219,121596] 0 2026-03-09T00:03:30.717 INFO:tasks.workunit.client.1.vm06.stdout:1/130: symlink d6/l2b 0 2026-03-09T00:03:30.717 INFO:tasks.workunit.client.1.vm06.stdout:1/131: readlink d6/l1e 0 2026-03-09T00:03:30.717 INFO:tasks.workunit.client.1.vm06.stdout:2/261: chown d7/d1b/f37 688625 1 2026-03-09T00:03:30.717 INFO:tasks.workunit.client.1.vm06.stdout:7/178: dread d0/df/f13 [0,4194304] 0 2026-03-09T00:03:30.721 INFO:tasks.workunit.client.1.vm06.stdout:7/179: symlink d0/df/d1a/d22/l2e 0 2026-03-09T00:03:30.726 INFO:tasks.workunit.client.1.vm06.stdout:1/132: write d6/ff [2919867,115964] 0 2026-03-09T00:03:30.729 INFO:tasks.workunit.client.1.vm06.stdout:7/180: write d0/f2 [1465175,44130] 0 2026-03-09T00:03:30.748 INFO:tasks.workunit.client.1.vm06.stdout:7/181: symlink d0/df/d1a/l2f 0 2026-03-09T00:03:30.750 INFO:tasks.workunit.client.1.vm06.stdout:3/134: dwrite d11/f1b [0,4194304] 0 2026-03-09T00:03:30.762 INFO:tasks.workunit.client.1.vm06.stdout:3/135: rename d11/f1c to d11/d28/f29 0 2026-03-09T00:03:30.792 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T00:03:30.792 INFO:tasks.workunit.client.0.vm03.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-09T00:03:30.792 INFO:tasks.workunit.client.0.vm03.stderr:+ make 2026-03-09T00:03:30.796 INFO:tasks.workunit.client.1.vm06.stdout:1/133: dwrite d6/f25 [0,4194304] 0 2026-03-09T00:03:30.798 INFO:tasks.workunit.client.1.vm06.stdout:6/152: dread d4/d16/f1c [0,4194304] 0 2026-03-09T00:03:30.799 INFO:tasks.workunit.client.1.vm06.stdout:7/182: dread d0/db/f23 [0,4194304] 0 2026-03-09T00:03:30.799 INFO:tasks.workunit.client.1.vm06.stdout:7/183: write d0/df/d17/f2d [631548,80004] 0 2026-03-09T00:03:30.804 INFO:tasks.workunit.client.1.vm06.stdout:1/134: truncate d6/fa 3848935 0 2026-03-09T00:03:30.804 INFO:tasks.workunit.client.1.vm06.stdout:1/135: write d6/f1d [435760,103269] 0 2026-03-09T00:03:30.804 INFO:tasks.workunit.client.1.vm06.stdout:1/136: chown l2 94037 1 2026-03-09T00:03:30.804 INFO:tasks.workunit.client.1.vm06.stdout:1/137: write d6/f1b [1190216,130375] 0 2026-03-09T00:03:30.805 INFO:tasks.workunit.client.1.vm06.stdout:7/184: write d0/fe [415666,103397] 0 2026-03-09T00:03:30.805 INFO:tasks.workunit.client.1.vm06.stdout:7/185: chown d0/df/d17/f21 358038 1 2026-03-09T00:03:30.805 INFO:tasks.workunit.client.1.vm06.stdout:7/186: chown d0/c4 16331911 1 2026-03-09T00:03:30.808 INFO:tasks.workunit.client.1.vm06.stdout:1/138: symlink d6/d21/l2c 0 2026-03-09T00:03:30.811 INFO:tasks.workunit.client.1.vm06.stdout:7/187: mknod d0/c30 0 2026-03-09T00:03:30.824 INFO:tasks.workunit.client.1.vm06.stdout:2/262: dwrite d7/d1a/d25/f33 [0,4194304] 0 2026-03-09T00:03:30.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:30 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:30.843 INFO:tasks.workunit.client.1.vm06.stdout:5/217: getdents d5/d1c/d21/d28 0 2026-03-09T00:03:30.844 INFO:tasks.workunit.client.1.vm06.stdout:5/218: fsync d5/d1c/d21/f32 0 2026-03-09T00:03:30.844 INFO:tasks.workunit.client.1.vm06.stdout:5/219: dread - d5/d1c/d23/f42 zero size 2026-03-09T00:03:30.844 INFO:tasks.workunit.client.1.vm06.stdout:5/220: truncate d5/d1c/d21/d2a/f3f 167065 0 2026-03-09T00:03:30.850 INFO:tasks.workunit.client.1.vm06.stdout:5/221: write d5/f43 [3669841,98133] 0 2026-03-09T00:03:30.850 INFO:tasks.workunit.client.1.vm06.stdout:5/222: write d5/d1c/d21/d28/d35/f40 [263112,114921] 0 2026-03-09T00:03:30.869 INFO:tasks.workunit.client.1.vm06.stdout:5/223: mkdir d5/d1c/d23/d34/d47 0 2026-03-09T00:03:30.870 INFO:tasks.workunit.client.1.vm06.stdout:7/188: getdents d0/df/d1a/d22 0 2026-03-09T00:03:30.871 INFO:tasks.workunit.client.1.vm06.stdout:7/189: rmdir d0/df/d1a/d22 39 2026-03-09T00:03:30.871 INFO:tasks.workunit.client.1.vm06.stdout:7/190: write d0/f14 [5047050,110186] 0 2026-03-09T00:03:30.881 INFO:tasks.workunit.client.1.vm06.stdout:3/136: truncate f8 2637624 0 2026-03-09T00:03:30.883 INFO:tasks.workunit.client.1.vm06.stdout:3/137: mknod d11/d28/c2a 0 2026-03-09T00:03:30.888 INFO:tasks.workunit.client.1.vm06.stdout:3/138: write d11/f27 [145332,47418] 0 2026-03-09T00:03:30.890 INFO:tasks.workunit.client.1.vm06.stdout:3/139: creat d11/d28/f2b x:0 0 0 2026-03-09T00:03:30.890 INFO:tasks.workunit.client.1.vm06.stdout:3/140: creat d11/d28/f2c x:0 0 0 2026-03-09T00:03:30.890 INFO:tasks.workunit.client.1.vm06.stdout:3/141: chown c0 198644 1 2026-03-09T00:03:30.892 INFO:tasks.workunit.client.1.vm06.stdout:5/224: read d5/d1c/d21/d2a/f3f [128937,130080] 0 2026-03-09T00:03:30.898 INFO:tasks.workunit.client.1.vm06.stdout:5/225: write d5/d1c/f22 [2684447,94715] 0 2026-03-09T00:03:30.898 INFO:tasks.workunit.client.1.vm06.stdout:5/226: truncate d5/f3d 878358 0 2026-03-09T00:03:30.898 INFO:tasks.workunit.client.1.vm06.stdout:5/227: mknod d5/d44/c48 0 2026-03-09T00:03:30.899 INFO:tasks.workunit.client.1.vm06.stdout:5/228: read d5/f1d [1177812,96931] 0 2026-03-09T00:03:30.904 INFO:tasks.workunit.client.1.vm06.stdout:7/191: read d0/df/d17/f1f [1064629,16648] 0 2026-03-09T00:03:30.904 INFO:tasks.workunit.client.1.vm06.stdout:7/192: dread - d0/df/d1a/d22/f2c zero size 2026-03-09T00:03:30.909 INFO:tasks.workunit.client.1.vm06.stdout:5/229: dread d5/d1c/d21/d2a/f3f [0,4194304] 0 2026-03-09T00:03:30.909 INFO:tasks.workunit.client.1.vm06.stdout:5/230: mkdir d5/d1c/d21/d28/d35/d49 0 2026-03-09T00:03:30.912 INFO:tasks.workunit.client.1.vm06.stdout:7/193: fdatasync d0/f14 0 2026-03-09T00:03:30.912 INFO:tasks.workunit.client.1.vm06.stdout:7/194: write d0/f6 [702190,114072] 0 2026-03-09T00:03:30.912 INFO:tasks.workunit.client.1.vm06.stdout:7/195: fsync d0/df/d1a/d22/f2c 0 2026-03-09T00:03:30.923 INFO:tasks.workunit.client.1.vm06.stdout:1/139: getdents d6/d21 0 2026-03-09T00:03:30.924 INFO:tasks.workunit.client.1.vm06.stdout:1/140: stat d6/l10 0 2026-03-09T00:03:30.924 INFO:tasks.workunit.client.1.vm06.stdout:6/153: truncate f1 502541 0 2026-03-09T00:03:30.927 INFO:tasks.workunit.client.1.vm06.stdout:1/141: write d6/fb [3177660,34353] 0 2026-03-09T00:03:30.927 INFO:tasks.workunit.client.1.vm06.stdout:1/142: write d6/d21/f2a [743853,25882] 0 2026-03-09T00:03:30.927 INFO:tasks.workunit.client.1.vm06.stdout:6/154: unlink d4/d16/c18 0 2026-03-09T00:03:30.929 INFO:tasks.workunit.client.1.vm06.stdout:1/143: mkdir d6/d21/d2d 0 2026-03-09T00:03:30.930 INFO:tasks.workunit.client.1.vm06.stdout:6/155: unlink d4/f1d 0 2026-03-09T00:03:30.930 INFO:tasks.workunit.client.1.vm06.stdout:6/156: fsync d4/f12 0 2026-03-09T00:03:30.930 INFO:tasks.workunit.client.1.vm06.stdout:1/144: creat d6/d21/f2e x:0 0 0 2026-03-09T00:03:30.931 INFO:tasks.workunit.client.1.vm06.stdout:6/157: dread d4/d16/f1c [0,4194304] 0 2026-03-09T00:03:30.932 INFO:tasks.workunit.client.1.vm06.stdout:6/158: read d4/d16/f1c [166903,69741] 0 2026-03-09T00:03:30.932 INFO:tasks.workunit.client.1.vm06.stdout:1/145: rename d6/c13 to d6/d21/d2d/c2f 0 2026-03-09T00:03:30.933 INFO:tasks.workunit.client.1.vm06.stdout:9/125: dwrite d1/d3/d12/d21/f8 [0,4194304] 0 2026-03-09T00:03:30.937 INFO:tasks.workunit.client.1.vm06.stdout:6/159: rename d4/f6 to d4/f2a 0 2026-03-09T00:03:30.937 INFO:tasks.workunit.client.1.vm06.stdout:6/160: chown d4/f22 42079056 1 2026-03-09T00:03:30.937 INFO:tasks.workunit.client.1.vm06.stdout:6/161: stat d4/fb 0 2026-03-09T00:03:30.938 INFO:tasks.workunit.client.1.vm06.stdout:1/146: mknod d6/d21/c30 0 2026-03-09T00:03:30.939 INFO:tasks.workunit.client.1.vm06.stdout:6/162: mknod d4/d16/c2b 0 2026-03-09T00:03:30.966 INFO:tasks.workunit.client.1.vm06.stdout:5/231: dread d5/d1c/d21/d28/d35/f40 [0,4194304] 0 2026-03-09T00:03:30.966 INFO:tasks.workunit.client.1.vm06.stdout:5/232: creat d5/d44/f4a x:0 0 0 2026-03-09T00:03:30.966 INFO:tasks.workunit.client.1.vm06.stdout:5/233: dread - d5/d1c/d21/f3c zero size 2026-03-09T00:03:30.966 INFO:tasks.workunit.client.1.vm06.stdout:5/234: chown d5/d1c/d21/d28/d35 169220 1 2026-03-09T00:03:30.990 INFO:tasks.workunit.client.1.vm06.stdout:5/235: dread d5/f43 [0,4194304] 0 2026-03-09T00:03:31.030 INFO:tasks.workunit.client.1.vm06.stdout:6/163: truncate d4/d16/f21 330982 0 2026-03-09T00:03:31.032 INFO:tasks.workunit.client.1.vm06.stdout:6/164: link l0 d4/d27/l2c 0 2026-03-09T00:03:31.032 INFO:tasks.workunit.client.1.vm06.stdout:6/165: chown d4/d16/c17 23836023 1 2026-03-09T00:03:31.033 INFO:tasks.workunit.client.1.vm06.stdout:6/166: rmdir d4/d27 39 2026-03-09T00:03:31.033 INFO:tasks.workunit.client.1.vm06.stdout:6/167: creat d4/f2d x:0 0 0 2026-03-09T00:03:31.033 INFO:tasks.workunit.client.1.vm06.stdout:3/142: dwrite d11/f1a [0,4194304] 0 2026-03-09T00:03:31.036 INFO:tasks.workunit.client.1.vm06.stdout:3/143: mknod d11/d28/c2d 0 2026-03-09T00:03:31.036 INFO:tasks.workunit.client.1.vm06.stdout:6/168: creat d4/d27/f2e x:0 0 0 2026-03-09T00:03:31.041 INFO:tasks.workunit.client.1.vm06.stdout:5/236: dwrite d5/d44/f4a [0,4194304] 0 2026-03-09T00:03:31.041 INFO:tasks.workunit.client.1.vm06.stdout:5/237: write d5/f1d [32734,126669] 0 2026-03-09T00:03:31.041 INFO:tasks.workunit.client.1.vm06.stdout:5/238: write d5/fe [347191,45354] 0 2026-03-09T00:03:31.056 INFO:tasks.workunit.client.1.vm06.stdout:5/239: write d5/ff [4114881,94922] 0 2026-03-09T00:03:31.057 INFO:tasks.workunit.client.1.vm06.stdout:9/126: truncate d1/d3/d12/d21/f8 208428 0 2026-03-09T00:03:31.057 INFO:tasks.workunit.client.1.vm06.stdout:9/127: chown d1/d3/d12/d21 1681633643 1 2026-03-09T00:03:31.058 INFO:tasks.workunit.client.1.vm06.stdout:5/240: mkdir d5/d44/d4b 0 2026-03-09T00:03:31.058 INFO:tasks.workunit.client.1.vm06.stdout:5/241: creat d5/d1c/d23/f4c x:0 0 0 2026-03-09T00:03:31.060 INFO:tasks.workunit.client.1.vm06.stdout:5/242: dread d5/d1c/f2d [0,4194304] 0 2026-03-09T00:03:31.060 INFO:tasks.workunit.client.1.vm06.stdout:5/243: mknod d5/d44/d4b/c4d 0 2026-03-09T00:03:31.060 INFO:tasks.workunit.client.1.vm06.stdout:5/244: creat d5/d1c/d21/d28/d35/f4e x:0 0 0 2026-03-09T00:03:31.061 INFO:tasks.workunit.client.1.vm06.stdout:1/147: dwrite d6/f1a [0,4194304] 0 2026-03-09T00:03:31.066 INFO:tasks.workunit.client.1.vm06.stdout:1/148: creat d6/d21/d2d/f31 x:0 0 0 2026-03-09T00:03:31.070 INFO:tasks.workunit.client.1.vm06.stdout:1/149: fsync d6/f1b 0 2026-03-09T00:03:31.092 INFO:tasks.workunit.client.1.vm06.stdout:1/150: dread d6/fa [0,4194304] 0 2026-03-09T00:03:31.111 INFO:tasks.workunit.client.1.vm06.stdout:6/169: dwrite d4/fc [0,4194304] 0 2026-03-09T00:03:31.114 INFO:tasks.workunit.client.1.vm06.stdout:6/170: link d4/le d4/d27/l2f 0 2026-03-09T00:03:31.140 INFO:tasks.workunit.client.1.vm06.stdout:9/128: dwrite d1/d4/f24 [0,4194304] 0 2026-03-09T00:03:31.143 INFO:tasks.workunit.client.1.vm06.stdout:9/129: mknod d1/d3/d12/d21/d9/c26 0 2026-03-09T00:03:31.144 INFO:tasks.workunit.client.1.vm06.stdout:8/193: dwrite f7 [0,4194304] 0 2026-03-09T00:03:31.146 INFO:tasks.workunit.client.0.vm03.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-09T00:03:31.154 INFO:tasks.workunit.client.1.vm06.stdout:8/194: mknod db/dd/c3e 0 2026-03-09T00:03:31.155 INFO:tasks.workunit.client.1.vm06.stdout:8/195: write db/f28 [493605,98938] 0 2026-03-09T00:03:31.155 INFO:tasks.workunit.client.1.vm06.stdout:8/196: chown db/dd/f13 26037 1 2026-03-09T00:03:31.156 INFO:tasks.workunit.client.1.vm06.stdout:8/197: creat db/f3f x:0 0 0 2026-03-09T00:03:31.157 INFO:tasks.workunit.client.1.vm06.stdout:8/198: fsync db/dd/fe 0 2026-03-09T00:03:31.157 INFO:tasks.workunit.client.1.vm06.stdout:8/199: stat db/dd/c3e 0 2026-03-09T00:03:31.157 INFO:tasks.workunit.client.1.vm06.stdout:5/245: dwrite d5/d1c/d21/d28/f3b [0,4194304] 0 2026-03-09T00:03:31.157 INFO:tasks.workunit.client.1.vm06.stdout:5/246: fdatasync d5/d1c/d21/f3c 0 2026-03-09T00:03:31.157 INFO:tasks.workunit.client.1.vm06.stdout:5/247: chown d5/d1c/d23/l2c 193001549 1 2026-03-09T00:03:31.173 INFO:tasks.workunit.client.1.vm06.stdout:5/248: dread d5/f19 [0,4194304] 0 2026-03-09T00:03:31.173 INFO:tasks.workunit.client.1.vm06.stdout:1/151: dwrite d6/f20 [0,4194304] 0 2026-03-09T00:03:31.173 INFO:tasks.workunit.client.1.vm06.stdout:1/152: write d6/d21/f2a [1045197,97619] 0 2026-03-09T00:03:31.175 INFO:tasks.workunit.client.1.vm06.stdout:5/249: link d5/ff d5/d1c/d23/f4f 0 2026-03-09T00:03:31.175 INFO:tasks.workunit.client.1.vm06.stdout:5/250: write d5/d1c/d21/d2a/f3f [665232,117340] 0 2026-03-09T00:03:31.175 INFO:tasks.workunit.client.1.vm06.stdout:5/251: chown d5/d1c/d21/d28/d35/f46 234384779 1 2026-03-09T00:03:31.176 INFO:tasks.workunit.client.1.vm06.stdout:5/252: write d5/d1c/f2d [533421,128229] 0 2026-03-09T00:03:31.183 INFO:tasks.workunit.client.1.vm06.stdout:1/153: truncate d6/f7 3811775 0 2026-03-09T00:03:31.195 INFO:tasks.workunit.client.1.vm06.stdout:5/253: mknod d5/c50 0 2026-03-09T00:03:31.195 INFO:tasks.workunit.client.1.vm06.stdout:6/171: getdents d4 0 2026-03-09T00:03:31.195 INFO:tasks.workunit.client.1.vm06.stdout:3/144: getdents d11/d28 0 2026-03-09T00:03:31.195 INFO:tasks.workunit.client.1.vm06.stdout:1/154: symlink d6/d21/d2d/l32 0 2026-03-09T00:03:31.195 INFO:tasks.workunit.client.1.vm06.stdout:5/254: mkdir d5/d1c/d23/d51 0 2026-03-09T00:03:31.195 INFO:tasks.workunit.client.1.vm06.stdout:5/255: creat d5/d1c/d21/d28/d35/f52 x:0 0 0 2026-03-09T00:03:31.198 INFO:tasks.workunit.client.1.vm06.stdout:3/145: read f8 [1519083,18610] 0 2026-03-09T00:03:31.208 INFO:tasks.workunit.client.1.vm06.stdout:6/172: mknod d4/c30 0 2026-03-09T00:03:31.211 INFO:tasks.workunit.client.1.vm06.stdout:8/200: dwrite db/dd/d24/d36/d38/f3a [0,4194304] 0 2026-03-09T00:03:31.214 INFO:tasks.workunit.client.1.vm06.stdout:1/155: symlink d6/d21/d2d/l33 0 2026-03-09T00:03:31.214 INFO:tasks.workunit.client.1.vm06.stdout:1/156: creat d6/f34 x:0 0 0 2026-03-09T00:03:31.214 INFO:tasks.workunit.client.1.vm06.stdout:1/157: dread - d6/f28 zero size 2026-03-09T00:03:31.214 INFO:tasks.workunit.client.1.vm06.stdout:9/130: dwrite d1/d3/f23 [0,4194304] 0 2026-03-09T00:03:31.214 INFO:tasks.workunit.client.1.vm06.stdout:9/131: read - d1/d3/d12/d21/d14/f20 zero size 2026-03-09T00:03:31.217 INFO:tasks.workunit.client.1.vm06.stdout:5/256: symlink d5/d1c/d21/d2a/l53 0 2026-03-09T00:03:31.217 INFO:tasks.workunit.client.1.vm06.stdout:5/257: stat d5/l8 0 2026-03-09T00:03:31.223 INFO:tasks.workunit.client.1.vm06.stdout:6/173: dread - d4/d27/f2e zero size 2026-03-09T00:03:31.227 INFO:tasks.workunit.client.1.vm06.stdout:3/146: dwrite d11/f13 [0,4194304] 0 2026-03-09T00:03:31.227 INFO:tasks.workunit.client.1.vm06.stdout:3/147: chown l1 2 1 2026-03-09T00:03:31.246 INFO:tasks.workunit.client.1.vm06.stdout:9/132: getdents d1/d4 0 2026-03-09T00:03:31.254 INFO:tasks.workunit.client.1.vm06.stdout:3/148: mkdir d11/d28/d2e 0 2026-03-09T00:03:31.254 INFO:tasks.workunit.client.1.vm06.stdout:3/149: readlink l1 0 2026-03-09T00:03:31.254 INFO:tasks.workunit.client.1.vm06.stdout:3/150: stat d11/d28/d2e 0 2026-03-09T00:03:31.257 INFO:tasks.workunit.client.1.vm06.stdout:9/133: read d1/d3/d12/d21/f8 [27638,130941] 0 2026-03-09T00:03:31.260 INFO:tasks.workunit.client.1.vm06.stdout:3/151: mkdir d11/d28/d2e/d2f 0 2026-03-09T00:03:31.264 INFO:tasks.workunit.client.1.vm06.stdout:9/134: mknod d1/d3/d12/d21/c27 0 2026-03-09T00:03:31.283 INFO:tasks.workunit.client.1.vm06.stdout:3/152: link d11/l15 d11/d28/l30 0 2026-03-09T00:03:31.291 INFO:tasks.workunit.client.1.vm06.stdout:3/153: read - d11/d28/f2c zero size 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:0/183: sync 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:0/184: read d3/f10 [1834749,106894] 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:0/185: fdatasync d3/f29 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:4/131: sync 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:4/132: write f10 [545069,126123] 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:4/133: fsync f1 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:4/134: read f10 [132204,47984] 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:4/135: readlink d17/l1c 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:0/186: mknod d3/d18/d1f/d39/d3b/c3e 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:4/136: mknod d17/d21/d22/c23 0 2026-03-09T00:03:31.292 INFO:tasks.workunit.client.1.vm06.stdout:0/187: symlink d3/d18/d3c/l3f 0 2026-03-09T00:03:31.302 INFO:tasks.workunit.client.1.vm06.stdout:4/137: mkdir d17/d24 0 2026-03-09T00:03:31.302 INFO:tasks.workunit.client.1.vm06.stdout:4/138: chown f14 1 1 2026-03-09T00:03:31.302 INFO:tasks.workunit.client.1.vm06.stdout:3/154: write d11/f1b [1592546,21425] 0 2026-03-09T00:03:31.302 INFO:tasks.workunit.client.1.vm06.stdout:1/158: write d6/f7 [2445004,111082] 0 2026-03-09T00:03:31.303 INFO:tasks.workunit.client.1.vm06.stdout:3/155: read f8 [2236263,39996] 0 2026-03-09T00:03:31.317 INFO:tasks.workunit.client.1.vm06.stdout:1/159: mknod d6/c35 0 2026-03-09T00:03:31.317 INFO:tasks.workunit.client.1.vm06.stdout:0/188: creat d3/d18/d2c/d2d/f40 x:0 0 0 2026-03-09T00:03:31.317 INFO:tasks.workunit.client.1.vm06.stdout:1/160: stat d6/l23 0 2026-03-09T00:03:31.317 INFO:tasks.workunit.client.1.vm06.stdout:4/139: getdents d17/d21 0 2026-03-09T00:03:31.317 INFO:tasks.workunit.client.1.vm06.stdout:0/189: mknod d3/d18/d3c/c41 0 2026-03-09T00:03:31.317 INFO:tasks.workunit.client.1.vm06.stdout:0/190: truncate f1 1702767 0 2026-03-09T00:03:31.318 INFO:tasks.workunit.client.1.vm06.stdout:4/140: rename c12 to d17/d21/d22/c25 0 2026-03-09T00:03:31.318 INFO:tasks.workunit.client.1.vm06.stdout:0/191: symlink d3/d18/d2c/d2d/d31/l42 0 2026-03-09T00:03:31.319 INFO:tasks.workunit.client.1.vm06.stdout:4/141: unlink d17/l18 0 2026-03-09T00:03:31.320 INFO:tasks.workunit.client.1.vm06.stdout:0/192: mknod d3/d18/d1f/d39/d3b/c43 0 2026-03-09T00:03:31.320 INFO:tasks.workunit.client.1.vm06.stdout:0/193: chown d3/d18/d1f/d39 236 1 2026-03-09T00:03:31.320 INFO:tasks.workunit.client.1.vm06.stdout:4/142: truncate f16 2816131 0 2026-03-09T00:03:31.323 INFO:tasks.workunit.client.1.vm06.stdout:5/258: dwrite d5/f1d [0,4194304] 0 2026-03-09T00:03:31.323 INFO:tasks.workunit.client.1.vm06.stdout:5/259: write d5/f3d [1124472,68691] 0 2026-03-09T00:03:31.323 INFO:tasks.workunit.client.1.vm06.stdout:5/260: truncate d5/d1c/d21/d28/d35/f4e 806915 0 2026-03-09T00:03:31.323 INFO:tasks.workunit.client.1.vm06.stdout:5/261: creat d5/d1c/d23/f54 x:0 0 0 2026-03-09T00:03:31.327 INFO:tasks.workunit.client.1.vm06.stdout:8/201: dwrite db/f28 [0,4194304] 0 2026-03-09T00:03:31.327 INFO:tasks.workunit.client.1.vm06.stdout:8/202: stat db/f3f 0 2026-03-09T00:03:31.347 INFO:tasks.workunit.client.1.vm06.stdout:6/174: dwrite d4/f5 [4194304,4194304] 0 2026-03-09T00:03:31.350 INFO:tasks.workunit.client.1.vm06.stdout:5/262: symlink d5/d1c/d23/l55 0 2026-03-09T00:03:31.350 INFO:tasks.workunit.client.1.vm06.stdout:5/263: chown d5/l11 981 1 2026-03-09T00:03:31.350 INFO:tasks.workunit.client.1.vm06.stdout:5/264: fsync d5/d1c/d21/d28/d35/f4e 0 2026-03-09T00:03:31.350 INFO:tasks.workunit.client.1.vm06.stdout:5/265: chown d5/c27 114358039 1 2026-03-09T00:03:31.350 INFO:tasks.workunit.client.1.vm06.stdout:5/266: readlink d5/d1c/d21/d2a/l53 0 2026-03-09T00:03:31.362 INFO:tasks.workunit.client.1.vm06.stdout:7/196: sync 2026-03-09T00:03:31.363 INFO:tasks.workunit.client.1.vm06.stdout:2/263: sync 2026-03-09T00:03:31.363 INFO:tasks.workunit.client.1.vm06.stdout:6/175: rmdir d4/d27 39 2026-03-09T00:03:31.368 INFO:tasks.workunit.client.1.vm06.stdout:2/264: creat d7/f4c x:0 0 0 2026-03-09T00:03:31.369 INFO:tasks.workunit.client.1.vm06.stdout:6/176: creat d4/d27/f31 x:0 0 0 2026-03-09T00:03:31.383 INFO:tasks.workunit.client.1.vm06.stdout:6/177: creat d4/d16/f32 x:0 0 0 2026-03-09T00:03:31.393 INFO:tasks.workunit.client.1.vm06.stdout:6/178: write d4/f5 [10439759,99306] 0 2026-03-09T00:03:31.393 INFO:tasks.workunit.client.1.vm06.stdout:6/179: write d4/d16/f1c [999912,27592] 0 2026-03-09T00:03:31.393 INFO:tasks.workunit.client.0.vm03.stderr:++ readlink -f fsstress 2026-03-09T00:03:31.394 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-09T00:03:31.394 INFO:tasks.workunit.client.0.vm03.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-09T00:03:31.394 INFO:tasks.workunit.client.0.vm03.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-09T00:03:31.394 INFO:tasks.workunit.client.0.vm03.stderr:+ popd 2026-03-09T00:03:31.394 INFO:tasks.workunit.client.0.vm03.stderr:+ popd 2026-03-09T00:03:31.394 INFO:tasks.workunit.client.0.vm03.stderr:++ mktemp -d -p . 2026-03-09T00:03:31.395 INFO:tasks.workunit.client.0.vm03.stderr:+ T=./tmp.Tq6zp6bsRi 2026-03-09T00:03:31.395 INFO:tasks.workunit.client.0.vm03.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.Tq6zp6bsRi -l 1 -n 1000 -p 10 -v 2026-03-09T00:03:31.396 INFO:tasks.workunit.client.0.vm03.stdout:seed = 1772625011 2026-03-09T00:03:31.398 INFO:tasks.workunit.client.0.vm03.stdout:0/0: read - no filename 2026-03-09T00:03:31.398 INFO:tasks.workunit.client.0.vm03.stdout:0/1: dwrite - no filename 2026-03-09T00:03:31.398 INFO:tasks.workunit.client.0.vm03.stdout:0/2: dread - no filename 2026-03-09T00:03:31.398 INFO:tasks.workunit.client.0.vm03.stdout:0/3: dread - no filename 2026-03-09T00:03:31.398 INFO:tasks.workunit.client.0.vm03.stdout:0/4: chown . 1042450 1 2026-03-09T00:03:31.399 INFO:tasks.workunit.client.0.vm03.stdout:0/5: creat f0 x:0 0 0 2026-03-09T00:03:31.401 INFO:tasks.workunit.client.0.vm03.stdout:1/0: write - no filename 2026-03-09T00:03:31.401 INFO:tasks.workunit.client.0.vm03.stdout:1/1: readlink - no filename 2026-03-09T00:03:31.402 INFO:tasks.workunit.client.0.vm03.stdout:1/2: creat f0 x:0 0 0 2026-03-09T00:03:31.405 INFO:tasks.workunit.client.1.vm06.stdout:4/143: write f1 [7466385,5501] 0 2026-03-09T00:03:31.411 INFO:tasks.workunit.client.1.vm06.stdout:4/144: chown ca 10561 1 2026-03-09T00:03:31.412 INFO:tasks.workunit.client.0.vm03.stdout:2/0: unlink - no file 2026-03-09T00:03:31.412 INFO:tasks.workunit.client.1.vm06.stdout:4/145: dread f7 [0,4194304] 0 2026-03-09T00:03:31.412 INFO:tasks.workunit.client.1.vm06.stdout:4/146: truncate fe 4882984 0 2026-03-09T00:03:31.412 INFO:tasks.workunit.client.1.vm06.stdout:4/147: readlink l8 0 2026-03-09T00:03:31.414 INFO:tasks.workunit.client.1.vm06.stdout:4/148: symlink d17/d21/d22/l26 0 2026-03-09T00:03:31.423 INFO:tasks.workunit.client.1.vm06.stdout:4/149: symlink d17/d21/l27 0 2026-03-09T00:03:31.423 INFO:tasks.workunit.client.1.vm06.stdout:4/150: stat d17/f19 0 2026-03-09T00:03:31.423 INFO:tasks.workunit.client.1.vm06.stdout:5/267: dread d5/f3d [0,4194304] 0 2026-03-09T00:03:31.434 INFO:tasks.workunit.client.1.vm06.stdout:9/135: dwrite d1/d3/d12/d21/d14/f18 [0,4194304] 0 2026-03-09T00:03:31.434 INFO:tasks.workunit.client.1.vm06.stdout:9/136: readlink d1/d3/ld 0 2026-03-09T00:03:31.434 INFO:tasks.workunit.client.1.vm06.stdout:9/137: chown d1/d4/f6 120788 1 2026-03-09T00:03:31.434 INFO:tasks.workunit.client.1.vm06.stdout:9/138: chown d1/d3/f1f 3 1 2026-03-09T00:03:31.434 INFO:tasks.workunit.client.1.vm06.stdout:9/139: chown d1/d3/d12/d21/d9/c26 3942835 1 2026-03-09T00:03:31.436 INFO:tasks.workunit.client.1.vm06.stdout:9/140: creat d1/d3/d12/f28 x:0 0 0 2026-03-09T00:03:31.446 INFO:tasks.workunit.client.1.vm06.stdout:0/194: dwrite d3/fa [0,4194304] 0 2026-03-09T00:03:31.461 INFO:tasks.workunit.client.1.vm06.stdout:0/195: chown d3/d18/d1f/d39/d3b 3 1 2026-03-09T00:03:31.461 INFO:tasks.workunit.client.1.vm06.stdout:0/196: mkdir d3/d18/d1f/d44 0 2026-03-09T00:03:31.462 INFO:tasks.workunit.client.1.vm06.stdout:0/197: mkdir d3/d18/d28/d45 0 2026-03-09T00:03:31.462 INFO:tasks.workunit.client.1.vm06.stdout:3/156: getdents d11/d28/d2e 0 2026-03-09T00:03:31.469 INFO:tasks.workunit.client.1.vm06.stdout:0/198: dread d3/f29 [0,4194304] 0 2026-03-09T00:03:31.469 INFO:tasks.workunit.client.1.vm06.stdout:3/157: write f10 [699179,30237] 0 2026-03-09T00:03:31.470 INFO:tasks.workunit.client.1.vm06.stdout:3/158: getdents d11/d28 0 2026-03-09T00:03:31.470 INFO:tasks.workunit.client.1.vm06.stdout:3/159: unlink d11/l22 0 2026-03-09T00:03:31.470 INFO:tasks.workunit.client.1.vm06.stdout:3/160: symlink d11/d28/d2e/l31 0 2026-03-09T00:03:31.470 INFO:tasks.workunit.client.1.vm06.stdout:3/161: write d11/f1e [1910561,7909] 0 2026-03-09T00:03:31.470 INFO:tasks.workunit.client.1.vm06.stdout:3/162: write fa [4459138,105775] 0 2026-03-09T00:03:31.589 INFO:tasks.workunit.client.1.vm06.stdout:2/265: dwrite d7/da/d1c/f1f [8388608,4194304] 0 2026-03-09T00:03:31.698 INFO:tasks.workunit.client.1.vm06.stdout:1/161: dwrite d6/f1a [0,4194304] 0 2026-03-09T00:03:31.699 INFO:tasks.workunit.client.1.vm06.stdout:1/162: rename d6/c15 to d6/c36 0 2026-03-09T00:03:31.702 INFO:tasks.workunit.client.1.vm06.stdout:1/163: mkdir d6/d21/d2d/d37 0 2026-03-09T00:03:31.804 INFO:tasks.workunit.client.1.vm06.stdout:5/268: dwrite d5/d44/f4a [0,4194304] 0 2026-03-09T00:03:31.806 INFO:tasks.workunit.client.1.vm06.stdout:5/269: creat d5/d1c/d21/d28/f56 x:0 0 0 2026-03-09T00:03:31.862 INFO:tasks.workunit.client.1.vm06.stdout:5/270: read d5/d1c/d21/d28/f3b [1890068,2801] 0 2026-03-09T00:03:31.863 INFO:tasks.workunit.client.1.vm06.stdout:5/271: creat d5/d1c/d21/d28/f57 x:0 0 0 2026-03-09T00:03:31.864 INFO:tasks.workunit.client.1.vm06.stdout:5/272: rmdir d5/d1c/d21/d2a 39 2026-03-09T00:03:31.864 INFO:tasks.workunit.client.1.vm06.stdout:5/273: write d5/f36 [527036,93608] 0 2026-03-09T00:03:31.864 INFO:tasks.workunit.client.1.vm06.stdout:5/274: fdatasync d5/d1c/d21/d28/d35/f40 0 2026-03-09T00:03:31.864 INFO:tasks.workunit.client.1.vm06.stdout:5/275: truncate d5/d1c/d21/d28/d35/f4e 1233349 0 2026-03-09T00:03:31.944 INFO:tasks.workunit.client.1.vm06.stdout:7/197: dwrite d0/df/d1a/d22/f2c [0,4194304] 0 2026-03-09T00:03:31.947 INFO:tasks.workunit.client.1.vm06.stdout:7/198: rmdir d0 39 2026-03-09T00:03:31.947 INFO:tasks.workunit.client.1.vm06.stdout:7/199: mkdir d0/db/d31 0 2026-03-09T00:03:31.950 INFO:tasks.workunit.client.1.vm06.stdout:7/200: unlink d0/f5 0 2026-03-09T00:03:32.008 INFO:tasks.workunit.client.0.vm03.stdout:0/6: dwrite f0 [0,4194304] 0 2026-03-09T00:03:32.009 INFO:tasks.workunit.client.1.vm06.stdout:7/201: dread d0/df/d17/f2d [0,4194304] 0 2026-03-09T00:03:32.014 INFO:tasks.workunit.client.0.vm03.stdout:0/7: write f0 [2023128,76186] 0 2026-03-09T00:03:32.022 INFO:tasks.workunit.client.1.vm06.stdout:4/151: fdatasync f1 0 2026-03-09T00:03:32.024 INFO:tasks.workunit.client.1.vm06.stdout:4/152: rename d17/d21/d22/l26 to d17/l28 0 2026-03-09T00:03:32.025 INFO:tasks.workunit.client.1.vm06.stdout:4/153: creat d17/d24/f29 x:0 0 0 2026-03-09T00:03:32.035 INFO:tasks.workunit.client.1.vm06.stdout:4/154: write f14 [823613,19464] 0 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:4/0: write - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:3/0: dwrite - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:4/1: read - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:4/2: write - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:6/0: write - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:6/1: stat - no entries 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:6/2: write - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:7/0: fdatasync - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:7/1: stat - no entries 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:5/0: chown . 1611 1 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:5/1: fsync - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:5/2: getdents . 0 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:5/3: dwrite - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:5/4: write - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:8/0: fsync - no filename 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:8/1: getdents . 0 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:8/2: chown . 83 1 2026-03-09T00:03:32.184 INFO:tasks.workunit.client.0.vm03.stdout:8/3: dread - no filename 2026-03-09T00:03:32.185 INFO:tasks.workunit.client.0.vm03.stdout:8/4: dwrite - no filename 2026-03-09T00:03:32.185 INFO:tasks.workunit.client.0.vm03.stdout:8/5: write - no filename 2026-03-09T00:03:32.185 INFO:tasks.workunit.client.0.vm03.stdout:8/6: getdents . 0 2026-03-09T00:03:32.185 INFO:tasks.workunit.client.1.vm06.stdout:3/163: dwrite d11/f12 [0,4194304] 0 2026-03-09T00:03:32.185 INFO:tasks.workunit.client.1.vm06.stdout:3/164: chown d11/l23 278060 1 2026-03-09T00:03:32.187 INFO:tasks.workunit.client.1.vm06.stdout:7/202: dwrite d0/df/d1a/f25 [0,4194304] 0 2026-03-09T00:03:32.187 INFO:tasks.workunit.client.1.vm06.stdout:0/199: dwrite d3/d18/d28/f2b [0,4194304] 0 2026-03-09T00:03:32.187 INFO:tasks.workunit.client.1.vm06.stdout:0/200: creat d3/d18/d2c/d2d/f46 x:0 0 0 2026-03-09T00:03:32.188 INFO:tasks.workunit.client.0.vm03.stdout:0/8: dread f0 [0,4194304] 0 2026-03-09T00:03:32.188 INFO:tasks.workunit.client.0.vm03.stdout:0/9: creat f1 x:0 0 0 2026-03-09T00:03:32.190 INFO:tasks.workunit.client.1.vm06.stdout:6/180: fsync d4/d27/f31 0 2026-03-09T00:03:32.190 INFO:tasks.workunit.client.0.vm03.stdout:0/10: dread f0 [0,4194304] 0 2026-03-09T00:03:32.197 INFO:tasks.workunit.client.0.vm03.stdout:1/3: getdents . 0 2026-03-09T00:03:32.199 INFO:tasks.workunit.client.1.vm06.stdout:6/181: dread d4/f5 [4194304,4194304] 0 2026-03-09T00:03:32.202 INFO:tasks.workunit.client.1.vm06.stdout:2/266: dwrite d7/f26 [0,4194304] 0 2026-03-09T00:03:32.205 INFO:tasks.workunit.client.1.vm06.stdout:9/141: dwrite d1/d3/f11 [0,4194304] 0 2026-03-09T00:03:32.208 INFO:tasks.workunit.client.1.vm06.stdout:3/165: creat d11/d28/d2e/f32 x:0 0 0 2026-03-09T00:03:32.211 INFO:tasks.workunit.client.1.vm06.stdout:1/164: dwrite d6/f7 [0,4194304] 0 2026-03-09T00:03:32.216 INFO:tasks.workunit.client.1.vm06.stdout:7/203: creat d0/db/d31/f32 x:0 0 0 2026-03-09T00:03:32.219 INFO:tasks.workunit.client.0.vm03.stdout:3/1: creat f0 x:0 0 0 2026-03-09T00:03:32.224 INFO:tasks.workunit.client.0.vm03.stdout:3/2: creat f1 x:0 0 0 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.0.vm03.stdout:3/3: write f0 [229631,25204] 0 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.1.vm06.stdout:0/201: link d3/f7 d3/d18/d1f/d39/d3b/f47 0 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.1.vm06.stdout:6/182: dread - d4/fa zero size 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.0.vm03.stdout:4/3: creat f0 x:0 0 0 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.0.vm03.stdout:6/3: symlink l0 0 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.0.vm03.stdout:6/4: dwrite - no filename 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.0.vm03.stdout:6/5: truncate - no filename 2026-03-09T00:03:32.225 INFO:tasks.workunit.client.0.vm03.stdout:6/6: stat l0 0 2026-03-09T00:03:32.234 INFO:tasks.workunit.client.0.vm03.stdout:7/2: symlink l0 0 2026-03-09T00:03:32.234 INFO:tasks.workunit.client.0.vm03.stdout:7/3: dwrite - no filename 2026-03-09T00:03:32.240 INFO:tasks.workunit.client.0.vm03.stdout:5/5: creat f0 x:0 0 0 2026-03-09T00:03:32.240 INFO:tasks.workunit.client.0.vm03.stdout:5/6: write f0 [316591,83628] 0 2026-03-09T00:03:32.240 INFO:tasks.workunit.client.0.vm03.stdout:5/7: read f0 [322679,95586] 0 2026-03-09T00:03:32.240 INFO:tasks.workunit.client.0.vm03.stdout:5/8: creat f1 x:0 0 0 2026-03-09T00:03:32.245 INFO:tasks.workunit.client.1.vm06.stdout:2/267: creat d7/d1a/d3c/f4d x:0 0 0 2026-03-09T00:03:32.245 INFO:tasks.workunit.client.0.vm03.stdout:8/7: mknod c0 0 2026-03-09T00:03:32.245 INFO:tasks.workunit.client.0.vm03.stdout:8/8: fdatasync - no filename 2026-03-09T00:03:32.245 INFO:tasks.workunit.client.0.vm03.stdout:8/9: truncate - no filename 2026-03-09T00:03:32.245 INFO:tasks.workunit.client.0.vm03.stdout:8/10: read - no filename 2026-03-09T00:03:32.251 INFO:tasks.workunit.client.0.vm03.stdout:9/0: creat f0 x:0 0 0 2026-03-09T00:03:32.251 INFO:tasks.workunit.client.0.vm03.stdout:9/1: dread - f0 zero size 2026-03-09T00:03:32.253 INFO:tasks.workunit.client.1.vm06.stdout:2/268: write d7/da/d1c/f1f [11190786,75035] 0 2026-03-09T00:03:32.262 INFO:tasks.workunit.client.1.vm06.stdout:3/166: mknod d11/d28/c33 0 2026-03-09T00:03:32.268 INFO:tasks.workunit.client.0.vm03.stdout:1/4: creat f1 x:0 0 0 2026-03-09T00:03:32.273 INFO:tasks.workunit.client.1.vm06.stdout:1/165: symlink d6/l38 0 2026-03-09T00:03:32.273 INFO:tasks.workunit.client.1.vm06.stdout:1/166: write d6/d21/d2d/f31 [632332,3202] 0 2026-03-09T00:03:32.274 INFO:tasks.workunit.client.1.vm06.stdout:4/155: dwrite d17/f1e [4194304,4194304] 0 2026-03-09T00:03:32.274 INFO:tasks.workunit.client.0.vm03.stdout:6/7: creat f1 x:0 0 0 2026-03-09T00:03:32.284 INFO:tasks.workunit.client.1.vm06.stdout:6/183: dread - d4/f2d zero size 2026-03-09T00:03:32.285 INFO:tasks.workunit.client.1.vm06.stdout:2/269: mkdir d7/da/d4e 0 2026-03-09T00:03:32.287 INFO:tasks.workunit.client.0.vm03.stdout:7/4: symlink l1 0 2026-03-09T00:03:32.287 INFO:tasks.workunit.client.0.vm03.stdout:8/11: rename c0 to c1 0 2026-03-09T00:03:32.288 INFO:tasks.workunit.client.0.vm03.stdout:1/5: link f1 f2 0 2026-03-09T00:03:32.291 INFO:tasks.workunit.client.1.vm06.stdout:3/167: symlink d11/d28/l34 0 2026-03-09T00:03:32.291 INFO:tasks.workunit.client.1.vm06.stdout:3/168: readlink d11/l15 0 2026-03-09T00:03:32.292 INFO:tasks.workunit.client.1.vm06.stdout:9/142: dwrite d1/d4/ff [0,4194304] 0 2026-03-09T00:03:32.297 INFO:tasks.workunit.client.1.vm06.stdout:0/202: dwrite d3/d18/f25 [0,4194304] 0 2026-03-09T00:03:32.299 INFO:tasks.workunit.client.0.vm03.stdout:6/8: link f1 f2 0 2026-03-09T00:03:32.300 INFO:tasks.workunit.client.1.vm06.stdout:1/167: symlink d6/d21/d2d/l39 0 2026-03-09T00:03:32.300 INFO:tasks.workunit.client.1.vm06.stdout:1/168: getdents d6/d21/d2d/d37 0 2026-03-09T00:03:32.300 INFO:tasks.workunit.client.1.vm06.stdout:3/169: write d11/d28/f29 [2763924,80220] 0 2026-03-09T00:03:32.300 INFO:tasks.workunit.client.1.vm06.stdout:3/170: chown d11/f1a 64 1 2026-03-09T00:03:32.307 INFO:tasks.workunit.client.1.vm06.stdout:0/203: write d3/f1a [2681618,111612] 0 2026-03-09T00:03:32.307 INFO:tasks.workunit.client.1.vm06.stdout:0/204: write d3/d18/d1f/d39/d3b/f47 [2664077,13602] 0 2026-03-09T00:03:32.316 INFO:tasks.workunit.client.1.vm06.stdout:1/169: truncate d6/fb 4152708 0 2026-03-09T00:03:32.320 INFO:tasks.workunit.client.1.vm06.stdout:3/171: mknod d11/d28/c35 0 2026-03-09T00:03:32.320 INFO:tasks.workunit.client.1.vm06.stdout:4/156: dread f14 [0,4194304] 0 2026-03-09T00:03:32.320 INFO:tasks.workunit.client.0.vm03.stdout:0/11: dwrite f1 [0,4194304] 0 2026-03-09T00:03:32.320 INFO:tasks.workunit.client.0.vm03.stdout:0/12: readlink - no filename 2026-03-09T00:03:32.327 INFO:tasks.workunit.client.1.vm06.stdout:0/205: creat d3/d18/d28/d45/f48 x:0 0 0 2026-03-09T00:03:32.331 INFO:tasks.workunit.client.1.vm06.stdout:1/170: mknod d6/d21/c3a 0 2026-03-09T00:03:32.335 INFO:tasks.workunit.client.1.vm06.stdout:3/172: unlink d11/l15 0 2026-03-09T00:03:32.335 INFO:tasks.workunit.client.1.vm06.stdout:3/173: readlink d11/l21 0 2026-03-09T00:03:32.337 INFO:tasks.workunit.client.1.vm06.stdout:4/157: link f15 d17/d21/d22/f2a 0 2026-03-09T00:03:32.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:32 vm03.local ceph-mon[52346]: pgmap v132: 65 pgs: 65 active+clean; 447 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 21 MiB/s rd, 37 MiB/s wr, 450 op/s 2026-03-09T00:03:32.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:32 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:32.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:32 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:32.341 INFO:tasks.workunit.client.1.vm06.stdout:1/171: mkdir d6/d21/d2d/d3b 0 2026-03-09T00:03:32.341 INFO:tasks.workunit.client.1.vm06.stdout:1/172: write d6/fa [1694757,71716] 0 2026-03-09T00:03:32.346 INFO:tasks.workunit.client.1.vm06.stdout:4/158: link d17/l1b d17/d24/l2b 0 2026-03-09T00:03:32.346 INFO:tasks.workunit.client.1.vm06.stdout:4/159: chown cd 2 1 2026-03-09T00:03:32.346 INFO:tasks.workunit.client.1.vm06.stdout:4/160: chown d17/d21/d22/c23 9620 1 2026-03-09T00:03:32.346 INFO:tasks.workunit.client.1.vm06.stdout:4/161: creat d17/d24/f2c x:0 0 0 2026-03-09T00:03:32.352 INFO:tasks.workunit.client.0.vm03.stdout:3/4: dwrite f1 [0,4194304] 0 2026-03-09T00:03:32.352 INFO:tasks.workunit.client.0.vm03.stdout:3/5: chown f1 1787 1 2026-03-09T00:03:32.353 INFO:tasks.workunit.client.1.vm06.stdout:9/143: dwrite d1/d4/f6 [4194304,4194304] 0 2026-03-09T00:03:32.359 INFO:tasks.workunit.client.1.vm06.stdout:4/162: rename l11 to d17/l2d 0 2026-03-09T00:03:32.368 INFO:tasks.workunit.client.1.vm06.stdout:7/204: rmdir d0/db/d31 39 2026-03-09T00:03:32.368 INFO:tasks.workunit.client.1.vm06.stdout:7/205: chown d0/df/d1a/c1e 83632 1 2026-03-09T00:03:32.369 INFO:tasks.workunit.client.0.vm03.stdout:4/4: getdents . 0 2026-03-09T00:03:32.369 INFO:tasks.workunit.client.0.vm03.stdout:4/5: chown f0 3186 1 2026-03-09T00:03:32.372 INFO:tasks.workunit.client.0.vm03.stdout:9/2: dwrite f0 [0,4194304] 0 2026-03-09T00:03:32.374 INFO:tasks.workunit.client.1.vm06.stdout:9/144: symlink d1/d3/l29 0 2026-03-09T00:03:32.380 INFO:tasks.workunit.client.0.vm03.stdout:9/3: dread f0 [0,4194304] 0 2026-03-09T00:03:32.385 INFO:tasks.workunit.client.1.vm06.stdout:7/206: rmdir d0/db 39 2026-03-09T00:03:32.385 INFO:tasks.workunit.client.1.vm06.stdout:7/207: symlink d0/df/d1a/l33 0 2026-03-09T00:03:32.386 INFO:tasks.workunit.client.1.vm06.stdout:7/208: chown d0/df/c1d 4129689 1 2026-03-09T00:03:32.387 INFO:tasks.workunit.client.1.vm06.stdout:9/145: link d1/d3/d12/d21/d14/f1a d1/f2a 0 2026-03-09T00:03:32.387 INFO:tasks.workunit.client.1.vm06.stdout:7/209: creat d0/db/d31/f34 x:0 0 0 2026-03-09T00:03:32.388 INFO:tasks.workunit.client.1.vm06.stdout:7/210: write d0/f2 [9414905,86582] 0 2026-03-09T00:03:32.388 INFO:tasks.workunit.client.1.vm06.stdout:9/146: mkdir d1/d3/d2b 0 2026-03-09T00:03:32.390 INFO:tasks.workunit.client.1.vm06.stdout:7/211: mkdir d0/df/d1a/d35 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/147: creat d1/d3/d12/d21/f2c x:0 0 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/148: rmdir d1/d3/d12/d21/d14 39 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/149: creat d1/d4/f2d x:0 0 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/150: symlink d1/d3/d2b/l2e 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/151: stat d1/d3/d12/d21/d9 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/152: mkdir d1/d4/d2f 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/153: fsync d1/d3/f11 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/154: symlink d1/d3/d12/l30 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/155: chown d1 524470188 1 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/156: write d1/d3/d12/f28 [697072,58618] 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/157: creat d1/d3/d12/d21/d14/d25/f31 x:0 0 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/158: write d1/d3/d12/d21/f2c [128274,71657] 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/159: truncate d1/d3/d12/d21/d9/f10 1272266 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/160: creat d1/d3/d12/d21/d14/d25/f32 x:0 0 0 2026-03-09T00:03:32.402 INFO:tasks.workunit.client.1.vm06.stdout:9/161: chown d1/d3/f11 1 1 2026-03-09T00:03:32.455 INFO:tasks.workunit.client.1.vm06.stdout:0/206: dwrite d3/f1a [4194304,4194304] 0 2026-03-09T00:03:32.458 INFO:tasks.workunit.client.1.vm06.stdout:0/207: dread d3/f11 [0,4194304] 0 2026-03-09T00:03:32.458 INFO:tasks.workunit.client.1.vm06.stdout:0/208: write d3/d18/d28/d45/f48 [732761,108556] 0 2026-03-09T00:03:32.460 INFO:tasks.workunit.client.1.vm06.stdout:0/209: mkdir d3/d18/d1f/d39/d49 0 2026-03-09T00:03:32.460 INFO:tasks.workunit.client.1.vm06.stdout:0/210: dread d3/d18/d28/f2b [4194304,4194304] 0 2026-03-09T00:03:32.462 INFO:tasks.workunit.client.1.vm06.stdout:0/211: creat d3/d18/d1f/f4a x:0 0 0 2026-03-09T00:03:32.462 INFO:tasks.workunit.client.1.vm06.stdout:0/212: getdents d3/d18/d1f/d44 0 2026-03-09T00:03:32.463 INFO:tasks.workunit.client.1.vm06.stdout:9/162: dwrite d1/d3/d12/d21/d14/d25/f32 [0,4194304] 0 2026-03-09T00:03:32.463 INFO:tasks.workunit.client.1.vm06.stdout:2/270: dwrite d7/da/d1c/f1f [8388608,4194304] 0 2026-03-09T00:03:32.463 INFO:tasks.workunit.client.0.vm03.stdout:6/9: dwrite f1 [0,4194304] 0 2026-03-09T00:03:32.464 INFO:tasks.workunit.client.0.vm03.stdout:6/10: rmdir - no directory 2026-03-09T00:03:32.466 INFO:tasks.workunit.client.1.vm06.stdout:8/203: sync 2026-03-09T00:03:32.467 INFO:tasks.workunit.client.1.vm06.stdout:8/204: dread - db/dd/d24/d36/f3d zero size 2026-03-09T00:03:32.470 INFO:tasks.workunit.client.1.vm06.stdout:0/213: getdents d3/d18/d2c 0 2026-03-09T00:03:32.472 INFO:tasks.workunit.client.1.vm06.stdout:7/212: dwrite d0/df/d1a/d22/f2c [4194304,4194304] 0 2026-03-09T00:03:32.476 INFO:tasks.workunit.client.1.vm06.stdout:7/213: dread d0/f7 [0,4194304] 0 2026-03-09T00:03:32.476 INFO:tasks.workunit.client.1.vm06.stdout:0/214: dread d3/f1c [0,4194304] 0 2026-03-09T00:03:32.477 INFO:tasks.workunit.client.1.vm06.stdout:9/163: creat d1/d3/d2b/f33 x:0 0 0 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/174: rmdir d11 39 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/175: write f8 [851446,39897] 0 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/176: chown d11 1111992 1 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/177: readlink d11/l21 0 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/178: fdatasync d11/f1f 0 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/179: chown c0 3283 1 2026-03-09T00:03:32.480 INFO:tasks.workunit.client.1.vm06.stdout:3/180: chown d11/d28/d2e/l31 65 1 2026-03-09T00:03:32.483 INFO:tasks.workunit.client.1.vm06.stdout:1/173: getdents d6/d21/d2d 0 2026-03-09T00:03:32.487 INFO:tasks.workunit.client.1.vm06.stdout:4/163: getdents d17 0 2026-03-09T00:03:32.491 INFO:tasks.workunit.client.1.vm06.stdout:8/205: truncate db/f17 7246661 0 2026-03-09T00:03:32.494 INFO:tasks.workunit.client.1.vm06.stdout:7/214: creat d0/f36 x:0 0 0 2026-03-09T00:03:32.495 INFO:tasks.workunit.client.1.vm06.stdout:7/215: read d0/fe [69131,32800] 0 2026-03-09T00:03:32.495 INFO:tasks.workunit.client.1.vm06.stdout:7/216: fdatasync d0/fa 0 2026-03-09T00:03:32.495 INFO:tasks.workunit.client.1.vm06.stdout:7/217: write d0/df/d1a/d22/f28 [1238764,49857] 0 2026-03-09T00:03:32.496 INFO:tasks.workunit.client.1.vm06.stdout:5/276: sync 2026-03-09T00:03:32.496 INFO:tasks.workunit.client.1.vm06.stdout:5/277: truncate d5/d1c/d23/f54 468427 0 2026-03-09T00:03:32.500 INFO:tasks.workunit.client.1.vm06.stdout:5/278: dread d5/d1c/d21/d28/d35/f40 [0,4194304] 0 2026-03-09T00:03:32.501 INFO:tasks.workunit.client.1.vm06.stdout:5/279: dread d5/f3d [0,4194304] 0 2026-03-09T00:03:32.505 INFO:tasks.workunit.client.1.vm06.stdout:0/215: rename d3/f17 to d3/d18/d1f/d39/d49/f4b 0 2026-03-09T00:03:32.507 INFO:tasks.workunit.client.1.vm06.stdout:9/164: mknod d1/d3/d2b/c34 0 2026-03-09T00:03:32.509 INFO:tasks.workunit.client.1.vm06.stdout:9/165: dread d1/d3/d12/d21/f8 [0,4194304] 0 2026-03-09T00:03:32.509 INFO:tasks.workunit.client.1.vm06.stdout:9/166: chown d1/d3/d12/d21/d9/c1e 3 1 2026-03-09T00:03:32.509 INFO:tasks.workunit.client.1.vm06.stdout:9/167: creat d1/d3/f35 x:0 0 0 2026-03-09T00:03:32.510 INFO:tasks.workunit.client.1.vm06.stdout:3/181: unlink fa 0 2026-03-09T00:03:32.510 INFO:tasks.workunit.client.1.vm06.stdout:3/182: truncate d11/f1f 86089 0 2026-03-09T00:03:32.515 INFO:tasks.workunit.client.1.vm06.stdout:2/271: rmdir d7/d1a/d3c 39 2026-03-09T00:03:32.516 INFO:tasks.workunit.client.1.vm06.stdout:2/272: write d7/d1b/f3b [859256,8073] 0 2026-03-09T00:03:32.519 INFO:tasks.workunit.client.1.vm06.stdout:1/174: unlink d6/c16 0 2026-03-09T00:03:32.521 INFO:tasks.workunit.client.1.vm06.stdout:4/164: mknod d17/d21/c2e 0 2026-03-09T00:03:32.522 INFO:tasks.workunit.client.1.vm06.stdout:4/165: dread d17/f1d [0,4194304] 0 2026-03-09T00:03:32.522 INFO:tasks.workunit.client.1.vm06.stdout:4/166: write d17/d24/f2c [643574,97691] 0 2026-03-09T00:03:32.531 INFO:tasks.workunit.client.1.vm06.stdout:5/280: symlink d5/d1c/l58 0 2026-03-09T00:03:32.536 INFO:tasks.workunit.client.1.vm06.stdout:0/216: unlink d3/d18/d2c/c2f 0 2026-03-09T00:03:32.543 INFO:tasks.workunit.client.1.vm06.stdout:5/281: read d5/f14 [1285967,94970] 0 2026-03-09T00:03:32.543 INFO:tasks.workunit.client.1.vm06.stdout:5/282: read - d5/d1c/d21/d28/d35/f52 zero size 2026-03-09T00:03:32.543 INFO:tasks.workunit.client.1.vm06.stdout:5/283: creat d5/d1c/d21/d28/f59 x:0 0 0 2026-03-09T00:03:32.543 INFO:tasks.workunit.client.1.vm06.stdout:5/284: readlink d5/d1c/l30 0 2026-03-09T00:03:32.550 INFO:tasks.workunit.client.1.vm06.stdout:1/175: creat d6/d21/d2d/f3c x:0 0 0 2026-03-09T00:03:32.550 INFO:tasks.workunit.client.1.vm06.stdout:1/176: creat d6/d21/f3d x:0 0 0 2026-03-09T00:03:32.550 INFO:tasks.workunit.client.1.vm06.stdout:1/177: write d6/d21/f2a [1840805,52530] 0 2026-03-09T00:03:32.556 INFO:tasks.workunit.client.1.vm06.stdout:9/168: dwrite d1/d3/d12/f28 [0,4194304] 0 2026-03-09T00:03:32.559 INFO:tasks.workunit.client.1.vm06.stdout:0/217: rename d3/d18/d28/c2a to d3/d18/d1f/c4c 0 2026-03-09T00:03:32.559 INFO:tasks.workunit.client.1.vm06.stdout:0/218: fdatasync d3/f19 0 2026-03-09T00:03:32.559 INFO:tasks.workunit.client.1.vm06.stdout:3/183: mkdir d11/d28/d2e/d2f/d36 0 2026-03-09T00:03:32.560 INFO:tasks.workunit.client.1.vm06.stdout:3/184: write f10 [3340653,32470] 0 2026-03-09T00:03:32.566 INFO:tasks.workunit.client.1.vm06.stdout:2/273: rmdir d7/d1a 39 2026-03-09T00:03:32.566 INFO:tasks.workunit.client.1.vm06.stdout:5/285: mkdir d5/d1c/d23/d5a 0 2026-03-09T00:03:32.566 INFO:tasks.workunit.client.1.vm06.stdout:5/286: dread - d5/d1c/d21/f32 zero size 2026-03-09T00:03:32.566 INFO:tasks.workunit.client.1.vm06.stdout:5/287: creat d5/d1c/d23/f5b x:0 0 0 2026-03-09T00:03:32.568 INFO:tasks.workunit.client.1.vm06.stdout:1/178: symlink d6/d21/d2d/d3b/l3e 0 2026-03-09T00:03:32.568 INFO:tasks.workunit.client.1.vm06.stdout:1/179: write d6/f28 [936196,97649] 0 2026-03-09T00:03:32.570 INFO:tasks.workunit.client.1.vm06.stdout:9/169: dread - d1/f1c zero size 2026-03-09T00:03:32.572 INFO:tasks.workunit.client.1.vm06.stdout:0/219: creat d3/d18/d2c/f4d x:0 0 0 2026-03-09T00:03:32.572 INFO:tasks.workunit.client.1.vm06.stdout:0/220: readlink d3/d18/d1f/l2e 0 2026-03-09T00:03:32.572 INFO:tasks.workunit.client.1.vm06.stdout:0/221: chown d3/f10 837345 1 2026-03-09T00:03:32.572 INFO:tasks.workunit.client.1.vm06.stdout:0/222: creat d3/d18/d2c/f4e x:0 0 0 2026-03-09T00:03:32.577 INFO:tasks.workunit.client.1.vm06.stdout:3/185: creat d11/d28/d2e/f37 x:0 0 0 2026-03-09T00:03:32.581 INFO:tasks.workunit.client.1.vm06.stdout:5/288: mknod d5/d1c/d21/d28/d35/c5c 0 2026-03-09T00:03:32.582 INFO:tasks.workunit.client.1.vm06.stdout:1/180: link d6/l18 d6/l3f 0 2026-03-09T00:03:32.582 INFO:tasks.workunit.client.1.vm06.stdout:1/181: write d6/d21/f3d [756911,1954] 0 2026-03-09T00:03:32.582 INFO:tasks.workunit.client.1.vm06.stdout:1/182: write d6/d21/d2d/f31 [1669406,65071] 0 2026-03-09T00:03:32.582 INFO:tasks.workunit.client.1.vm06.stdout:1/183: chown d6/d21/d2d/l39 30 1 2026-03-09T00:03:32.583 INFO:tasks.workunit.client.1.vm06.stdout:9/170: symlink d1/d4/l36 0 2026-03-09T00:03:32.585 INFO:tasks.workunit.client.1.vm06.stdout:0/223: creat d3/d18/d2c/d2d/d31/f4f x:0 0 0 2026-03-09T00:03:32.585 INFO:tasks.workunit.client.1.vm06.stdout:0/224: write d3/d18/d2c/d2d/d31/f4f [377061,1265] 0 2026-03-09T00:03:32.585 INFO:tasks.workunit.client.1.vm06.stdout:0/225: chown d3/d18/d1f/f26 43 1 2026-03-09T00:03:32.585 INFO:tasks.workunit.client.1.vm06.stdout:0/226: write d3/d18/d2c/d2d/f46 [698144,84444] 0 2026-03-09T00:03:32.586 INFO:tasks.workunit.client.1.vm06.stdout:3/186: dread - d11/d28/f2c zero size 2026-03-09T00:03:32.594 INFO:tasks.workunit.client.1.vm06.stdout:5/289: link d5/d1c/d23/l2c d5/d1c/d21/d28/d35/l5d 0 2026-03-09T00:03:32.597 INFO:tasks.workunit.client.1.vm06.stdout:0/227: getdents d3/d18/d1f/d39/d3b 0 2026-03-09T00:03:32.598 INFO:tasks.workunit.client.1.vm06.stdout:0/228: chown d3/d18/c21 23342437 1 2026-03-09T00:03:32.600 INFO:tasks.workunit.client.1.vm06.stdout:6/184: sync 2026-03-09T00:03:32.600 INFO:tasks.workunit.client.1.vm06.stdout:6/185: chown d4/d27/l2f 9139298 1 2026-03-09T00:03:32.603 INFO:tasks.workunit.client.1.vm06.stdout:8/206: rmdir db 39 2026-03-09T00:03:32.614 INFO:tasks.workunit.client.1.vm06.stdout:5/290: mkdir d5/d1c/d21/d28/d5e 0 2026-03-09T00:03:32.614 INFO:tasks.workunit.client.1.vm06.stdout:5/291: rename d5/c12 to d5/d1c/d21/d28/d35/c5f 0 2026-03-09T00:03:32.614 INFO:tasks.workunit.client.1.vm06.stdout:5/292: write d5/d1c/d21/f39 [8102736,11942] 0 2026-03-09T00:03:32.620 INFO:tasks.workunit.client.1.vm06.stdout:3/187: dread d11/d28/f29 [0,4194304] 0 2026-03-09T00:03:32.620 INFO:tasks.workunit.client.1.vm06.stdout:8/207: rename db/dd/f1f to db/dd/f40 0 2026-03-09T00:03:32.631 INFO:tasks.workunit.client.1.vm06.stdout:8/208: symlink db/dd/d24/d36/l41 0 2026-03-09T00:03:32.631 INFO:tasks.workunit.client.1.vm06.stdout:8/209: dread - db/dd/f2c zero size 2026-03-09T00:03:32.631 INFO:tasks.workunit.client.1.vm06.stdout:3/188: truncate d11/f16 3817111 0 2026-03-09T00:03:32.632 INFO:tasks.workunit.client.1.vm06.stdout:8/210: mknod db/d1e/c42 0 2026-03-09T00:03:32.632 INFO:tasks.workunit.client.1.vm06.stdout:3/189: chown d11/c25 2 1 2026-03-09T00:03:32.635 INFO:tasks.workunit.client.1.vm06.stdout:8/211: mknod db/dd/d24/c43 0 2026-03-09T00:03:32.635 INFO:tasks.workunit.client.1.vm06.stdout:3/190: fdatasync d11/f12 0 2026-03-09T00:03:32.635 INFO:tasks.workunit.client.1.vm06.stdout:8/212: chown f9 14477565 1 2026-03-09T00:03:32.639 INFO:tasks.workunit.client.1.vm06.stdout:8/213: dread db/d1e/f23 [0,4194304] 0 2026-03-09T00:03:32.646 INFO:tasks.workunit.client.1.vm06.stdout:8/214: mkdir db/dd/d24/d36/d44 0 2026-03-09T00:03:32.657 INFO:tasks.workunit.client.1.vm06.stdout:6/186: write d4/fb [3286510,48976] 0 2026-03-09T00:03:32.657 INFO:tasks.workunit.client.1.vm06.stdout:6/187: fsync d4/d27/f31 0 2026-03-09T00:03:32.663 INFO:tasks.workunit.client.1.vm06.stdout:2/274: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:03:32.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:32 vm06.local ceph-mon[58395]: pgmap v132: 65 pgs: 65 active+clean; 447 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 21 MiB/s rd, 37 MiB/s wr, 450 op/s 2026-03-09T00:03:32.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:32 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:32.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:32 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:32.747 INFO:tasks.workunit.client.1.vm06.stdout:3/191: dwrite d11/f18 [0,4194304] 0 2026-03-09T00:03:32.747 INFO:tasks.workunit.client.1.vm06.stdout:4/167: dwrite f16 [0,4194304] 0 2026-03-09T00:03:32.747 INFO:tasks.workunit.client.1.vm06.stdout:9/171: dwrite d1/d3/d2b/f33 [0,4194304] 0 2026-03-09T00:03:32.748 INFO:tasks.workunit.client.1.vm06.stdout:4/168: dread d17/f1d [0,4194304] 0 2026-03-09T00:03:32.751 INFO:tasks.workunit.client.1.vm06.stdout:2/275: dwrite d7/da/d1c/f1f [4194304,4194304] 0 2026-03-09T00:03:32.751 INFO:tasks.workunit.client.1.vm06.stdout:0/229: dwrite d3/d18/d1f/d39/f3d [0,4194304] 0 2026-03-09T00:03:32.752 INFO:tasks.workunit.client.1.vm06.stdout:3/192: creat d11/d28/d2e/f38 x:0 0 0 2026-03-09T00:03:32.754 INFO:tasks.workunit.client.1.vm06.stdout:5/293: dwrite d5/d1c/d21/d28/d35/f4e [0,4194304] 0 2026-03-09T00:03:32.754 INFO:tasks.workunit.client.1.vm06.stdout:2/276: dread f3 [0,4194304] 0 2026-03-09T00:03:32.754 INFO:tasks.workunit.client.1.vm06.stdout:2/277: chown d7/d1a/d39/c3d 149 1 2026-03-09T00:03:32.755 INFO:tasks.workunit.client.1.vm06.stdout:5/294: dread d5/f36 [0,4194304] 0 2026-03-09T00:03:32.757 INFO:tasks.workunit.client.1.vm06.stdout:8/215: dwrite db/d1e/f20 [0,4194304] 0 2026-03-09T00:03:32.757 INFO:tasks.workunit.client.1.vm06.stdout:5/295: write d5/f3d [808590,62119] 0 2026-03-09T00:03:32.757 INFO:tasks.workunit.client.1.vm06.stdout:8/216: chown db/d1e/c42 56774 1 2026-03-09T00:03:32.758 INFO:tasks.workunit.client.1.vm06.stdout:1/184: dwrite d6/f19 [0,4194304] 0 2026-03-09T00:03:32.758 INFO:tasks.workunit.client.1.vm06.stdout:1/185: stat d6/l12 0 2026-03-09T00:03:32.763 INFO:tasks.workunit.client.1.vm06.stdout:1/186: write f0 [688467,59669] 0 2026-03-09T00:03:32.776 INFO:tasks.workunit.client.1.vm06.stdout:6/188: dread d4/fc [0,4194304] 0 2026-03-09T00:03:32.781 INFO:tasks.workunit.client.1.vm06.stdout:9/172: creat d1/d3/d12/d21/d14/f37 x:0 0 0 2026-03-09T00:03:32.782 INFO:tasks.workunit.client.1.vm06.stdout:9/173: dread - d1/d3/d12/d21/d14/f20 zero size 2026-03-09T00:03:32.783 INFO:tasks.workunit.client.1.vm06.stdout:4/169: link d17/f19 d17/d21/f2f 0 2026-03-09T00:03:32.788 INFO:tasks.workunit.client.1.vm06.stdout:0/230: creat d3/d18/d1f/d39/d49/f50 x:0 0 0 2026-03-09T00:03:32.788 INFO:tasks.workunit.client.1.vm06.stdout:0/231: fdatasync d3/f10 0 2026-03-09T00:03:32.788 INFO:tasks.workunit.client.1.vm06.stdout:3/193: chown d11/f1a 262 1 2026-03-09T00:03:32.793 INFO:tasks.workunit.client.1.vm06.stdout:2/278: unlink d7/da/d1c/l28 0 2026-03-09T00:03:32.793 INFO:tasks.workunit.client.1.vm06.stdout:2/279: getdents d7/da/d4e 0 2026-03-09T00:03:32.793 INFO:tasks.workunit.client.1.vm06.stdout:5/296: rmdir d5/d1c/d23/d5a 0 2026-03-09T00:03:32.793 INFO:tasks.workunit.client.1.vm06.stdout:5/297: truncate d5/d1c/d21/f32 574507 0 2026-03-09T00:03:32.793 INFO:tasks.workunit.client.1.vm06.stdout:4/170: dread d17/f1e [4194304,4194304] 0 2026-03-09T00:03:32.795 INFO:tasks.workunit.client.1.vm06.stdout:6/189: getdents d4 0 2026-03-09T00:03:32.795 INFO:tasks.workunit.client.1.vm06.stdout:6/190: write d4/f2d [465492,2472] 0 2026-03-09T00:03:32.795 INFO:tasks.workunit.client.1.vm06.stdout:6/191: creat d4/d16/f33 x:0 0 0 2026-03-09T00:03:32.797 INFO:tasks.workunit.client.1.vm06.stdout:3/194: unlink d11/d28/f2c 0 2026-03-09T00:03:32.798 INFO:tasks.workunit.client.1.vm06.stdout:2/280: link d7/c40 d7/da/d4e/c4f 0 2026-03-09T00:03:32.798 INFO:tasks.workunit.client.1.vm06.stdout:2/281: truncate d7/da/f18 1582403 0 2026-03-09T00:03:32.799 INFO:tasks.workunit.client.1.vm06.stdout:4/171: mknod d17/d21/c30 0 2026-03-09T00:03:32.799 INFO:tasks.workunit.client.1.vm06.stdout:4/172: creat d17/d24/f31 x:0 0 0 2026-03-09T00:03:32.800 INFO:tasks.workunit.client.1.vm06.stdout:5/298: rename d5/d1c/d21/f39 to d5/d1c/d23/d51/f60 0 2026-03-09T00:03:32.801 INFO:tasks.workunit.client.1.vm06.stdout:5/299: truncate d5/d1c/d21/d28/f56 67832 0 2026-03-09T00:03:32.801 INFO:tasks.workunit.client.1.vm06.stdout:5/300: truncate d5/d1c/d21/d28/f56 1019755 0 2026-03-09T00:03:32.810 INFO:tasks.workunit.client.1.vm06.stdout:0/232: link d3/f11 d3/f51 0 2026-03-09T00:03:32.811 INFO:tasks.workunit.client.1.vm06.stdout:0/233: write d3/fa [4518436,83461] 0 2026-03-09T00:03:32.811 INFO:tasks.workunit.client.1.vm06.stdout:0/234: creat d3/d18/d28/d45/f52 x:0 0 0 2026-03-09T00:03:32.812 INFO:tasks.workunit.client.1.vm06.stdout:5/301: dread d5/d1c/d21/d28/d35/f4e [0,4194304] 0 2026-03-09T00:03:32.816 INFO:tasks.workunit.client.1.vm06.stdout:6/192: rename f1 to d4/d16/f34 0 2026-03-09T00:03:32.816 INFO:tasks.workunit.client.1.vm06.stdout:6/193: write d4/d27/f2e [821820,108832] 0 2026-03-09T00:03:32.819 INFO:tasks.workunit.client.1.vm06.stdout:6/194: dread d4/f5 [8388608,4194304] 0 2026-03-09T00:03:32.824 INFO:tasks.workunit.client.1.vm06.stdout:2/282: getdents d7/da/db/de 0 2026-03-09T00:03:32.824 INFO:tasks.workunit.client.1.vm06.stdout:2/283: fdatasync d7/d1a/d25/f33 0 2026-03-09T00:03:32.824 INFO:tasks.workunit.client.1.vm06.stdout:8/217: dwrite db/dd/f2c [0,4194304] 0 2026-03-09T00:03:32.837 INFO:tasks.workunit.client.1.vm06.stdout:0/235: getdents d3/d18/d28 0 2026-03-09T00:03:32.838 INFO:tasks.workunit.client.1.vm06.stdout:5/302: creat d5/d1c/d23/d34/d47/f61 x:0 0 0 2026-03-09T00:03:32.838 INFO:tasks.workunit.client.1.vm06.stdout:5/303: chown d5/f19 17619 1 2026-03-09T00:03:32.840 INFO:tasks.workunit.client.1.vm06.stdout:6/195: mkdir d4/d27/d35 0 2026-03-09T00:03:32.840 INFO:tasks.workunit.client.1.vm06.stdout:6/196: truncate d4/d16/f32 391124 0 2026-03-09T00:03:32.840 INFO:tasks.workunit.client.1.vm06.stdout:6/197: write d4/f5 [2650900,66143] 0 2026-03-09T00:03:32.842 INFO:tasks.workunit.client.1.vm06.stdout:2/284: creat d7/d4b/f50 x:0 0 0 2026-03-09T00:03:32.842 INFO:tasks.workunit.client.1.vm06.stdout:2/285: getdents d7/da/db/de 0 2026-03-09T00:03:32.849 INFO:tasks.workunit.client.1.vm06.stdout:0/236: symlink d3/d18/d1f/d39/l53 0 2026-03-09T00:03:32.862 INFO:tasks.workunit.client.1.vm06.stdout:2/286: dread d7/f8 [4194304,4194304] 0 2026-03-09T00:03:32.862 INFO:tasks.workunit.client.1.vm06.stdout:2/287: fdatasync d7/f17 0 2026-03-09T00:03:32.862 INFO:tasks.workunit.client.1.vm06.stdout:4/173: dwrite d17/f1f [0,4194304] 0 2026-03-09T00:03:32.868 INFO:tasks.workunit.client.1.vm06.stdout:4/174: rmdir d17/d21/d22 39 2026-03-09T00:03:32.868 INFO:tasks.workunit.client.1.vm06.stdout:4/175: chown f15 48558586 1 2026-03-09T00:03:32.873 INFO:tasks.workunit.client.1.vm06.stdout:4/176: mkdir d17/d21/d32 0 2026-03-09T00:03:32.912 INFO:tasks.workunit.client.1.vm06.stdout:4/177: dwrite fe [0,4194304] 0 2026-03-09T00:03:32.914 INFO:tasks.workunit.client.1.vm06.stdout:4/178: symlink d17/d21/d22/l33 0 2026-03-09T00:03:32.916 INFO:tasks.workunit.client.1.vm06.stdout:4/179: mknod d17/d21/d32/c34 0 2026-03-09T00:03:32.916 INFO:tasks.workunit.client.1.vm06.stdout:4/180: chown f1 15691 1 2026-03-09T00:03:32.916 INFO:tasks.workunit.client.1.vm06.stdout:4/181: truncate d17/d21/f2f 816724 0 2026-03-09T00:03:32.918 INFO:tasks.workunit.client.1.vm06.stdout:4/182: rename f7 to d17/f35 0 2026-03-09T00:03:32.924 INFO:tasks.workunit.client.1.vm06.stdout:4/183: dread f1 [8388608,4194304] 0 2026-03-09T00:03:32.924 INFO:tasks.workunit.client.1.vm06.stdout:4/184: creat d17/d24/f36 x:0 0 0 2026-03-09T00:03:32.924 INFO:tasks.workunit.client.1.vm06.stdout:4/185: readlink d17/l1c 0 2026-03-09T00:03:32.926 INFO:tasks.workunit.client.1.vm06.stdout:4/186: truncate f10 640216 0 2026-03-09T00:03:32.953 INFO:tasks.workunit.client.1.vm06.stdout:2/288: truncate d7/d1b/f22 467088 0 2026-03-09T00:03:32.953 INFO:tasks.workunit.client.1.vm06.stdout:2/289: dread - d7/da/db/de/f32 zero size 2026-03-09T00:03:32.953 INFO:tasks.workunit.client.1.vm06.stdout:2/290: readlink d7/da/l15 0 2026-03-09T00:03:32.959 INFO:tasks.workunit.client.1.vm06.stdout:7/218: sync 2026-03-09T00:03:32.960 INFO:tasks.workunit.client.1.vm06.stdout:9/174: getdents d1/d3/d12/d21/d14 0 2026-03-09T00:03:32.961 INFO:tasks.workunit.client.1.vm06.stdout:7/219: creat d0/df/d1a/d27/f37 x:0 0 0 2026-03-09T00:03:32.962 INFO:tasks.workunit.client.1.vm06.stdout:9/175: unlink d1/d3/d12/d21/c27 0 2026-03-09T00:03:32.964 INFO:tasks.workunit.client.1.vm06.stdout:7/220: link d0/df/d1a/f25 d0/df/d17/f38 0 2026-03-09T00:03:32.969 INFO:tasks.workunit.client.1.vm06.stdout:7/221: mkdir d0/d39 0 2026-03-09T00:03:32.979 INFO:tasks.workunit.client.1.vm06.stdout:5/304: write d5/d1c/d21/d28/d35/f4e [4453800,80264] 0 2026-03-09T00:03:32.979 INFO:tasks.workunit.client.1.vm06.stdout:5/305: creat d5/d1c/f62 x:0 0 0 2026-03-09T00:03:32.979 INFO:tasks.workunit.client.1.vm06.stdout:5/306: fdatasync d5/d1c/d21/d28/d35/f52 0 2026-03-09T00:03:32.984 INFO:tasks.workunit.client.1.vm06.stdout:5/307: truncate d5/d1c/f2d 249272 0 2026-03-09T00:03:32.994 INFO:tasks.workunit.client.1.vm06.stdout:5/308: chown d5/d1c/d21/f3c 3 1 2026-03-09T00:03:32.994 INFO:tasks.workunit.client.1.vm06.stdout:5/309: creat d5/d1c/d21/d28/f63 x:0 0 0 2026-03-09T00:03:32.994 INFO:tasks.workunit.client.1.vm06.stdout:5/310: truncate d5/d1c/d23/f5b 122894 0 2026-03-09T00:03:32.994 INFO:tasks.workunit.client.1.vm06.stdout:5/311: write d5/d1c/d21/f3c [370340,123736] 0 2026-03-09T00:03:32.995 INFO:tasks.workunit.client.1.vm06.stdout:0/237: getdents d3 0 2026-03-09T00:03:32.995 INFO:tasks.workunit.client.1.vm06.stdout:0/238: truncate d3/d18/d1f/d39/d49/f50 809960 0 2026-03-09T00:03:33.032 INFO:tasks.workunit.client.1.vm06.stdout:4/187: getdents d17/d21 0 2026-03-09T00:03:33.032 INFO:tasks.workunit.client.1.vm06.stdout:4/188: fdatasync d17/d21/f2f 0 2026-03-09T00:03:33.033 INFO:tasks.workunit.client.1.vm06.stdout:4/189: write d17/f1d [289501,1904] 0 2026-03-09T00:03:33.033 INFO:tasks.workunit.client.1.vm06.stdout:4/190: chown d17/l28 119867 1 2026-03-09T00:03:33.033 INFO:tasks.workunit.client.1.vm06.stdout:4/191: write f15 [314926,46360] 0 2026-03-09T00:03:33.034 INFO:tasks.workunit.client.1.vm06.stdout:4/192: symlink d17/l37 0 2026-03-09T00:03:33.035 INFO:tasks.workunit.client.1.vm06.stdout:4/193: creat d17/d21/f38 x:0 0 0 2026-03-09T00:03:33.035 INFO:tasks.workunit.client.1.vm06.stdout:4/194: chown d17/d21/f2f 534 1 2026-03-09T00:03:33.035 INFO:tasks.workunit.client.1.vm06.stdout:4/195: fdatasync d17/f19 0 2026-03-09T00:03:33.035 INFO:tasks.workunit.client.1.vm06.stdout:4/196: creat d17/d24/f39 x:0 0 0 2026-03-09T00:03:33.043 INFO:tasks.workunit.client.1.vm06.stdout:7/222: dwrite d0/df/d1a/f25 [4194304,4194304] 0 2026-03-09T00:03:33.043 INFO:tasks.workunit.client.1.vm06.stdout:7/223: dread - d0/db/d31/f34 zero size 2026-03-09T00:03:33.043 INFO:tasks.workunit.client.1.vm06.stdout:7/224: fdatasync d0/df/d17/f2d 0 2026-03-09T00:03:33.054 INFO:tasks.workunit.client.1.vm06.stdout:9/176: dwrite d1/d3/d12/f28 [0,4194304] 0 2026-03-09T00:03:33.055 INFO:tasks.workunit.client.0.vm03.stdout:5/9: rename f1 to f2 0 2026-03-09T00:03:33.056 INFO:tasks.workunit.client.0.vm03.stdout:7/5: mkdir d2 0 2026-03-09T00:03:33.056 INFO:tasks.workunit.client.0.vm03.stdout:7/6: chown d2 2653197 1 2026-03-09T00:03:33.056 INFO:tasks.workunit.client.0.vm03.stdout:7/7: fsync - no filename 2026-03-09T00:03:33.057 INFO:tasks.workunit.client.0.vm03.stdout:8/12: rename c1 to c2 0 2026-03-09T00:03:33.057 INFO:tasks.workunit.client.0.vm03.stdout:8/13: truncate - no filename 2026-03-09T00:03:33.057 INFO:tasks.workunit.client.0.vm03.stdout:8/14: write - no filename 2026-03-09T00:03:33.057 INFO:tasks.workunit.client.0.vm03.stdout:8/15: truncate - no filename 2026-03-09T00:03:33.057 INFO:tasks.workunit.client.0.vm03.stdout:8/16: write - no filename 2026-03-09T00:03:33.058 INFO:tasks.workunit.client.1.vm06.stdout:9/177: truncate d1/d3/d12/d21/d14/f18 879141 0 2026-03-09T00:03:33.058 INFO:tasks.workunit.client.1.vm06.stdout:9/178: stat d1/d4/fe 0 2026-03-09T00:03:33.058 INFO:tasks.workunit.client.0.vm03.stdout:1/6: mkdir d3 0 2026-03-09T00:03:33.058 INFO:tasks.workunit.client.0.vm03.stdout:1/7: write f2 [465539,54348] 0 2026-03-09T00:03:33.059 INFO:tasks.workunit.client.0.vm03.stdout:0/13: mkdir d2 0 2026-03-09T00:03:33.060 INFO:tasks.workunit.client.0.vm03.stdout:3/6: mkdir d2 0 2026-03-09T00:03:33.061 INFO:tasks.workunit.client.0.vm03.stdout:4/6: symlink l1 0 2026-03-09T00:03:33.061 INFO:tasks.workunit.client.0.vm03.stdout:4/7: dread - f0 zero size 2026-03-09T00:03:33.061 INFO:tasks.workunit.client.0.vm03.stdout:4/8: readlink l1 0 2026-03-09T00:03:33.061 INFO:tasks.workunit.client.0.vm03.stdout:4/9: fsync f0 0 2026-03-09T00:03:33.061 INFO:tasks.workunit.client.0.vm03.stdout:4/10: rmdir - no directory 2026-03-09T00:03:33.061 INFO:tasks.workunit.client.0.vm03.stdout:4/11: read - f0 zero size 2026-03-09T00:03:33.064 INFO:tasks.workunit.client.0.vm03.stdout:6/11: unlink l0 0 2026-03-09T00:03:33.065 INFO:tasks.workunit.client.0.vm03.stdout:5/10: truncate f0 265154 0 2026-03-09T00:03:33.066 INFO:tasks.workunit.client.0.vm03.stdout:7/8: creat d2/f3 x:0 0 0 2026-03-09T00:03:33.066 INFO:tasks.workunit.client.0.vm03.stdout:8/17: stat c2 0 2026-03-09T00:03:33.066 INFO:tasks.workunit.client.0.vm03.stdout:8/18: dread - no filename 2026-03-09T00:03:33.066 INFO:tasks.workunit.client.0.vm03.stdout:8/19: truncate - no filename 2026-03-09T00:03:33.066 INFO:tasks.workunit.client.0.vm03.stdout:8/20: chown c2 32438229 1 2026-03-09T00:03:33.067 INFO:tasks.workunit.client.0.vm03.stdout:1/8: rmdir d3 0 2026-03-09T00:03:33.068 INFO:tasks.workunit.client.0.vm03.stdout:0/14: symlink d2/l3 0 2026-03-09T00:03:33.080 INFO:tasks.workunit.client.0.vm03.stdout:3/7: mknod d2/c3 0 2026-03-09T00:03:33.082 INFO:tasks.workunit.client.0.vm03.stdout:4/12: creat f2 x:0 0 0 2026-03-09T00:03:33.082 INFO:tasks.workunit.client.0.vm03.stdout:4/13: truncate f2 987878 0 2026-03-09T00:03:33.083 INFO:tasks.workunit.client.0.vm03.stdout:9/4: getdents . 0 2026-03-09T00:03:33.084 INFO:tasks.workunit.client.0.vm03.stdout:6/12: rename f1 to f3 0 2026-03-09T00:03:33.085 INFO:tasks.workunit.client.0.vm03.stdout:7/9: mkdir d2/d4 0 2026-03-09T00:03:33.085 INFO:tasks.workunit.client.0.vm03.stdout:8/21: stat c2 0 2026-03-09T00:03:33.086 INFO:tasks.workunit.client.0.vm03.stdout:1/9: mkdir d4 0 2026-03-09T00:03:33.087 INFO:tasks.workunit.client.0.vm03.stdout:4/14: rename f0 to f3 0 2026-03-09T00:03:33.087 INFO:tasks.workunit.client.0.vm03.stdout:4/15: fdatasync f2 0 2026-03-09T00:03:33.091 INFO:tasks.workunit.client.0.vm03.stdout:7/10: symlink d2/l5 0 2026-03-09T00:03:33.093 INFO:tasks.workunit.client.0.vm03.stdout:8/22: creat f3 x:0 0 0 2026-03-09T00:03:33.094 INFO:tasks.workunit.client.0.vm03.stdout:8/23: truncate f3 288971 0 2026-03-09T00:03:33.094 INFO:tasks.workunit.client.1.vm06.stdout:2/291: dwrite d7/d1a/f30 [0,4194304] 0 2026-03-09T00:03:33.094 INFO:tasks.workunit.client.1.vm06.stdout:2/292: dread - d7/da/db/de/f49 zero size 2026-03-09T00:03:33.096 INFO:tasks.workunit.client.0.vm03.stdout:7/11: link l1 d2/l6 0 2026-03-09T00:03:33.096 INFO:tasks.workunit.client.0.vm03.stdout:7/12: dread - d2/f3 zero size 2026-03-09T00:03:33.096 INFO:tasks.workunit.client.0.vm03.stdout:7/13: write d2/f3 [867772,126366] 0 2026-03-09T00:03:33.097 INFO:tasks.workunit.client.0.vm03.stdout:7/14: unlink l0 0 2026-03-09T00:03:33.104 INFO:tasks.workunit.client.0.vm03.stdout:4/16: dread f2 [0,4194304] 0 2026-03-09T00:03:33.109 INFO:tasks.workunit.client.0.vm03.stdout:9/5: write f0 [2961931,15896] 0 2026-03-09T00:03:33.117 INFO:tasks.workunit.client.1.vm06.stdout:4/197: rename f10 to d17/d24/f3a 0 2026-03-09T00:03:33.123 INFO:tasks.workunit.client.1.vm06.stdout:4/198: mkdir d17/d24/d3b 0 2026-03-09T00:03:33.123 INFO:tasks.workunit.client.1.vm06.stdout:7/225: rename d0/db to d0/df/d1a/d3a 0 2026-03-09T00:03:33.124 INFO:tasks.workunit.client.1.vm06.stdout:2/293: rename d7/da/l24 to d7/da/d1c/d47/l51 0 2026-03-09T00:03:33.124 INFO:tasks.workunit.client.1.vm06.stdout:2/294: chown d7/f3a 122059077 1 2026-03-09T00:03:33.124 INFO:tasks.workunit.client.1.vm06.stdout:2/295: write d7/da/d1c/f29 [2667333,10816] 0 2026-03-09T00:03:33.134 INFO:tasks.workunit.client.1.vm06.stdout:8/218: sync 2026-03-09T00:03:33.141 INFO:tasks.workunit.client.1.vm06.stdout:3/195: sync 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:6/198: sync 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:3/196: symlink d11/d28/d2e/d2f/l39 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/187: sync 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/188: mknod d6/d21/c40 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/189: creat d6/f41 x:0 0 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/190: mkdir d6/d21/d2d/d3b/d42 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/191: read d6/f1d [90640,52628] 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/192: write d6/d21/f3d [873619,103325] 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/193: mkdir d6/d21/d2d/d3b/d42/d43 0 2026-03-09T00:03:33.142 INFO:tasks.workunit.client.1.vm06.stdout:1/194: truncate d6/ff 2553168 0 2026-03-09T00:03:33.175 INFO:tasks.workunit.client.1.vm06.stdout:7/226: dread d0/df/f29 [0,4194304] 0 2026-03-09T00:03:33.175 INFO:tasks.workunit.client.1.vm06.stdout:7/227: write d0/f14 [3109564,45764] 0 2026-03-09T00:03:33.177 INFO:tasks.workunit.client.1.vm06.stdout:7/228: unlink d0/df/d1a/l2f 0 2026-03-09T00:03:33.200 INFO:tasks.workunit.client.0.vm03.stdout:0/15: dwrite f1 [0,4194304] 0 2026-03-09T00:03:33.201 INFO:tasks.workunit.client.0.vm03.stdout:0/16: link d2/l3 d2/l4 0 2026-03-09T00:03:33.204 INFO:tasks.workunit.client.1.vm06.stdout:1/195: read d6/d21/f2a [983981,57336] 0 2026-03-09T00:03:33.211 INFO:tasks.workunit.client.1.vm06.stdout:7/229: dread d0/df/d1a/d22/f2c [4194304,4194304] 0 2026-03-09T00:03:33.211 INFO:tasks.workunit.client.1.vm06.stdout:7/230: fdatasync d0/df/d17/f21 0 2026-03-09T00:03:33.216 INFO:tasks.workunit.client.1.vm06.stdout:7/231: read d0/df/d1a/f25 [5562875,104526] 0 2026-03-09T00:03:33.216 INFO:tasks.workunit.client.1.vm06.stdout:8/219: dread f5 [0,4194304] 0 2026-03-09T00:03:33.217 INFO:tasks.workunit.client.1.vm06.stdout:7/232: mknod d0/d39/c3b 0 2026-03-09T00:03:33.217 INFO:tasks.workunit.client.1.vm06.stdout:7/233: readlink d0/l1c 0 2026-03-09T00:03:33.217 INFO:tasks.workunit.client.1.vm06.stdout:7/234: write d0/df/d17/f21 [791414,25487] 0 2026-03-09T00:03:33.218 INFO:tasks.workunit.client.1.vm06.stdout:7/235: unlink d0/df/f29 0 2026-03-09T00:03:33.218 INFO:tasks.workunit.client.1.vm06.stdout:7/236: chown d0/c4 12776688 1 2026-03-09T00:03:33.218 INFO:tasks.workunit.client.0.vm03.stdout:2/1: sync 2026-03-09T00:03:33.222 INFO:tasks.workunit.client.0.vm03.stdout:2/2: write - no filename 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/3: link - no file 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/4: truncate - no filename 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/5: mknod c0 0 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/6: dread - no filename 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/7: write - no filename 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/8: write - no filename 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/9: chown c0 377 1 2026-03-09T00:03:33.223 INFO:tasks.workunit.client.0.vm03.stdout:2/10: read - no filename 2026-03-09T00:03:33.230 INFO:tasks.workunit.client.0.vm03.stdout:1/10: dread f2 [0,4194304] 0 2026-03-09T00:03:33.230 INFO:tasks.workunit.client.0.vm03.stdout:1/11: readlink - no filename 2026-03-09T00:03:33.230 INFO:tasks.workunit.client.0.vm03.stdout:1/12: dread f2 [0,4194304] 0 2026-03-09T00:03:33.230 INFO:tasks.workunit.client.0.vm03.stdout:1/13: fsync f2 0 2026-03-09T00:03:33.230 INFO:tasks.workunit.client.0.vm03.stdout:2/11: symlink l1 0 2026-03-09T00:03:33.232 INFO:tasks.workunit.client.0.vm03.stdout:1/14: mknod d4/c5 0 2026-03-09T00:03:33.232 INFO:tasks.workunit.client.0.vm03.stdout:2/12: creat f2 x:0 0 0 2026-03-09T00:03:33.232 INFO:tasks.workunit.client.0.vm03.stdout:2/13: stat l1 0 2026-03-09T00:03:33.234 INFO:tasks.workunit.client.0.vm03.stdout:1/15: mkdir d4/d6 0 2026-03-09T00:03:33.234 INFO:tasks.workunit.client.0.vm03.stdout:1/16: chown f1 13665211 1 2026-03-09T00:03:33.236 INFO:tasks.workunit.client.0.vm03.stdout:1/17: read f1 [354405,129231] 0 2026-03-09T00:03:33.237 INFO:tasks.workunit.client.0.vm03.stdout:1/18: symlink d4/l7 0 2026-03-09T00:03:33.237 INFO:tasks.workunit.client.0.vm03.stdout:1/19: read f2 [271274,70432] 0 2026-03-09T00:03:33.237 INFO:tasks.workunit.client.0.vm03.stdout:1/20: creat d4/d6/f8 x:0 0 0 2026-03-09T00:03:33.238 INFO:tasks.workunit.client.0.vm03.stdout:1/21: creat d4/f9 x:0 0 0 2026-03-09T00:03:33.243 INFO:tasks.workunit.client.0.vm03.stdout:1/22: dread f1 [0,4194304] 0 2026-03-09T00:03:33.253 INFO:tasks.workunit.client.0.vm03.stdout:1/23: dread f2 [0,4194304] 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/24: dread f1 [0,4194304] 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/25: write d4/f9 [974535,118147] 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/26: dread f1 [0,4194304] 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/27: dread f2 [0,4194304] 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/28: link f1 d4/d6/fa 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/29: dread f2 [0,4194304] 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/30: chown d4/d6 5782541 1 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/31: creat d4/fb x:0 0 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/32: symlink d4/lc 0 2026-03-09T00:03:33.254 INFO:tasks.workunit.client.0.vm03.stdout:1/33: mknod d4/cd 0 2026-03-09T00:03:33.257 INFO:tasks.workunit.client.1.vm06.stdout:0/239: dwrite d3/fa [0,4194304] 0 2026-03-09T00:03:33.262 INFO:tasks.workunit.client.0.vm03.stdout:4/17: dwrite f3 [0,4194304] 0 2026-03-09T00:03:33.262 INFO:tasks.workunit.client.1.vm06.stdout:0/240: read d3/f1c [155793,74873] 0 2026-03-09T00:03:33.273 INFO:tasks.workunit.client.1.vm06.stdout:0/241: symlink d3/d18/d28/d45/l54 0 2026-03-09T00:03:33.294 INFO:tasks.workunit.client.1.vm06.stdout:0/242: rename d3/d18/f22 to d3/d18/d1f/d39/d3b/f55 0 2026-03-09T00:03:33.294 INFO:tasks.workunit.client.1.vm06.stdout:0/243: chown d3/d18/d2c/d2d/d31 15318 1 2026-03-09T00:03:33.294 INFO:tasks.workunit.client.1.vm06.stdout:0/244: truncate d3/d18/d2c/d2d/f46 25055 0 2026-03-09T00:03:33.294 INFO:tasks.workunit.client.1.vm06.stdout:0/245: link d3/fa d3/d18/d28/f56 0 2026-03-09T00:03:33.294 INFO:tasks.workunit.client.1.vm06.stdout:0/246: chown d3/d18/d1f/d39/d3b/l38 65232 1 2026-03-09T00:03:33.294 INFO:tasks.workunit.client.1.vm06.stdout:0/247: creat d3/d18/d1f/d39/d3b/f57 x:0 0 0 2026-03-09T00:03:33.295 INFO:tasks.workunit.client.0.vm03.stdout:3/8: dwrite f1 [0,4194304] 0 2026-03-09T00:03:33.296 INFO:tasks.workunit.client.0.vm03.stdout:3/9: mknod d2/c4 0 2026-03-09T00:03:33.340 INFO:tasks.workunit.client.1.vm06.stdout:2/296: dwrite d7/d1b/f3b [0,4194304] 0 2026-03-09T00:03:33.340 INFO:tasks.workunit.client.1.vm06.stdout:2/297: write f6 [1867615,103522] 0 2026-03-09T00:03:33.342 INFO:tasks.workunit.client.1.vm06.stdout:2/298: link d7/da/db/de/c10 d7/da/d4e/c52 0 2026-03-09T00:03:33.342 INFO:tasks.workunit.client.1.vm06.stdout:2/299: write d7/f3a [612172,67711] 0 2026-03-09T00:03:33.354 INFO:tasks.workunit.client.0.vm03.stdout:9/6: write f0 [4155595,70919] 0 2026-03-09T00:03:33.359 INFO:tasks.workunit.client.0.vm03.stdout:6/13: truncate f2 1164208 0 2026-03-09T00:03:33.363 INFO:tasks.workunit.client.0.vm03.stdout:5/11: unlink f0 0 2026-03-09T00:03:33.363 INFO:tasks.workunit.client.0.vm03.stdout:0/17: dwrite f1 [0,4194304] 0 2026-03-09T00:03:33.363 INFO:tasks.workunit.client.0.vm03.stdout:0/18: chown d2 688 1 2026-03-09T00:03:33.364 INFO:tasks.workunit.client.0.vm03.stdout:5/12: rename f2 to f3 0 2026-03-09T00:03:33.366 INFO:tasks.workunit.client.0.vm03.stdout:0/19: creat d2/f5 x:0 0 0 2026-03-09T00:03:33.367 INFO:tasks.workunit.client.0.vm03.stdout:5/13: symlink l4 0 2026-03-09T00:03:33.367 INFO:tasks.workunit.client.1.vm06.stdout:2/300: dread d7/f8 [8388608,4194304] 0 2026-03-09T00:03:33.369 INFO:tasks.workunit.client.1.vm06.stdout:9/179: dwrite d1/d3/f35 [0,4194304] 0 2026-03-09T00:03:33.375 INFO:tasks.workunit.client.0.vm03.stdout:0/20: unlink d2/l4 0 2026-03-09T00:03:33.376 INFO:tasks.workunit.client.0.vm03.stdout:0/21: symlink d2/l6 0 2026-03-09T00:03:33.376 INFO:tasks.workunit.client.0.vm03.stdout:0/22: creat d2/f7 x:0 0 0 2026-03-09T00:03:33.378 INFO:tasks.workunit.client.1.vm06.stdout:0/248: write d3/d18/d1f/d39/f3d [2749265,73914] 0 2026-03-09T00:03:33.385 INFO:tasks.workunit.client.0.vm03.stdout:0/23: truncate f0 1986732 0 2026-03-09T00:03:33.385 INFO:tasks.workunit.client.0.vm03.stdout:0/24: mknod d2/c8 0 2026-03-09T00:03:33.385 INFO:tasks.workunit.client.1.vm06.stdout:9/180: rename d1/d3/d12/d21/c22 to d1/d3/d12/d21/d14/d25/c38 0 2026-03-09T00:03:33.385 INFO:tasks.workunit.client.1.vm06.stdout:9/181: write d1/d3/d12/d21/f8 [449655,108038] 0 2026-03-09T00:03:33.385 INFO:tasks.workunit.client.1.vm06.stdout:9/182: creat d1/d4/f39 x:0 0 0 2026-03-09T00:03:33.385 INFO:tasks.workunit.client.1.vm06.stdout:0/249: creat d3/d18/d1f/d44/f58 x:0 0 0 2026-03-09T00:03:33.389 INFO:tasks.workunit.client.1.vm06.stdout:9/183: write d1/d3/f35 [878128,107788] 0 2026-03-09T00:03:33.389 INFO:tasks.workunit.client.1.vm06.stdout:9/184: creat d1/d3/d12/f3a x:0 0 0 2026-03-09T00:03:33.397 INFO:tasks.workunit.client.1.vm06.stdout:6/199: dwrite d4/fc [0,4194304] 0 2026-03-09T00:03:33.397 INFO:tasks.workunit.client.1.vm06.stdout:6/200: fsync d4/fa 0 2026-03-09T00:03:33.397 INFO:tasks.workunit.client.0.vm03.stdout:7/15: truncate d2/f3 269722 0 2026-03-09T00:03:33.398 INFO:tasks.workunit.client.1.vm06.stdout:6/201: link d4/f12 d4/f36 0 2026-03-09T00:03:33.398 INFO:tasks.workunit.client.1.vm06.stdout:6/202: mknod d4/d27/c37 0 2026-03-09T00:03:33.399 INFO:tasks.workunit.client.1.vm06.stdout:6/203: creat d4/f38 x:0 0 0 2026-03-09T00:03:33.408 INFO:tasks.workunit.client.1.vm06.stdout:8/220: dwrite db/f17 [0,4194304] 0 2026-03-09T00:03:33.408 INFO:tasks.workunit.client.1.vm06.stdout:8/221: getdents db/dd/d24/d36/d44 0 2026-03-09T00:03:33.418 INFO:tasks.workunit.client.0.vm03.stdout:8/24: dwrite f3 [0,4194304] 0 2026-03-09T00:03:33.423 INFO:tasks.workunit.client.0.vm03.stdout:8/25: symlink l4 0 2026-03-09T00:03:33.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:33 vm06.local ceph-mon[58395]: pgmap v133: 65 pgs: 65 active+clean; 535 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 31 MiB/s rd, 53 MiB/s wr, 509 op/s 2026-03-09T00:03:33.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:33 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:33.428 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:33 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:33.428 INFO:tasks.workunit.client.1.vm06.stdout:3/197: dwrite f7 [0,4194304] 0 2026-03-09T00:03:33.428 INFO:tasks.workunit.client.0.vm03.stdout:8/26: symlink l5 0 2026-03-09T00:03:33.428 INFO:tasks.workunit.client.0.vm03.stdout:8/27: fsync f3 0 2026-03-09T00:03:33.430 INFO:tasks.workunit.client.1.vm06.stdout:3/198: creat d11/d28/f3a x:0 0 0 2026-03-09T00:03:33.431 INFO:tasks.workunit.client.1.vm06.stdout:3/199: readlink d11/d28/l30 0 2026-03-09T00:03:33.434 INFO:tasks.workunit.client.1.vm06.stdout:3/200: link d11/d28/l30 d11/d28/d2e/l3b 0 2026-03-09T00:03:33.435 INFO:tasks.workunit.client.1.vm06.stdout:3/201: unlink d11/f13 0 2026-03-09T00:03:33.439 INFO:tasks.workunit.client.1.vm06.stdout:3/202: dread d11/f16 [0,4194304] 0 2026-03-09T00:03:33.442 INFO:tasks.workunit.client.1.vm06.stdout:3/203: truncate d11/f1e 330850 0 2026-03-09T00:03:33.442 INFO:tasks.workunit.client.1.vm06.stdout:3/204: truncate d11/f12 658824 0 2026-03-09T00:03:33.452 INFO:tasks.workunit.client.0.vm03.stdout:1/34: dwrite d4/d6/f8 [0,4194304] 0 2026-03-09T00:03:33.455 INFO:tasks.workunit.client.0.vm03.stdout:1/35: unlink d4/cd 0 2026-03-09T00:03:33.459 INFO:tasks.workunit.client.0.vm03.stdout:2/14: dwrite f2 [0,4194304] 0 2026-03-09T00:03:33.464 INFO:tasks.workunit.client.0.vm03.stdout:1/36: dread d4/d6/fa [0,4194304] 0 2026-03-09T00:03:33.464 INFO:tasks.workunit.client.0.vm03.stdout:2/15: mknod c3 0 2026-03-09T00:03:33.466 INFO:tasks.workunit.client.0.vm03.stdout:1/37: symlink d4/d6/le 0 2026-03-09T00:03:33.466 INFO:tasks.workunit.client.0.vm03.stdout:1/38: chown f0 15819 1 2026-03-09T00:03:33.468 INFO:tasks.workunit.client.0.vm03.stdout:1/39: link d4/d6/fa d4/d6/ff 0 2026-03-09T00:03:33.469 INFO:tasks.workunit.client.0.vm03.stdout:1/40: dread f2 [0,4194304] 0 2026-03-09T00:03:33.470 INFO:tasks.workunit.client.0.vm03.stdout:1/41: symlink d4/l10 0 2026-03-09T00:03:33.471 INFO:tasks.workunit.client.0.vm03.stdout:2/16: dread f2 [0,4194304] 0 2026-03-09T00:03:33.473 INFO:tasks.workunit.client.0.vm03.stdout:1/42: symlink d4/d6/l11 0 2026-03-09T00:03:33.481 INFO:tasks.workunit.client.0.vm03.stdout:5/14: dwrite f3 [0,4194304] 0 2026-03-09T00:03:33.481 INFO:tasks.workunit.client.0.vm03.stdout:5/15: rmdir - no directory 2026-03-09T00:03:33.485 INFO:tasks.workunit.client.0.vm03.stdout:5/16: symlink l5 0 2026-03-09T00:03:33.490 INFO:tasks.workunit.client.0.vm03.stdout:5/17: read f3 [1127172,15421] 0 2026-03-09T00:03:33.497 INFO:tasks.workunit.client.1.vm06.stdout:1/196: dwrite d6/f7 [0,4194304] 0 2026-03-09T00:03:33.499 INFO:tasks.workunit.client.1.vm06.stdout:1/197: rename d6/f1a to d6/d21/d2d/f44 0 2026-03-09T00:03:33.501 INFO:tasks.workunit.client.1.vm06.stdout:7/237: dwrite d0/f2 [4194304,4194304] 0 2026-03-09T00:03:33.501 INFO:tasks.workunit.client.1.vm06.stdout:7/238: chown d0/fe 106258 1 2026-03-09T00:03:33.503 INFO:tasks.workunit.client.1.vm06.stdout:1/198: unlink d6/d21/d2d/d3b/l3e 0 2026-03-09T00:03:33.504 INFO:tasks.workunit.client.1.vm06.stdout:7/239: dread d0/df/d1a/d22/f28 [0,4194304] 0 2026-03-09T00:03:33.511 INFO:tasks.workunit.client.1.vm06.stdout:1/199: unlink d6/d21/f2a 0 2026-03-09T00:03:33.522 INFO:tasks.workunit.client.1.vm06.stdout:8/222: dwrite db/d1e/f34 [0,4194304] 0 2026-03-09T00:03:33.523 INFO:tasks.workunit.client.1.vm06.stdout:8/223: write db/d1e/f2e [150942,93466] 0 2026-03-09T00:03:33.523 INFO:tasks.workunit.client.1.vm06.stdout:8/224: chown db/f31 0 1 2026-03-09T00:03:33.523 INFO:tasks.workunit.client.1.vm06.stdout:8/225: readlink db/l12 0 2026-03-09T00:03:33.524 INFO:tasks.workunit.client.1.vm06.stdout:8/226: rmdir db/dd/d24/d36/d44 0 2026-03-09T00:03:33.524 INFO:tasks.workunit.client.1.vm06.stdout:8/227: fsync db/dd/f2c 0 2026-03-09T00:03:33.526 INFO:tasks.workunit.client.1.vm06.stdout:8/228: creat db/dd/d24/d36/f45 x:0 0 0 2026-03-09T00:03:33.526 INFO:tasks.workunit.client.1.vm06.stdout:8/229: write db/f1d [375339,51957] 0 2026-03-09T00:03:33.527 INFO:tasks.workunit.client.1.vm06.stdout:8/230: mkdir db/d1e/d46 0 2026-03-09T00:03:33.527 INFO:tasks.workunit.client.1.vm06.stdout:8/231: truncate db/f2d 591050 0 2026-03-09T00:03:33.527 INFO:tasks.workunit.client.1.vm06.stdout:8/232: write db/dd/f27 [706290,106956] 0 2026-03-09T00:03:33.528 INFO:tasks.workunit.client.1.vm06.stdout:8/233: write db/f31 [31699,86808] 0 2026-03-09T00:03:33.531 INFO:tasks.workunit.client.1.vm06.stdout:5/312: sync 2026-03-09T00:03:33.531 INFO:tasks.workunit.client.1.vm06.stdout:5/313: chown c4 59 1 2026-03-09T00:03:33.532 INFO:tasks.workunit.client.1.vm06.stdout:5/314: symlink d5/l64 0 2026-03-09T00:03:33.532 INFO:tasks.workunit.client.1.vm06.stdout:5/315: write d5/d1c/d21/d28/f57 [529024,35244] 0 2026-03-09T00:03:33.532 INFO:tasks.workunit.client.1.vm06.stdout:5/316: mknod d5/d1c/d21/d28/d35/c65 0 2026-03-09T00:03:33.537 INFO:tasks.workunit.client.1.vm06.stdout:5/317: write d5/f1d [1036457,78729] 0 2026-03-09T00:03:33.537 INFO:tasks.workunit.client.1.vm06.stdout:5/318: chown d5/d1c/d23/f4c 154343 1 2026-03-09T00:03:33.538 INFO:tasks.workunit.client.1.vm06.stdout:5/319: mkdir d5/d1c/d21/d28/d5e/d66 0 2026-03-09T00:03:33.538 INFO:tasks.workunit.client.1.vm06.stdout:5/320: fdatasync d5/d1c/d21/d28/f63 0 2026-03-09T00:03:33.557 INFO:tasks.workunit.client.0.vm03.stdout:3/10: truncate f1 3584396 0 2026-03-09T00:03:33.601 INFO:tasks.workunit.client.0.vm03.stdout:3/11: fdatasync f0 0 2026-03-09T00:03:33.602 INFO:tasks.workunit.client.0.vm03.stdout:3/12: getdents d2 0 2026-03-09T00:03:33.602 INFO:tasks.workunit.client.0.vm03.stdout:3/13: creat d2/f5 x:0 0 0 2026-03-09T00:03:33.602 INFO:tasks.workunit.client.1.vm06.stdout:2/301: dwrite f3 [0,4194304] 0 2026-03-09T00:03:33.602 INFO:tasks.workunit.client.1.vm06.stdout:2/302: write f6 [1251937,32878] 0 2026-03-09T00:03:33.602 INFO:tasks.workunit.client.1.vm06.stdout:2/303: fdatasync f3 0 2026-03-09T00:03:33.602 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:33 vm03.local ceph-mon[52346]: pgmap v133: 65 pgs: 65 active+clean; 535 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 31 MiB/s rd, 53 MiB/s wr, 509 op/s 2026-03-09T00:03:33.602 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:33 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:33.602 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:33 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:33.768 INFO:tasks.workunit.client.1.vm06.stdout:8/234: dwrite db/dd/f40 [0,4194304] 0 2026-03-09T00:03:33.775 INFO:tasks.workunit.client.0.vm03.stdout:7/16: truncate d2/f3 990258 0 2026-03-09T00:03:33.777 INFO:tasks.workunit.client.0.vm03.stdout:7/17: rename l1 to d2/l7 0 2026-03-09T00:03:33.779 INFO:tasks.workunit.client.1.vm06.stdout:3/205: dwrite f9 [0,4194304] 0 2026-03-09T00:03:33.798 INFO:tasks.workunit.client.0.vm03.stdout:0/25: getdents d2 0 2026-03-09T00:03:33.798 INFO:tasks.workunit.client.0.vm03.stdout:0/26: rename d2/f5 to d2/f9 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.0.vm03.stdout:0/27: rmdir d2 39 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.0.vm03.stdout:8/28: getdents . 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:3/206: truncate d11/f1d 925836 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:0/250: getdents d3/d18/d1f/d44 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:0/251: readlink d3/d18/l27 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:0/252: write d3/d18/d1f/f4a [999364,21183] 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:0/253: creat d3/d18/f59 x:0 0 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:0/254: write d3/d18/d2c/f4d [590092,34955] 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:3/207: rename d11/f1f to d11/f3c 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:6/204: rmdir d4/d27 39 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:6/205: rename d4/fa to d4/d27/d35/f39 0 2026-03-09T00:03:33.799 INFO:tasks.workunit.client.1.vm06.stdout:3/208: mknod d11/d28/c3d 0 2026-03-09T00:03:33.813 INFO:tasks.workunit.client.0.vm03.stdout:5/18: getdents . 0 2026-03-09T00:03:33.814 INFO:tasks.workunit.client.0.vm03.stdout:5/19: creat f6 x:0 0 0 2026-03-09T00:03:33.825 INFO:tasks.workunit.client.0.vm03.stdout:3/14: dwrite f0 [0,4194304] 0 2026-03-09T00:03:33.851 INFO:tasks.workunit.client.0.vm03.stdout:2/17: dwrite f2 [0,4194304] 0 2026-03-09T00:03:33.851 INFO:tasks.workunit.client.0.vm03.stdout:2/18: creat f4 x:0 0 0 2026-03-09T00:03:33.851 INFO:tasks.workunit.client.1.vm06.stdout:4/199: sync 2026-03-09T00:03:33.866 INFO:tasks.workunit.client.1.vm06.stdout:5/321: dwrite d5/d44/f4a [4194304,4194304] 0 2026-03-09T00:03:33.867 INFO:tasks.workunit.client.1.vm06.stdout:5/322: dread - d5/d1c/d21/d28/f63 zero size 2026-03-09T00:03:33.867 INFO:tasks.workunit.client.1.vm06.stdout:5/323: mknod d5/d1c/d21/d28/c67 0 2026-03-09T00:03:33.867 INFO:tasks.workunit.client.1.vm06.stdout:5/324: fsync d5/f14 0 2026-03-09T00:03:33.867 INFO:tasks.workunit.client.1.vm06.stdout:5/325: fdatasync d5/d1c/d21/d28/f56 0 2026-03-09T00:03:33.887 INFO:tasks.workunit.client.0.vm03.stdout:5/20: dread f3 [0,4194304] 0 2026-03-09T00:03:33.889 INFO:tasks.workunit.client.0.vm03.stdout:5/21: dread f3 [0,4194304] 0 2026-03-09T00:03:33.890 INFO:tasks.workunit.client.0.vm03.stdout:5/22: mknod c7 0 2026-03-09T00:03:33.890 INFO:tasks.workunit.client.0.vm03.stdout:5/23: rmdir - no directory 2026-03-09T00:03:33.890 INFO:tasks.workunit.client.0.vm03.stdout:5/24: creat f8 x:0 0 0 2026-03-09T00:03:33.895 INFO:tasks.workunit.client.0.vm03.stdout:8/29: dread f3 [0,4194304] 0 2026-03-09T00:03:33.919 INFO:tasks.workunit.client.1.vm06.stdout:2/304: dwrite d7/da/f18 [0,4194304] 0 2026-03-09T00:03:33.923 INFO:tasks.workunit.client.1.vm06.stdout:2/305: dread d7/da/d1c/f1f [4194304,4194304] 0 2026-03-09T00:03:33.923 INFO:tasks.workunit.client.1.vm06.stdout:4/200: write d17/d21/d22/f2a [134340,44646] 0 2026-03-09T00:03:33.925 INFO:tasks.workunit.client.0.vm03.stdout:6/14: dwrite f3 [0,4194304] 0 2026-03-09T00:03:33.932 INFO:tasks.workunit.client.0.vm03.stdout:6/15: mknod c4 0 2026-03-09T00:03:33.932 INFO:tasks.workunit.client.0.vm03.stdout:6/16: symlink l5 0 2026-03-09T00:03:33.932 INFO:tasks.workunit.client.0.vm03.stdout:6/17: readlink l5 0 2026-03-09T00:03:33.933 INFO:tasks.workunit.client.1.vm06.stdout:8/235: dwrite db/f3f [0,4194304] 0 2026-03-09T00:03:33.933 INFO:tasks.workunit.client.1.vm06.stdout:8/236: chown db/dd/f27 952 1 2026-03-09T00:03:33.935 INFO:tasks.workunit.client.0.vm03.stdout:9/7: dwrite f0 [0,4194304] 0 2026-03-09T00:03:33.935 INFO:tasks.workunit.client.0.vm03.stdout:9/8: unlink f0 0 2026-03-09T00:03:33.937 INFO:tasks.workunit.client.0.vm03.stdout:9/9: symlink l1 0 2026-03-09T00:03:33.937 INFO:tasks.workunit.client.0.vm03.stdout:9/10: readlink l1 0 2026-03-09T00:03:33.937 INFO:tasks.workunit.client.0.vm03.stdout:9/11: dread - no filename 2026-03-09T00:03:33.937 INFO:tasks.workunit.client.0.vm03.stdout:9/12: readlink l1 0 2026-03-09T00:03:33.937 INFO:tasks.workunit.client.0.vm03.stdout:9/13: dread - no filename 2026-03-09T00:03:33.938 INFO:tasks.workunit.client.1.vm06.stdout:1/200: dwrite f0 [0,4194304] 0 2026-03-09T00:03:33.938 INFO:tasks.workunit.client.1.vm06.stdout:1/201: read d6/ff [1759484,16412] 0 2026-03-09T00:03:33.939 INFO:tasks.workunit.client.0.vm03.stdout:9/14: creat f2 x:0 0 0 2026-03-09T00:03:33.940 INFO:tasks.workunit.client.0.vm03.stdout:9/15: symlink l3 0 2026-03-09T00:03:33.940 INFO:tasks.workunit.client.0.vm03.stdout:9/16: chown l3 14189 1 2026-03-09T00:03:33.942 INFO:tasks.workunit.client.0.vm03.stdout:6/18: dread f3 [0,4194304] 0 2026-03-09T00:03:33.942 INFO:tasks.workunit.client.0.vm03.stdout:9/17: unlink f2 0 2026-03-09T00:03:33.959 INFO:tasks.workunit.client.0.vm03.stdout:9/18: write - no filename 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/202: dread d6/ff [0,4194304] 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/203: dread - d6/f34 zero size 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/204: creat d6/d21/d2d/d3b/d42/d43/f45 x:0 0 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/205: fdatasync d6/d21/d2d/d3b/d42/d43/f45 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/206: rename d6/c14 to d6/d21/d2d/d3b/c46 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/207: chown d6/d21/c40 15883998 1 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.1.vm06.stdout:1/208: write d6/f25 [4684280,28924] 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.0.vm03.stdout:9/19: unlink l3 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.0.vm03.stdout:9/20: write - no filename 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.0.vm03.stdout:9/21: rename l1 to l4 0 2026-03-09T00:03:33.960 INFO:tasks.workunit.client.0.vm03.stdout:9/22: creat f5 x:0 0 0 2026-03-09T00:03:33.961 INFO:tasks.workunit.client.1.vm06.stdout:1/209: dread d6/fa [0,4194304] 0 2026-03-09T00:03:33.962 INFO:tasks.workunit.client.1.vm06.stdout:1/210: read - d6/f41 zero size 2026-03-09T00:03:33.967 INFO:tasks.workunit.client.1.vm06.stdout:9/185: dwrite d1/d3/d12/d21/f8 [0,4194304] 0 2026-03-09T00:03:33.968 INFO:tasks.workunit.client.1.vm06.stdout:3/209: dwrite d11/f3c [0,4194304] 0 2026-03-09T00:03:33.988 INFO:tasks.workunit.client.0.vm03.stdout:8/30: dwrite f3 [0,4194304] 0 2026-03-09T00:03:33.990 INFO:tasks.workunit.client.0.vm03.stdout:8/31: creat f6 x:0 0 0 2026-03-09T00:03:33.990 INFO:tasks.workunit.client.0.vm03.stdout:8/32: chown l5 0 1 2026-03-09T00:03:33.990 INFO:tasks.workunit.client.0.vm03.stdout:8/33: mkdir d7 0 2026-03-09T00:03:34.031 INFO:tasks.workunit.client.1.vm06.stdout:1/211: dread d6/d21/d2d/f44 [0,4194304] 0 2026-03-09T00:03:34.031 INFO:tasks.workunit.client.1.vm06.stdout:1/212: chown d6/f28 54112 1 2026-03-09T00:03:34.036 INFO:tasks.workunit.client.1.vm06.stdout:1/213: write d6/f19 [3977263,34487] 0 2026-03-09T00:03:34.037 INFO:tasks.workunit.client.1.vm06.stdout:1/214: unlink d6/l18 0 2026-03-09T00:03:34.037 INFO:tasks.workunit.client.1.vm06.stdout:1/215: read d6/f28 [947269,26827] 0 2026-03-09T00:03:34.037 INFO:tasks.workunit.client.1.vm06.stdout:1/216: chown d6/l10 6597 1 2026-03-09T00:03:34.038 INFO:tasks.workunit.client.1.vm06.stdout:1/217: creat d6/d21/d2d/d37/f47 x:0 0 0 2026-03-09T00:03:34.038 INFO:tasks.workunit.client.1.vm06.stdout:4/201: dwrite d17/f35 [0,4194304] 0 2026-03-09T00:03:34.039 INFO:tasks.workunit.client.1.vm06.stdout:1/218: getdents d6/d21/d2d 0 2026-03-09T00:03:34.039 INFO:tasks.workunit.client.1.vm06.stdout:1/219: write d6/f34 [660349,46079] 0 2026-03-09T00:03:34.040 INFO:tasks.workunit.client.1.vm06.stdout:6/206: dwrite d4/d16/f1c [0,4194304] 0 2026-03-09T00:03:34.053 INFO:tasks.workunit.client.1.vm06.stdout:6/207: write d4/fc [830941,119321] 0 2026-03-09T00:03:34.053 INFO:tasks.workunit.client.1.vm06.stdout:3/210: dwrite d11/f1a [0,4194304] 0 2026-03-09T00:03:34.055 INFO:tasks.workunit.client.1.vm06.stdout:6/208: symlink d4/d27/l3a 0 2026-03-09T00:03:34.055 INFO:tasks.workunit.client.1.vm06.stdout:6/209: write d4/d27/d35/f39 [170248,67823] 0 2026-03-09T00:03:34.055 INFO:tasks.workunit.client.1.vm06.stdout:6/210: dread - d4/f26 zero size 2026-03-09T00:03:34.084 INFO:tasks.workunit.client.0.vm03.stdout:2/19: getdents . 0 2026-03-09T00:03:34.084 INFO:tasks.workunit.client.0.vm03.stdout:2/20: rmdir - no directory 2026-03-09T00:03:34.084 INFO:tasks.workunit.client.0.vm03.stdout:2/21: stat f4 0 2026-03-09T00:03:34.084 INFO:tasks.workunit.client.0.vm03.stdout:2/22: chown c0 4358 1 2026-03-09T00:03:34.085 INFO:tasks.workunit.client.1.vm06.stdout:9/186: dwrite d1/d3/d12/d21/d9/f10 [0,4194304] 0 2026-03-09T00:03:34.112 INFO:tasks.workunit.client.0.vm03.stdout:2/23: write f2 [3509939,21966] 0 2026-03-09T00:03:34.112 INFO:tasks.workunit.client.0.vm03.stdout:2/24: rmdir - no directory 2026-03-09T00:03:34.176 INFO:tasks.workunit.client.0.vm03.stdout:7/18: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:34.177 INFO:tasks.workunit.client.0.vm03.stdout:7/19: chown d2/d4 234274 1 2026-03-09T00:03:34.177 INFO:tasks.workunit.client.1.vm06.stdout:6/211: dwrite d4/d16/f34 [0,4194304] 0 2026-03-09T00:03:34.178 INFO:tasks.workunit.client.1.vm06.stdout:3/211: dwrite d11/f1e [0,4194304] 0 2026-03-09T00:03:34.178 INFO:tasks.workunit.client.1.vm06.stdout:9/187: dwrite d1/d3/d12/d21/d14/f20 [0,4194304] 0 2026-03-09T00:03:34.178 INFO:tasks.workunit.client.1.vm06.stdout:3/212: fsync d11/d28/d2e/f38 0 2026-03-09T00:03:34.183 INFO:tasks.workunit.client.0.vm03.stdout:7/20: write d2/f3 [857891,105633] 0 2026-03-09T00:03:34.189 INFO:tasks.workunit.client.0.vm03.stdout:3/15: read f1 [458914,45607] 0 2026-03-09T00:03:34.189 INFO:tasks.workunit.client.0.vm03.stdout:3/16: creat d2/f6 x:0 0 0 2026-03-09T00:03:34.190 INFO:tasks.workunit.client.0.vm03.stdout:3/17: creat d2/f7 x:0 0 0 2026-03-09T00:03:34.191 INFO:tasks.workunit.client.0.vm03.stdout:3/18: creat d2/f8 x:0 0 0 2026-03-09T00:03:34.194 INFO:tasks.workunit.client.0.vm03.stdout:2/25: dwrite f4 [0,4194304] 0 2026-03-09T00:03:34.194 INFO:tasks.workunit.client.0.vm03.stdout:2/26: getdents . 0 2026-03-09T00:03:34.207 INFO:tasks.workunit.client.0.vm03.stdout:6/19: truncate f2 1143519 0 2026-03-09T00:03:34.209 INFO:tasks.workunit.client.0.vm03.stdout:4/18: sync 2026-03-09T00:03:34.209 INFO:tasks.workunit.client.0.vm03.stdout:0/28: sync 2026-03-09T00:03:34.209 INFO:tasks.workunit.client.0.vm03.stdout:1/43: sync 2026-03-09T00:03:34.209 INFO:tasks.workunit.client.0.vm03.stdout:1/44: creat d4/f12 x:0 0 0 2026-03-09T00:03:34.210 INFO:tasks.workunit.client.0.vm03.stdout:9/23: rename l4 to l6 0 2026-03-09T00:03:34.210 INFO:tasks.workunit.client.0.vm03.stdout:9/24: truncate f5 459064 0 2026-03-09T00:03:34.210 INFO:tasks.workunit.client.0.vm03.stdout:9/25: chown l6 17623258 1 2026-03-09T00:03:34.213 INFO:tasks.workunit.client.0.vm03.stdout:6/20: unlink l5 0 2026-03-09T00:03:34.213 INFO:tasks.workunit.client.0.vm03.stdout:6/21: rmdir - no directory 2026-03-09T00:03:34.215 INFO:tasks.workunit.client.0.vm03.stdout:4/19: rename f2 to f4 0 2026-03-09T00:03:34.220 INFO:tasks.workunit.client.0.vm03.stdout:8/34: fsync f6 0 2026-03-09T00:03:34.220 INFO:tasks.workunit.client.0.vm03.stdout:8/35: dread - f6 zero size 2026-03-09T00:03:34.222 INFO:tasks.workunit.client.0.vm03.stdout:8/36: mknod d7/c8 0 2026-03-09T00:03:34.222 INFO:tasks.workunit.client.0.vm03.stdout:8/37: readlink l5 0 2026-03-09T00:03:34.222 INFO:tasks.workunit.client.0.vm03.stdout:9/26: write f5 [289773,63124] 0 2026-03-09T00:03:34.223 INFO:tasks.workunit.client.1.vm06.stdout:6/212: dread d4/d16/f34 [0,4194304] 0 2026-03-09T00:03:34.223 INFO:tasks.workunit.client.1.vm06.stdout:1/220: rmdir d6/d21/d2d/d37 39 2026-03-09T00:03:34.226 INFO:tasks.workunit.client.0.vm03.stdout:4/20: read f3 [3225776,53830] 0 2026-03-09T00:03:34.229 INFO:tasks.workunit.client.0.vm03.stdout:8/38: dread f3 [0,4194304] 0 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/221: readlink d6/d21/l2c 0 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/222: symlink d6/d21/d2d/d3b/l48 0 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/223: mknod d6/d21/c49 0 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/224: chown d6/d21/d2d/l32 51 1 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/225: chown d6/d21/f3d 122512 1 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/226: chown f0 3 1 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/227: truncate d6/d21/d2d/f31 41217 0 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/228: fdatasync d6/d21/d2d/d37/f47 0 2026-03-09T00:03:34.240 INFO:tasks.workunit.client.1.vm06.stdout:1/229: creat d6/d21/d2d/d3b/d42/d43/f4a x:0 0 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:8/39: dread f3 [0,4194304] 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/21: symlink l5 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:8/40: creat d7/f9 x:0 0 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/22: truncate f4 160839 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/23: mknod c6 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/24: readlink l5 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/25: unlink l5 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/26: read f3 [652796,1134] 0 2026-03-09T00:03:34.241 INFO:tasks.workunit.client.0.vm03.stdout:4/27: fsync f3 0 2026-03-09T00:03:34.242 INFO:tasks.workunit.client.0.vm03.stdout:4/28: mkdir d7 0 2026-03-09T00:03:34.243 INFO:tasks.workunit.client.0.vm03.stdout:4/29: creat d7/f8 x:0 0 0 2026-03-09T00:03:34.244 INFO:tasks.workunit.client.0.vm03.stdout:4/30: unlink c6 0 2026-03-09T00:03:34.248 INFO:tasks.workunit.client.0.vm03.stdout:4/31: dread f3 [0,4194304] 0 2026-03-09T00:03:34.248 INFO:tasks.workunit.client.0.vm03.stdout:2/27: dread f2 [0,4194304] 0 2026-03-09T00:03:34.249 INFO:tasks.workunit.client.0.vm03.stdout:4/32: unlink l1 0 2026-03-09T00:03:34.249 INFO:tasks.workunit.client.0.vm03.stdout:4/33: readlink - no filename 2026-03-09T00:03:34.249 INFO:tasks.workunit.client.0.vm03.stdout:2/28: mknod c5 0 2026-03-09T00:03:34.250 INFO:tasks.workunit.client.0.vm03.stdout:4/34: mknod d7/c9 0 2026-03-09T00:03:34.257 INFO:tasks.workunit.client.1.vm06.stdout:1/230: write d6/f7 [100200,82942] 0 2026-03-09T00:03:34.257 INFO:tasks.workunit.client.1.vm06.stdout:1/231: creat d6/d21/d2d/d3b/d42/d43/f4b x:0 0 0 2026-03-09T00:03:34.276 INFO:tasks.workunit.client.0.vm03.stdout:3/19: dwrite d2/f5 [0,4194304] 0 2026-03-09T00:03:34.276 INFO:tasks.workunit.client.0.vm03.stdout:0/29: dwrite d2/f9 [0,4194304] 0 2026-03-09T00:03:34.283 INFO:tasks.workunit.client.0.vm03.stdout:3/20: write f0 [1771932,102259] 0 2026-03-09T00:03:34.283 INFO:tasks.workunit.client.0.vm03.stdout:3/21: truncate d2/f7 414943 0 2026-03-09T00:03:34.286 INFO:tasks.workunit.client.0.vm03.stdout:1/45: dwrite f1 [0,4194304] 0 2026-03-09T00:03:34.288 INFO:tasks.workunit.client.0.vm03.stdout:0/30: mkdir d2/da 0 2026-03-09T00:03:34.289 INFO:tasks.workunit.client.0.vm03.stdout:3/22: rename d2/f7 to d2/f9 0 2026-03-09T00:03:34.291 INFO:tasks.workunit.client.0.vm03.stdout:1/46: dread f1 [0,4194304] 0 2026-03-09T00:03:34.291 INFO:tasks.workunit.client.0.vm03.stdout:1/47: dread - d4/f12 zero size 2026-03-09T00:03:34.291 INFO:tasks.workunit.client.0.vm03.stdout:1/48: dread - d4/fb zero size 2026-03-09T00:03:34.291 INFO:tasks.workunit.client.0.vm03.stdout:1/49: read - f0 zero size 2026-03-09T00:03:34.293 INFO:tasks.workunit.client.0.vm03.stdout:0/31: dread d2/f9 [0,4194304] 0 2026-03-09T00:03:34.294 INFO:tasks.workunit.client.0.vm03.stdout:1/50: read d4/d6/fa [99786,31544] 0 2026-03-09T00:03:34.295 INFO:tasks.workunit.client.0.vm03.stdout:1/51: write d4/d6/ff [993038,60001] 0 2026-03-09T00:03:34.295 INFO:tasks.workunit.client.0.vm03.stdout:0/32: stat d2/l6 0 2026-03-09T00:03:34.298 INFO:tasks.workunit.client.0.vm03.stdout:0/33: creat d2/fb x:0 0 0 2026-03-09T00:03:34.314 INFO:tasks.workunit.client.0.vm03.stdout:1/52: dread d4/d6/fa [0,4194304] 0 2026-03-09T00:03:34.314 INFO:tasks.workunit.client.0.vm03.stdout:1/53: chown d4/l7 3332161 1 2026-03-09T00:03:34.314 INFO:tasks.workunit.client.0.vm03.stdout:1/54: chown f1 1906938608 1 2026-03-09T00:03:34.319 INFO:tasks.workunit.client.0.vm03.stdout:9/27: dwrite f5 [0,4194304] 0 2026-03-09T00:03:34.327 INFO:tasks.workunit.client.0.vm03.stdout:1/55: rmdir d4/d6 39 2026-03-09T00:03:34.329 INFO:tasks.workunit.client.0.vm03.stdout:1/56: symlink d4/l13 0 2026-03-09T00:03:34.331 INFO:tasks.workunit.client.0.vm03.stdout:1/57: dread f2 [0,4194304] 0 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/25: sync 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/26: fdatasync f6 0 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:1/58: dwrite f0 [0,4194304] 0 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/27: readlink l5 0 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/28: write f6 [722426,95004] 0 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/29: rmdir - no directory 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/30: readlink l5 0 2026-03-09T00:03:34.369 INFO:tasks.workunit.client.0.vm03.stdout:5/31: stat c7 0 2026-03-09T00:03:34.371 INFO:tasks.workunit.client.1.vm06.stdout:9/188: rename d1/d3/d12/d21/f8 to d1/d3/d12/f3b 0 2026-03-09T00:03:34.371 INFO:tasks.workunit.client.1.vm06.stdout:9/189: dread - d1/f2a zero size 2026-03-09T00:03:34.371 INFO:tasks.workunit.client.1.vm06.stdout:9/190: read - d1/d4/f2d zero size 2026-03-09T00:03:34.371 INFO:tasks.workunit.client.1.vm06.stdout:9/191: dread - d1/f2a zero size 2026-03-09T00:03:34.371 INFO:tasks.workunit.client.0.vm03.stdout:5/32: rename l5 to l9 0 2026-03-09T00:03:34.372 INFO:tasks.workunit.client.1.vm06.stdout:9/192: getdents d1 0 2026-03-09T00:03:34.372 INFO:tasks.workunit.client.1.vm06.stdout:9/193: dread - d1/f1c zero size 2026-03-09T00:03:34.374 INFO:tasks.workunit.client.1.vm06.stdout:3/213: rename d11/f18 to d11/d28/d2e/d2f/f3e 0 2026-03-09T00:03:34.375 INFO:tasks.workunit.client.1.vm06.stdout:1/232: mkdir d6/d4c 0 2026-03-09T00:03:34.377 INFO:tasks.workunit.client.1.vm06.stdout:9/194: mknod d1/c3c 0 2026-03-09T00:03:34.379 INFO:tasks.workunit.client.1.vm06.stdout:3/214: mkdir d11/d3f 0 2026-03-09T00:03:34.379 INFO:tasks.workunit.client.1.vm06.stdout:3/215: fdatasync d11/d28/f29 0 2026-03-09T00:03:34.381 INFO:tasks.workunit.client.1.vm06.stdout:3/216: symlink d11/d28/l40 0 2026-03-09T00:03:34.382 INFO:tasks.workunit.client.1.vm06.stdout:3/217: mknod d11/d28/d2e/c41 0 2026-03-09T00:03:34.383 INFO:tasks.workunit.client.1.vm06.stdout:3/218: rmdir d11/d28/d2e/d2f 39 2026-03-09T00:03:34.383 INFO:tasks.workunit.client.1.vm06.stdout:3/219: readlink d11/d28/d2e/d2f/l39 0 2026-03-09T00:03:34.383 INFO:tasks.workunit.client.1.vm06.stdout:3/220: fsync d11/f24 0 2026-03-09T00:03:34.383 INFO:tasks.workunit.client.0.vm03.stdout:8/41: getdents d7 0 2026-03-09T00:03:34.384 INFO:tasks.workunit.client.0.vm03.stdout:8/42: stat f3 0 2026-03-09T00:03:34.384 INFO:tasks.workunit.client.0.vm03.stdout:8/43: dread - f6 zero size 2026-03-09T00:03:34.385 INFO:tasks.workunit.client.1.vm06.stdout:3/221: truncate f8 1544700 0 2026-03-09T00:03:34.385 INFO:tasks.workunit.client.0.vm03.stdout:4/35: write f4 [1024349,95934] 0 2026-03-09T00:03:34.386 INFO:tasks.workunit.client.0.vm03.stdout:8/44: link l4 d7/la 0 2026-03-09T00:03:34.386 INFO:tasks.workunit.client.0.vm03.stdout:4/36: mkdir d7/da 0 2026-03-09T00:03:34.388 INFO:tasks.workunit.client.0.vm03.stdout:8/45: write d7/f9 [902089,73906] 0 2026-03-09T00:03:34.390 INFO:tasks.workunit.client.1.vm06.stdout:3/222: rename d11/d28/f29 to d11/d28/f42 0 2026-03-09T00:03:34.397 INFO:tasks.workunit.client.0.vm03.stdout:8/46: write f3 [3867483,78590] 0 2026-03-09T00:03:34.411 INFO:tasks.workunit.client.0.vm03.stdout:8/47: dread f3 [0,4194304] 0 2026-03-09T00:03:34.412 INFO:tasks.workunit.client.0.vm03.stdout:8/48: unlink c2 0 2026-03-09T00:03:34.412 INFO:tasks.workunit.client.0.vm03.stdout:8/49: write d7/f9 [338361,13057] 0 2026-03-09T00:03:34.415 INFO:tasks.workunit.client.1.vm06.stdout:1/233: dwrite d6/f19 [0,4194304] 0 2026-03-09T00:03:34.415 INFO:tasks.workunit.client.0.vm03.stdout:8/50: mknod d7/cb 0 2026-03-09T00:03:34.415 INFO:tasks.workunit.client.0.vm03.stdout:8/51: write f6 [629011,36268] 0 2026-03-09T00:03:34.415 INFO:tasks.workunit.client.0.vm03.stdout:8/52: write f6 [1302316,20037] 0 2026-03-09T00:03:34.420 INFO:tasks.workunit.client.1.vm06.stdout:1/234: fdatasync d6/f28 0 2026-03-09T00:03:34.479 INFO:tasks.workunit.client.0.vm03.stdout:8/53: dwrite f3 [0,4194304] 0 2026-03-09T00:03:34.479 INFO:tasks.workunit.client.0.vm03.stdout:4/37: dwrite f4 [0,4194304] 0 2026-03-09T00:03:34.479 INFO:tasks.workunit.client.0.vm03.stdout:4/38: fsync f3 0 2026-03-09T00:03:34.480 INFO:tasks.workunit.client.0.vm03.stdout:4/39: rmdir d7/da 0 2026-03-09T00:03:34.480 INFO:tasks.workunit.client.0.vm03.stdout:4/40: dread - d7/f8 zero size 2026-03-09T00:03:34.481 INFO:tasks.workunit.client.0.vm03.stdout:8/54: link l5 d7/lc 0 2026-03-09T00:03:34.486 INFO:tasks.workunit.client.0.vm03.stdout:4/41: dread f4 [0,4194304] 0 2026-03-09T00:03:34.486 INFO:tasks.workunit.client.0.vm03.stdout:4/42: readlink - no filename 2026-03-09T00:03:34.491 INFO:tasks.workunit.client.1.vm06.stdout:9/195: unlink d1/d3/l29 0 2026-03-09T00:03:34.492 INFO:tasks.workunit.client.1.vm06.stdout:9/196: fdatasync d1/d4/f2d 0 2026-03-09T00:03:34.492 INFO:tasks.workunit.client.1.vm06.stdout:9/197: creat d1/d3/d12/d21/d9/f3d x:0 0 0 2026-03-09T00:03:34.492 INFO:tasks.workunit.client.1.vm06.stdout:9/198: creat d1/d3/d12/d21/f3e x:0 0 0 2026-03-09T00:03:34.496 INFO:tasks.workunit.client.1.vm06.stdout:9/199: symlink d1/d3/d12/d21/d14/l3f 0 2026-03-09T00:03:34.498 INFO:tasks.workunit.client.1.vm06.stdout:9/200: truncate d1/f16 148670 0 2026-03-09T00:03:34.499 INFO:tasks.workunit.client.1.vm06.stdout:9/201: link d1/d4/f6 d1/d3/d12/d21/d9/f40 0 2026-03-09T00:03:34.500 INFO:tasks.workunit.client.1.vm06.stdout:9/202: dread - d1/d4/f39 zero size 2026-03-09T00:03:34.500 INFO:tasks.workunit.client.0.vm03.stdout:7/21: sync 2026-03-09T00:03:34.500 INFO:tasks.workunit.client.0.vm03.stdout:6/22: sync 2026-03-09T00:03:34.500 INFO:tasks.workunit.client.0.vm03.stdout:1/59: write d4/d6/fa [4984935,84325] 0 2026-03-09T00:03:34.501 INFO:tasks.workunit.client.0.vm03.stdout:7/22: mknod d2/c8 0 2026-03-09T00:03:34.508 INFO:tasks.workunit.client.0.vm03.stdout:6/23: truncate f3 878241 0 2026-03-09T00:03:34.517 INFO:tasks.workunit.client.1.vm06.stdout:9/203: mknod d1/d3/c41 0 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:9/204: symlink d1/d4/l42 0 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:8/237: sync 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:7/240: sync 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:8/238: stat db/d1e/l2b 0 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:7/241: fdatasync d0/df/d1a/d22/f2c 0 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:7/242: chown d0/df/d1a 61082 1 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:4/202: sync 2026-03-09T00:03:34.529 INFO:tasks.workunit.client.1.vm06.stdout:2/306: sync 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.1.vm06.stdout:5/326: sync 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.1.vm06.stdout:0/255: sync 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:3/23: rmdir d2 39 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:3/24: write d2/f9 [1085308,34002] 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:1/60: symlink d4/l14 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:6/24: link c4 c6 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:1/61: mkdir d4/d15 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:3/25: symlink d2/la 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:3/26: write d2/f9 [1124760,53514] 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:3/27: write d2/f6 [827978,42226] 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:2/29: sync 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:9/28: sync 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:9/29: creat f7 x:0 0 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:9/30: chown f5 231161 1 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:9/31: rmdir - no directory 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:9/32: creat f8 x:0 0 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:9/33: truncate f8 900939 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.0.vm03.stdout:5/33: rename l9 to la 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.1.vm06.stdout:7/243: dread d0/df/d1a/d22/f28 [0,4194304] 0 2026-03-09T00:03:34.530 INFO:tasks.workunit.client.1.vm06.stdout:7/244: chown d0/df/d1a/d22/c24 4266562 1 2026-03-09T00:03:34.532 INFO:tasks.workunit.client.0.vm03.stdout:2/30: creat f6 x:0 0 0 2026-03-09T00:03:34.532 INFO:tasks.workunit.client.0.vm03.stdout:2/31: write f6 [890780,80794] 0 2026-03-09T00:03:34.532 INFO:tasks.workunit.client.1.vm06.stdout:9/205: dread d1/d4/f6 [0,4194304] 0 2026-03-09T00:03:34.533 INFO:tasks.workunit.client.1.vm06.stdout:4/203: write d17/d24/f2c [38528,27356] 0 2026-03-09T00:03:34.533 INFO:tasks.workunit.client.1.vm06.stdout:4/204: write d17/d24/f29 [837240,3690] 0 2026-03-09T00:03:34.534 INFO:tasks.workunit.client.1.vm06.stdout:8/239: mkdir db/dd/d24/d36/d38/d47 0 2026-03-09T00:03:34.538 INFO:tasks.workunit.client.1.vm06.stdout:6/213: creat d4/f3b x:0 0 0 2026-03-09T00:03:34.542 INFO:tasks.workunit.client.1.vm06.stdout:2/307: dread d7/da/f18 [0,4194304] 0 2026-03-09T00:03:34.542 INFO:tasks.workunit.client.1.vm06.stdout:0/256: rename d3/d18/d28/f2b to d3/d18/d1f/d44/f5a 0 2026-03-09T00:03:34.542 INFO:tasks.workunit.client.0.vm03.stdout:9/34: mknod c9 0 2026-03-09T00:03:34.543 INFO:tasks.workunit.client.0.vm03.stdout:9/35: rmdir - no directory 2026-03-09T00:03:34.544 INFO:tasks.workunit.client.0.vm03.stdout:5/34: dread f3 [0,4194304] 0 2026-03-09T00:03:34.546 INFO:tasks.workunit.client.0.vm03.stdout:2/32: truncate f2 68990 0 2026-03-09T00:03:34.558 INFO:tasks.workunit.client.0.vm03.stdout:2/33: creat f7 x:0 0 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.1.vm06.stdout:9/206: creat d1/d4/d2f/f43 x:0 0 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.1.vm06.stdout:4/205: symlink d17/l3c 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.1.vm06.stdout:6/214: getdents d4/d27 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.1.vm06.stdout:6/215: chown l0 0 1 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.1.vm06.stdout:2/308: creat d7/da/db/de/f53 x:0 0 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.0.vm03.stdout:9/36: read f5 [1324560,92015] 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.0.vm03.stdout:2/34: mkdir d8 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.0.vm03.stdout:9/37: symlink la 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.0.vm03.stdout:5/35: rename f6 to fb 0 2026-03-09T00:03:34.559 INFO:tasks.workunit.client.0.vm03.stdout:2/35: creat d8/f9 x:0 0 0 2026-03-09T00:03:34.560 INFO:tasks.workunit.client.0.vm03.stdout:1/62: fsync f2 0 2026-03-09T00:03:34.560 INFO:tasks.workunit.client.0.vm03.stdout:1/63: write d4/fb [168215,18883] 0 2026-03-09T00:03:34.560 INFO:tasks.workunit.client.0.vm03.stdout:1/64: write f2 [5679031,17676] 0 2026-03-09T00:03:34.560 INFO:tasks.workunit.client.0.vm03.stdout:1/65: chown d4/d6/l11 232224 1 2026-03-09T00:03:34.561 INFO:tasks.workunit.client.1.vm06.stdout:6/216: dread d4/ff [0,4194304] 0 2026-03-09T00:03:34.562 INFO:tasks.workunit.client.0.vm03.stdout:8/55: dwrite f6 [0,4194304] 0 2026-03-09T00:03:34.562 INFO:tasks.workunit.client.1.vm06.stdout:0/257: mknod d3/d18/d1f/c5b 0 2026-03-09T00:03:34.565 INFO:tasks.workunit.client.1.vm06.stdout:7/245: rmdir d0/df/d1a/d27 39 2026-03-09T00:03:34.567 INFO:tasks.workunit.client.1.vm06.stdout:9/207: rename d1/d4/f2d to d1/d4/f44 0 2026-03-09T00:03:34.580 INFO:tasks.workunit.client.1.vm06.stdout:9/208: truncate d1/d3/d12/f3b 4635316 0 2026-03-09T00:03:34.580 INFO:tasks.workunit.client.1.vm06.stdout:4/206: unlink f16 0 2026-03-09T00:03:34.586 INFO:tasks.workunit.client.1.vm06.stdout:8/240: mkdir db/dd/d48 0 2026-03-09T00:03:34.586 INFO:tasks.workunit.client.1.vm06.stdout:2/309: mknod d7/d1a/d25/c54 0 2026-03-09T00:03:34.586 INFO:tasks.workunit.client.1.vm06.stdout:9/209: read d1/d3/d12/d21/d9/f10 [3354133,41229] 0 2026-03-09T00:03:34.587 INFO:tasks.workunit.client.1.vm06.stdout:6/217: mknod d4/d16/c3c 0 2026-03-09T00:03:34.587 INFO:tasks.workunit.client.1.vm06.stdout:6/218: creat d4/f3d x:0 0 0 2026-03-09T00:03:34.587 INFO:tasks.workunit.client.1.vm06.stdout:6/219: fsync d4/fb 0 2026-03-09T00:03:34.587 INFO:tasks.workunit.client.1.vm06.stdout:6/220: dread - d4/f26 zero size 2026-03-09T00:03:34.587 INFO:tasks.workunit.client.1.vm06.stdout:6/221: readlink d4/d27/l2c 0 2026-03-09T00:03:34.587 INFO:tasks.workunit.client.1.vm06.stdout:0/258: truncate d3/f1b 2711258 0 2026-03-09T00:03:34.610 INFO:tasks.workunit.client.1.vm06.stdout:7/246: rename d0/fa to d0/df/d1a/d3a/f3c 0 2026-03-09T00:03:34.610 INFO:tasks.workunit.client.1.vm06.stdout:7/247: stat d0/df/d1a/d27 0 2026-03-09T00:03:34.610 INFO:tasks.workunit.client.1.vm06.stdout:8/241: mknod db/dd/d24/d36/d38/d47/c49 0 2026-03-09T00:03:34.610 INFO:tasks.workunit.client.1.vm06.stdout:8/242: write db/f31 [111421,6534] 0 2026-03-09T00:03:34.610 INFO:tasks.workunit.client.0.vm03.stdout:0/34: dwrite d2/f9 [4194304,4194304] 0 2026-03-09T00:03:34.612 INFO:tasks.workunit.client.0.vm03.stdout:3/28: fsync d2/f9 0 2026-03-09T00:03:34.612 INFO:tasks.workunit.client.0.vm03.stdout:7/23: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:34.612 INFO:tasks.workunit.client.1.vm06.stdout:2/310: mkdir d7/da/d55 0 2026-03-09T00:03:34.615 INFO:tasks.workunit.client.1.vm06.stdout:7/248: dread d0/df/d17/f1f [0,4194304] 0 2026-03-09T00:03:34.618 INFO:tasks.workunit.client.0.vm03.stdout:1/66: mknod d4/d6/c16 0 2026-03-09T00:03:34.619 INFO:tasks.workunit.client.0.vm03.stdout:9/38: dwrite f5 [0,4194304] 0 2026-03-09T00:03:34.625 INFO:tasks.workunit.client.0.vm03.stdout:0/35: creat d2/da/fc x:0 0 0 2026-03-09T00:03:34.628 INFO:tasks.workunit.client.1.vm06.stdout:8/243: creat db/dd/d48/f4a x:0 0 0 2026-03-09T00:03:34.628 INFO:tasks.workunit.client.1.vm06.stdout:0/259: truncate d3/d18/f14 781207 0 2026-03-09T00:03:34.628 INFO:tasks.workunit.client.1.vm06.stdout:8/244: fsync db/dd/f27 0 2026-03-09T00:03:34.628 INFO:tasks.workunit.client.1.vm06.stdout:8/245: write db/dd/f1c [230346,4984] 0 2026-03-09T00:03:34.631 INFO:tasks.workunit.client.0.vm03.stdout:7/24: mknod d2/c9 0 2026-03-09T00:03:34.631 INFO:tasks.workunit.client.0.vm03.stdout:1/67: creat d4/d15/f17 x:0 0 0 2026-03-09T00:03:34.632 INFO:tasks.workunit.client.0.vm03.stdout:8/56: link f3 d7/fd 0 2026-03-09T00:03:34.635 INFO:tasks.workunit.client.0.vm03.stdout:9/39: rename f7 to fb 0 2026-03-09T00:03:34.651 INFO:tasks.workunit.client.0.vm03.stdout:9/40: dread - fb zero size 2026-03-09T00:03:34.651 INFO:tasks.workunit.client.0.vm03.stdout:9/41: fsync f5 0 2026-03-09T00:03:34.651 INFO:tasks.workunit.client.0.vm03.stdout:9/42: write fb [95889,28093] 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.1.vm06.stdout:2/311: rename d7/d4b to d7/d1a/d56 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.1.vm06.stdout:8/246: getdents db/dd/d24/d36 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:9/43: write fb [35444,95252] 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:0/36: mkdir d2/da/dd 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:3/29: mkdir d2/db 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:1/68: link d4/fb d4/d15/f18 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:0/37: creat d2/fe x:0 0 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:1/69: mknod d4/d15/c19 0 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:1/70: chown d4/d6/c16 9739 1 2026-03-09T00:03:34.652 INFO:tasks.workunit.client.0.vm03.stdout:8/57: mknod d7/ce 0 2026-03-09T00:03:34.654 INFO:tasks.workunit.client.1.vm06.stdout:5/327: dwrite d5/d1c/d23/f4f [4194304,4194304] 0 2026-03-09T00:03:34.654 INFO:tasks.workunit.client.1.vm06.stdout:5/328: read - d5/d1c/d21/d28/d35/f46 zero size 2026-03-09T00:03:34.655 INFO:tasks.workunit.client.1.vm06.stdout:2/312: dread d7/f8 [4194304,4194304] 0 2026-03-09T00:03:34.655 INFO:tasks.workunit.client.1.vm06.stdout:2/313: readlink d7/d1a/d3c/l42 0 2026-03-09T00:03:34.655 INFO:tasks.workunit.client.1.vm06.stdout:2/314: chown d7/d1a/d39 95940240 1 2026-03-09T00:03:34.655 INFO:tasks.workunit.client.1.vm06.stdout:2/315: write d7/da/db/de/f53 [460289,78315] 0 2026-03-09T00:03:34.658 INFO:tasks.workunit.client.0.vm03.stdout:0/38: creat d2/ff x:0 0 0 2026-03-09T00:03:34.663 INFO:tasks.workunit.client.1.vm06.stdout:8/247: link db/dd/d24/f33 db/d1e/d46/f4b 0 2026-03-09T00:03:34.663 INFO:tasks.workunit.client.0.vm03.stdout:3/30: link d2/f6 d2/fc 0 2026-03-09T00:03:34.663 INFO:tasks.workunit.client.0.vm03.stdout:3/31: truncate d2/f8 1015241 0 2026-03-09T00:03:34.663 INFO:tasks.workunit.client.0.vm03.stdout:3/32: fsync f1 0 2026-03-09T00:03:34.666 INFO:tasks.workunit.client.1.vm06.stdout:2/316: rename d7/da/d1c/d47 to d7/da/d4e/d57 0 2026-03-09T00:03:34.669 INFO:tasks.workunit.client.1.vm06.stdout:5/329: dread d5/d44/f4a [4194304,4194304] 0 2026-03-09T00:03:34.669 INFO:tasks.workunit.client.1.vm06.stdout:5/330: truncate d5/d1c/d23/f4c 2959 0 2026-03-09T00:03:34.669 INFO:tasks.workunit.client.1.vm06.stdout:5/331: chown d5/d1c/d21/d28/f56 58166 1 2026-03-09T00:03:34.673 INFO:tasks.workunit.client.0.vm03.stdout:1/71: mkdir d4/d15/d1a 0 2026-03-09T00:03:34.673 INFO:tasks.workunit.client.1.vm06.stdout:0/260: dread d3/f11 [0,4194304] 0 2026-03-09T00:03:34.673 INFO:tasks.workunit.client.0.vm03.stdout:1/72: read f2 [5372179,111461] 0 2026-03-09T00:03:34.683 INFO:tasks.workunit.client.1.vm06.stdout:8/248: chown db/dd/l15 4263227 1 2026-03-09T00:03:34.690 INFO:tasks.workunit.client.0.vm03.stdout:7/25: dread d2/f3 [0,4194304] 0 2026-03-09T00:03:34.690 INFO:tasks.workunit.client.0.vm03.stdout:8/58: mkdir d7/df 0 2026-03-09T00:03:34.690 INFO:tasks.workunit.client.1.vm06.stdout:2/317: mknod d7/da/db/c58 0 2026-03-09T00:03:34.690 INFO:tasks.workunit.client.1.vm06.stdout:5/332: rename d5/d1c/d21/d2a to d5/d1c/d68 0 2026-03-09T00:03:34.690 INFO:tasks.workunit.client.1.vm06.stdout:5/333: write d5/d1c/d68/f31 [955564,18404] 0 2026-03-09T00:03:34.692 INFO:tasks.workunit.client.0.vm03.stdout:8/59: dread d7/fd [0,4194304] 0 2026-03-09T00:03:34.693 INFO:tasks.workunit.client.0.vm03.stdout:0/39: link d2/c8 d2/da/c10 0 2026-03-09T00:03:34.696 INFO:tasks.workunit.client.1.vm06.stdout:6/222: dwrite d4/f12 [0,4194304] 0 2026-03-09T00:03:34.696 INFO:tasks.workunit.client.1.vm06.stdout:6/223: write d4/d27/f2e [1874034,60113] 0 2026-03-09T00:03:34.697 INFO:tasks.workunit.client.1.vm06.stdout:0/261: symlink d3/d18/d28/d45/l5c 0 2026-03-09T00:03:34.702 INFO:tasks.workunit.client.0.vm03.stdout:6/25: write f3 [1347948,60369] 0 2026-03-09T00:03:34.702 INFO:tasks.workunit.client.0.vm03.stdout:6/26: chown c6 12 1 2026-03-09T00:03:34.706 INFO:tasks.workunit.client.0.vm03.stdout:2/36: fsync f2 0 2026-03-09T00:03:34.711 INFO:tasks.workunit.client.0.vm03.stdout:5/36: getdents . 0 2026-03-09T00:03:34.711 INFO:tasks.workunit.client.0.vm03.stdout:7/26: unlink d2/l7 0 2026-03-09T00:03:34.711 INFO:tasks.workunit.client.1.vm06.stdout:1/235: sync 2026-03-09T00:03:34.711 INFO:tasks.workunit.client.1.vm06.stdout:3/223: sync 2026-03-09T00:03:34.712 INFO:tasks.workunit.client.0.vm03.stdout:0/40: chown d2/f7 28625375 1 2026-03-09T00:03:34.712 INFO:tasks.workunit.client.0.vm03.stdout:0/41: readlink d2/l6 0 2026-03-09T00:03:34.714 INFO:tasks.workunit.client.1.vm06.stdout:2/318: symlink d7/d1a/d25/l59 0 2026-03-09T00:03:34.716 INFO:tasks.workunit.client.1.vm06.stdout:4/207: dwrite d17/f1f [0,4194304] 0 2026-03-09T00:03:34.716 INFO:tasks.workunit.client.1.vm06.stdout:4/208: dread - d17/d24/f31 zero size 2026-03-09T00:03:34.716 INFO:tasks.workunit.client.1.vm06.stdout:4/209: truncate d17/f1d 647591 0 2026-03-09T00:03:34.716 INFO:tasks.workunit.client.1.vm06.stdout:4/210: write f1 [8674951,98565] 0 2026-03-09T00:03:34.716 INFO:tasks.workunit.client.0.vm03.stdout:5/37: symlink lc 0 2026-03-09T00:03:34.716 INFO:tasks.workunit.client.1.vm06.stdout:7/249: dwrite d0/f14 [0,4194304] 0 2026-03-09T00:03:34.719 INFO:tasks.workunit.client.0.vm03.stdout:4/43: dwrite f4 [4194304,4194304] 0 2026-03-09T00:03:34.723 INFO:tasks.workunit.client.1.vm06.stdout:4/211: write d17/f35 [284742,21654] 0 2026-03-09T00:03:34.723 INFO:tasks.workunit.client.1.vm06.stdout:4/212: chown d17/d24/f2c 346 1 2026-03-09T00:03:34.727 INFO:tasks.workunit.client.0.vm03.stdout:6/27: dread f3 [0,4194304] 0 2026-03-09T00:03:34.727 INFO:tasks.workunit.client.0.vm03.stdout:6/28: write f3 [2057875,73938] 0 2026-03-09T00:03:34.727 INFO:tasks.workunit.client.1.vm06.stdout:3/224: dread d11/f1a [0,4194304] 0 2026-03-09T00:03:34.728 INFO:tasks.workunit.client.0.vm03.stdout:3/33: dwrite f0 [0,4194304] 0 2026-03-09T00:03:34.733 INFO:tasks.workunit.client.0.vm03.stdout:7/27: mkdir d2/d4/da 0 2026-03-09T00:03:34.735 INFO:tasks.workunit.client.0.vm03.stdout:4/44: read f4 [8025145,1551] 0 2026-03-09T00:03:34.746 INFO:tasks.workunit.client.0.vm03.stdout:7/28: read d2/f3 [3798832,35399] 0 2026-03-09T00:03:34.746 INFO:tasks.workunit.client.0.vm03.stdout:7/29: chown d2/c8 285 1 2026-03-09T00:03:34.749 INFO:tasks.workunit.client.1.vm06.stdout:4/213: dread d17/f35 [0,4194304] 0 2026-03-09T00:03:34.776 INFO:tasks.workunit.client.1.vm06.stdout:4/214: write d17/f35 [5117995,41296] 0 2026-03-09T00:03:34.776 INFO:tasks.workunit.client.1.vm06.stdout:1/236: fsync d6/f34 0 2026-03-09T00:03:34.776 INFO:tasks.workunit.client.1.vm06.stdout:8/249: rename db/c21 to db/dd/d24/d36/d38/c4c 0 2026-03-09T00:03:34.776 INFO:tasks.workunit.client.1.vm06.stdout:7/250: mknod d0/df/d1a/d22/c3d 0 2026-03-09T00:03:34.776 INFO:tasks.workunit.client.1.vm06.stdout:7/251: chown d0/d39 1768 1 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.1.vm06.stdout:6/224: rmdir d4 39 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.1.vm06.stdout:6/225: write d4/d16/f21 [693961,78052] 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.1.vm06.stdout:1/237: mkdir d6/d21/d2d/d3b/d42/d43/d4d 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.1.vm06.stdout:5/334: rename d5/d1c/d21/f32 to d5/d1c/d21/d28/d5e/f69 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:9/44: getdents . 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:0/42: creat d2/da/dd/f11 x:0 0 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:2/37: getdents d8 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:5/38: creat fd x:0 0 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:3/34: creat d2/fd x:0 0 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:3/35: readlink d2/la 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:4/45: unlink d7/c9 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:4/46: write f3 [5057460,98664] 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:4/47: dread - d7/f8 zero size 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:4/48: readlink - no filename 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:4/49: creat d7/fb x:0 0 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:4/50: write f3 [4610210,122445] 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:7/30: unlink d2/l6 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:9/45: creat fc x:0 0 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.1.vm06.stdout:8/250: getdents db/d1e/d46 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:0/43: mknod d2/da/dd/c12 0 2026-03-09T00:03:34.777 INFO:tasks.workunit.client.0.vm03.stdout:0/44: truncate d2/da/dd/f11 981190 0 2026-03-09T00:03:34.778 INFO:tasks.workunit.client.1.vm06.stdout:6/226: mkdir d4/d27/d3e 0 2026-03-09T00:03:34.778 INFO:tasks.workunit.client.0.vm03.stdout:2/38: mknod d8/ca 0 2026-03-09T00:03:34.779 INFO:tasks.workunit.client.1.vm06.stdout:4/215: dread d17/f1f [0,4194304] 0 2026-03-09T00:03:34.780 INFO:tasks.workunit.client.0.vm03.stdout:0/45: dread f0 [0,4194304] 0 2026-03-09T00:03:34.780 INFO:tasks.workunit.client.1.vm06.stdout:3/225: symlink d11/d28/l43 0 2026-03-09T00:03:34.780 INFO:tasks.workunit.client.1.vm06.stdout:5/335: symlink d5/d1c/d21/d28/d5e/l6a 0 2026-03-09T00:03:34.784 INFO:tasks.workunit.client.1.vm06.stdout:6/227: link d4/d16/c17 d4/d27/c3f 0 2026-03-09T00:03:34.786 INFO:tasks.workunit.client.1.vm06.stdout:3/226: creat d11/d28/d2e/d2f/d36/f44 x:0 0 0 2026-03-09T00:03:34.788 INFO:tasks.workunit.client.1.vm06.stdout:6/228: truncate d4/d16/f1c 4149514 0 2026-03-09T00:03:34.793 INFO:tasks.workunit.client.0.vm03.stdout:9/46: creat fd x:0 0 0 2026-03-09T00:03:34.793 INFO:tasks.workunit.client.0.vm03.stdout:2/39: creat d8/fb x:0 0 0 2026-03-09T00:03:34.793 INFO:tasks.workunit.client.1.vm06.stdout:3/227: symlink d11/d3f/l45 0 2026-03-09T00:03:34.793 INFO:tasks.workunit.client.1.vm06.stdout:3/228: write d11/f24 [1028565,5049] 0 2026-03-09T00:03:34.796 INFO:tasks.workunit.client.0.vm03.stdout:6/29: dread f2 [0,4194304] 0 2026-03-09T00:03:34.797 INFO:tasks.workunit.client.0.vm03.stdout:5/39: rename fd to fe 0 2026-03-09T00:03:34.797 INFO:tasks.workunit.client.0.vm03.stdout:5/40: truncate fe 604981 0 2026-03-09T00:03:34.797 INFO:tasks.workunit.client.0.vm03.stdout:5/41: write fe [1169964,126038] 0 2026-03-09T00:03:34.797 INFO:tasks.workunit.client.0.vm03.stdout:6/30: dread f3 [0,4194304] 0 2026-03-09T00:03:34.798 INFO:tasks.workunit.client.0.vm03.stdout:3/36: link d2/c3 d2/db/ce 0 2026-03-09T00:03:34.798 INFO:tasks.workunit.client.1.vm06.stdout:2/319: dwrite f2 [0,4194304] 0 2026-03-09T00:03:34.798 INFO:tasks.workunit.client.1.vm06.stdout:2/320: write d7/f4c [226182,36309] 0 2026-03-09T00:03:34.798 INFO:tasks.workunit.client.0.vm03.stdout:3/37: dread d2/f8 [0,4194304] 0 2026-03-09T00:03:34.799 INFO:tasks.workunit.client.0.vm03.stdout:0/46: getdents d2 0 2026-03-09T00:03:34.799 INFO:tasks.workunit.client.0.vm03.stdout:0/47: chown d2/l3 434 1 2026-03-09T00:03:34.799 INFO:tasks.workunit.client.0.vm03.stdout:0/48: read f0 [698263,111536] 0 2026-03-09T00:03:34.799 INFO:tasks.workunit.client.0.vm03.stdout:0/49: truncate d2/ff 511062 0 2026-03-09T00:03:34.801 INFO:tasks.workunit.client.0.vm03.stdout:5/42: creat ff x:0 0 0 2026-03-09T00:03:34.803 INFO:tasks.workunit.client.0.vm03.stdout:6/31: rename f3 to f7 0 2026-03-09T00:03:34.804 INFO:tasks.workunit.client.0.vm03.stdout:6/32: dread f2 [0,4194304] 0 2026-03-09T00:03:34.805 INFO:tasks.workunit.client.0.vm03.stdout:3/38: unlink d2/la 0 2026-03-09T00:03:34.807 INFO:tasks.workunit.client.0.vm03.stdout:0/50: rename d2/l3 to d2/da/l13 0 2026-03-09T00:03:34.807 INFO:tasks.workunit.client.0.vm03.stdout:0/51: stat d2 0 2026-03-09T00:03:34.807 INFO:tasks.workunit.client.0.vm03.stdout:5/43: chown f3 50612 1 2026-03-09T00:03:34.808 INFO:tasks.workunit.client.0.vm03.stdout:9/47: link fb fe 0 2026-03-09T00:03:34.809 INFO:tasks.workunit.client.1.vm06.stdout:2/321: write d7/d1b/f22 [389299,116759] 0 2026-03-09T00:03:34.810 INFO:tasks.workunit.client.0.vm03.stdout:5/44: link lc l10 0 2026-03-09T00:03:34.813 INFO:tasks.workunit.client.0.vm03.stdout:9/48: truncate f5 1543944 0 2026-03-09T00:03:34.814 INFO:tasks.workunit.client.0.vm03.stdout:9/49: write fd [452677,58982] 0 2026-03-09T00:03:34.814 INFO:tasks.workunit.client.0.vm03.stdout:5/45: creat f11 x:0 0 0 2026-03-09T00:03:34.814 INFO:tasks.workunit.client.0.vm03.stdout:9/50: mknod cf 0 2026-03-09T00:03:34.814 INFO:tasks.workunit.client.0.vm03.stdout:9/51: truncate fc 928714 0 2026-03-09T00:03:34.814 INFO:tasks.workunit.client.0.vm03.stdout:9/52: chown cf 248 1 2026-03-09T00:03:34.814 INFO:tasks.workunit.client.0.vm03.stdout:9/53: chown f5 3 1 2026-03-09T00:03:34.816 INFO:tasks.workunit.client.1.vm06.stdout:6/229: fdatasync d4/d16/f21 0 2026-03-09T00:03:34.818 INFO:tasks.workunit.client.0.vm03.stdout:9/54: creat f10 x:0 0 0 2026-03-09T00:03:34.819 INFO:tasks.workunit.client.0.vm03.stdout:0/52: write d2/f9 [1380727,9022] 0 2026-03-09T00:03:34.821 INFO:tasks.workunit.client.0.vm03.stdout:5/46: rename f8 to f12 0 2026-03-09T00:03:34.821 INFO:tasks.workunit.client.0.vm03.stdout:5/47: creat f13 x:0 0 0 2026-03-09T00:03:34.822 INFO:tasks.workunit.client.0.vm03.stdout:9/55: dread fe [0,4194304] 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.0.vm03.stdout:9/56: rename fe to f11 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:2/322: write f2 [3831388,61556] 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/230: read d4/f36 [2488163,113008] 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:2/323: rmdir d7/d1a/d3c 39 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/231: dread d4/f2a [0,4194304] 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/232: rename d4/f23 to d4/f40 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/233: fdatasync d4/f26 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/234: link d4/f5 d4/d27/d3e/f41 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/235: dread d4/d16/f34 [0,4194304] 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.1.vm06.stdout:6/236: rename d4/d27/d35 to d4/d27/d42 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.0.vm03.stdout:8/60: dwrite f6 [4194304,4194304] 0 2026-03-09T00:03:34.856 INFO:tasks.workunit.client.0.vm03.stdout:8/61: chown d7/ce 21612169 1 2026-03-09T00:03:34.876 INFO:tasks.workunit.client.0.vm03.stdout:6/33: dwrite f2 [0,4194304] 0 2026-03-09T00:03:34.889 INFO:tasks.workunit.client.1.vm06.stdout:3/229: dwrite d11/d28/d2e/f37 [0,4194304] 0 2026-03-09T00:03:34.900 INFO:tasks.workunit.client.1.vm06.stdout:9/210: sync 2026-03-09T00:03:34.901 INFO:tasks.workunit.client.1.vm06.stdout:5/336: truncate d5/d1c/d68/f31 699830 0 2026-03-09T00:03:34.902 INFO:tasks.workunit.client.1.vm06.stdout:3/230: mknod d11/d28/d2e/c46 0 2026-03-09T00:03:34.903 INFO:tasks.workunit.client.1.vm06.stdout:9/211: rmdir d1/d3/d2b 39 2026-03-09T00:03:34.905 INFO:tasks.workunit.client.1.vm06.stdout:9/212: rename d1/d3/d12/d21/f3e to d1/f45 0 2026-03-09T00:03:34.917 INFO:tasks.workunit.client.1.vm06.stdout:9/213: dread d1/d3/f1f [0,4194304] 0 2026-03-09T00:03:34.951 INFO:tasks.workunit.client.1.vm06.stdout:6/237: write d4/f40 [561162,118771] 0 2026-03-09T00:03:34.953 INFO:tasks.workunit.client.1.vm06.stdout:6/238: symlink d4/l43 0 2026-03-09T00:03:34.954 INFO:tasks.workunit.client.0.vm03.stdout:3/39: dwrite d2/f9 [0,4194304] 0 2026-03-09T00:03:34.960 INFO:tasks.workunit.client.0.vm03.stdout:2/40: rmdir d8 39 2026-03-09T00:03:34.965 INFO:tasks.workunit.client.0.vm03.stdout:3/40: stat d2/c3 0 2026-03-09T00:03:34.966 INFO:tasks.workunit.client.0.vm03.stdout:2/41: mknod d8/cc 0 2026-03-09T00:03:34.966 INFO:tasks.workunit.client.0.vm03.stdout:2/42: rmdir d8 39 2026-03-09T00:03:34.967 INFO:tasks.workunit.client.0.vm03.stdout:3/41: dread f1 [0,4194304] 0 2026-03-09T00:03:34.971 INFO:tasks.workunit.client.0.vm03.stdout:7/31: dread d2/f3 [0,4194304] 0 2026-03-09T00:03:34.971 INFO:tasks.workunit.client.0.vm03.stdout:2/43: creat d8/fd x:0 0 0 2026-03-09T00:03:34.971 INFO:tasks.workunit.client.0.vm03.stdout:2/44: stat d8/fb 0 2026-03-09T00:03:34.981 INFO:tasks.workunit.client.0.vm03.stdout:7/32: chown d2/c8 112 1 2026-03-09T00:03:34.982 INFO:tasks.workunit.client.0.vm03.stdout:7/33: getdents d2 0 2026-03-09T00:03:34.985 INFO:tasks.workunit.client.0.vm03.stdout:7/34: read d2/f3 [515778,60577] 0 2026-03-09T00:03:34.991 INFO:tasks.workunit.client.1.vm06.stdout:2/324: dwrite f6 [0,4194304] 0 2026-03-09T00:03:34.995 INFO:tasks.workunit.client.1.vm06.stdout:2/325: stat d7/da/db/c13 0 2026-03-09T00:03:34.998 INFO:tasks.workunit.client.0.vm03.stdout:6/34: getdents . 0 2026-03-09T00:03:34.998 INFO:tasks.workunit.client.0.vm03.stdout:6/35: write f2 [4436425,98894] 0 2026-03-09T00:03:34.998 INFO:tasks.workunit.client.0.vm03.stdout:6/36: creat f8 x:0 0 0 2026-03-09T00:03:34.998 INFO:tasks.workunit.client.0.vm03.stdout:6/37: readlink - no filename 2026-03-09T00:03:35.003 INFO:tasks.workunit.client.0.vm03.stdout:6/38: write f2 [2511068,113852] 0 2026-03-09T00:03:35.016 INFO:tasks.workunit.client.0.vm03.stdout:6/39: truncate f8 705216 0 2026-03-09T00:03:35.016 INFO:tasks.workunit.client.0.vm03.stdout:6/40: chown f8 1225 1 2026-03-09T00:03:35.016 INFO:tasks.workunit.client.0.vm03.stdout:6/41: creat f9 x:0 0 0 2026-03-09T00:03:35.045 INFO:tasks.workunit.client.0.vm03.stdout:4/51: dwrite f3 [0,4194304] 0 2026-03-09T00:03:35.045 INFO:tasks.workunit.client.0.vm03.stdout:4/52: creat d7/fc x:0 0 0 2026-03-09T00:03:35.047 INFO:tasks.workunit.client.0.vm03.stdout:8/62: dwrite f3 [0,4194304] 0 2026-03-09T00:03:35.047 INFO:tasks.workunit.client.1.vm06.stdout:9/214: dwrite d1/d4/fe [0,4194304] 0 2026-03-09T00:03:35.048 INFO:tasks.workunit.client.0.vm03.stdout:4/53: rename f3 to d7/fd 0 2026-03-09T00:03:35.071 INFO:tasks.workunit.client.1.vm06.stdout:4/216: dwrite f1 [4194304,4194304] 0 2026-03-09T00:03:35.071 INFO:tasks.workunit.client.1.vm06.stdout:4/217: write d17/f1d [1450311,94200] 0 2026-03-09T00:03:35.072 INFO:tasks.workunit.client.1.vm06.stdout:4/218: creat d17/d21/d32/f3d x:0 0 0 2026-03-09T00:03:35.073 INFO:tasks.workunit.client.1.vm06.stdout:4/219: mknod d17/d21/d22/c3e 0 2026-03-09T00:03:35.073 INFO:tasks.workunit.client.1.vm06.stdout:4/220: dread - d17/d21/f38 zero size 2026-03-09T00:03:35.073 INFO:tasks.workunit.client.1.vm06.stdout:4/221: creat d17/d24/d3b/f3f x:0 0 0 2026-03-09T00:03:35.078 INFO:tasks.workunit.client.1.vm06.stdout:4/222: dread fe [0,4194304] 0 2026-03-09T00:03:35.078 INFO:tasks.workunit.client.1.vm06.stdout:4/223: dread - d17/d21/f38 zero size 2026-03-09T00:03:35.079 INFO:tasks.workunit.client.1.vm06.stdout:4/224: mknod d17/d24/d3b/c40 0 2026-03-09T00:03:35.080 INFO:tasks.workunit.client.1.vm06.stdout:4/225: mknod d17/d24/d3b/c41 0 2026-03-09T00:03:35.080 INFO:tasks.workunit.client.1.vm06.stdout:4/226: write f14 [42669,98921] 0 2026-03-09T00:03:35.082 INFO:tasks.workunit.client.1.vm06.stdout:4/227: mknod d17/d21/d32/c42 0 2026-03-09T00:03:35.082 INFO:tasks.workunit.client.1.vm06.stdout:4/228: dread - d17/d24/f31 zero size 2026-03-09T00:03:35.096 INFO:tasks.workunit.client.0.vm03.stdout:0/53: dwrite d2/f9 [0,4194304] 0 2026-03-09T00:03:35.104 INFO:tasks.workunit.client.0.vm03.stdout:0/54: unlink d2/l6 0 2026-03-09T00:03:35.122 INFO:tasks.workunit.client.0.vm03.stdout:3/42: dwrite d2/f9 [0,4194304] 0 2026-03-09T00:03:35.123 INFO:tasks.workunit.client.0.vm03.stdout:0/55: creat d2/da/dd/f14 x:0 0 0 2026-03-09T00:03:35.123 INFO:tasks.workunit.client.0.vm03.stdout:3/43: mknod d2/cf 0 2026-03-09T00:03:35.123 INFO:tasks.workunit.client.0.vm03.stdout:0/56: symlink d2/l15 0 2026-03-09T00:03:35.123 INFO:tasks.workunit.client.0.vm03.stdout:0/57: fdatasync d2/f9 0 2026-03-09T00:03:35.130 INFO:tasks.workunit.client.1.vm06.stdout:5/337: dwrite d5/f36 [0,4194304] 0 2026-03-09T00:03:35.130 INFO:tasks.workunit.client.0.vm03.stdout:3/44: dread f0 [0,4194304] 0 2026-03-09T00:03:35.138 INFO:tasks.workunit.client.0.vm03.stdout:3/45: creat d2/db/f10 x:0 0 0 2026-03-09T00:03:35.140 INFO:tasks.workunit.client.0.vm03.stdout:3/46: truncate d2/f6 57302 0 2026-03-09T00:03:35.140 INFO:tasks.workunit.client.1.vm06.stdout:5/338: creat d5/f6b x:0 0 0 2026-03-09T00:03:35.140 INFO:tasks.workunit.client.1.vm06.stdout:5/339: readlink d5/l24 0 2026-03-09T00:03:35.140 INFO:tasks.workunit.client.1.vm06.stdout:5/340: creat d5/d44/d4b/f6c x:0 0 0 2026-03-09T00:03:35.146 INFO:tasks.workunit.client.1.vm06.stdout:5/341: dread d5/d1c/f22 [0,4194304] 0 2026-03-09T00:03:35.146 INFO:tasks.workunit.client.1.vm06.stdout:5/342: creat d5/d44/d4b/f6d x:0 0 0 2026-03-09T00:03:35.146 INFO:tasks.workunit.client.1.vm06.stdout:5/343: fsync d5/d1c/d23/d34/d47/f61 0 2026-03-09T00:03:35.146 INFO:tasks.workunit.client.1.vm06.stdout:5/344: readlink d5/d1c/d23/l55 0 2026-03-09T00:03:35.159 INFO:tasks.workunit.client.0.vm03.stdout:2/45: dwrite d8/fd [0,4194304] 0 2026-03-09T00:03:35.159 INFO:tasks.workunit.client.0.vm03.stdout:2/46: write f2 [228998,35092] 0 2026-03-09T00:03:35.164 INFO:tasks.workunit.client.1.vm06.stdout:5/345: dread d5/d1c/d23/f5b [0,4194304] 0 2026-03-09T00:03:35.165 INFO:tasks.workunit.client.1.vm06.stdout:4/229: dwrite fe [0,4194304] 0 2026-03-09T00:03:35.166 INFO:tasks.workunit.client.1.vm06.stdout:4/230: symlink d17/d21/d22/l43 0 2026-03-09T00:03:35.166 INFO:tasks.workunit.client.1.vm06.stdout:4/231: symlink d17/d24/l44 0 2026-03-09T00:03:35.166 INFO:tasks.workunit.client.1.vm06.stdout:4/232: symlink d17/d24/d3b/l45 0 2026-03-09T00:03:35.170 INFO:tasks.workunit.client.0.vm03.stdout:2/47: dread d8/fd [0,4194304] 0 2026-03-09T00:03:35.170 INFO:tasks.workunit.client.0.vm03.stdout:2/48: write f6 [1233501,118470] 0 2026-03-09T00:03:35.171 INFO:tasks.workunit.client.0.vm03.stdout:2/49: symlink d8/le 0 2026-03-09T00:03:35.174 INFO:tasks.workunit.client.1.vm06.stdout:4/233: write d17/f1e [1089349,77864] 0 2026-03-09T00:03:35.178 INFO:tasks.workunit.client.0.vm03.stdout:6/42: dwrite f8 [0,4194304] 0 2026-03-09T00:03:35.178 INFO:tasks.workunit.client.0.vm03.stdout:6/43: readlink - no filename 2026-03-09T00:03:35.178 INFO:tasks.workunit.client.0.vm03.stdout:6/44: dread - f9 zero size 2026-03-09T00:03:35.179 INFO:tasks.workunit.client.0.vm03.stdout:6/45: mknod ca 0 2026-03-09T00:03:35.179 INFO:tasks.workunit.client.0.vm03.stdout:6/46: rmdir - no directory 2026-03-09T00:03:35.182 INFO:tasks.workunit.client.0.vm03.stdout:6/47: link f8 fb 0 2026-03-09T00:03:35.182 INFO:tasks.workunit.client.0.vm03.stdout:6/48: creat fc x:0 0 0 2026-03-09T00:03:35.182 INFO:tasks.workunit.client.0.vm03.stdout:6/49: rmdir - no directory 2026-03-09T00:03:35.182 INFO:tasks.workunit.client.0.vm03.stdout:8/63: dwrite d7/f9 [0,4194304] 0 2026-03-09T00:03:35.201 INFO:tasks.workunit.client.0.vm03.stdout:8/64: write d7/f9 [1908684,48475] 0 2026-03-09T00:03:35.202 INFO:tasks.workunit.client.0.vm03.stdout:8/65: creat d7/f10 x:0 0 0 2026-03-09T00:03:35.202 INFO:tasks.workunit.client.0.vm03.stdout:8/66: creat d7/f11 x:0 0 0 2026-03-09T00:03:35.203 INFO:tasks.workunit.client.0.vm03.stdout:8/67: symlink d7/l12 0 2026-03-09T00:03:35.217 INFO:tasks.workunit.client.1.vm06.stdout:6/239: rmdir d4/d27/d42 39 2026-03-09T00:03:35.218 INFO:tasks.workunit.client.1.vm06.stdout:6/240: creat d4/d27/d3e/f44 x:0 0 0 2026-03-09T00:03:35.258 INFO:tasks.workunit.client.0.vm03.stdout:2/50: dwrite d8/fb [0,4194304] 0 2026-03-09T00:03:35.259 INFO:tasks.workunit.client.0.vm03.stdout:2/51: mknod d8/cf 0 2026-03-09T00:03:35.259 INFO:tasks.workunit.client.0.vm03.stdout:2/52: fsync d8/fb 0 2026-03-09T00:03:35.259 INFO:tasks.workunit.client.1.vm06.stdout:2/326: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:03:35.259 INFO:tasks.workunit.client.1.vm06.stdout:2/327: fsync f6 0 2026-03-09T00:03:35.265 INFO:tasks.workunit.client.1.vm06.stdout:3/231: rmdir d11 39 2026-03-09T00:03:35.266 INFO:tasks.workunit.client.1.vm06.stdout:3/232: link d11/d28/d2e/f38 d11/d28/d2e/f47 0 2026-03-09T00:03:35.266 INFO:tasks.workunit.client.1.vm06.stdout:0/262: sync 2026-03-09T00:03:35.267 INFO:tasks.workunit.client.0.vm03.stdout:1/73: sync 2026-03-09T00:03:35.269 INFO:tasks.workunit.client.0.vm03.stdout:2/53: dread d8/fb [0,4194304] 0 2026-03-09T00:03:35.270 INFO:tasks.workunit.client.1.vm06.stdout:0/263: rename d3/d18/d28/f56 to d3/d18/d2c/d2d/d31/f5d 0 2026-03-09T00:03:35.270 INFO:tasks.workunit.client.1.vm06.stdout:3/233: dread d11/f27 [0,4194304] 0 2026-03-09T00:03:35.271 INFO:tasks.workunit.client.0.vm03.stdout:2/54: link f4 d8/f10 0 2026-03-09T00:03:35.271 INFO:tasks.workunit.client.1.vm06.stdout:2/328: write d7/da/d1c/f1f [4618995,89445] 0 2026-03-09T00:03:35.271 INFO:tasks.workunit.client.1.vm06.stdout:3/234: creat d11/f48 x:0 0 0 2026-03-09T00:03:35.271 INFO:tasks.workunit.client.1.vm06.stdout:3/235: write f10 [5252052,56620] 0 2026-03-09T00:03:35.273 INFO:tasks.workunit.client.1.vm06.stdout:2/329: unlink d7/da/db/c4a 0 2026-03-09T00:03:35.275 INFO:tasks.workunit.client.0.vm03.stdout:2/55: rename f4 to d8/f11 0 2026-03-09T00:03:35.276 INFO:tasks.workunit.client.1.vm06.stdout:0/264: write d3/d18/d1f/d44/f5a [1935459,82795] 0 2026-03-09T00:03:35.276 INFO:tasks.workunit.client.1.vm06.stdout:3/236: rename d11/f16 to d11/d28/d2e/d2f/f49 0 2026-03-09T00:03:35.276 INFO:tasks.workunit.client.1.vm06.stdout:3/237: truncate d11/d28/f3a 709235 0 2026-03-09T00:03:35.279 INFO:tasks.workunit.client.0.vm03.stdout:2/56: creat d8/f12 x:0 0 0 2026-03-09T00:03:35.280 INFO:tasks.workunit.client.1.vm06.stdout:0/265: creat d3/d18/d1f/f5e x:0 0 0 2026-03-09T00:03:35.280 INFO:tasks.workunit.client.1.vm06.stdout:0/266: dread - d3/f1e zero size 2026-03-09T00:03:35.285 INFO:tasks.workunit.client.0.vm03.stdout:2/57: creat d8/f13 x:0 0 0 2026-03-09T00:03:35.293 INFO:tasks.workunit.client.0.vm03.stdout:2/58: readlink l1 0 2026-03-09T00:03:35.293 INFO:tasks.workunit.client.0.vm03.stdout:2/59: symlink d8/l14 0 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:3/238: rename f10 to d11/d28/d2e/d2f/d36/f4a 0 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:2/330: getdents d7/da/d1c 0 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:2/331: chown d7/d1b/f3b 0 1 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:0/267: truncate d3/f10 3239019 0 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:0/268: write d3/f1c [686252,59486] 0 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:0/269: chown d3/d18/d1f/f34 0 1 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:3/239: mknod d11/d3f/c4b 0 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:1/238: sync 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:8/251: sync 2026-03-09T00:03:35.294 INFO:tasks.workunit.client.1.vm06.stdout:7/252: sync 2026-03-09T00:03:35.299 INFO:tasks.workunit.client.0.vm03.stdout:5/48: sync 2026-03-09T00:03:35.299 INFO:tasks.workunit.client.0.vm03.stdout:3/47: fsync d2/db/f10 0 2026-03-09T00:03:35.300 INFO:tasks.workunit.client.1.vm06.stdout:1/239: creat d6/d21/d2d/d3b/d42/f4e x:0 0 0 2026-03-09T00:03:35.300 INFO:tasks.workunit.client.1.vm06.stdout:1/240: fdatasync d6/d21/d2d/d3b/d42/d43/f45 0 2026-03-09T00:03:35.300 INFO:tasks.workunit.client.0.vm03.stdout:0/58: rmdir d2 39 2026-03-09T00:03:35.300 INFO:tasks.workunit.client.0.vm03.stdout:0/59: write d2/da/dd/f14 [936668,129042] 0 2026-03-09T00:03:35.302 INFO:tasks.workunit.client.1.vm06.stdout:7/253: truncate d0/df/f13 171093 0 2026-03-09T00:03:35.304 INFO:tasks.workunit.client.1.vm06.stdout:5/346: getdents d5/d44/d4b 0 2026-03-09T00:03:35.316 INFO:tasks.workunit.client.0.vm03.stdout:5/49: rename f13 to f14 0 2026-03-09T00:03:35.317 INFO:tasks.workunit.client.1.vm06.stdout:1/241: symlink d6/d4c/l4f 0 2026-03-09T00:03:35.317 INFO:tasks.workunit.client.1.vm06.stdout:1/242: fdatasync d6/d21/d2d/d3b/d42/f4e 0 2026-03-09T00:03:35.318 INFO:tasks.workunit.client.1.vm06.stdout:5/347: symlink d5/d1c/d21/d28/d5e/l6e 0 2026-03-09T00:03:35.318 INFO:tasks.workunit.client.1.vm06.stdout:5/348: fdatasync d5/d1c/d23/f4f 0 2026-03-09T00:03:35.318 INFO:tasks.workunit.client.1.vm06.stdout:5/349: symlink d5/d1c/d68/l6f 0 2026-03-09T00:03:35.318 INFO:tasks.workunit.client.1.vm06.stdout:5/350: read d5/d1c/d21/d28/f3b [1695251,24623] 0 2026-03-09T00:03:35.319 INFO:tasks.workunit.client.1.vm06.stdout:5/351: creat d5/d44/d4b/f70 x:0 0 0 2026-03-09T00:03:35.320 INFO:tasks.workunit.client.0.vm03.stdout:7/35: sync 2026-03-09T00:03:35.321 INFO:tasks.workunit.client.1.vm06.stdout:5/352: unlink d5/c27 0 2026-03-09T00:03:35.321 INFO:tasks.workunit.client.0.vm03.stdout:4/54: sync 2026-03-09T00:03:35.323 INFO:tasks.workunit.client.1.vm06.stdout:5/353: mknod d5/d1c/d21/d28/d5e/c71 0 2026-03-09T00:03:35.323 INFO:tasks.workunit.client.1.vm06.stdout:5/354: write d5/fe [648457,13038] 0 2026-03-09T00:03:35.323 INFO:tasks.workunit.client.0.vm03.stdout:7/36: creat d2/d4/fb x:0 0 0 2026-03-09T00:03:35.323 INFO:tasks.workunit.client.1.vm06.stdout:5/355: mknod d5/d1c/d21/d28/d35/c72 0 2026-03-09T00:03:35.323 INFO:tasks.workunit.client.1.vm06.stdout:5/356: creat d5/d1c/d21/f73 x:0 0 0 2026-03-09T00:03:35.324 INFO:tasks.workunit.client.0.vm03.stdout:7/37: creat d2/fc x:0 0 0 2026-03-09T00:03:35.325 INFO:tasks.workunit.client.0.vm03.stdout:4/55: dread f4 [4194304,4194304] 0 2026-03-09T00:03:35.327 INFO:tasks.workunit.client.1.vm06.stdout:4/234: dwrite fe [4194304,4194304] 0 2026-03-09T00:03:35.328 INFO:tasks.workunit.client.1.vm06.stdout:4/235: write d17/f19 [115139,92248] 0 2026-03-09T00:03:35.333 INFO:tasks.workunit.client.0.vm03.stdout:7/38: dread d2/f3 [0,4194304] 0 2026-03-09T00:03:35.347 INFO:tasks.workunit.client.0.vm03.stdout:7/39: chown d2/l5 2 1 2026-03-09T00:03:35.347 INFO:tasks.workunit.client.1.vm06.stdout:4/236: symlink d17/d21/d22/l46 0 2026-03-09T00:03:35.347 INFO:tasks.workunit.client.1.vm06.stdout:4/237: creat d17/d21/d22/f47 x:0 0 0 2026-03-09T00:03:35.348 INFO:tasks.workunit.client.0.vm03.stdout:7/40: mknod d2/d4/cd 0 2026-03-09T00:03:35.348 INFO:tasks.workunit.client.0.vm03.stdout:7/41: write d2/fc [424094,56489] 0 2026-03-09T00:03:35.348 INFO:tasks.workunit.client.0.vm03.stdout:7/42: dread d2/f3 [0,4194304] 0 2026-03-09T00:03:35.384 INFO:tasks.workunit.client.0.vm03.stdout:6/50: dwrite f9 [0,4194304] 0 2026-03-09T00:03:35.385 INFO:tasks.workunit.client.0.vm03.stdout:6/51: creat fd x:0 0 0 2026-03-09T00:03:35.385 INFO:tasks.workunit.client.0.vm03.stdout:9/57: dwrite f5 [0,4194304] 0 2026-03-09T00:03:35.392 INFO:tasks.workunit.client.0.vm03.stdout:6/52: unlink fd 0 2026-03-09T00:03:35.392 INFO:tasks.workunit.client.0.vm03.stdout:6/53: fsync f9 0 2026-03-09T00:03:35.393 INFO:tasks.workunit.client.0.vm03.stdout:9/58: symlink l12 0 2026-03-09T00:03:35.393 INFO:tasks.workunit.client.0.vm03.stdout:9/59: truncate f10 1001477 0 2026-03-09T00:03:35.393 INFO:tasks.workunit.client.0.vm03.stdout:9/60: fdatasync f5 0 2026-03-09T00:03:35.396 INFO:tasks.workunit.client.0.vm03.stdout:6/54: mknod ce 0 2026-03-09T00:03:35.396 INFO:tasks.workunit.client.0.vm03.stdout:6/55: fdatasync f8 0 2026-03-09T00:03:35.396 INFO:tasks.workunit.client.0.vm03.stdout:6/56: chown f9 231 1 2026-03-09T00:03:35.396 INFO:tasks.workunit.client.0.vm03.stdout:6/57: stat f7 0 2026-03-09T00:03:35.398 INFO:tasks.workunit.client.0.vm03.stdout:6/58: mknod cf 0 2026-03-09T00:03:35.398 INFO:tasks.workunit.client.0.vm03.stdout:6/59: fsync fc 0 2026-03-09T00:03:35.399 INFO:tasks.workunit.client.0.vm03.stdout:9/61: symlink l13 0 2026-03-09T00:03:35.403 INFO:tasks.workunit.client.0.vm03.stdout:6/60: link fc f10 0 2026-03-09T00:03:35.403 INFO:tasks.workunit.client.0.vm03.stdout:6/61: write f2 [2548733,79598] 0 2026-03-09T00:03:35.404 INFO:tasks.workunit.client.0.vm03.stdout:6/62: symlink l11 0 2026-03-09T00:03:35.404 INFO:tasks.workunit.client.0.vm03.stdout:6/63: write f2 [4878593,66555] 0 2026-03-09T00:03:35.404 INFO:tasks.workunit.client.0.vm03.stdout:6/64: creat f12 x:0 0 0 2026-03-09T00:03:35.404 INFO:tasks.workunit.client.0.vm03.stdout:6/65: dread - f10 zero size 2026-03-09T00:03:35.404 INFO:tasks.workunit.client.0.vm03.stdout:6/66: stat f9 0 2026-03-09T00:03:35.404 INFO:tasks.workunit.client.0.vm03.stdout:6/67: mkdir d13 0 2026-03-09T00:03:35.411 INFO:tasks.workunit.client.1.vm06.stdout:6/241: dwrite d4/d27/d3e/f44 [0,4194304] 0 2026-03-09T00:03:35.413 INFO:tasks.workunit.client.1.vm06.stdout:6/242: mkdir d4/d27/d3e/d45 0 2026-03-09T00:03:35.413 INFO:tasks.workunit.client.1.vm06.stdout:6/243: chown d4/f26 1860 1 2026-03-09T00:03:35.413 INFO:tasks.workunit.client.1.vm06.stdout:6/244: write d4/f3d [694249,41857] 0 2026-03-09T00:03:35.413 INFO:tasks.workunit.client.1.vm06.stdout:6/245: stat d4/d16/f33 0 2026-03-09T00:03:35.413 INFO:tasks.workunit.client.1.vm06.stdout:6/246: mkdir d4/d16/d46 0 2026-03-09T00:03:35.415 INFO:tasks.workunit.client.0.vm03.stdout:0/60: dwrite d2/ff [0,4194304] 0 2026-03-09T00:03:35.416 INFO:tasks.workunit.client.1.vm06.stdout:6/247: truncate d4/d27/d3e/f41 6121949 0 2026-03-09T00:03:35.416 INFO:tasks.workunit.client.1.vm06.stdout:6/248: fdatasync d4/d16/f32 0 2026-03-09T00:03:35.422 INFO:tasks.workunit.client.0.vm03.stdout:0/61: rename d2/c8 to d2/c16 0 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/62: mknod d2/da/dd/c17 0 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/63: unlink d2/f9 0 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/64: mknod d2/da/dd/c18 0 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/65: getdents d2/da 0 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/66: write d2/da/fc [425652,46713] 0 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/67: chown d2/da/dd/c18 289 1 2026-03-09T00:03:35.430 INFO:tasks.workunit.client.0.vm03.stdout:0/68: chown f1 23109 1 2026-03-09T00:03:35.437 INFO:tasks.workunit.client.1.vm06.stdout:0/270: dwrite d3/d18/d2c/f4d [0,4194304] 0 2026-03-09T00:03:35.444 INFO:tasks.workunit.client.1.vm06.stdout:0/271: chown d3/f1e 14071 1 2026-03-09T00:03:35.445 INFO:tasks.workunit.client.1.vm06.stdout:0/272: mknod d3/d18/d1f/d44/c5f 0 2026-03-09T00:03:35.445 INFO:tasks.workunit.client.1.vm06.stdout:0/273: mkdir d3/d18/d1f/d39/d49/d60 0 2026-03-09T00:03:35.445 INFO:tasks.workunit.client.1.vm06.stdout:0/274: readlink d3/d18/d28/d45/l54 0 2026-03-09T00:03:35.445 INFO:tasks.workunit.client.1.vm06.stdout:0/275: rmdir d3/d18/d28/d45 39 2026-03-09T00:03:35.445 INFO:tasks.workunit.client.1.vm06.stdout:0/276: fdatasync d3/f19 0 2026-03-09T00:03:35.479 INFO:tasks.workunit.client.0.vm03.stdout:1/74: dwrite d4/fb [0,4194304] 0 2026-03-09T00:03:35.485 INFO:tasks.workunit.client.0.vm03.stdout:1/75: dread d4/d6/f8 [0,4194304] 0 2026-03-09T00:03:35.524 INFO:tasks.workunit.client.1.vm06.stdout:5/357: dwrite d5/d1c/d21/d28/f63 [0,4194304] 0 2026-03-09T00:03:35.528 INFO:tasks.workunit.client.1.vm06.stdout:5/358: symlink d5/d44/l74 0 2026-03-09T00:03:35.528 INFO:tasks.workunit.client.1.vm06.stdout:5/359: dread - d5/d44/d4b/f6d zero size 2026-03-09T00:03:35.528 INFO:tasks.workunit.client.1.vm06.stdout:5/360: stat d5/f6b 0 2026-03-09T00:03:35.528 INFO:tasks.workunit.client.1.vm06.stdout:5/361: write d5/d1c/d21/d28/f56 [1061677,37604] 0 2026-03-09T00:03:35.531 INFO:tasks.workunit.client.1.vm06.stdout:7/254: write d0/df/f13 [719092,124543] 0 2026-03-09T00:03:35.531 INFO:tasks.workunit.client.1.vm06.stdout:7/255: stat d0/f14 0 2026-03-09T00:03:35.531 INFO:tasks.workunit.client.1.vm06.stdout:7/256: fdatasync d0/f6 0 2026-03-09T00:03:35.531 INFO:tasks.workunit.client.1.vm06.stdout:7/257: creat d0/d39/f3e x:0 0 0 2026-03-09T00:03:35.532 INFO:tasks.workunit.client.1.vm06.stdout:7/258: mkdir d0/df/d1a/d3f 0 2026-03-09T00:03:35.552 INFO:tasks.workunit.client.0.vm03.stdout:6/68: dwrite fc [0,4194304] 0 2026-03-09T00:03:35.552 INFO:tasks.workunit.client.0.vm03.stdout:6/69: stat f12 0 2026-03-09T00:03:35.552 INFO:tasks.workunit.client.0.vm03.stdout:6/70: truncate fb 1862489 0 2026-03-09T00:03:35.553 INFO:tasks.workunit.client.0.vm03.stdout:6/71: creat d13/f14 x:0 0 0 2026-03-09T00:03:35.553 INFO:tasks.workunit.client.0.vm03.stdout:6/72: truncate f12 323064 0 2026-03-09T00:03:35.558 INFO:tasks.workunit.client.0.vm03.stdout:0/69: dwrite d2/da/dd/f14 [0,4194304] 0 2026-03-09T00:03:35.560 INFO:tasks.workunit.client.0.vm03.stdout:0/70: symlink d2/da/l19 0 2026-03-09T00:03:35.562 INFO:tasks.workunit.client.1.vm06.stdout:9/215: sync 2026-03-09T00:03:35.565 INFO:tasks.workunit.client.1.vm06.stdout:2/332: dwrite d7/f8 [12582912,4194304] 0 2026-03-09T00:03:35.565 INFO:tasks.workunit.client.1.vm06.stdout:9/216: mknod d1/d3/d12/c46 0 2026-03-09T00:03:35.566 INFO:tasks.workunit.client.1.vm06.stdout:9/217: fdatasync d1/d4/f39 0 2026-03-09T00:03:35.566 INFO:tasks.workunit.client.1.vm06.stdout:3/240: dwrite f8 [0,4194304] 0 2026-03-09T00:03:35.568 INFO:tasks.workunit.client.1.vm06.stdout:9/218: creat d1/d3/d12/d21/d14/f47 x:0 0 0 2026-03-09T00:03:35.568 INFO:tasks.workunit.client.1.vm06.stdout:2/333: mkdir d7/d1b/d5a 0 2026-03-09T00:03:35.581 INFO:tasks.workunit.client.1.vm06.stdout:2/334: creat d7/da/d55/f5b x:0 0 0 2026-03-09T00:03:35.585 INFO:tasks.workunit.client.1.vm06.stdout:2/335: rename d7/da/f20 to d7/d1b/f5c 0 2026-03-09T00:03:35.585 INFO:tasks.workunit.client.1.vm06.stdout:2/336: write f2 [936440,75601] 0 2026-03-09T00:03:35.587 INFO:tasks.workunit.client.0.vm03.stdout:5/50: dwrite f12 [0,4194304] 0 2026-03-09T00:03:35.587 INFO:tasks.workunit.client.0.vm03.stdout:5/51: dread - f11 zero size 2026-03-09T00:03:35.592 INFO:tasks.workunit.client.1.vm06.stdout:0/277: dwrite d3/d18/d1f/f5e [0,4194304] 0 2026-03-09T00:03:35.592 INFO:tasks.workunit.client.1.vm06.stdout:0/278: write d3/d18/d1f/d39/d49/f50 [868604,95867] 0 2026-03-09T00:03:35.592 INFO:tasks.workunit.client.1.vm06.stdout:0/279: stat d3/d18/d1f/d39/f3d 0 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: pgmap v134: 65 pgs: 65 active+clean; 639 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 42 MiB/s rd, 64 MiB/s wr, 500 op/s 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr fail", "who": "vm03.yvcons"}]: dispatch 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "mgr fail", "who": "vm03.yvcons"}]': finished 2026-03-09T00:03:35.617 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:35 vm06.local ceph-mon[58395]: mgrmap e20: vm06.rzcvhn(active, starting, since 0.022425s) 2026-03-09T00:03:35.642 INFO:tasks.workunit.client.1.vm06.stdout:6/249: dwrite d4/f3b [0,4194304] 0 2026-03-09T00:03:35.643 INFO:tasks.workunit.client.1.vm06.stdout:6/250: unlink d4/f3b 0 2026-03-09T00:03:35.644 INFO:tasks.workunit.client.1.vm06.stdout:6/251: mknod d4/d16/c47 0 2026-03-09T00:03:35.644 INFO:tasks.workunit.client.1.vm06.stdout:6/252: mknod d4/d27/c48 0 2026-03-09T00:03:35.644 INFO:tasks.workunit.client.1.vm06.stdout:6/253: stat d4/f3d 0 2026-03-09T00:03:35.645 INFO:tasks.workunit.client.1.vm06.stdout:6/254: symlink d4/d27/d3e/d45/l49 0 2026-03-09T00:03:35.645 INFO:tasks.workunit.client.1.vm06.stdout:6/255: getdents d4/d27/d3e 0 2026-03-09T00:03:35.646 INFO:tasks.workunit.client.1.vm06.stdout:6/256: symlink d4/d16/d46/l4a 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/43: dwrite d2/fc [0,4194304] 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/44: fsync d2/fc 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/45: truncate d2/d4/fb 814979 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/46: write d2/f3 [3352936,79772] 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/47: mknod d2/ce 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:0/71: dread d2/da/fc [0,4194304] 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/48: write d2/d4/fb [2770,41532] 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/49: rename d2/d4 to d2/d4/df 22 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:0/72: mkdir d2/da/d1a 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/50: rename d2/c9 to d2/d4/da/c10 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:0/73: creat d2/da/f1b x:0 0 0 2026-03-09T00:03:35.660 INFO:tasks.workunit.client.0.vm03.stdout:7/51: creat d2/d4/da/f11 x:0 0 0 2026-03-09T00:03:35.665 INFO:tasks.workunit.client.1.vm06.stdout:6/257: dread d4/d16/f1c [0,4194304] 0 2026-03-09T00:03:35.683 INFO:tasks.workunit.client.0.vm03.stdout:1/76: dwrite d4/d6/fa [4194304,4194304] 0 2026-03-09T00:03:35.686 INFO:tasks.workunit.client.0.vm03.stdout:1/77: creat d4/d15/d1a/f1b x:0 0 0 2026-03-09T00:03:35.708 INFO:tasks.workunit.client.1.vm06.stdout:6/258: dread d4/f22 [0,4194304] 0 2026-03-09T00:03:35.708 INFO:tasks.workunit.client.1.vm06.stdout:2/337: dwrite d7/d1b/f5c [0,4194304] 0 2026-03-09T00:03:35.708 INFO:tasks.workunit.client.1.vm06.stdout:2/338: truncate d7/da/db/de/f49 247966 0 2026-03-09T00:03:35.709 INFO:tasks.workunit.client.1.vm06.stdout:6/259: mkdir d4/d27/d42/d4b 0 2026-03-09T00:03:35.709 INFO:tasks.workunit.client.1.vm06.stdout:6/260: getdents d4/d27/d3e 0 2026-03-09T00:03:35.710 INFO:tasks.workunit.client.1.vm06.stdout:7/259: dwrite d0/df/d17/f1f [4194304,4194304] 0 2026-03-09T00:03:35.710 INFO:tasks.workunit.client.1.vm06.stdout:7/260: write d0/df/d1a/d3a/d31/f32 [855546,59192] 0 2026-03-09T00:03:35.714 INFO:tasks.workunit.client.1.vm06.stdout:6/261: symlink d4/d27/l4c 0 2026-03-09T00:03:35.715 INFO:tasks.workunit.client.1.vm06.stdout:7/261: mkdir d0/df/d1a/d3a/d31/d40 0 2026-03-09T00:03:35.717 INFO:tasks.workunit.client.1.vm06.stdout:2/339: dread d7/f48 [0,4194304] 0 2026-03-09T00:03:35.717 INFO:tasks.workunit.client.1.vm06.stdout:2/340: creat d7/f5d x:0 0 0 2026-03-09T00:03:35.717 INFO:tasks.workunit.client.1.vm06.stdout:2/341: chown d7/da/db/c16 126398690 1 2026-03-09T00:03:35.717 INFO:tasks.workunit.client.1.vm06.stdout:2/342: chown d7/f3a 321063 1 2026-03-09T00:03:35.727 INFO:tasks.workunit.client.1.vm06.stdout:3/241: dwrite d11/f3c [0,4194304] 0 2026-03-09T00:03:35.729 INFO:tasks.workunit.client.1.vm06.stdout:3/242: chown d11/c14 376034 1 2026-03-09T00:03:35.729 INFO:tasks.workunit.client.1.vm06.stdout:3/243: write d11/d28/d2e/d2f/d36/f4a [8728518,120386] 0 2026-03-09T00:03:35.731 INFO:tasks.workunit.client.1.vm06.stdout:3/244: creat d11/d3f/f4c x:0 0 0 2026-03-09T00:03:35.731 INFO:tasks.workunit.client.1.vm06.stdout:3/245: dread - d11/d28/d2e/f47 zero size 2026-03-09T00:03:35.734 INFO:tasks.workunit.client.1.vm06.stdout:7/262: write d0/df/d17/f1f [4720985,72012] 0 2026-03-09T00:03:35.736 INFO:tasks.workunit.client.1.vm06.stdout:7/263: truncate d0/fe 1340308 0 2026-03-09T00:03:35.741 INFO:tasks.workunit.client.1.vm06.stdout:7/264: creat d0/df/d1a/d3a/d31/d40/f41 x:0 0 0 2026-03-09T00:03:35.750 INFO:tasks.workunit.client.1.vm06.stdout:0/280: dwrite d3/d18/d1f/d39/d3b/f55 [0,4194304] 0 2026-03-09T00:03:35.751 INFO:tasks.workunit.client.1.vm06.stdout:0/281: write d3/f10 [3860130,62290] 0 2026-03-09T00:03:35.751 INFO:tasks.workunit.client.1.vm06.stdout:0/282: stat d3/d18/d1f/f5e 0 2026-03-09T00:03:35.754 INFO:tasks.workunit.client.1.vm06.stdout:0/283: unlink d3/d18/d1f/d44/c5f 0 2026-03-09T00:03:35.765 INFO:tasks.workunit.client.0.vm03.stdout:6/73: rmdir d13 39 2026-03-09T00:03:35.770 INFO:tasks.workunit.client.0.vm03.stdout:6/74: write f10 [3161205,58070] 0 2026-03-09T00:03:35.776 INFO:tasks.workunit.client.1.vm06.stdout:5/362: dwrite d5/d1c/d21/d28/f56 [0,4194304] 0 2026-03-09T00:03:35.778 INFO:tasks.workunit.client.0.vm03.stdout:2/60: sync 2026-03-09T00:03:35.778 INFO:tasks.workunit.client.1.vm06.stdout:0/284: symlink d3/d18/d1f/d39/d3b/l61 0 2026-03-09T00:03:35.779 INFO:tasks.workunit.client.0.vm03.stdout:6/75: rename fc to d13/f15 0 2026-03-09T00:03:35.779 INFO:tasks.workunit.client.0.vm03.stdout:6/76: fdatasync d13/f14 0 2026-03-09T00:03:35.787 INFO:tasks.workunit.client.0.vm03.stdout:0/74: dwrite d2/da/fc [0,4194304] 0 2026-03-09T00:03:35.789 INFO:tasks.workunit.client.1.vm06.stdout:0/285: truncate d3/d18/d1f/d44/f5a 4079848 0 2026-03-09T00:03:35.789 INFO:tasks.workunit.client.1.vm06.stdout:5/363: creat d5/d1c/f75 x:0 0 0 2026-03-09T00:03:35.790 INFO:tasks.workunit.client.1.vm06.stdout:0/286: getdents d3/d18/d3c 0 2026-03-09T00:03:35.791 INFO:tasks.workunit.client.1.vm06.stdout:0/287: truncate d3/f7 3189217 0 2026-03-09T00:03:35.791 INFO:tasks.workunit.client.1.vm06.stdout:0/288: symlink d3/d18/d28/d45/l62 0 2026-03-09T00:03:35.802 INFO:tasks.workunit.client.0.vm03.stdout:0/75: write d2/da/fc [3586809,127992] 0 2026-03-09T00:03:35.802 INFO:tasks.workunit.client.0.vm03.stdout:0/76: creat d2/da/d1a/f1c x:0 0 0 2026-03-09T00:03:35.803 INFO:tasks.workunit.client.1.vm06.stdout:5/364: dread d5/d1c/d21/d28/f63 [0,4194304] 0 2026-03-09T00:03:35.804 INFO:tasks.workunit.client.1.vm06.stdout:5/365: mknod d5/d1c/c76 0 2026-03-09T00:03:35.805 INFO:tasks.workunit.client.1.vm06.stdout:5/366: dread - d5/d1c/f62 zero size 2026-03-09T00:03:35.806 INFO:tasks.workunit.client.1.vm06.stdout:5/367: link d5/d1c/d23/c45 d5/d1c/d21/d28/d35/d49/c77 0 2026-03-09T00:03:35.806 INFO:tasks.workunit.client.1.vm06.stdout:5/368: write d5/d1c/f62 [353808,9047] 0 2026-03-09T00:03:35.806 INFO:tasks.workunit.client.1.vm06.stdout:5/369: mkdir d5/d1c/d21/d28/d5e/d66/d78 0 2026-03-09T00:03:35.807 INFO:tasks.workunit.client.1.vm06.stdout:5/370: symlink d5/d1c/d21/l79 0 2026-03-09T00:03:35.807 INFO:tasks.workunit.client.1.vm06.stdout:5/371: chown d5/d1c/d23/f5b 31115905 1 2026-03-09T00:03:35.807 INFO:tasks.workunit.client.1.vm06.stdout:5/372: link d5/d1c/d21/f3c d5/d1c/d23/d51/f7a 0 2026-03-09T00:03:35.808 INFO:tasks.workunit.client.1.vm06.stdout:5/373: mknod d5/d1c/d21/d28/d5e/d66/d78/c7b 0 2026-03-09T00:03:35.808 INFO:tasks.workunit.client.1.vm06.stdout:5/374: chown d5/d1c/d21/d28/d35/f52 15 1 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: pgmap v134: 65 pgs: 65 active+clean; 639 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 42 MiB/s rd, 64 MiB/s wr, 500 op/s 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr fail", "who": "vm03.yvcons"}]: dispatch 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: osdmap e40: 6 total, 6 up, 6 in 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: from='mgr.14223 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd='[{"prefix": "mgr fail", "who": "vm03.yvcons"}]': finished 2026-03-09T00:03:35.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:35 vm03.local ceph-mon[52346]: mgrmap e20: vm06.rzcvhn(active, starting, since 0.022425s) 2026-03-09T00:03:35.850 INFO:tasks.workunit.client.0.vm03.stdout:8/68: sync 2026-03-09T00:03:35.852 INFO:tasks.workunit.client.0.vm03.stdout:8/69: symlink d7/l13 0 2026-03-09T00:03:35.852 INFO:tasks.workunit.client.0.vm03.stdout:8/70: mknod d7/df/c14 0 2026-03-09T00:03:35.852 INFO:tasks.workunit.client.0.vm03.stdout:8/71: write f3 [4504211,44156] 0 2026-03-09T00:03:35.863 INFO:tasks.workunit.client.1.vm06.stdout:6/262: dwrite d4/f36 [0,4194304] 0 2026-03-09T00:03:35.865 INFO:tasks.workunit.client.1.vm06.stdout:6/263: creat d4/d27/d3e/d45/f4d x:0 0 0 2026-03-09T00:03:35.865 INFO:tasks.workunit.client.0.vm03.stdout:1/78: dwrite d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:35.865 INFO:tasks.workunit.client.0.vm03.stdout:1/79: read - d4/f12 zero size 2026-03-09T00:03:35.868 INFO:tasks.workunit.client.0.vm03.stdout:7/52: link d2/d4/da/c10 d2/c12 0 2026-03-09T00:03:35.868 INFO:tasks.workunit.client.0.vm03.stdout:7/53: creat d2/d4/f13 x:0 0 0 2026-03-09T00:03:35.868 INFO:tasks.workunit.client.0.vm03.stdout:7/54: read - d2/d4/f13 zero size 2026-03-09T00:03:35.874 INFO:tasks.workunit.client.0.vm03.stdout:7/55: symlink d2/d4/l14 0 2026-03-09T00:03:35.875 INFO:tasks.workunit.client.0.vm03.stdout:7/56: mkdir d2/d4/d15 0 2026-03-09T00:03:35.875 INFO:tasks.workunit.client.0.vm03.stdout:7/57: dread - d2/d4/f13 zero size 2026-03-09T00:03:35.875 INFO:tasks.workunit.client.0.vm03.stdout:7/58: creat d2/d4/da/f16 x:0 0 0 2026-03-09T00:03:35.875 INFO:tasks.workunit.client.0.vm03.stdout:7/59: chown d2/d4/f13 64843475 1 2026-03-09T00:03:35.891 INFO:tasks.workunit.client.1.vm06.stdout:9/219: dwrite d1/f16 [0,4194304] 0 2026-03-09T00:03:35.891 INFO:tasks.workunit.client.1.vm06.stdout:9/220: dread - d1/f2a zero size 2026-03-09T00:03:35.935 INFO:tasks.workunit.client.1.vm06.stdout:0/289: dwrite d3/d18/d2c/f4e [0,4194304] 0 2026-03-09T00:03:35.941 INFO:tasks.workunit.client.1.vm06.stdout:2/343: dwrite d7/da/db/de/f49 [0,4194304] 0 2026-03-09T00:03:35.944 INFO:tasks.workunit.client.0.vm03.stdout:2/61: dwrite d8/f9 [0,4194304] 0 2026-03-09T00:03:35.945 INFO:tasks.workunit.client.0.vm03.stdout:2/62: creat d8/f15 x:0 0 0 2026-03-09T00:03:35.945 INFO:tasks.workunit.client.0.vm03.stdout:2/63: chown d8/f9 378 1 2026-03-09T00:03:35.951 INFO:tasks.workunit.client.0.vm03.stdout:2/64: mknod d8/c16 0 2026-03-09T00:03:35.951 INFO:tasks.workunit.client.0.vm03.stdout:2/65: read - d8/f13 zero size 2026-03-09T00:03:35.952 INFO:tasks.workunit.client.0.vm03.stdout:7/60: dwrite d2/d4/f13 [0,4194304] 0 2026-03-09T00:03:35.954 INFO:tasks.workunit.client.1.vm06.stdout:8/252: sync 2026-03-09T00:03:35.958 INFO:tasks.workunit.client.1.vm06.stdout:7/265: write d0/fe [1798507,25942] 0 2026-03-09T00:03:35.959 INFO:tasks.workunit.client.1.vm06.stdout:3/246: rmdir d11/d3f 39 2026-03-09T00:03:35.960 INFO:tasks.workunit.client.1.vm06.stdout:6/264: dwrite d4/fc [0,4194304] 0 2026-03-09T00:03:35.960 INFO:tasks.workunit.client.1.vm06.stdout:6/265: chown d4/c30 818419710 1 2026-03-09T00:03:35.960 INFO:tasks.workunit.client.1.vm06.stdout:6/266: stat d4/f22 0 2026-03-09T00:03:35.968 INFO:tasks.workunit.client.0.vm03.stdout:6/77: rename d13/f15 to d13/f16 0 2026-03-09T00:03:35.972 INFO:tasks.workunit.client.0.vm03.stdout:8/72: fdatasync f3 0 2026-03-09T00:03:35.981 INFO:tasks.workunit.client.0.vm03.stdout:8/73: readlink l5 0 2026-03-09T00:03:35.982 INFO:tasks.workunit.client.1.vm06.stdout:8/253: mkdir db/dd/d24/d36/d38/d4d 0 2026-03-09T00:03:35.993 INFO:tasks.workunit.client.0.vm03.stdout:2/66: mkdir d8/d17 0 2026-03-09T00:03:35.993 INFO:tasks.workunit.client.1.vm06.stdout:1/243: sync 2026-03-09T00:03:35.993 INFO:tasks.workunit.client.1.vm06.stdout:1/244: dread - d6/d21/f2e zero size 2026-03-09T00:03:35.994 INFO:tasks.workunit.client.1.vm06.stdout:4/238: sync 2026-03-09T00:03:35.994 INFO:tasks.workunit.client.1.vm06.stdout:4/239: write d17/f1d [592035,120139] 0 2026-03-09T00:03:35.995 INFO:tasks.workunit.client.1.vm06.stdout:7/266: rename d0/d39/c3b to d0/df/d1a/d3a/d31/c42 0 2026-03-09T00:03:35.999 INFO:tasks.workunit.client.0.vm03.stdout:1/80: dwrite d4/d6/fa [0,4194304] 0 2026-03-09T00:03:36.006 INFO:tasks.workunit.client.1.vm06.stdout:4/240: write fe [7952974,2043] 0 2026-03-09T00:03:36.007 INFO:tasks.workunit.client.0.vm03.stdout:1/81: dread d4/f9 [0,4194304] 0 2026-03-09T00:03:36.008 INFO:tasks.workunit.client.0.vm03.stdout:1/82: truncate d4/d15/d1a/f1b 265103 0 2026-03-09T00:03:36.013 INFO:tasks.workunit.client.1.vm06.stdout:6/267: truncate d4/d27/d3e/f44 2012556 0 2026-03-09T00:03:36.015 INFO:tasks.workunit.client.0.vm03.stdout:2/67: creat d8/f18 x:0 0 0 2026-03-09T00:03:36.020 INFO:tasks.workunit.client.0.vm03.stdout:2/68: write d8/f18 [556247,90168] 0 2026-03-09T00:03:36.021 INFO:tasks.workunit.client.0.vm03.stdout:2/69: write d8/f13 [505552,113976] 0 2026-03-09T00:03:36.021 INFO:tasks.workunit.client.0.vm03.stdout:1/83: dread f1 [4194304,4194304] 0 2026-03-09T00:03:36.022 INFO:tasks.workunit.client.1.vm06.stdout:1/245: rmdir d6/d21/d2d/d37 39 2026-03-09T00:03:36.028 INFO:tasks.workunit.client.1.vm06.stdout:8/254: rename db/d1e/f23 to db/dd/d48/f4e 0 2026-03-09T00:03:36.028 INFO:tasks.workunit.client.1.vm06.stdout:8/255: fdatasync f5 0 2026-03-09T00:03:36.029 INFO:tasks.workunit.client.0.vm03.stdout:1/84: mknod d4/d6/c1c 0 2026-03-09T00:03:36.029 INFO:tasks.workunit.client.0.vm03.stdout:1/85: creat d4/d15/d1a/f1d x:0 0 0 2026-03-09T00:03:36.029 INFO:tasks.workunit.client.0.vm03.stdout:1/86: creat d4/f1e x:0 0 0 2026-03-09T00:03:36.031 INFO:tasks.workunit.client.1.vm06.stdout:4/241: stat d17/d24/l2b 0 2026-03-09T00:03:36.040 INFO:tasks.workunit.client.0.vm03.stdout:1/87: mknod d4/d15/c1f 0 2026-03-09T00:03:36.041 INFO:tasks.workunit.client.1.vm06.stdout:3/247: mkdir d11/d28/d4d 0 2026-03-09T00:03:36.041 INFO:tasks.workunit.client.1.vm06.stdout:3/248: write d11/f48 [1038299,119802] 0 2026-03-09T00:03:36.045 INFO:tasks.workunit.client.0.vm03.stdout:7/61: dwrite d2/d4/da/f16 [0,4194304] 0 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.1.vm06.stdout:7/267: rename d0/f6 to d0/df/d1a/d27/f43 0 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.1.vm06.stdout:7/268: fsync d0/f7 0 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.1.vm06.stdout:7/269: write d0/df/d1a/d27/f43 [1774807,10037] 0 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.1.vm06.stdout:7/270: chown d0/f14 1812827347 1 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.1.vm06.stdout:7/271: readlink d0/l1c 0 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.1.vm06.stdout:7/272: chown d0/df/d17/f1f 1006978 1 2026-03-09T00:03:36.046 INFO:tasks.workunit.client.0.vm03.stdout:6/78: dwrite f12 [0,4194304] 0 2026-03-09T00:03:36.047 INFO:tasks.workunit.client.0.vm03.stdout:3/48: sync 2026-03-09T00:03:36.048 INFO:tasks.workunit.client.1.vm06.stdout:8/256: creat db/d1e/f4f x:0 0 0 2026-03-09T00:03:36.048 INFO:tasks.workunit.client.1.vm06.stdout:8/257: chown db/dd/d24/d36/f45 1 1 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:9/62: sync 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:2/70: write d8/f10 [2937947,83298] 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/56: sync 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:2/71: write d8/f12 [657920,659] 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/57: write d7/fc [913162,14316] 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/58: creat d7/fe x:0 0 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/59: creat d7/ff x:0 0 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/60: write d7/fb [299028,117486] 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/61: write d7/fd [5785763,115200] 0 2026-03-09T00:03:36.049 INFO:tasks.workunit.client.0.vm03.stdout:4/62: write d7/fd [5813128,125684] 0 2026-03-09T00:03:36.053 INFO:tasks.workunit.client.0.vm03.stdout:1/88: creat d4/d6/f20 x:0 0 0 2026-03-09T00:03:36.065 INFO:tasks.workunit.client.1.vm06.stdout:4/242: symlink d17/d21/d22/l48 0 2026-03-09T00:03:36.065 INFO:tasks.workunit.client.1.vm06.stdout:1/246: mknod d6/d4c/c50 0 2026-03-09T00:03:36.067 INFO:tasks.workunit.client.1.vm06.stdout:7/273: rename d0/f36 to d0/df/d1a/f44 0 2026-03-09T00:03:36.080 INFO:tasks.workunit.client.0.vm03.stdout:6/79: creat d13/f17 x:0 0 0 2026-03-09T00:03:36.080 INFO:tasks.workunit.client.0.vm03.stdout:6/80: creat d13/f18 x:0 0 0 2026-03-09T00:03:36.082 INFO:tasks.workunit.client.0.vm03.stdout:1/89: dread d4/d6/f8 [0,4194304] 0 2026-03-09T00:03:36.086 INFO:tasks.workunit.client.0.vm03.stdout:2/72: unlink d8/f13 0 2026-03-09T00:03:36.086 INFO:tasks.workunit.client.0.vm03.stdout:2/73: write d8/fb [4705887,43700] 0 2026-03-09T00:03:36.093 INFO:tasks.workunit.client.0.vm03.stdout:3/49: rename f0 to d2/f11 0 2026-03-09T00:03:36.093 INFO:tasks.workunit.client.0.vm03.stdout:3/50: write d2/f8 [1729633,76069] 0 2026-03-09T00:03:36.093 INFO:tasks.workunit.client.0.vm03.stdout:3/51: chown f1 73 1 2026-03-09T00:03:36.095 INFO:tasks.workunit.client.0.vm03.stdout:9/63: mknod c14 0 2026-03-09T00:03:36.096 INFO:tasks.workunit.client.0.vm03.stdout:9/64: mkdir d15 0 2026-03-09T00:03:36.098 INFO:tasks.workunit.client.0.vm03.stdout:9/65: dread f8 [0,4194304] 0 2026-03-09T00:03:36.098 INFO:tasks.workunit.client.0.vm03.stdout:9/66: mknod d15/c16 0 2026-03-09T00:03:36.098 INFO:tasks.workunit.client.0.vm03.stdout:9/67: fsync f10 0 2026-03-09T00:03:36.098 INFO:tasks.workunit.client.0.vm03.stdout:9/68: creat d15/f17 x:0 0 0 2026-03-09T00:03:36.098 INFO:tasks.workunit.client.0.vm03.stdout:9/69: creat d15/f18 x:0 0 0 2026-03-09T00:03:36.111 INFO:tasks.workunit.client.1.vm06.stdout:6/268: dwrite d4/f2d [0,4194304] 0 2026-03-09T00:03:36.116 INFO:tasks.workunit.client.1.vm06.stdout:0/290: truncate d3/d18/d2c/f4e 1535743 0 2026-03-09T00:03:36.116 INFO:tasks.workunit.client.1.vm06.stdout:0/291: fsync d3/d18/d1f/f26 0 2026-03-09T00:03:36.118 INFO:tasks.workunit.client.1.vm06.stdout:0/292: getdents d3/d18/d1f/d39 0 2026-03-09T00:03:36.118 INFO:tasks.workunit.client.1.vm06.stdout:0/293: creat d3/d18/d2c/d2d/d31/f63 x:0 0 0 2026-03-09T00:03:36.118 INFO:tasks.workunit.client.1.vm06.stdout:0/294: dread - d3/d18/d28/d45/f52 zero size 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/295: unlink d3/d18/d1f/d39/d3b/f47 0 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/296: fsync d3/f1b 0 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/297: chown d3/d18 1653 1 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/298: write d3/d18/d1f/d39/d3b/f57 [544390,58999] 0 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/299: read d3/d18/d1f/f4a [641688,21883] 0 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/300: creat d3/d18/d1f/d39/d49/f64 x:0 0 0 2026-03-09T00:03:36.120 INFO:tasks.workunit.client.1.vm06.stdout:0/301: readlink d3/lb 0 2026-03-09T00:03:36.122 INFO:tasks.workunit.client.1.vm06.stdout:0/302: rename d3/d18/d1f/d39/l53 to d3/d18/d2c/d2d/d31/l65 0 2026-03-09T00:03:36.122 INFO:tasks.workunit.client.1.vm06.stdout:0/303: truncate d3/d18/d1f/d44/f5a 3592177 0 2026-03-09T00:03:36.129 INFO:tasks.workunit.client.1.vm06.stdout:2/344: dwrite f6 [4194304,4194304] 0 2026-03-09T00:03:36.129 INFO:tasks.workunit.client.1.vm06.stdout:2/345: fdatasync d7/d1a/d3c/f4d 0 2026-03-09T00:03:36.135 INFO:tasks.workunit.client.1.vm06.stdout:2/346: creat d7/d1b/d5a/f5e x:0 0 0 2026-03-09T00:03:36.147 INFO:tasks.workunit.client.1.vm06.stdout:2/347: creat d7/da/d1c/f5f x:0 0 0 2026-03-09T00:03:36.147 INFO:tasks.workunit.client.1.vm06.stdout:2/348: creat d7/da/db/de/f60 x:0 0 0 2026-03-09T00:03:36.147 INFO:tasks.workunit.client.1.vm06.stdout:2/349: unlink d7/da/d1c/f1f 0 2026-03-09T00:03:36.149 INFO:tasks.workunit.client.0.vm03.stdout:8/74: dwrite d7/f10 [0,4194304] 0 2026-03-09T00:03:36.149 INFO:tasks.workunit.client.0.vm03.stdout:8/75: stat d7/ce 0 2026-03-09T00:03:36.150 INFO:tasks.workunit.client.0.vm03.stdout:8/76: chown d7/lc 334 1 2026-03-09T00:03:36.157 INFO:tasks.workunit.client.0.vm03.stdout:8/77: dread f6 [4194304,4194304] 0 2026-03-09T00:03:36.189 INFO:tasks.workunit.client.1.vm06.stdout:3/249: dwrite f9 [0,4194304] 0 2026-03-09T00:03:36.190 INFO:tasks.workunit.client.0.vm03.stdout:5/52: sync 2026-03-09T00:03:36.190 INFO:tasks.workunit.client.0.vm03.stdout:5/53: dread - ff zero size 2026-03-09T00:03:36.193 INFO:tasks.workunit.client.0.vm03.stdout:5/54: dread fe [0,4194304] 0 2026-03-09T00:03:36.193 INFO:tasks.workunit.client.1.vm06.stdout:3/250: getdents d11 0 2026-03-09T00:03:36.194 INFO:tasks.workunit.client.0.vm03.stdout:5/55: creat f15 x:0 0 0 2026-03-09T00:03:36.196 INFO:tasks.workunit.client.1.vm06.stdout:3/251: rmdir d11/d28/d2e/d2f 39 2026-03-09T00:03:36.197 INFO:tasks.workunit.client.1.vm06.stdout:4/243: dwrite d17/f19 [0,4194304] 0 2026-03-09T00:03:36.200 INFO:tasks.workunit.client.1.vm06.stdout:3/252: creat d11/d28/d2e/d2f/d36/f4e x:0 0 0 2026-03-09T00:03:36.201 INFO:tasks.workunit.client.0.vm03.stdout:5/56: mknod c16 0 2026-03-09T00:03:36.201 INFO:tasks.workunit.client.0.vm03.stdout:5/57: stat f15 0 2026-03-09T00:03:36.201 INFO:tasks.workunit.client.0.vm03.stdout:5/58: dread - f14 zero size 2026-03-09T00:03:36.203 INFO:tasks.workunit.client.0.vm03.stdout:5/59: chown l4 236485 1 2026-03-09T00:03:36.203 INFO:tasks.workunit.client.0.vm03.stdout:5/60: readlink l4 0 2026-03-09T00:03:36.204 INFO:tasks.workunit.client.0.vm03.stdout:5/61: creat f17 x:0 0 0 2026-03-09T00:03:36.204 INFO:tasks.workunit.client.0.vm03.stdout:5/62: truncate f15 160211 0 2026-03-09T00:03:36.204 INFO:tasks.workunit.client.0.vm03.stdout:5/63: read f3 [3423214,1757] 0 2026-03-09T00:03:36.207 INFO:tasks.workunit.client.0.vm03.stdout:1/90: dwrite d4/d6/f20 [0,4194304] 0 2026-03-09T00:03:36.208 INFO:tasks.workunit.client.1.vm06.stdout:8/258: dwrite db/d1e/f2e [0,4194304] 0 2026-03-09T00:03:36.208 INFO:tasks.workunit.client.1.vm06.stdout:8/259: truncate db/f1d 1006995 0 2026-03-09T00:03:36.208 INFO:tasks.workunit.client.1.vm06.stdout:8/260: creat db/d1e/f50 x:0 0 0 2026-03-09T00:03:36.208 INFO:tasks.workunit.client.1.vm06.stdout:8/261: chown db/dd/d24/d36/d38/d47/c49 0 1 2026-03-09T00:03:36.214 INFO:tasks.workunit.client.0.vm03.stdout:2/74: dread d8/f18 [0,4194304] 0 2026-03-09T00:03:36.214 INFO:tasks.workunit.client.0.vm03.stdout:2/75: chown c3 1 1 2026-03-09T00:03:36.215 INFO:tasks.workunit.client.0.vm03.stdout:2/76: dread d8/f18 [0,4194304] 0 2026-03-09T00:03:36.215 INFO:tasks.workunit.client.0.vm03.stdout:2/77: readlink d8/l14 0 2026-03-09T00:03:36.215 INFO:tasks.workunit.client.1.vm06.stdout:2/350: dread d7/f26 [0,4194304] 0 2026-03-09T00:03:36.215 INFO:tasks.workunit.client.1.vm06.stdout:6/269: dwrite d4/f3d [0,4194304] 0 2026-03-09T00:03:36.219 INFO:tasks.workunit.client.1.vm06.stdout:8/262: write db/d1e/f2e [4015348,96780] 0 2026-03-09T00:03:36.219 INFO:tasks.workunit.client.1.vm06.stdout:8/263: chown db/dd/d24/d36/f45 7 1 2026-03-09T00:03:36.219 INFO:tasks.workunit.client.1.vm06.stdout:8/264: creat db/d1e/f51 x:0 0 0 2026-03-09T00:03:36.220 INFO:tasks.workunit.client.1.vm06.stdout:8/265: dread - db/d1e/f50 zero size 2026-03-09T00:03:36.220 INFO:tasks.workunit.client.1.vm06.stdout:8/266: stat db/dd/d24/d36/f45 0 2026-03-09T00:03:36.220 INFO:tasks.workunit.client.1.vm06.stdout:8/267: creat db/d1e/f52 x:0 0 0 2026-03-09T00:03:36.220 INFO:tasks.workunit.client.1.vm06.stdout:3/253: creat d11/d28/f4f x:0 0 0 2026-03-09T00:03:36.221 INFO:tasks.workunit.client.1.vm06.stdout:1/247: dwrite d6/f19 [0,4194304] 0 2026-03-09T00:03:36.221 INFO:tasks.workunit.client.1.vm06.stdout:1/248: read - d6/d21/d2d/d3b/d42/d43/f4b zero size 2026-03-09T00:03:36.221 INFO:tasks.workunit.client.1.vm06.stdout:1/249: write d6/d21/d2d/f44 [4810051,105445] 0 2026-03-09T00:03:36.228 INFO:tasks.workunit.client.0.vm03.stdout:2/78: symlink d8/l19 0 2026-03-09T00:03:36.233 INFO:tasks.workunit.client.1.vm06.stdout:2/351: mknod d7/d1b/d5a/c61 0 2026-03-09T00:03:36.233 INFO:tasks.workunit.client.1.vm06.stdout:2/352: write d7/d1a/d25/f33 [4687512,70464] 0 2026-03-09T00:03:36.244 INFO:tasks.workunit.client.1.vm06.stdout:3/254: creat d11/d28/d2e/d2f/f50 x:0 0 0 2026-03-09T00:03:36.248 INFO:tasks.workunit.client.1.vm06.stdout:8/268: mkdir db/d53 0 2026-03-09T00:03:36.248 INFO:tasks.workunit.client.1.vm06.stdout:8/269: readlink db/dd/l19 0 2026-03-09T00:03:36.248 INFO:tasks.workunit.client.1.vm06.stdout:8/270: stat db/dd/d24/c43 0 2026-03-09T00:03:36.248 INFO:tasks.workunit.client.1.vm06.stdout:8/271: fdatasync db/f16 0 2026-03-09T00:03:36.249 INFO:tasks.workunit.client.1.vm06.stdout:3/255: getdents d11/d28/d2e 0 2026-03-09T00:03:36.256 INFO:tasks.workunit.client.1.vm06.stdout:7/274: dwrite d0/df/d1a/f44 [0,4194304] 0 2026-03-09T00:03:36.259 INFO:tasks.workunit.client.1.vm06.stdout:8/272: creat db/dd/d24/d36/f54 x:0 0 0 2026-03-09T00:03:36.264 INFO:tasks.workunit.client.1.vm06.stdout:8/273: creat db/f55 x:0 0 0 2026-03-09T00:03:36.265 INFO:tasks.workunit.client.1.vm06.stdout:7/275: rmdir d0/df/d1a/d3a/d31/d40 39 2026-03-09T00:03:36.265 INFO:tasks.workunit.client.1.vm06.stdout:7/276: write d0/df/d1a/d3a/f23 [1368826,5488] 0 2026-03-09T00:03:36.267 INFO:tasks.workunit.client.1.vm06.stdout:7/277: dread d0/df/f13 [0,4194304] 0 2026-03-09T00:03:36.269 INFO:tasks.workunit.client.1.vm06.stdout:8/274: write db/d1e/f20 [3856437,16481] 0 2026-03-09T00:03:36.271 INFO:tasks.workunit.client.1.vm06.stdout:7/278: dread d0/df/d1a/d3a/f3c [0,4194304] 0 2026-03-09T00:03:36.278 INFO:tasks.workunit.client.1.vm06.stdout:8/275: dread db/f16 [0,4194304] 0 2026-03-09T00:03:36.279 INFO:tasks.workunit.client.0.vm03.stdout:1/91: dwrite d4/d15/d1a/f1d [0,4194304] 0 2026-03-09T00:03:36.282 INFO:tasks.workunit.client.1.vm06.stdout:2/353: dwrite d7/da/db/de/f49 [0,4194304] 0 2026-03-09T00:03:36.289 INFO:tasks.workunit.client.1.vm06.stdout:1/250: getdents d6/d4c 0 2026-03-09T00:03:36.306 INFO:tasks.workunit.client.1.vm06.stdout:8/276: unlink db/dd/c10 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:7/62: truncate d2/d4/da/f16 878855 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:4/63: getdents d7 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/70: rmdir d15 39 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/71: stat d15 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/72: write f5 [881886,130209] 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/73: truncate d15/f18 839585 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/74: chown d15 4 1 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/75: write f8 [1002455,42565] 0 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.0.vm03.stdout:9/76: chown l12 95353 1 2026-03-09T00:03:36.310 INFO:tasks.workunit.client.1.vm06.stdout:4/244: rename d17/d21/d22 to d17/d24/d49 0 2026-03-09T00:03:36.311 INFO:tasks.workunit.client.0.vm03.stdout:1/92: mkdir d4/d21 0 2026-03-09T00:03:36.312 INFO:tasks.workunit.client.1.vm06.stdout:1/251: mkdir d6/d4c/d51 0 2026-03-09T00:03:36.314 INFO:tasks.workunit.client.1.vm06.stdout:4/245: unlink d17/f1e 0 2026-03-09T00:03:36.314 INFO:tasks.workunit.client.1.vm06.stdout:4/246: write d17/d24/d49/f47 [731321,26920] 0 2026-03-09T00:03:36.314 INFO:tasks.workunit.client.1.vm06.stdout:4/247: creat d17/d24/d3b/f4a x:0 0 0 2026-03-09T00:03:36.318 INFO:tasks.workunit.client.0.vm03.stdout:7/63: getdents d2 0 2026-03-09T00:03:36.325 INFO:tasks.workunit.client.1.vm06.stdout:6/270: rename d4/d27/d42/f39 to d4/d27/f4e 0 2026-03-09T00:03:36.356 INFO:tasks.workunit.client.1.vm06.stdout:6/271: symlink d4/d27/d42/d4b/l4f 0 2026-03-09T00:03:36.356 INFO:tasks.workunit.client.1.vm06.stdout:6/272: chown d4/d27/d3e/d45/f4d 22 1 2026-03-09T00:03:36.362 INFO:tasks.workunit.client.1.vm06.stdout:1/252: rename d6/l1e to d6/d21/d2d/d3b/d42/l52 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:6/273: creat d4/d27/d42/d4b/f50 x:0 0 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:6/274: write d4/f26 [358241,4165] 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:1/253: unlink d6/d21/l27 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:1/254: symlink d6/d21/d2d/d3b/d42/d43/d4d/l53 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:1/255: creat d6/f54 x:0 0 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:1/256: rename d6/f54 to d6/d21/f55 0 2026-03-09T00:03:36.393 INFO:tasks.workunit.client.1.vm06.stdout:1/257: dread - d6/d21/d2d/d3b/d42/d43/f45 zero size 2026-03-09T00:03:36.404 INFO:tasks.workunit.client.1.vm06.stdout:7/279: dwrite d0/df/d1a/d3a/d31/d40/f41 [0,4194304] 0 2026-03-09T00:03:36.409 INFO:tasks.workunit.client.1.vm06.stdout:7/280: getdents d0/df/d1a 0 2026-03-09T00:03:36.420 INFO:tasks.workunit.client.1.vm06.stdout:8/277: read f3 [2200364,127052] 0 2026-03-09T00:03:36.421 INFO:tasks.workunit.client.1.vm06.stdout:7/281: symlink d0/l45 0 2026-03-09T00:03:36.421 INFO:tasks.workunit.client.1.vm06.stdout:7/282: truncate d0/df/d1a/d3a/d31/f34 603139 0 2026-03-09T00:03:36.421 INFO:tasks.workunit.client.1.vm06.stdout:8/278: mknod db/d1e/c56 0 2026-03-09T00:03:36.421 INFO:tasks.workunit.client.1.vm06.stdout:8/279: creat db/d1e/f57 x:0 0 0 2026-03-09T00:03:36.421 INFO:tasks.workunit.client.1.vm06.stdout:8/280: link db/d1e/f34 db/d1e/f58 0 2026-03-09T00:03:36.421 INFO:tasks.workunit.client.1.vm06.stdout:8/281: mknod db/dd/d24/d36/d38/c59 0 2026-03-09T00:03:36.453 INFO:tasks.workunit.client.1.vm06.stdout:6/275: dread d4/d16/f32 [0,4194304] 0 2026-03-09T00:03:36.453 INFO:tasks.workunit.client.1.vm06.stdout:6/276: chown d4/f2a 22 1 2026-03-09T00:03:36.453 INFO:tasks.workunit.client.0.vm03.stdout:4/64: dwrite f4 [4194304,4194304] 0 2026-03-09T00:03:36.456 INFO:tasks.workunit.client.0.vm03.stdout:1/93: dwrite d4/d6/f20 [0,4194304] 0 2026-03-09T00:03:36.456 INFO:tasks.workunit.client.0.vm03.stdout:1/94: fsync f2 0 2026-03-09T00:03:36.458 INFO:tasks.workunit.client.1.vm06.stdout:6/277: rename d4/d27/d3e/d45/l49 to d4/d27/l51 0 2026-03-09T00:03:36.460 INFO:tasks.workunit.client.1.vm06.stdout:6/278: unlink d4/d16/f1c 0 2026-03-09T00:03:36.461 INFO:tasks.workunit.client.0.vm03.stdout:9/77: dwrite f11 [0,4194304] 0 2026-03-09T00:03:36.463 INFO:tasks.workunit.client.1.vm06.stdout:6/279: mkdir d4/d27/d42/d52 0 2026-03-09T00:03:36.463 INFO:tasks.workunit.client.1.vm06.stdout:6/280: fdatasync d4/d27/d42/d4b/f50 0 2026-03-09T00:03:36.465 INFO:tasks.workunit.client.0.vm03.stdout:1/95: write f2 [623479,125353] 0 2026-03-09T00:03:36.470 INFO:tasks.workunit.client.0.vm03.stdout:9/78: symlink d15/l19 0 2026-03-09T00:03:36.476 INFO:tasks.workunit.client.0.vm03.stdout:5/64: fsync f15 0 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: Active manager daemon vm06.rzcvhn restarted 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: Activating manager daemon vm06.rzcvhn 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: mgrmap e21: vm06.rzcvhn(active, starting, since 0.093046s) 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:03:36.476 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:36 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:03:36.477 INFO:tasks.workunit.client.0.vm03.stdout:1/96: symlink d4/d15/l22 0 2026-03-09T00:03:36.478 INFO:tasks.workunit.client.1.vm06.stdout:2/354: dwrite d7/f8 [8388608,4194304] 0 2026-03-09T00:03:36.480 INFO:tasks.workunit.client.0.vm03.stdout:2/79: rmdir d8 39 2026-03-09T00:03:36.481 INFO:tasks.workunit.client.0.vm03.stdout:0/77: sync 2026-03-09T00:03:36.483 INFO:tasks.workunit.client.0.vm03.stdout:9/79: chown l6 269 1 2026-03-09T00:03:36.483 INFO:tasks.workunit.client.0.vm03.stdout:9/80: chown d15/c16 413 1 2026-03-09T00:03:36.485 INFO:tasks.workunit.client.1.vm06.stdout:2/355: creat d7/d1b/d5a/f62 x:0 0 0 2026-03-09T00:03:36.489 INFO:tasks.workunit.client.0.vm03.stdout:2/80: creat d8/d17/f1a x:0 0 0 2026-03-09T00:03:36.489 INFO:tasks.workunit.client.0.vm03.stdout:0/78: dread - d2/fe zero size 2026-03-09T00:03:36.491 INFO:tasks.workunit.client.0.vm03.stdout:0/79: dread f0 [0,4194304] 0 2026-03-09T00:03:36.492 INFO:tasks.workunit.client.1.vm06.stdout:2/356: dread d7/da/db/de/f11 [0,4194304] 0 2026-03-09T00:03:36.493 INFO:tasks.workunit.client.0.vm03.stdout:2/81: dread d8/f11 [0,4194304] 0 2026-03-09T00:03:36.494 INFO:tasks.workunit.client.1.vm06.stdout:2/357: chown d7/d1a/d3c/l42 0 1 2026-03-09T00:03:36.494 INFO:tasks.workunit.client.0.vm03.stdout:5/65: getdents . 0 2026-03-09T00:03:36.494 INFO:tasks.workunit.client.0.vm03.stdout:5/66: write f14 [816866,77644] 0 2026-03-09T00:03:36.499 INFO:tasks.workunit.client.0.vm03.stdout:2/82: dread d8/f10 [0,4194304] 0 2026-03-09T00:03:36.505 INFO:tasks.workunit.client.0.vm03.stdout:0/80: symlink d2/da/dd/l1d 0 2026-03-09T00:03:36.506 INFO:tasks.workunit.client.1.vm06.stdout:2/358: rmdir d7/da/d1c 39 2026-03-09T00:03:36.513 INFO:tasks.workunit.client.0.vm03.stdout:0/81: rename d2/da/fc to d2/f1e 0 2026-03-09T00:03:36.516 INFO:tasks.workunit.client.0.vm03.stdout:7/64: dwrite d2/d4/da/f11 [0,4194304] 0 2026-03-09T00:03:36.525 INFO:tasks.workunit.client.0.vm03.stdout:0/82: unlink d2/da/l19 0 2026-03-09T00:03:36.525 INFO:tasks.workunit.client.0.vm03.stdout:7/65: dread d2/d4/fb [0,4194304] 0 2026-03-09T00:03:36.525 INFO:tasks.workunit.client.0.vm03.stdout:7/66: chown d2/d4/d15 0 1 2026-03-09T00:03:36.529 INFO:tasks.workunit.client.1.vm06.stdout:1/258: dwrite d6/d21/f3d [0,4194304] 0 2026-03-09T00:03:36.536 INFO:tasks.workunit.client.1.vm06.stdout:4/248: dwrite d17/d24/f36 [0,4194304] 0 2026-03-09T00:03:36.536 INFO:tasks.workunit.client.1.vm06.stdout:4/249: rename d17/f1f to d17/d21/f4b 0 2026-03-09T00:03:36.543 INFO:tasks.workunit.client.0.vm03.stdout:0/83: mkdir d2/d1f 0 2026-03-09T00:03:36.547 INFO:tasks.workunit.client.0.vm03.stdout:0/84: unlink d2/l15 0 2026-03-09T00:03:36.548 INFO:tasks.workunit.client.0.vm03.stdout:0/85: mknod d2/da/d1a/c20 0 2026-03-09T00:03:36.548 INFO:tasks.workunit.client.0.vm03.stdout:0/86: readlink d2/da/dd/l1d 0 2026-03-09T00:03:36.550 INFO:tasks.workunit.client.0.vm03.stdout:0/87: mknod d2/d1f/c21 0 2026-03-09T00:03:36.554 INFO:tasks.workunit.client.1.vm06.stdout:8/282: dwrite db/dd/f27 [0,4194304] 0 2026-03-09T00:03:36.558 INFO:tasks.workunit.client.1.vm06.stdout:8/283: dread db/dd/f13 [0,4194304] 0 2026-03-09T00:03:36.559 INFO:tasks.workunit.client.1.vm06.stdout:4/250: mkdir d17/d21/d4c 0 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/284: mknod db/dd/d24/d36/d38/d47/c5a 0 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/285: rename f9 to db/dd/d24/d36/d38/f5b 0 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/286: mkdir db/d53/d5c 0 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/287: creat db/d1e/d46/f5d x:0 0 0 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/288: read - db/d1e/f50 zero size 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/289: readlink db/dd/l15 0 2026-03-09T00:03:36.573 INFO:tasks.workunit.client.1.vm06.stdout:8/290: rmdir db/dd/d24/d36/d38 39 2026-03-09T00:03:36.606 INFO:tasks.workunit.client.1.vm06.stdout:7/283: dwrite d0/df/d1a/d3a/d31/f32 [0,4194304] 0 2026-03-09T00:03:36.612 INFO:tasks.workunit.client.1.vm06.stdout:7/284: write d0/df/d17/f38 [3261866,65171] 0 2026-03-09T00:03:36.633 INFO:tasks.workunit.client.0.vm03.stdout:4/65: dwrite d7/fe [0,4194304] 0 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: Active manager daemon vm06.rzcvhn restarted 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: Activating manager daemon vm06.rzcvhn 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: osdmap e41: 6 total, 6 up, 6 in 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: mgrmap e21: vm06.rzcvhn(active, starting, since 0.093046s) 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:03:36.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:36 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:03:36.687 INFO:tasks.workunit.client.1.vm06.stdout:6/281: dwrite d4/f22 [0,4194304] 0 2026-03-09T00:03:36.689 INFO:tasks.workunit.client.1.vm06.stdout:6/282: mkdir d4/d16/d53 0 2026-03-09T00:03:36.693 INFO:tasks.workunit.client.1.vm06.stdout:6/283: dread d4/fc [0,4194304] 0 2026-03-09T00:03:36.699 INFO:tasks.workunit.client.1.vm06.stdout:6/284: chown d4/d27/l2f 12 1 2026-03-09T00:03:36.699 INFO:tasks.workunit.client.1.vm06.stdout:6/285: rename d4/d27/d3e/d45 to d4/d27/d3e/d45/d54 22 2026-03-09T00:03:36.701 INFO:tasks.workunit.client.1.vm06.stdout:2/359: dwrite d7/da/d1c/f5f [0,4194304] 0 2026-03-09T00:03:36.701 INFO:tasks.workunit.client.1.vm06.stdout:8/291: dwrite db/f28 [0,4194304] 0 2026-03-09T00:03:36.701 INFO:tasks.workunit.client.1.vm06.stdout:8/292: readlink db/d1e/l37 0 2026-03-09T00:03:36.701 INFO:tasks.workunit.client.1.vm06.stdout:8/293: read db/d1e/f2e [799350,125902] 0 2026-03-09T00:03:36.706 INFO:tasks.workunit.client.1.vm06.stdout:2/360: mkdir d7/da/d63 0 2026-03-09T00:03:36.706 INFO:tasks.workunit.client.1.vm06.stdout:2/361: fdatasync d7/d1b/f46 0 2026-03-09T00:03:36.709 INFO:tasks.workunit.client.1.vm06.stdout:8/294: read db/dd/f40 [3080603,42181] 0 2026-03-09T00:03:36.710 INFO:tasks.workunit.client.1.vm06.stdout:8/295: fsync db/f31 0 2026-03-09T00:03:36.716 INFO:tasks.workunit.client.0.vm03.stdout:7/67: dwrite d2/d4/fb [0,4194304] 0 2026-03-09T00:03:36.719 INFO:tasks.workunit.client.0.vm03.stdout:7/68: mknod d2/d4/c17 0 2026-03-09T00:03:36.719 INFO:tasks.workunit.client.0.vm03.stdout:7/69: readlink d2/l5 0 2026-03-09T00:03:36.719 INFO:tasks.workunit.client.0.vm03.stdout:7/70: chown d2/d4/da/f16 3586 1 2026-03-09T00:03:36.719 INFO:tasks.workunit.client.0.vm03.stdout:7/71: chown d2/l5 946 1 2026-03-09T00:03:36.719 INFO:tasks.workunit.client.0.vm03.stdout:7/72: getdents d2/d4/d15 0 2026-03-09T00:03:36.720 INFO:tasks.workunit.client.0.vm03.stdout:7/73: mknod d2/d4/da/c18 0 2026-03-09T00:03:36.720 INFO:tasks.workunit.client.0.vm03.stdout:7/74: getdents d2/d4/d15 0 2026-03-09T00:03:36.720 INFO:tasks.workunit.client.0.vm03.stdout:7/75: write d2/d4/da/f16 [1447007,31566] 0 2026-03-09T00:03:36.737 INFO:tasks.workunit.client.1.vm06.stdout:7/285: dwrite d0/df/d1a/f44 [4194304,4194304] 0 2026-03-09T00:03:36.738 INFO:tasks.workunit.client.1.vm06.stdout:7/286: dread d0/df/d17/f2d [0,4194304] 0 2026-03-09T00:03:36.739 INFO:tasks.workunit.client.0.vm03.stdout:4/66: dwrite d7/f8 [0,4194304] 0 2026-03-09T00:03:36.739 INFO:tasks.workunit.client.0.vm03.stdout:4/67: chown f4 17722 1 2026-03-09T00:03:36.744 INFO:tasks.workunit.client.0.vm03.stdout:0/88: dwrite d2/ff [0,4194304] 0 2026-03-09T00:03:36.746 INFO:tasks.workunit.client.1.vm06.stdout:1/259: dwrite d6/d21/f2e [0,4194304] 0 2026-03-09T00:03:36.746 INFO:tasks.workunit.client.1.vm06.stdout:1/260: write d6/f7 [3029484,70991] 0 2026-03-09T00:03:36.749 INFO:tasks.workunit.client.1.vm06.stdout:7/287: symlink d0/df/d1a/d3a/d31/l46 0 2026-03-09T00:03:36.751 INFO:tasks.workunit.client.1.vm06.stdout:1/261: write d6/d21/d2d/f44 [841667,47256] 0 2026-03-09T00:03:36.764 INFO:tasks.workunit.client.1.vm06.stdout:7/288: rename d0/df/d1a/c1e to d0/df/d1a/d27/c47 0 2026-03-09T00:03:36.771 INFO:tasks.workunit.client.1.vm06.stdout:0/304: truncate d3/d18/d2c/d2d/d31/f5d 1345476 0 2026-03-09T00:03:36.771 INFO:tasks.workunit.client.1.vm06.stdout:0/305: dread - d3/d18/d1f/f34 zero size 2026-03-09T00:03:36.771 INFO:tasks.workunit.client.1.vm06.stdout:0/306: stat d3/d18/d1f/d39/d3b/l61 0 2026-03-09T00:03:36.771 INFO:tasks.workunit.client.1.vm06.stdout:0/307: chown d3/d18/d28/d45/f48 1252 1 2026-03-09T00:03:36.771 INFO:tasks.workunit.client.1.vm06.stdout:0/308: fsync d3/d18/d1f/d39/d49/f4b 0 2026-03-09T00:03:36.771 INFO:tasks.workunit.client.1.vm06.stdout:0/309: creat d3/d18/d1f/d39/d3b/f66 x:0 0 0 2026-03-09T00:03:36.776 INFO:tasks.workunit.client.0.vm03.stdout:9/81: getdents d15 0 2026-03-09T00:03:36.779 INFO:tasks.workunit.client.0.vm03.stdout:1/97: rmdir d4/d15 39 2026-03-09T00:03:36.779 INFO:tasks.workunit.client.0.vm03.stdout:1/98: readlink d4/d6/l11 0 2026-03-09T00:03:36.783 INFO:tasks.workunit.client.1.vm06.stdout:2/362: unlink d7/f48 0 2026-03-09T00:03:36.784 INFO:tasks.workunit.client.1.vm06.stdout:6/286: dwrite d4/d16/f32 [0,4194304] 0 2026-03-09T00:03:36.784 INFO:tasks.workunit.client.1.vm06.stdout:6/287: chown d4/d27/l4c 1 1 2026-03-09T00:03:36.790 INFO:tasks.workunit.client.1.vm06.stdout:2/363: write d7/d1b/f5c [2346883,128194] 0 2026-03-09T00:03:36.790 INFO:tasks.workunit.client.1.vm06.stdout:2/364: fsync f3 0 2026-03-09T00:03:36.791 INFO:tasks.workunit.client.1.vm06.stdout:2/365: fdatasync f3 0 2026-03-09T00:03:36.793 INFO:tasks.workunit.client.1.vm06.stdout:7/289: unlink d0/c4 0 2026-03-09T00:03:36.793 INFO:tasks.workunit.client.1.vm06.stdout:7/290: fdatasync d0/df/d1a/f44 0 2026-03-09T00:03:36.794 INFO:tasks.workunit.client.0.vm03.stdout:3/52: sync 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.0.vm03.stdout:7/76: truncate d2/d4/da/f11 3063985 0 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.0.vm03.stdout:6/81: sync 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.0.vm03.stdout:8/78: sync 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.0.vm03.stdout:8/79: fdatasync d7/f11 0 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.0.vm03.stdout:1/99: getdents d4/d15/d1a 0 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.0.vm03.stdout:1/100: write f1 [1916281,94304] 0 2026-03-09T00:03:36.805 INFO:tasks.workunit.client.1.vm06.stdout:1/262: mknod d6/c56 0 2026-03-09T00:03:36.806 INFO:tasks.workunit.client.0.vm03.stdout:7/77: creat d2/d4/d15/f19 x:0 0 0 2026-03-09T00:03:36.806 INFO:tasks.workunit.client.0.vm03.stdout:7/78: stat d2/d4/da 0 2026-03-09T00:03:36.810 INFO:tasks.workunit.client.0.vm03.stdout:8/80: dread d7/f10 [0,4194304] 0 2026-03-09T00:03:36.817 INFO:tasks.workunit.client.1.vm06.stdout:2/366: symlink d7/d1b/l64 0 2026-03-09T00:03:36.818 INFO:tasks.workunit.client.0.vm03.stdout:6/82: getdents d13 0 2026-03-09T00:03:36.818 INFO:tasks.workunit.client.0.vm03.stdout:8/81: rmdir d7 39 2026-03-09T00:03:36.820 INFO:tasks.workunit.client.1.vm06.stdout:1/263: rmdir d6/d21/d2d/d37 39 2026-03-09T00:03:36.823 INFO:tasks.workunit.client.0.vm03.stdout:0/89: dwrite d2/fe [0,4194304] 0 2026-03-09T00:03:36.840 INFO:tasks.workunit.client.0.vm03.stdout:0/90: chown d2/da/l13 0 1 2026-03-09T00:03:36.840 INFO:tasks.workunit.client.0.vm03.stdout:0/91: fsync d2/da/f1b 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.0.vm03.stdout:8/82: creat d7/f15 x:0 0 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.0.vm03.stdout:8/83: write d7/f15 [608699,83615] 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.0.vm03.stdout:0/92: creat d2/f22 x:0 0 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.1.vm06.stdout:2/367: mknod d7/d1a/d3c/c65 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.1.vm06.stdout:1/264: mknod d6/d21/d2d/d3b/d42/c57 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.1.vm06.stdout:6/288: truncate d4/f40 2916453 0 2026-03-09T00:03:36.841 INFO:tasks.workunit.client.1.vm06.stdout:1/265: dread d6/f28 [0,4194304] 0 2026-03-09T00:03:36.844 INFO:tasks.workunit.client.0.vm03.stdout:8/84: mknod d7/df/c16 0 2026-03-09T00:03:36.850 INFO:tasks.workunit.client.1.vm06.stdout:6/289: link d4/f2a d4/d27/d3e/f55 0 2026-03-09T00:03:36.850 INFO:tasks.workunit.client.1.vm06.stdout:6/290: fdatasync d4/f38 0 2026-03-09T00:03:36.853 INFO:tasks.workunit.client.0.vm03.stdout:8/85: truncate f3 4496902 0 2026-03-09T00:03:36.856 INFO:tasks.workunit.client.1.vm06.stdout:0/310: dwrite d3/d18/d1f/f5e [4194304,4194304] 0 2026-03-09T00:03:36.865 INFO:tasks.workunit.client.0.vm03.stdout:8/86: symlink d7/df/l17 0 2026-03-09T00:03:36.865 INFO:tasks.workunit.client.1.vm06.stdout:0/311: symlink d3/d18/l67 0 2026-03-09T00:03:36.867 INFO:tasks.workunit.client.1.vm06.stdout:0/312: creat d3/d18/f68 x:0 0 0 2026-03-09T00:03:36.867 INFO:tasks.workunit.client.1.vm06.stdout:0/313: readlink d3/d18/l27 0 2026-03-09T00:03:36.869 INFO:tasks.workunit.client.1.vm06.stdout:0/314: mkdir d3/d18/d1f/d39/d69 0 2026-03-09T00:03:36.870 INFO:tasks.workunit.client.1.vm06.stdout:0/315: mkdir d3/d18/d1f/d44/d6a 0 2026-03-09T00:03:36.870 INFO:tasks.workunit.client.1.vm06.stdout:0/316: chown d3/d18/f25 197776 1 2026-03-09T00:03:36.871 INFO:tasks.workunit.client.1.vm06.stdout:0/317: creat d3/d18/d2c/f6b x:0 0 0 2026-03-09T00:03:36.926 INFO:tasks.workunit.client.0.vm03.stdout:5/67: dwrite fb [0,4194304] 0 2026-03-09T00:03:36.929 INFO:tasks.workunit.client.0.vm03.stdout:5/68: unlink l10 0 2026-03-09T00:03:36.929 INFO:tasks.workunit.client.0.vm03.stdout:5/69: stat c16 0 2026-03-09T00:03:36.946 INFO:tasks.workunit.client.0.vm03.stdout:9/82: dwrite f10 [0,4194304] 0 2026-03-09T00:03:36.949 INFO:tasks.workunit.client.0.vm03.stdout:9/83: link l12 d15/l1a 0 2026-03-09T00:03:36.949 INFO:tasks.workunit.client.0.vm03.stdout:9/84: creat d15/f1b x:0 0 0 2026-03-09T00:03:36.950 INFO:tasks.workunit.client.0.vm03.stdout:9/85: truncate f8 753975 0 2026-03-09T00:03:36.950 INFO:tasks.workunit.client.0.vm03.stdout:9/86: truncate d15/f17 1025352 0 2026-03-09T00:03:36.950 INFO:tasks.workunit.client.0.vm03.stdout:9/87: fdatasync f11 0 2026-03-09T00:03:36.951 INFO:tasks.workunit.client.0.vm03.stdout:9/88: dread fd [0,4194304] 0 2026-03-09T00:03:36.952 INFO:tasks.workunit.client.0.vm03.stdout:9/89: mkdir d15/d1c 0 2026-03-09T00:03:36.963 INFO:tasks.workunit.client.0.vm03.stdout:3/53: dwrite d2/f8 [0,4194304] 0 2026-03-09T00:03:36.963 INFO:tasks.workunit.client.0.vm03.stdout:3/54: write d2/fd [183463,72859] 0 2026-03-09T00:03:36.963 INFO:tasks.workunit.client.0.vm03.stdout:3/55: stat d2/f6 0 2026-03-09T00:03:36.963 INFO:tasks.workunit.client.0.vm03.stdout:4/68: dwrite f4 [8388608,4194304] 0 2026-03-09T00:03:36.963 INFO:tasks.workunit.client.0.vm03.stdout:3/56: read d2/fc [6648,101409] 0 2026-03-09T00:03:36.963 INFO:tasks.workunit.client.0.vm03.stdout:3/57: stat d2/f6 0 2026-03-09T00:03:36.967 INFO:tasks.workunit.client.0.vm03.stdout:4/69: creat d7/f10 x:0 0 0 2026-03-09T00:03:36.969 INFO:tasks.workunit.client.0.vm03.stdout:4/70: symlink d7/l11 0 2026-03-09T00:03:36.970 INFO:tasks.workunit.client.0.vm03.stdout:3/58: symlink d2/l12 0 2026-03-09T00:03:37.006 INFO:tasks.workunit.client.1.vm06.stdout:6/291: dwrite d4/d27/f31 [0,4194304] 0 2026-03-09T00:03:37.006 INFO:tasks.workunit.client.1.vm06.stdout:6/292: getdents d4/d27/d42/d52 0 2026-03-09T00:03:37.006 INFO:tasks.workunit.client.1.vm06.stdout:6/293: truncate d4/d27/d3e/f44 2433373 0 2026-03-09T00:03:37.006 INFO:tasks.workunit.client.0.vm03.stdout:6/83: dwrite f12 [4194304,4194304] 0 2026-03-09T00:03:37.006 INFO:tasks.workunit.client.0.vm03.stdout:6/84: chown fb 61 1 2026-03-09T00:03:37.006 INFO:tasks.workunit.client.0.vm03.stdout:9/90: dwrite fd [0,4194304] 0 2026-03-09T00:03:37.014 INFO:tasks.workunit.client.0.vm03.stdout:1/101: dwrite f2 [0,4194304] 0 2026-03-09T00:03:37.017 INFO:tasks.workunit.client.0.vm03.stdout:1/102: dread d4/d6/f8 [0,4194304] 0 2026-03-09T00:03:37.024 INFO:tasks.workunit.client.0.vm03.stdout:6/85: mknod d13/c19 0 2026-03-09T00:03:37.028 INFO:tasks.workunit.client.0.vm03.stdout:6/86: readlink l11 0 2026-03-09T00:03:37.030 INFO:tasks.workunit.client.1.vm06.stdout:6/294: getdents d4/d16 0 2026-03-09T00:03:37.033 INFO:tasks.workunit.client.1.vm06.stdout:6/295: dread d4/f2d [0,4194304] 0 2026-03-09T00:03:37.046 INFO:tasks.workunit.client.0.vm03.stdout:0/93: dwrite d2/f1e [0,4194304] 0 2026-03-09T00:03:37.047 INFO:tasks.workunit.client.1.vm06.stdout:2/368: dwrite f6 [0,4194304] 0 2026-03-09T00:03:37.048 INFO:tasks.workunit.client.0.vm03.stdout:0/94: rename d2/da/dd/c17 to d2/da/c23 0 2026-03-09T00:03:37.048 INFO:tasks.workunit.client.0.vm03.stdout:9/91: dwrite fc [0,4194304] 0 2026-03-09T00:03:37.049 INFO:tasks.workunit.client.0.vm03.stdout:0/95: truncate d2/fb 630782 0 2026-03-09T00:03:37.049 INFO:tasks.workunit.client.0.vm03.stdout:0/96: write f0 [280270,55792] 0 2026-03-09T00:03:37.053 INFO:tasks.workunit.client.0.vm03.stdout:7/79: write d2/fc [1695169,51088] 0 2026-03-09T00:03:37.053 INFO:tasks.workunit.client.0.vm03.stdout:7/80: chown d2 1253800574 1 2026-03-09T00:03:37.055 INFO:tasks.workunit.client.0.vm03.stdout:4/71: write d7/fc [598116,62258] 0 2026-03-09T00:03:37.055 INFO:tasks.workunit.client.0.vm03.stdout:4/72: creat d7/f12 x:0 0 0 2026-03-09T00:03:37.055 INFO:tasks.workunit.client.0.vm03.stdout:4/73: readlink d7/l11 0 2026-03-09T00:03:37.058 INFO:tasks.workunit.client.1.vm06.stdout:2/369: mkdir d7/d1a/d25/d66 0 2026-03-09T00:03:37.074 INFO:tasks.workunit.client.0.vm03.stdout:0/97: write d2/ff [3481213,111698] 0 2026-03-09T00:03:37.077 INFO:tasks.workunit.client.0.vm03.stdout:4/74: symlink d7/l13 0 2026-03-09T00:03:37.077 INFO:tasks.workunit.client.0.vm03.stdout:4/75: chown f4 14 1 2026-03-09T00:03:37.083 INFO:tasks.workunit.client.0.vm03.stdout:0/98: link d2/f1e d2/da/dd/f24 0 2026-03-09T00:03:37.083 INFO:tasks.workunit.client.0.vm03.stdout:0/99: creat d2/da/d1a/f25 x:0 0 0 2026-03-09T00:03:37.084 INFO:tasks.workunit.client.0.vm03.stdout:4/76: rename d7/l13 to d7/l14 0 2026-03-09T00:03:37.084 INFO:tasks.workunit.client.0.vm03.stdout:4/77: fdatasync f4 0 2026-03-09T00:03:37.084 INFO:tasks.workunit.client.0.vm03.stdout:4/78: creat d7/f15 x:0 0 0 2026-03-09T00:03:37.089 INFO:tasks.workunit.client.0.vm03.stdout:0/100: symlink d2/d1f/l26 0 2026-03-09T00:03:37.096 INFO:tasks.workunit.client.0.vm03.stdout:4/79: link d7/fd d7/f16 0 2026-03-09T00:03:37.104 INFO:tasks.workunit.client.0.vm03.stdout:6/87: dwrite fb [0,4194304] 0 2026-03-09T00:03:37.108 INFO:tasks.workunit.client.0.vm03.stdout:0/101: write d2/f1e [1610257,71924] 0 2026-03-09T00:03:37.108 INFO:tasks.workunit.client.0.vm03.stdout:0/102: read - d2/da/d1a/f25 zero size 2026-03-09T00:03:37.126 INFO:tasks.workunit.client.1.vm06.stdout:2/370: dwrite d7/d1a/d56/f50 [0,4194304] 0 2026-03-09T00:03:37.152 INFO:tasks.workunit.client.1.vm06.stdout:9/221: sync 2026-03-09T00:03:37.152 INFO:tasks.workunit.client.1.vm06.stdout:5/375: sync 2026-03-09T00:03:37.152 INFO:tasks.workunit.client.1.vm06.stdout:5/376: readlink d5/l8 0 2026-03-09T00:03:37.152 INFO:tasks.workunit.client.1.vm06.stdout:5/377: dread - d5/d1c/d21/f73 zero size 2026-03-09T00:03:37.152 INFO:tasks.workunit.client.1.vm06.stdout:5/378: write d5/d1c/d21/d28/f57 [672189,39267] 0 2026-03-09T00:03:37.152 INFO:tasks.workunit.client.1.vm06.stdout:3/256: sync 2026-03-09T00:03:37.153 INFO:tasks.workunit.client.0.vm03.stdout:4/80: dwrite d7/f10 [0,4194304] 0 2026-03-09T00:03:37.153 INFO:tasks.workunit.client.0.vm03.stdout:7/81: dwrite d2/d4/fb [0,4194304] 0 2026-03-09T00:03:37.156 INFO:tasks.workunit.client.0.vm03.stdout:7/82: dread d2/f3 [0,4194304] 0 2026-03-09T00:03:37.156 INFO:tasks.workunit.client.0.vm03.stdout:7/83: creat d2/d4/d15/f1a x:0 0 0 2026-03-09T00:03:37.162 INFO:tasks.workunit.client.0.vm03.stdout:2/83: sync 2026-03-09T00:03:37.162 INFO:tasks.workunit.client.0.vm03.stdout:2/84: truncate d8/f18 1148957 0 2026-03-09T00:03:37.162 INFO:tasks.workunit.client.0.vm03.stdout:2/85: write f6 [1343240,27394] 0 2026-03-09T00:03:37.164 INFO:tasks.workunit.client.1.vm06.stdout:9/222: mkdir d1/d3/d12/d48 0 2026-03-09T00:03:37.165 INFO:tasks.workunit.client.1.vm06.stdout:8/296: sync 2026-03-09T00:03:37.165 INFO:tasks.workunit.client.1.vm06.stdout:4/251: sync 2026-03-09T00:03:37.166 INFO:tasks.workunit.client.1.vm06.stdout:3/257: rename d11/c14 to d11/d3f/c51 0 2026-03-09T00:03:37.176 INFO:tasks.workunit.client.0.vm03.stdout:4/81: unlink d7/f16 0 2026-03-09T00:03:37.181 INFO:tasks.workunit.client.0.vm03.stdout:4/82: write d7/fe [4422521,28338] 0 2026-03-09T00:03:37.182 INFO:tasks.workunit.client.1.vm06.stdout:9/223: mkdir d1/d3/d12/d49 0 2026-03-09T00:03:37.182 INFO:tasks.workunit.client.1.vm06.stdout:9/224: stat d1/f16 0 2026-03-09T00:03:37.182 INFO:tasks.workunit.client.1.vm06.stdout:9/225: creat d1/d3/d12/d21/d14/d25/f4a x:0 0 0 2026-03-09T00:03:37.182 INFO:tasks.workunit.client.1.vm06.stdout:5/379: getdents d5/d1c/d21/d28/d35 0 2026-03-09T00:03:37.186 INFO:tasks.workunit.client.1.vm06.stdout:6/296: fsync d4/d27/d3e/f55 0 2026-03-09T00:03:37.187 INFO:tasks.workunit.client.1.vm06.stdout:6/297: dread - d4/f38 zero size 2026-03-09T00:03:37.191 INFO:tasks.workunit.client.1.vm06.stdout:4/252: link d17/d21/d32/c34 d17/d21/c4d 0 2026-03-09T00:03:37.192 INFO:tasks.workunit.client.1.vm06.stdout:0/318: rmdir d3/d18/d1f/d44 39 2026-03-09T00:03:37.193 INFO:tasks.workunit.client.1.vm06.stdout:8/297: rename db/dd/d24/d36/d38/c4c to db/dd/d24/d36/d38/d47/c5e 0 2026-03-09T00:03:37.200 INFO:tasks.workunit.client.1.vm06.stdout:9/226: symlink d1/d4/l4b 0 2026-03-09T00:03:37.200 INFO:tasks.workunit.client.1.vm06.stdout:9/227: fsync d1/d3/d12/d21/d9/f10 0 2026-03-09T00:03:37.201 INFO:tasks.workunit.client.0.vm03.stdout:9/92: write f8 [1077754,130680] 0 2026-03-09T00:03:37.201 INFO:tasks.workunit.client.0.vm03.stdout:9/93: dread - d15/f1b zero size 2026-03-09T00:03:37.204 INFO:tasks.workunit.client.0.vm03.stdout:7/84: dwrite d2/d4/d15/f1a [0,4194304] 0 2026-03-09T00:03:37.208 INFO:tasks.workunit.client.1.vm06.stdout:3/258: dwrite d11/d28/d2e/d2f/f49 [0,4194304] 0 2026-03-09T00:03:37.208 INFO:tasks.workunit.client.1.vm06.stdout:3/259: dread - d11/d3f/f4c zero size 2026-03-09T00:03:37.209 INFO:tasks.workunit.client.0.vm03.stdout:3/59: getdents d2 0 2026-03-09T00:03:37.211 INFO:tasks.workunit.client.1.vm06.stdout:5/380: creat d5/d1c/d21/d28/d5e/d66/d78/f7c x:0 0 0 2026-03-09T00:03:37.214 INFO:tasks.workunit.client.0.vm03.stdout:3/60: dread d2/f8 [0,4194304] 0 2026-03-09T00:03:37.214 INFO:tasks.workunit.client.0.vm03.stdout:3/61: stat d2/db/f10 0 2026-03-09T00:03:37.226 INFO:tasks.workunit.client.1.vm06.stdout:6/298: symlink d4/d27/d3e/l56 0 2026-03-09T00:03:37.226 INFO:tasks.workunit.client.1.vm06.stdout:6/299: readlink d4/d27/d3e/l56 0 2026-03-09T00:03:37.226 INFO:tasks.workunit.client.0.vm03.stdout:9/94: symlink d15/l1d 0 2026-03-09T00:03:37.226 INFO:tasks.workunit.client.1.vm06.stdout:0/319: symlink d3/d18/l6c 0 2026-03-09T00:03:37.227 INFO:tasks.workunit.client.0.vm03.stdout:6/88: getdents d13 0 2026-03-09T00:03:37.228 INFO:tasks.workunit.client.1.vm06.stdout:0/320: dread d3/d18/d2c/d2d/d31/f4f [0,4194304] 0 2026-03-09T00:03:37.234 INFO:tasks.workunit.client.0.vm03.stdout:2/86: mkdir d8/d1b 0 2026-03-09T00:03:37.241 INFO:tasks.workunit.client.0.vm03.stdout:2/87: write d8/f15 [21826,67576] 0 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.0.vm03.stdout:4/83: dread d7/fd [4194304,4194304] 0 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:1/266: sync 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:7/291: sync 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:4/253: rename d17/d21/l27 to d17/d24/d49/l4e 0 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:8/298: rmdir db/dd/d24/d36/d38/d47 39 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:8/299: read db/d1e/f58 [3384871,108120] 0 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:8/300: write db/dd/f40 [3674477,92572] 0 2026-03-09T00:03:37.242 INFO:tasks.workunit.client.1.vm06.stdout:8/301: creat db/d1e/f5f x:0 0 0 2026-03-09T00:03:37.244 INFO:tasks.workunit.client.1.vm06.stdout:5/381: creat d5/d1c/d21/d28/d35/d49/f7d x:0 0 0 2026-03-09T00:03:37.248 INFO:tasks.workunit.client.0.vm03.stdout:2/88: write d8/fb [2181740,125686] 0 2026-03-09T00:03:37.248 INFO:tasks.workunit.client.0.vm03.stdout:2/89: creat d8/d17/f1c x:0 0 0 2026-03-09T00:03:37.248 INFO:tasks.workunit.client.0.vm03.stdout:2/90: creat d8/d17/f1d x:0 0 0 2026-03-09T00:03:37.250 INFO:tasks.workunit.client.0.vm03.stdout:4/84: mknod d7/c17 0 2026-03-09T00:03:37.251 INFO:tasks.workunit.client.1.vm06.stdout:0/321: symlink d3/d18/d28/d45/l6d 0 2026-03-09T00:03:37.254 INFO:tasks.workunit.client.0.vm03.stdout:0/103: rmdir d2/da 39 2026-03-09T00:03:37.255 INFO:tasks.workunit.client.1.vm06.stdout:7/292: getdents d0/df/d1a/d3a/d31 0 2026-03-09T00:03:37.255 INFO:tasks.workunit.client.1.vm06.stdout:7/293: creat d0/d39/f48 x:0 0 0 2026-03-09T00:03:37.264 INFO:tasks.workunit.client.1.vm06.stdout:9/228: rename d1/d3/d12/f3b to d1/d3/d12/d21/d9/f4c 0 2026-03-09T00:03:37.264 INFO:tasks.workunit.client.1.vm06.stdout:9/229: write d1/d3/d12/d21/d9/f10 [4406516,44507] 0 2026-03-09T00:03:37.265 INFO:tasks.workunit.client.0.vm03.stdout:4/85: mknod d7/c18 0 2026-03-09T00:03:37.265 INFO:tasks.workunit.client.0.vm03.stdout:2/91: dread d8/f11 [0,4194304] 0 2026-03-09T00:03:37.269 INFO:tasks.workunit.client.0.vm03.stdout:2/92: write d8/f18 [80050,40643] 0 2026-03-09T00:03:37.272 INFO:tasks.workunit.client.1.vm06.stdout:2/371: sync 2026-03-09T00:03:37.273 INFO:tasks.workunit.client.1.vm06.stdout:4/254: symlink d17/d24/l4f 0 2026-03-09T00:03:37.282 INFO:tasks.workunit.client.0.vm03.stdout:0/104: mknod d2/da/c27 0 2026-03-09T00:03:37.284 INFO:tasks.workunit.client.0.vm03.stdout:3/62: dwrite d2/f11 [0,4194304] 0 2026-03-09T00:03:37.293 INFO:tasks.workunit.client.0.vm03.stdout:2/93: rename d8/f10 to d8/d1b/f1e 0 2026-03-09T00:03:37.296 INFO:tasks.workunit.client.0.vm03.stdout:2/94: write d8/d17/f1d [696333,13951] 0 2026-03-09T00:03:37.298 INFO:tasks.workunit.client.1.vm06.stdout:8/302: rename db/d1e/f57 to db/d1e/f60 0 2026-03-09T00:03:37.298 INFO:tasks.workunit.client.0.vm03.stdout:0/105: mknod d2/d1f/c28 0 2026-03-09T00:03:37.299 INFO:tasks.workunit.client.1.vm06.stdout:5/382: link d5/d1c/d23/f42 d5/d1c/d21/d28/d5e/d66/d78/f7e 0 2026-03-09T00:03:37.300 INFO:tasks.workunit.client.1.vm06.stdout:0/322: creat d3/d18/d1f/d39/f6e x:0 0 0 2026-03-09T00:03:37.301 INFO:tasks.workunit.client.1.vm06.stdout:1/267: symlink d6/l58 0 2026-03-09T00:03:37.301 INFO:tasks.workunit.client.0.vm03.stdout:2/95: creat d8/d1b/f1f x:0 0 0 2026-03-09T00:03:37.301 INFO:tasks.workunit.client.0.vm03.stdout:2/96: creat d8/d17/f20 x:0 0 0 2026-03-09T00:03:37.307 INFO:tasks.workunit.client.0.vm03.stdout:0/106: mknod d2/da/d1a/c29 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:7/294: mknod d0/c49 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:2/372: mkdir d7/da/d55/d67 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:7/295: dread d0/df/d1a/d22/f28 [0,4194304] 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:7/296: chown d0/df/d1a/d22/c2a 86247 1 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:1/268: creat d6/d21/d2d/d3b/d42/d43/d4d/f59 x:0 0 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:2/373: chown d7/f8 239 1 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:2/374: chown d7/da/l15 0 1 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:7/297: mknod d0/df/d1a/d3a/d31/d40/c4a 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:8/303: mknod db/c61 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:1/269: mknod d6/d21/d2d/c5a 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:7/298: creat d0/df/d1a/d27/f4b x:0 0 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:7/299: getdents d0/df/d1a/d35 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:8/304: mkdir db/dd/d24/d36/d38/d47/d62 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:2/375: symlink d7/l68 0 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:2/376: chown d7/d1b/d31 13709 1 2026-03-09T00:03:37.324 INFO:tasks.workunit.client.1.vm06.stdout:2/377: readlink d7/d1a/l2a 0 2026-03-09T00:03:37.391 INFO:tasks.workunit.client.0.vm03.stdout:9/95: read fc [2608571,95801] 0 2026-03-09T00:03:37.391 INFO:tasks.workunit.client.0.vm03.stdout:9/96: write d15/f1b [997203,53235] 0 2026-03-09T00:03:37.399 INFO:tasks.workunit.client.0.vm03.stdout:6/89: dwrite f8 [0,4194304] 0 2026-03-09T00:03:37.399 INFO:tasks.workunit.client.1.vm06.stdout:3/260: dwrite d11/d28/f3a [0,4194304] 0 2026-03-09T00:03:37.400 INFO:tasks.workunit.client.0.vm03.stdout:6/90: write f7 [4389971,106458] 0 2026-03-09T00:03:37.401 INFO:tasks.workunit.client.1.vm06.stdout:6/300: dwrite d4/f36 [4194304,4194304] 0 2026-03-09T00:03:37.402 INFO:tasks.workunit.client.0.vm03.stdout:7/85: dwrite d2/d4/da/f16 [0,4194304] 0 2026-03-09T00:03:37.403 INFO:tasks.workunit.client.0.vm03.stdout:7/86: mknod d2/c1b 0 2026-03-09T00:03:37.403 INFO:tasks.workunit.client.0.vm03.stdout:7/87: stat d2/d4/da/f16 0 2026-03-09T00:03:37.403 INFO:tasks.workunit.client.0.vm03.stdout:7/88: fdatasync d2/d4/d15/f19 0 2026-03-09T00:03:37.403 INFO:tasks.workunit.client.0.vm03.stdout:7/89: dread - d2/d4/d15/f19 zero size 2026-03-09T00:03:37.404 INFO:tasks.workunit.client.0.vm03.stdout:7/90: mknod d2/d4/c1c 0 2026-03-09T00:03:37.404 INFO:tasks.workunit.client.0.vm03.stdout:7/91: write d2/d4/d15/f19 [925249,68555] 0 2026-03-09T00:03:37.411 INFO:tasks.workunit.client.0.vm03.stdout:7/92: mknod d2/c1d 0 2026-03-09T00:03:37.416 INFO:tasks.workunit.client.0.vm03.stdout:7/93: truncate d2/d4/fb 3957516 0 2026-03-09T00:03:37.416 INFO:tasks.workunit.client.0.vm03.stdout:7/94: chown d2/c8 21917 1 2026-03-09T00:03:37.431 INFO:tasks.workunit.client.1.vm06.stdout:5/383: dwrite d5/d1c/d68/f3f [0,4194304] 0 2026-03-09T00:03:37.431 INFO:tasks.workunit.client.1.vm06.stdout:5/384: write d5/d1c/d21/d28/f3b [3765786,95044] 0 2026-03-09T00:03:37.431 INFO:tasks.workunit.client.1.vm06.stdout:5/385: dread - d5/d44/d4b/f6c zero size 2026-03-09T00:03:37.431 INFO:tasks.workunit.client.1.vm06.stdout:5/386: chown d5/d1c/d21/d28/d35 1492 1 2026-03-09T00:03:37.436 INFO:tasks.workunit.client.0.vm03.stdout:3/63: dwrite d2/f11 [0,4194304] 0 2026-03-09T00:03:37.506 INFO:tasks.workunit.client.1.vm06.stdout:8/305: dwrite db/dd/f40 [0,4194304] 0 2026-03-09T00:03:37.507 INFO:tasks.workunit.client.1.vm06.stdout:8/306: fdatasync db/d1e/f20 0 2026-03-09T00:03:37.509 INFO:tasks.workunit.client.1.vm06.stdout:8/307: mkdir db/dd/d24/d63 0 2026-03-09T00:03:37.509 INFO:tasks.workunit.client.1.vm06.stdout:8/308: chown db/d1e/l39 209699826 1 2026-03-09T00:03:37.509 INFO:tasks.workunit.client.1.vm06.stdout:8/309: dread - db/d1e/f52 zero size 2026-03-09T00:03:37.509 INFO:tasks.workunit.client.1.vm06.stdout:8/310: write db/d1e/f25 [123254,96091] 0 2026-03-09T00:03:37.512 INFO:tasks.workunit.client.1.vm06.stdout:8/311: creat db/dd/f64 x:0 0 0 2026-03-09T00:03:37.512 INFO:tasks.workunit.client.1.vm06.stdout:8/312: fdatasync db/f1d 0 2026-03-09T00:03:37.513 INFO:tasks.workunit.client.1.vm06.stdout:8/313: readlink db/l26 0 2026-03-09T00:03:37.515 INFO:tasks.workunit.client.1.vm06.stdout:8/314: creat db/dd/d24/d36/d38/d4d/f65 x:0 0 0 2026-03-09T00:03:37.520 INFO:tasks.workunit.client.1.vm06.stdout:8/315: rmdir db/dd/d24/d36/d38/d47/d62 0 2026-03-09T00:03:37.529 INFO:tasks.workunit.client.1.vm06.stdout:7/300: dwrite d0/f2 [4194304,4194304] 0 2026-03-09T00:03:37.529 INFO:tasks.workunit.client.1.vm06.stdout:7/301: chown d0/c49 19554 1 2026-03-09T00:03:37.533 INFO:tasks.workunit.client.0.vm03.stdout:4/86: dwrite d7/fd [4194304,4194304] 0 2026-03-09T00:03:37.533 INFO:tasks.workunit.client.0.vm03.stdout:4/87: fdatasync d7/fc 0 2026-03-09T00:03:37.534 INFO:tasks.workunit.client.0.vm03.stdout:6/91: dwrite f10 [0,4194304] 0 2026-03-09T00:03:37.534 INFO:tasks.workunit.client.0.vm03.stdout:6/92: dread - d13/f17 zero size 2026-03-09T00:03:37.540 INFO:tasks.workunit.client.0.vm03.stdout:9/97: getdents d15 0 2026-03-09T00:03:37.547 INFO:tasks.workunit.client.0.vm03.stdout:7/95: truncate d2/d4/d15/f1a 2342668 0 2026-03-09T00:03:37.551 INFO:tasks.workunit.client.0.vm03.stdout:9/98: dread f11 [0,4194304] 0 2026-03-09T00:03:37.554 INFO:tasks.workunit.client.0.vm03.stdout:9/99: write f11 [790654,48359] 0 2026-03-09T00:03:37.564 INFO:tasks.workunit.client.0.vm03.stdout:3/64: dwrite d2/f8 [0,4194304] 0 2026-03-09T00:03:37.564 INFO:tasks.workunit.client.0.vm03.stdout:3/65: stat d2/f9 0 2026-03-09T00:03:37.564 INFO:tasks.workunit.client.0.vm03.stdout:3/66: fdatasync f1 0 2026-03-09T00:03:37.566 INFO:tasks.workunit.client.0.vm03.stdout:4/88: dwrite d7/fc [0,4194304] 0 2026-03-09T00:03:37.572 INFO:tasks.workunit.client.1.vm06.stdout:5/387: dwrite d5/d1c/d21/d28/d35/d49/f7d [0,4194304] 0 2026-03-09T00:03:37.581 INFO:tasks.workunit.client.0.vm03.stdout:6/93: dread f8 [0,4194304] 0 2026-03-09T00:03:37.585 INFO:tasks.workunit.client.1.vm06.stdout:2/378: dwrite d7/da/d1c/f29 [4194304,4194304] 0 2026-03-09T00:03:37.585 INFO:tasks.workunit.client.0.vm03.stdout:9/100: rmdir d15 39 2026-03-09T00:03:37.587 INFO:tasks.workunit.client.0.vm03.stdout:2/97: rmdir d8/d17 39 2026-03-09T00:03:37.587 INFO:tasks.workunit.client.0.vm03.stdout:2/98: write d8/d17/f1a [539404,3970] 0 2026-03-09T00:03:37.594 INFO:tasks.workunit.client.1.vm06.stdout:1/270: rmdir d6/d21/d2d/d3b/d42/d43 39 2026-03-09T00:03:37.595 INFO:tasks.workunit.client.1.vm06.stdout:7/302: rename d0/df/d1a/d3a/d31 to d0/df/d1a/d27/d4c 0 2026-03-09T00:03:37.595 INFO:tasks.workunit.client.0.vm03.stdout:8/87: truncate f3 1372663 0 2026-03-09T00:03:37.601 INFO:tasks.workunit.client.0.vm03.stdout:7/96: getdents d2/d4/da 0 2026-03-09T00:03:37.609 INFO:tasks.workunit.client.0.vm03.stdout:7/97: chown d2/l5 25833343 1 2026-03-09T00:03:37.609 INFO:tasks.workunit.client.1.vm06.stdout:1/271: creat d6/d21/d2d/f5b x:0 0 0 2026-03-09T00:03:37.609 INFO:tasks.workunit.client.1.vm06.stdout:2/379: write d7/f8 [109803,128893] 0 2026-03-09T00:03:37.609 INFO:tasks.workunit.client.0.vm03.stdout:8/88: dread f6 [4194304,4194304] 0 2026-03-09T00:03:37.616 INFO:tasks.workunit.client.1.vm06.stdout:6/301: dwrite d4/f5 [4194304,4194304] 0 2026-03-09T00:03:37.617 INFO:tasks.workunit.client.0.vm03.stdout:7/98: write d2/d4/da/f16 [928258,85595] 0 2026-03-09T00:03:37.617 INFO:tasks.workunit.client.0.vm03.stdout:7/99: write d2/f3 [4437682,51146] 0 2026-03-09T00:03:37.625 INFO:tasks.workunit.client.0.vm03.stdout:6/94: dwrite d13/f17 [0,4194304] 0 2026-03-09T00:03:37.630 INFO:tasks.workunit.client.0.vm03.stdout:3/67: dwrite d2/f11 [0,4194304] 0 2026-03-09T00:03:37.633 INFO:tasks.workunit.client.0.vm03.stdout:6/95: dread f2 [0,4194304] 0 2026-03-09T00:03:37.634 INFO:tasks.workunit.client.1.vm06.stdout:8/316: rmdir db 39 2026-03-09T00:03:37.644 INFO:tasks.workunit.client.1.vm06.stdout:5/388: rmdir d5/d1c 39 2026-03-09T00:03:37.646 INFO:tasks.workunit.client.1.vm06.stdout:1/272: link d6/d21/f2e d6/d21/d2d/d3b/d42/d43/d4d/f5c 0 2026-03-09T00:03:37.655 INFO:tasks.workunit.client.0.vm03.stdout:8/89: creat d7/f18 x:0 0 0 2026-03-09T00:03:37.656 INFO:tasks.workunit.client.0.vm03.stdout:5/70: sync 2026-03-09T00:03:37.656 INFO:tasks.workunit.client.0.vm03.stdout:1/103: sync 2026-03-09T00:03:37.657 INFO:tasks.workunit.client.1.vm06.stdout:2/380: mknod d7/d1a/d3c/c69 0 2026-03-09T00:03:37.658 INFO:tasks.workunit.client.0.vm03.stdout:4/89: rmdir d7 39 2026-03-09T00:03:37.661 INFO:tasks.workunit.client.1.vm06.stdout:7/303: getdents d0 0 2026-03-09T00:03:37.666 INFO:tasks.workunit.client.1.vm06.stdout:2/381: dread d7/da/d1c/f5f [0,4194304] 0 2026-03-09T00:03:37.667 INFO:tasks.workunit.client.0.vm03.stdout:7/100: mkdir d2/d4/d1e 0 2026-03-09T00:03:37.682 INFO:tasks.workunit.client.1.vm06.stdout:8/317: dwrite db/d1e/f50 [0,4194304] 0 2026-03-09T00:03:37.687 INFO:tasks.workunit.client.0.vm03.stdout:3/68: rmdir d2 39 2026-03-09T00:03:37.687 INFO:tasks.workunit.client.0.vm03.stdout:9/101: truncate f11 3744640 0 2026-03-09T00:03:37.687 INFO:tasks.workunit.client.0.vm03.stdout:6/96: creat d13/f1a x:0 0 0 2026-03-09T00:03:37.688 INFO:tasks.workunit.client.1.vm06.stdout:5/389: mknod d5/d1c/c7f 0 2026-03-09T00:03:37.688 INFO:tasks.workunit.client.1.vm06.stdout:7/304: rmdir d0/df/d1a/d3a 39 2026-03-09T00:03:37.688 INFO:tasks.workunit.client.1.vm06.stdout:7/305: chown d0/df/d1a/d27/d4c 1874172 1 2026-03-09T00:03:37.688 INFO:tasks.workunit.client.1.vm06.stdout:2/382: fsync d7/f26 0 2026-03-09T00:03:37.690 INFO:tasks.workunit.client.0.vm03.stdout:5/71: link fb f18 0 2026-03-09T00:03:37.693 INFO:tasks.workunit.client.0.vm03.stdout:1/104: mkdir d4/d15/d1a/d23 0 2026-03-09T00:03:37.693 INFO:tasks.workunit.client.0.vm03.stdout:1/105: stat d4/lc 0 2026-03-09T00:03:37.697 INFO:tasks.workunit.client.0.vm03.stdout:4/90: getdents d7 0 2026-03-09T00:03:37.698 INFO:tasks.workunit.client.0.vm03.stdout:3/69: unlink d2/cf 0 2026-03-09T00:03:37.699 INFO:tasks.workunit.client.1.vm06.stdout:8/318: symlink db/d53/l66 0 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:9/102: symlink d15/d1c/l1e 0 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:6/97: mknod d13/c1b 0 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:6/98: creat d13/f1c x:0 0 0 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:6/99: readlink l11 0 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:6/100: chown cf 819953 1 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:6/101: creat d13/f1d x:0 0 0 2026-03-09T00:03:37.701 INFO:tasks.workunit.client.0.vm03.stdout:6/102: dread - d13/f1a zero size 2026-03-09T00:03:37.704 INFO:tasks.workunit.client.1.vm06.stdout:6/302: rmdir d4 39 2026-03-09T00:03:37.705 INFO:tasks.workunit.client.0.vm03.stdout:3/70: write d2/f5 [301988,74948] 0 2026-03-09T00:03:37.705 INFO:tasks.workunit.client.0.vm03.stdout:3/71: creat d2/db/f13 x:0 0 0 2026-03-09T00:03:37.707 INFO:tasks.workunit.client.0.vm03.stdout:4/91: dread d7/f10 [0,4194304] 0 2026-03-09T00:03:37.707 INFO:tasks.workunit.client.0.vm03.stdout:5/72: unlink lc 0 2026-03-09T00:03:37.709 INFO:tasks.workunit.client.1.vm06.stdout:5/390: rmdir d5/d1c/d23/d51 39 2026-03-09T00:03:37.709 INFO:tasks.workunit.client.1.vm06.stdout:5/391: fsync d5/f43 0 2026-03-09T00:03:37.710 INFO:tasks.workunit.client.0.vm03.stdout:7/101: dwrite d2/d4/da/f11 [0,4194304] 0 2026-03-09T00:03:37.712 INFO:tasks.workunit.client.0.vm03.stdout:1/106: mknod d4/d21/c24 0 2026-03-09T00:03:37.714 INFO:tasks.workunit.client.0.vm03.stdout:1/107: dread f2 [4194304,4194304] 0 2026-03-09T00:03:37.723 INFO:tasks.workunit.client.1.vm06.stdout:8/319: truncate db/d1e/f20 1055852 0 2026-03-09T00:03:37.724 INFO:tasks.workunit.client.1.vm06.stdout:7/306: rmdir d0/df/d1a/d27 39 2026-03-09T00:03:37.725 INFO:tasks.workunit.client.0.vm03.stdout:6/103: dread d13/f17 [0,4194304] 0 2026-03-09T00:03:37.725 INFO:tasks.workunit.client.0.vm03.stdout:6/104: write d13/f18 [101550,97763] 0 2026-03-09T00:03:37.729 INFO:tasks.workunit.client.0.vm03.stdout:4/92: read f4 [5519266,127273] 0 2026-03-09T00:03:37.730 INFO:tasks.workunit.client.0.vm03.stdout:6/105: dread f10 [0,4194304] 0 2026-03-09T00:03:37.731 INFO:tasks.workunit.client.1.vm06.stdout:5/392: dread d5/d1c/d21/d28/f57 [0,4194304] 0 2026-03-09T00:03:37.731 INFO:tasks.workunit.client.0.vm03.stdout:1/108: dread d4/d6/f8 [0,4194304] 0 2026-03-09T00:03:37.731 INFO:tasks.workunit.client.0.vm03.stdout:1/109: fsync d4/d15/d1a/f1b 0 2026-03-09T00:03:37.738 INFO:tasks.workunit.client.0.vm03.stdout:3/72: link d2/db/f10 d2/db/f14 0 2026-03-09T00:03:37.738 INFO:tasks.workunit.client.0.vm03.stdout:3/73: creat d2/db/f15 x:0 0 0 2026-03-09T00:03:37.747 INFO:tasks.workunit.client.0.vm03.stdout:8/90: rmdir d7 39 2026-03-09T00:03:37.752 INFO:tasks.workunit.client.1.vm06.stdout:8/320: creat db/dd/f67 x:0 0 0 2026-03-09T00:03:37.755 INFO:tasks.workunit.client.1.vm06.stdout:7/307: mknod d0/df/d1a/d22/c4d 0 2026-03-09T00:03:37.760 INFO:tasks.workunit.client.0.vm03.stdout:7/102: rename d2/d4/da to d2/d1f 0 2026-03-09T00:03:37.763 INFO:tasks.workunit.client.0.vm03.stdout:7/103: write d2/d1f/f16 [1431883,104426] 0 2026-03-09T00:03:37.763 INFO:tasks.workunit.client.0.vm03.stdout:7/104: getdents d2/d1f 0 2026-03-09T00:03:37.763 INFO:tasks.workunit.client.1.vm06.stdout:5/393: mknod d5/d44/d4b/c80 0 2026-03-09T00:03:37.763 INFO:tasks.workunit.client.1.vm06.stdout:5/394: stat d5/d1c/d21/d28/d5e/d66/d78/c7b 0 2026-03-09T00:03:37.763 INFO:tasks.workunit.client.1.vm06.stdout:5/395: write d5/d1c/d21/d28/d5e/d66/d78/f7c [337233,38972] 0 2026-03-09T00:03:37.763 INFO:tasks.workunit.client.1.vm06.stdout:5/396: write d5/d1c/f22 [573897,102127] 0 2026-03-09T00:03:37.764 INFO:tasks.workunit.client.1.vm06.stdout:5/397: fsync d5/d1c/f62 0 2026-03-09T00:03:37.764 INFO:tasks.workunit.client.1.vm06.stdout:5/398: creat d5/d44/f81 x:0 0 0 2026-03-09T00:03:37.764 INFO:tasks.workunit.client.0.vm03.stdout:4/93: creat d7/f19 x:0 0 0 2026-03-09T00:03:37.765 INFO:tasks.workunit.client.1.vm06.stdout:8/321: creat db/dd/d48/f68 x:0 0 0 2026-03-09T00:03:37.765 INFO:tasks.workunit.client.1.vm06.stdout:8/322: write db/d1e/f60 [408326,57721] 0 2026-03-09T00:03:37.766 INFO:tasks.workunit.client.0.vm03.stdout:6/106: mkdir d13/d1e 0 2026-03-09T00:03:37.766 INFO:tasks.workunit.client.0.vm03.stdout:6/107: readlink l11 0 2026-03-09T00:03:37.771 INFO:tasks.workunit.client.1.vm06.stdout:7/308: truncate d0/df/d1a/f25 1610352 0 2026-03-09T00:03:37.771 INFO:tasks.workunit.client.1.vm06.stdout:7/309: dread - d0/df/d1a/d27/f4b zero size 2026-03-09T00:03:37.774 INFO:tasks.workunit.client.1.vm06.stdout:5/399: creat d5/d1c/d23/f82 x:0 0 0 2026-03-09T00:03:37.776 INFO:tasks.workunit.client.0.vm03.stdout:8/91: rename d7/l12 to d7/df/l19 0 2026-03-09T00:03:37.776 INFO:tasks.workunit.client.0.vm03.stdout:8/92: write d7/f18 [924231,24179] 0 2026-03-09T00:03:37.776 INFO:tasks.workunit.client.0.vm03.stdout:8/93: dread - d7/f11 zero size 2026-03-09T00:03:37.778 INFO:tasks.workunit.client.1.vm06.stdout:7/310: mkdir d0/df/d1a/d3a/d4e 0 2026-03-09T00:03:37.778 INFO:tasks.workunit.client.1.vm06.stdout:7/311: chown d0/d39/f48 31088408 1 2026-03-09T00:03:37.778 INFO:tasks.workunit.client.1.vm06.stdout:7/312: write d0/df/d17/f1f [2068997,83573] 0 2026-03-09T00:03:37.778 INFO:tasks.workunit.client.0.vm03.stdout:4/94: symlink d7/l1a 0 2026-03-09T00:03:37.783 INFO:tasks.workunit.client.1.vm06.stdout:5/400: creat d5/d1c/d21/d28/d35/d49/f83 x:0 0 0 2026-03-09T00:03:37.784 INFO:tasks.workunit.client.1.vm06.stdout:6/303: dwrite d4/d16/f32 [0,4194304] 0 2026-03-09T00:03:37.794 INFO:tasks.workunit.client.0.vm03.stdout:7/105: rename d2/l5 to d2/d1f/l20 0 2026-03-09T00:03:37.798 INFO:tasks.workunit.client.0.vm03.stdout:8/94: mkdir d7/df/d1a 0 2026-03-09T00:03:37.800 INFO:tasks.workunit.client.1.vm06.stdout:7/313: creat d0/f4f x:0 0 0 2026-03-09T00:03:37.802 INFO:tasks.workunit.client.0.vm03.stdout:4/95: unlink d7/f19 0 2026-03-09T00:03:37.803 INFO:tasks.workunit.client.1.vm06.stdout:5/401: mkdir d5/d44/d84 0 2026-03-09T00:03:37.804 INFO:tasks.workunit.client.0.vm03.stdout:7/106: symlink d2/d4/d1e/l21 0 2026-03-09T00:03:37.806 INFO:tasks.workunit.client.0.vm03.stdout:8/95: link d7/la d7/df/d1a/l1b 0 2026-03-09T00:03:37.806 INFO:tasks.workunit.client.0.vm03.stdout:8/96: write d7/f11 [883565,91619] 0 2026-03-09T00:03:37.806 INFO:tasks.workunit.client.0.vm03.stdout:8/97: chown d7/cb 229 1 2026-03-09T00:03:37.809 INFO:tasks.workunit.client.1.vm06.stdout:7/314: rename d0/df/d1a/d27/d4c/f34 to d0/df/d1a/f50 0 2026-03-09T00:03:37.810 INFO:tasks.workunit.client.0.vm03.stdout:7/107: truncate d2/d4/fb 3540294 0 2026-03-09T00:03:37.810 INFO:tasks.workunit.client.0.vm03.stdout:7/108: fsync d2/d1f/f11 0 2026-03-09T00:03:37.810 INFO:tasks.workunit.client.0.vm03.stdout:7/109: creat d2/d4/f22 x:0 0 0 2026-03-09T00:03:37.812 INFO:tasks.workunit.client.1.vm06.stdout:5/402: symlink d5/d1c/d21/d28/d5e/d66/l85 0 2026-03-09T00:03:37.813 INFO:tasks.workunit.client.0.vm03.stdout:8/98: creat d7/df/d1a/f1c x:0 0 0 2026-03-09T00:03:37.814 INFO:tasks.workunit.client.1.vm06.stdout:7/315: mkdir d0/df/d1a/d27/d4c/d40/d51 0 2026-03-09T00:03:37.817 INFO:tasks.workunit.client.1.vm06.stdout:5/403: creat d5/d1c/d21/d28/d35/f86 x:0 0 0 2026-03-09T00:03:37.819 INFO:tasks.workunit.client.1.vm06.stdout:7/316: rmdir d0/df 39 2026-03-09T00:03:37.819 INFO:tasks.workunit.client.1.vm06.stdout:7/317: dread - d0/d39/f3e zero size 2026-03-09T00:03:37.821 INFO:tasks.workunit.client.0.vm03.stdout:7/110: unlink d2/c1b 0 2026-03-09T00:03:37.822 INFO:tasks.workunit.client.1.vm06.stdout:5/404: write d5/f14 [5185124,49728] 0 2026-03-09T00:03:37.824 INFO:tasks.workunit.client.1.vm06.stdout:7/318: mkdir d0/df/d1a/d27/d4c/d52 0 2026-03-09T00:03:37.824 INFO:tasks.workunit.client.1.vm06.stdout:7/319: dread - d0/d39/f3e zero size 2026-03-09T00:03:37.825 INFO:tasks.workunit.client.0.vm03.stdout:8/99: truncate f3 631462 0 2026-03-09T00:03:37.825 INFO:tasks.workunit.client.0.vm03.stdout:8/100: chown d7/ce 315657747 1 2026-03-09T00:03:37.825 INFO:tasks.workunit.client.0.vm03.stdout:8/101: symlink d7/df/d1a/l1d 2 2026-03-09T00:03:37.830 INFO:tasks.workunit.client.1.vm06.stdout:5/405: creat d5/d1c/d23/d34/d47/f87 x:0 0 0 2026-03-09T00:03:37.830 INFO:tasks.workunit.client.1.vm06.stdout:5/406: truncate d5/d1c/d21/d28/f63 4934788 0 2026-03-09T00:03:37.830 INFO:tasks.workunit.client.0.vm03.stdout:4/96: write d7/fc [2161459,67547] 0 2026-03-09T00:03:37.830 INFO:tasks.workunit.client.0.vm03.stdout:4/97: write d7/f15 [63791,44806] 0 2026-03-09T00:03:37.833 INFO:tasks.workunit.client.0.vm03.stdout:1/110: dwrite f1 [0,4194304] 0 2026-03-09T00:03:37.837 INFO:tasks.workunit.client.1.vm06.stdout:5/407: symlink d5/l88 0 2026-03-09T00:03:37.837 INFO:tasks.workunit.client.1.vm06.stdout:5/408: fdatasync d5/d1c/d21/f3c 0 2026-03-09T00:03:37.837 INFO:tasks.workunit.client.1.vm06.stdout:5/409: chown d5/d1c/d21/d28/f3b 65 1 2026-03-09T00:03:37.837 INFO:tasks.workunit.client.1.vm06.stdout:5/410: write d5/d1c/d21/f73 [505812,9530] 0 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: Manager daemon vm06.rzcvhn is now available 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.rzcvhn/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:03:37.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:37 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.rzcvhn/trash_purge_schedule"}]: dispatch 2026-03-09T00:03:37.839 INFO:tasks.workunit.client.1.vm06.stdout:9/230: sync 2026-03-09T00:03:37.849 INFO:tasks.workunit.client.1.vm06.stdout:2/383: dwrite d7/f3a [0,4194304] 0 2026-03-09T00:03:37.849 INFO:tasks.workunit.client.1.vm06.stdout:2/384: truncate d7/d1b/d5a/f5e 961754 0 2026-03-09T00:03:37.853 INFO:tasks.workunit.client.0.vm03.stdout:4/98: rename d7/c18 to d7/c1b 0 2026-03-09T00:03:37.853 INFO:tasks.workunit.client.0.vm03.stdout:4/99: write d7/fc [4531314,88284] 0 2026-03-09T00:03:37.854 INFO:tasks.workunit.client.0.vm03.stdout:3/74: dwrite d2/db/f14 [0,4194304] 0 2026-03-09T00:03:37.854 INFO:tasks.workunit.client.0.vm03.stdout:1/111: symlink d4/d15/l25 0 2026-03-09T00:03:37.856 INFO:tasks.workunit.client.1.vm06.stdout:5/411: symlink d5/d1c/d23/d51/l89 0 2026-03-09T00:03:37.858 INFO:tasks.workunit.client.1.vm06.stdout:4/255: sync 2026-03-09T00:03:37.862 INFO:tasks.workunit.client.1.vm06.stdout:0/323: sync 2026-03-09T00:03:37.862 INFO:tasks.workunit.client.1.vm06.stdout:0/324: creat d3/d18/d1f/d39/d3b/f6f x:0 0 0 2026-03-09T00:03:37.862 INFO:tasks.workunit.client.1.vm06.stdout:0/325: readlink d3/d18/d28/d45/l62 0 2026-03-09T00:03:37.862 INFO:tasks.workunit.client.1.vm06.stdout:3/261: sync 2026-03-09T00:03:37.862 INFO:tasks.workunit.client.1.vm06.stdout:3/262: chown f8 238 1 2026-03-09T00:03:37.862 INFO:tasks.workunit.client.1.vm06.stdout:2/385: rename d7/da/db/c13 to d7/d1a/d39/c6a 0 2026-03-09T00:03:37.865 INFO:tasks.workunit.client.0.vm03.stdout:3/75: creat d2/f16 x:0 0 0 2026-03-09T00:03:37.865 INFO:tasks.workunit.client.0.vm03.stdout:1/112: creat d4/d21/f26 x:0 0 0 2026-03-09T00:03:37.865 INFO:tasks.workunit.client.0.vm03.stdout:1/113: creat d4/d6/f27 x:0 0 0 2026-03-09T00:03:37.865 INFO:tasks.workunit.client.0.vm03.stdout:1/114: creat d4/d21/f28 x:0 0 0 2026-03-09T00:03:37.865 INFO:tasks.workunit.client.1.vm06.stdout:4/256: mkdir d17/d21/d4c/d50 0 2026-03-09T00:03:37.866 INFO:tasks.workunit.client.1.vm06.stdout:3/263: write d11/d28/d2e/d2f/f49 [135738,25459] 0 2026-03-09T00:03:37.866 INFO:tasks.workunit.client.1.vm06.stdout:3/264: fdatasync d11/f48 0 2026-03-09T00:03:37.867 INFO:tasks.workunit.client.0.vm03.stdout:3/76: unlink d2/f11 0 2026-03-09T00:03:37.867 INFO:tasks.workunit.client.0.vm03.stdout:3/77: creat d2/db/f17 x:0 0 0 2026-03-09T00:03:37.867 INFO:tasks.workunit.client.0.vm03.stdout:3/78: chown d2/fc 6851 1 2026-03-09T00:03:37.867 INFO:tasks.workunit.client.0.vm03.stdout:3/79: readlink d2/l12 0 2026-03-09T00:03:37.870 INFO:tasks.workunit.client.1.vm06.stdout:0/326: dread d3/d18/d1f/d39/d3b/f55 [0,4194304] 0 2026-03-09T00:03:37.875 INFO:tasks.workunit.client.0.vm03.stdout:9/103: dwrite f5 [0,4194304] 0 2026-03-09T00:03:37.876 INFO:tasks.workunit.client.0.vm03.stdout:9/104: write fc [3677794,6466] 0 2026-03-09T00:03:37.876 INFO:tasks.workunit.client.0.vm03.stdout:9/105: chown d15/d1c 29196 1 2026-03-09T00:03:37.876 INFO:tasks.workunit.client.0.vm03.stdout:9/106: write fc [3091291,51232] 0 2026-03-09T00:03:37.880 INFO:tasks.workunit.client.1.vm06.stdout:4/257: rename cd to d17/c51 0 2026-03-09T00:03:37.889 INFO:tasks.workunit.client.0.vm03.stdout:3/80: mknod d2/db/c18 0 2026-03-09T00:03:37.889 INFO:tasks.workunit.client.0.vm03.stdout:3/81: dread - d2/db/f13 zero size 2026-03-09T00:03:37.889 INFO:tasks.workunit.client.0.vm03.stdout:3/82: chown d2/f8 21075991 1 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:6/304: dwrite d4/f2d [0,4194304] 0 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:2/386: write d7/d1b/f22 [1574380,128044] 0 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:3/265: mknod d11/d3f/c52 0 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:0/327: unlink d3/d18/d2c/c37 0 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:6/305: mkdir d4/d27/d3e/d57 0 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:6/306: chown d4/c30 371 1 2026-03-09T00:03:37.890 INFO:tasks.workunit.client.1.vm06.stdout:2/387: stat d7/f4c 0 2026-03-09T00:03:37.895 INFO:tasks.workunit.client.1.vm06.stdout:0/328: link d3/d18/d2c/f6b d3/d18/d28/f70 0 2026-03-09T00:03:37.895 INFO:tasks.workunit.client.1.vm06.stdout:2/388: creat d7/d1a/d25/d66/f6b x:0 0 0 2026-03-09T00:03:37.895 INFO:tasks.workunit.client.0.vm03.stdout:7/111: dread d2/d1f/f16 [0,4194304] 0 2026-03-09T00:03:37.896 INFO:tasks.workunit.client.1.vm06.stdout:0/329: unlink d3/d18/d1f/d39/f3d 0 2026-03-09T00:03:37.896 INFO:tasks.workunit.client.1.vm06.stdout:0/330: write d3/d18/d2c/d2d/f46 [745069,6141] 0 2026-03-09T00:03:37.896 INFO:tasks.workunit.client.1.vm06.stdout:0/331: stat d3/lb 0 2026-03-09T00:03:37.899 INFO:tasks.workunit.client.1.vm06.stdout:2/389: creat d7/d1b/f6c x:0 0 0 2026-03-09T00:03:37.912 INFO:tasks.workunit.client.0.vm03.stdout:3/83: symlink d2/l19 0 2026-03-09T00:03:37.914 INFO:tasks.workunit.client.0.vm03.stdout:7/112: read d2/d4/d15/f1a [1802499,43394] 0 2026-03-09T00:03:37.928 INFO:tasks.workunit.client.0.vm03.stdout:7/113: chown d2/ce 22430059 1 2026-03-09T00:03:37.928 INFO:tasks.workunit.client.0.vm03.stdout:7/114: write d2/d4/d15/f1a [202756,35902] 0 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: Manager daemon vm06.rzcvhn is now available 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.rzcvhn/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:03:37.929 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:37 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm06.rzcvhn/trash_purge_schedule"}]: dispatch 2026-03-09T00:03:37.929 INFO:tasks.workunit.client.0.vm03.stdout:7/115: mkdir d2/d4/d1e/d23 0 2026-03-09T00:03:37.929 INFO:tasks.workunit.client.0.vm03.stdout:7/116: fsync d2/d4/d15/f19 0 2026-03-09T00:03:37.962 INFO:tasks.workunit.client.1.vm06.stdout:9/231: dwrite d1/d4/ff [0,4194304] 0 2026-03-09T00:03:37.963 INFO:tasks.workunit.client.1.vm06.stdout:9/232: chown d1/d3/d12/d21/d14 11989818 1 2026-03-09T00:03:37.963 INFO:tasks.workunit.client.1.vm06.stdout:9/233: dread d1/d3/d12/d21/f2c [0,4194304] 0 2026-03-09T00:03:37.969 INFO:tasks.workunit.client.1.vm06.stdout:9/234: creat d1/d3/d12/f4d x:0 0 0 2026-03-09T00:03:37.985 INFO:tasks.workunit.client.0.vm03.stdout:4/100: dwrite d7/f12 [0,4194304] 0 2026-03-09T00:03:37.993 INFO:tasks.workunit.client.0.vm03.stdout:4/101: dread d7/f8 [0,4194304] 0 2026-03-09T00:03:37.996 INFO:tasks.workunit.client.0.vm03.stdout:4/102: chown d7/fd 50275707 1 2026-03-09T00:03:37.999 INFO:tasks.workunit.client.1.vm06.stdout:5/412: dwrite d5/d1c/d23/f82 [0,4194304] 0 2026-03-09T00:03:38.001 INFO:tasks.workunit.client.1.vm06.stdout:5/413: chown d5/d1c/d21/d28/d35 146720394 1 2026-03-09T00:03:38.001 INFO:tasks.workunit.client.1.vm06.stdout:5/414: chown d5/l8 154288873 1 2026-03-09T00:03:38.003 INFO:tasks.workunit.client.1.vm06.stdout:0/332: dread d3/d18/d2c/d2d/f46 [0,4194304] 0 2026-03-09T00:03:38.020 INFO:tasks.workunit.client.1.vm06.stdout:0/333: creat d3/d18/d1f/d39/d69/f71 x:0 0 0 2026-03-09T00:03:38.038 INFO:tasks.workunit.client.1.vm06.stdout:2/390: dwrite d7/d1b/f6c [0,4194304] 0 2026-03-09T00:03:38.038 INFO:tasks.workunit.client.1.vm06.stdout:2/391: chown d7/da/c14 10608294 1 2026-03-09T00:03:38.038 INFO:tasks.workunit.client.1.vm06.stdout:2/392: getdents d7/da/d55/d67 0 2026-03-09T00:03:38.054 INFO:tasks.workunit.client.1.vm06.stdout:4/258: dwrite d17/f35 [4194304,4194304] 0 2026-03-09T00:03:38.056 INFO:tasks.workunit.client.0.vm03.stdout:9/107: truncate fb 4542770 0 2026-03-09T00:03:38.056 INFO:tasks.workunit.client.0.vm03.stdout:2/99: sync 2026-03-09T00:03:38.056 INFO:tasks.workunit.client.0.vm03.stdout:9/108: chown f5 105662287 1 2026-03-09T00:03:38.056 INFO:tasks.workunit.client.0.vm03.stdout:0/107: sync 2026-03-09T00:03:38.061 INFO:tasks.workunit.client.1.vm06.stdout:1/273: sync 2026-03-09T00:03:38.070 INFO:tasks.workunit.client.0.vm03.stdout:5/73: getdents . 0 2026-03-09T00:03:38.070 INFO:tasks.workunit.client.1.vm06.stdout:8/323: write db/d1e/f20 [519094,40544] 0 2026-03-09T00:03:38.070 INFO:tasks.workunit.client.1.vm06.stdout:8/324: read f7 [4079328,48966] 0 2026-03-09T00:03:38.075 INFO:tasks.workunit.client.1.vm06.stdout:6/307: dwrite d4/f40 [0,4194304] 0 2026-03-09T00:03:38.075 INFO:tasks.workunit.client.1.vm06.stdout:6/308: truncate d4/f3d 4803420 0 2026-03-09T00:03:38.076 INFO:tasks.workunit.client.1.vm06.stdout:3/266: dwrite d11/f20 [0,4194304] 0 2026-03-09T00:03:38.084 INFO:tasks.workunit.client.1.vm06.stdout:6/309: dread d4/f22 [0,4194304] 0 2026-03-09T00:03:38.086 INFO:tasks.workunit.client.1.vm06.stdout:4/259: rename d17/d24/d3b/c40 to d17/d21/c52 0 2026-03-09T00:03:38.087 INFO:tasks.workunit.client.0.vm03.stdout:9/109: truncate d15/f18 321773 0 2026-03-09T00:03:38.087 INFO:tasks.workunit.client.0.vm03.stdout:9/110: chown f8 0 1 2026-03-09T00:03:38.094 INFO:tasks.workunit.client.1.vm06.stdout:1/274: chown d6/lc 32482840 1 2026-03-09T00:03:38.094 INFO:tasks.workunit.client.1.vm06.stdout:1/275: stat d6/d21/f2e 0 2026-03-09T00:03:38.095 INFO:tasks.workunit.client.0.vm03.stdout:2/100: truncate d8/fb 785807 0 2026-03-09T00:03:38.098 INFO:tasks.workunit.client.0.vm03.stdout:9/111: write d15/f1b [738532,124810] 0 2026-03-09T00:03:38.110 INFO:tasks.workunit.client.0.vm03.stdout:0/108: link d2/da/dd/c12 d2/d1f/c2a 0 2026-03-09T00:03:38.110 INFO:tasks.workunit.client.0.vm03.stdout:5/74: mknod c19 0 2026-03-09T00:03:38.110 INFO:tasks.workunit.client.0.vm03.stdout:5/75: chown ff 18011245 1 2026-03-09T00:03:38.110 INFO:tasks.workunit.client.1.vm06.stdout:5/415: dwrite d5/d1c/d21/d28/d35/f40 [0,4194304] 0 2026-03-09T00:03:38.110 INFO:tasks.workunit.client.1.vm06.stdout:3/267: rmdir d11/d28/d2e 39 2026-03-09T00:03:38.110 INFO:tasks.workunit.client.1.vm06.stdout:6/310: symlink d4/d27/d42/d4b/l58 0 2026-03-09T00:03:38.112 INFO:tasks.workunit.client.1.vm06.stdout:1/276: stat d6/ff 0 2026-03-09T00:03:38.113 INFO:tasks.workunit.client.0.vm03.stdout:9/112: creat d15/f1f x:0 0 0 2026-03-09T00:03:38.115 INFO:tasks.workunit.client.0.vm03.stdout:5/76: creat f1a x:0 0 0 2026-03-09T00:03:38.121 INFO:tasks.workunit.client.0.vm03.stdout:5/77: mknod c1b 0 2026-03-09T00:03:38.124 INFO:tasks.workunit.client.0.vm03.stdout:4/103: rmdir d7 39 2026-03-09T00:03:38.124 INFO:tasks.workunit.client.0.vm03.stdout:4/104: dread - d7/ff zero size 2026-03-09T00:03:38.124 INFO:tasks.workunit.client.0.vm03.stdout:5/78: mkdir d1c 0 2026-03-09T00:03:38.124 INFO:tasks.workunit.client.1.vm06.stdout:5/416: getdents d5/d1c/d23/d51 0 2026-03-09T00:03:38.124 INFO:tasks.workunit.client.1.vm06.stdout:5/417: write d5/d1c/d21/d28/f3b [2824513,39300] 0 2026-03-09T00:03:38.124 INFO:tasks.workunit.client.1.vm06.stdout:5/418: write d5/d1c/d21/d28/d35/f4e [5239345,24801] 0 2026-03-09T00:03:38.126 INFO:tasks.workunit.client.1.vm06.stdout:0/334: dwrite d3/d18/d2c/d2d/f40 [0,4194304] 0 2026-03-09T00:03:38.126 INFO:tasks.workunit.client.1.vm06.stdout:0/335: creat d3/d18/d1f/d39/d49/f72 x:0 0 0 2026-03-09T00:03:38.127 INFO:tasks.workunit.client.0.vm03.stdout:8/102: dwrite d7/f11 [0,4194304] 0 2026-03-09T00:03:38.136 INFO:tasks.workunit.client.0.vm03.stdout:4/105: rename d7/f8 to d7/f1c 0 2026-03-09T00:03:38.138 INFO:tasks.workunit.client.0.vm03.stdout:4/106: chown d7/f12 112364 1 2026-03-09T00:03:38.138 INFO:tasks.workunit.client.0.vm03.stdout:4/107: readlink d7/l1a 0 2026-03-09T00:03:38.138 INFO:tasks.workunit.client.0.vm03.stdout:4/108: stat d7/f15 0 2026-03-09T00:03:38.139 INFO:tasks.workunit.client.1.vm06.stdout:1/277: fdatasync d6/f25 0 2026-03-09T00:03:38.139 INFO:tasks.workunit.client.0.vm03.stdout:5/79: mknod d1c/c1d 0 2026-03-09T00:03:38.139 INFO:tasks.workunit.client.0.vm03.stdout:5/80: write f11 [607211,55301] 0 2026-03-09T00:03:38.139 INFO:tasks.workunit.client.0.vm03.stdout:5/81: dread - f17 zero size 2026-03-09T00:03:38.144 INFO:tasks.workunit.client.0.vm03.stdout:4/109: dread d7/fe [0,4194304] 0 2026-03-09T00:03:38.147 INFO:tasks.workunit.client.0.vm03.stdout:4/110: dread d7/f1c [0,4194304] 0 2026-03-09T00:03:38.149 INFO:tasks.workunit.client.0.vm03.stdout:8/103: mkdir d7/df/d1e 0 2026-03-09T00:03:38.150 INFO:tasks.workunit.client.0.vm03.stdout:8/104: chown d7/df/l17 128153830 1 2026-03-09T00:03:38.151 INFO:tasks.workunit.client.0.vm03.stdout:4/111: creat d7/f1d x:0 0 0 2026-03-09T00:03:38.151 INFO:tasks.workunit.client.0.vm03.stdout:4/112: creat d7/f1e x:0 0 0 2026-03-09T00:03:38.159 INFO:tasks.workunit.client.1.vm06.stdout:5/419: creat d5/d1c/d21/d28/d5e/d66/f8a x:0 0 0 2026-03-09T00:03:38.159 INFO:tasks.workunit.client.1.vm06.stdout:5/420: write d5/d44/f81 [664359,34987] 0 2026-03-09T00:03:38.159 INFO:tasks.workunit.client.1.vm06.stdout:5/421: truncate d5/d1c/d23/d34/d47/f61 950763 0 2026-03-09T00:03:38.159 INFO:tasks.workunit.client.1.vm06.stdout:5/422: write d5/d1c/d21/d28/f59 [994269,96324] 0 2026-03-09T00:03:38.163 INFO:tasks.workunit.client.1.vm06.stdout:0/336: mkdir d3/d18/d1f/d44/d6a/d73 0 2026-03-09T00:03:38.163 INFO:tasks.workunit.client.1.vm06.stdout:0/337: truncate d3/d18/f59 951240 0 2026-03-09T00:03:38.163 INFO:tasks.workunit.client.1.vm06.stdout:0/338: fdatasync d3/f19 0 2026-03-09T00:03:38.163 INFO:tasks.workunit.client.1.vm06.stdout:0/339: write d3/f10 [4590635,32077] 0 2026-03-09T00:03:38.166 INFO:tasks.workunit.client.1.vm06.stdout:1/278: dread d6/f1d [0,4194304] 0 2026-03-09T00:03:38.166 INFO:tasks.workunit.client.1.vm06.stdout:1/279: chown d6/lc 19 1 2026-03-09T00:03:38.168 INFO:tasks.workunit.client.1.vm06.stdout:3/268: rename d11/d28/f2b to d11/d28/d2e/d2f/f53 0 2026-03-09T00:03:38.173 INFO:tasks.workunit.client.1.vm06.stdout:0/340: getdents d3/d18/d1f/d44 0 2026-03-09T00:03:38.173 INFO:tasks.workunit.client.1.vm06.stdout:0/341: readlink d3/d18/l67 0 2026-03-09T00:03:38.174 INFO:tasks.workunit.client.1.vm06.stdout:1/280: chown d6/f41 2673373 1 2026-03-09T00:03:38.174 INFO:tasks.workunit.client.1.vm06.stdout:1/281: chown d6/d21/d2d/c5a 534 1 2026-03-09T00:03:38.174 INFO:tasks.workunit.client.1.vm06.stdout:1/282: creat d6/d21/d2d/f5d x:0 0 0 2026-03-09T00:03:38.174 INFO:tasks.workunit.client.1.vm06.stdout:1/283: chown d6/d4c/d51 1017005 1 2026-03-09T00:03:38.175 INFO:tasks.workunit.client.1.vm06.stdout:1/284: read d6/ff [281701,26023] 0 2026-03-09T00:03:38.175 INFO:tasks.workunit.client.1.vm06.stdout:1/285: chown d6/d21/c40 66 1 2026-03-09T00:03:38.175 INFO:tasks.workunit.client.1.vm06.stdout:1/286: stat d6/d21/d2d/l33 0 2026-03-09T00:03:38.175 INFO:tasks.workunit.client.1.vm06.stdout:3/269: rename d11/f1b to d11/d3f/f54 0 2026-03-09T00:03:38.175 INFO:tasks.workunit.client.1.vm06.stdout:1/287: symlink d6/d21/d2d/d3b/l5e 0 2026-03-09T00:03:38.182 INFO:tasks.workunit.client.1.vm06.stdout:5/423: rename d5/d1c/d21/d28/d5e/d66/d78/f7e to d5/d44/f8b 0 2026-03-09T00:03:38.206 INFO:tasks.workunit.client.1.vm06.stdout:5/424: creat d5/d1c/d68/f8c x:0 0 0 2026-03-09T00:03:38.211 INFO:tasks.workunit.client.1.vm06.stdout:3/270: dread d11/d28/f3a [0,4194304] 0 2026-03-09T00:03:38.211 INFO:tasks.workunit.client.1.vm06.stdout:3/271: write d11/d28/d2e/d2f/f53 [264191,70106] 0 2026-03-09T00:03:38.211 INFO:tasks.workunit.client.1.vm06.stdout:3/272: creat d11/d28/d2e/d2f/d36/f55 x:0 0 0 2026-03-09T00:03:38.214 INFO:tasks.workunit.client.1.vm06.stdout:3/273: rename d11/c25 to d11/d28/c56 0 2026-03-09T00:03:38.217 INFO:tasks.workunit.client.1.vm06.stdout:3/274: mkdir d11/d28/d57 0 2026-03-09T00:03:38.221 INFO:tasks.workunit.client.1.vm06.stdout:3/275: creat d11/d28/d2e/d2f/d36/f58 x:0 0 0 2026-03-09T00:03:38.246 INFO:tasks.workunit.client.0.vm03.stdout:9/113: dwrite f8 [0,4194304] 0 2026-03-09T00:03:38.256 INFO:tasks.workunit.client.1.vm06.stdout:4/260: dwrite d17/d24/f39 [0,4194304] 0 2026-03-09T00:03:38.256 INFO:tasks.workunit.client.1.vm06.stdout:7/320: dwrite d0/df/d17/f1f [4194304,4194304] 0 2026-03-09T00:03:38.257 INFO:tasks.workunit.client.1.vm06.stdout:9/235: dwrite d1/d3/d12/f3a [0,4194304] 0 2026-03-09T00:03:38.260 INFO:tasks.workunit.client.1.vm06.stdout:4/261: symlink d17/d21/d4c/d50/l53 0 2026-03-09T00:03:38.264 INFO:tasks.workunit.client.1.vm06.stdout:7/321: mkdir d0/df/d1a/d3f/d53 0 2026-03-09T00:03:38.268 INFO:tasks.workunit.client.1.vm06.stdout:2/393: dread f2 [0,4194304] 0 2026-03-09T00:03:38.271 INFO:tasks.workunit.client.1.vm06.stdout:4/262: mkdir d17/d24/d3b/d54 0 2026-03-09T00:03:38.277 INFO:tasks.workunit.client.1.vm06.stdout:7/322: symlink d0/df/d1a/d3a/l54 0 2026-03-09T00:03:38.277 INFO:tasks.workunit.client.1.vm06.stdout:7/323: mkdir d0/d55 0 2026-03-09T00:03:38.278 INFO:tasks.workunit.client.1.vm06.stdout:2/394: unlink d7/c34 0 2026-03-09T00:03:38.279 INFO:tasks.workunit.client.1.vm06.stdout:7/324: rename d0/d39/f48 to d0/d39/f56 0 2026-03-09T00:03:38.280 INFO:tasks.workunit.client.1.vm06.stdout:7/325: creat d0/df/d1a/d27/d4c/d52/f57 x:0 0 0 2026-03-09T00:03:38.286 INFO:tasks.workunit.client.0.vm03.stdout:4/113: dwrite d7/f12 [0,4194304] 0 2026-03-09T00:03:38.286 INFO:tasks.workunit.client.1.vm06.stdout:3/276: dwrite d11/d3f/f54 [4194304,4194304] 0 2026-03-09T00:03:38.286 INFO:tasks.workunit.client.1.vm06.stdout:3/277: chown f3 2332 1 2026-03-09T00:03:38.288 INFO:tasks.workunit.client.1.vm06.stdout:3/278: rename d11/d28/d2e/d2f/d36/f58 to d11/d28/d2e/d2f/d36/f59 0 2026-03-09T00:03:38.288 INFO:tasks.workunit.client.1.vm06.stdout:3/279: readlink l1 0 2026-03-09T00:03:38.291 INFO:tasks.workunit.client.0.vm03.stdout:4/114: dread d7/fd [4194304,4194304] 0 2026-03-09T00:03:38.292 INFO:tasks.workunit.client.1.vm06.stdout:5/425: dwrite d5/d1c/d23/f42 [0,4194304] 0 2026-03-09T00:03:38.292 INFO:tasks.workunit.client.0.vm03.stdout:4/115: rename d7/fb to d7/f1f 0 2026-03-09T00:03:38.300 INFO:tasks.workunit.client.1.vm06.stdout:3/280: creat d11/f5a x:0 0 0 2026-03-09T00:03:38.310 INFO:tasks.workunit.client.1.vm06.stdout:3/281: mkdir d11/d28/d2e/d2f/d5b 0 2026-03-09T00:03:38.310 INFO:tasks.workunit.client.1.vm06.stdout:5/426: mknod d5/d44/d84/c8d 0 2026-03-09T00:03:38.310 INFO:tasks.workunit.client.1.vm06.stdout:5/427: creat d5/f8e x:0 0 0 2026-03-09T00:03:38.311 INFO:tasks.workunit.client.1.vm06.stdout:7/326: dread d0/df/d17/f1f [0,4194304] 0 2026-03-09T00:03:38.314 INFO:tasks.workunit.client.1.vm06.stdout:3/282: link d11/d28/c2a d11/c5c 0 2026-03-09T00:03:38.314 INFO:tasks.workunit.client.1.vm06.stdout:5/428: rename d5/d1c/c26 to d5/d1c/d21/d28/d5e/d66/c8f 0 2026-03-09T00:03:38.322 INFO:tasks.workunit.client.1.vm06.stdout:7/327: rename d0/df/d1a/d22/c2a to d0/df/d1a/d27/d4c/c58 0 2026-03-09T00:03:38.322 INFO:tasks.workunit.client.1.vm06.stdout:3/283: link d11/d28/d2e/c41 d11/d28/d2e/c5d 0 2026-03-09T00:03:38.323 INFO:tasks.workunit.client.1.vm06.stdout:7/328: mknod d0/df/d1a/d27/d4c/d52/c59 0 2026-03-09T00:03:38.323 INFO:tasks.workunit.client.1.vm06.stdout:3/284: creat d11/d28/f5e x:0 0 0 2026-03-09T00:03:38.325 INFO:tasks.workunit.client.1.vm06.stdout:7/329: creat d0/df/d1a/d27/d4c/d40/f5a x:0 0 0 2026-03-09T00:03:38.325 INFO:tasks.workunit.client.1.vm06.stdout:7/330: truncate d0/df/f13 974017 0 2026-03-09T00:03:38.326 INFO:tasks.workunit.client.1.vm06.stdout:7/331: mkdir d0/df/d1a/d27/d4c/d40/d5b 0 2026-03-09T00:03:38.334 INFO:tasks.workunit.client.1.vm06.stdout:3/285: write d11/d3f/f54 [6008502,130553] 0 2026-03-09T00:03:38.334 INFO:tasks.workunit.client.1.vm06.stdout:3/286: dread - d11/d28/f5e zero size 2026-03-09T00:03:38.334 INFO:tasks.workunit.client.1.vm06.stdout:3/287: fdatasync d11/d28/f4f 0 2026-03-09T00:03:38.334 INFO:tasks.workunit.client.1.vm06.stdout:3/288: mkdir d11/d28/d2e/d2f/d5b/d5f 0 2026-03-09T00:03:38.344 INFO:tasks.workunit.client.1.vm06.stdout:3/289: dread d11/f1a [0,4194304] 0 2026-03-09T00:03:38.352 INFO:tasks.workunit.client.1.vm06.stdout:3/290: creat d11/d28/d2e/d2f/d5b/d5f/f60 x:0 0 0 2026-03-09T00:03:38.352 INFO:tasks.workunit.client.1.vm06.stdout:3/291: chown d11/d28/d2e/d2f/d36/f55 1 1 2026-03-09T00:03:38.356 INFO:tasks.workunit.client.1.vm06.stdout:0/342: dwrite d3/d18/d1f/d44/f58 [0,4194304] 0 2026-03-09T00:03:38.356 INFO:tasks.workunit.client.1.vm06.stdout:0/343: stat d3/f19 0 2026-03-09T00:03:38.357 INFO:tasks.workunit.client.1.vm06.stdout:0/344: truncate d3/d18/d1f/f5e 2460029 0 2026-03-09T00:03:38.358 INFO:tasks.workunit.client.1.vm06.stdout:0/345: mkdir d3/d18/d2c/d2d/d74 0 2026-03-09T00:03:38.386 INFO:tasks.workunit.client.0.vm03.stdout:7/117: dwrite d2/d1f/f16 [0,4194304] 0 2026-03-09T00:03:38.389 INFO:tasks.workunit.client.0.vm03.stdout:7/118: rename d2/d4/d1e/d23 to d2/d4/d15/d24 0 2026-03-09T00:03:38.389 INFO:tasks.workunit.client.0.vm03.stdout:7/119: readlink d2/d1f/l20 0 2026-03-09T00:03:38.389 INFO:tasks.workunit.client.1.vm06.stdout:5/429: dwrite d5/d1c/d23/f42 [0,4194304] 0 2026-03-09T00:03:38.395 INFO:tasks.workunit.client.0.vm03.stdout:4/116: dwrite d7/fd [0,4194304] 0 2026-03-09T00:03:38.395 INFO:tasks.workunit.client.0.vm03.stdout:4/117: chown d7/f10 18 1 2026-03-09T00:03:38.410 INFO:tasks.workunit.client.1.vm06.stdout:2/395: dwrite d7/f26 [0,4194304] 0 2026-03-09T00:03:38.410 INFO:tasks.workunit.client.1.vm06.stdout:2/396: read - d7/d1a/d25/d66/f6b zero size 2026-03-09T00:03:38.411 INFO:tasks.workunit.client.1.vm06.stdout:2/397: rmdir d7/d1a/d56 39 2026-03-09T00:03:38.411 INFO:tasks.workunit.client.1.vm06.stdout:2/398: creat d7/da/d55/f6d x:0 0 0 2026-03-09T00:03:38.412 INFO:tasks.workunit.client.1.vm06.stdout:2/399: rename d7/da/d1c/f29 to d7/da/db/f6e 0 2026-03-09T00:03:38.413 INFO:tasks.workunit.client.1.vm06.stdout:2/400: symlink d7/d1a/d56/l6f 0 2026-03-09T00:03:38.413 INFO:tasks.workunit.client.1.vm06.stdout:2/401: dread - d7/d1b/f46 zero size 2026-03-09T00:03:38.413 INFO:tasks.workunit.client.1.vm06.stdout:2/402: creat d7/da/d1c/f70 x:0 0 0 2026-03-09T00:03:38.415 INFO:tasks.workunit.client.1.vm06.stdout:2/403: rename d7/da/d55/d67 to d7/d1b/d71 0 2026-03-09T00:03:38.424 INFO:tasks.workunit.client.1.vm06.stdout:2/404: write d7/d1a/f30 [82292,111283] 0 2026-03-09T00:03:38.450 INFO:tasks.workunit.client.1.vm06.stdout:3/292: dwrite d11/d28/d2e/d2f/f50 [0,4194304] 0 2026-03-09T00:03:38.450 INFO:tasks.workunit.client.1.vm06.stdout:3/293: chown d11/d28/d2e/f37 3368774 1 2026-03-09T00:03:38.450 INFO:tasks.workunit.client.1.vm06.stdout:3/294: chown c0 53 1 2026-03-09T00:03:38.487 INFO:tasks.workunit.client.0.vm03.stdout:5/82: dwrite f12 [0,4194304] 0 2026-03-09T00:03:38.491 INFO:tasks.workunit.client.0.vm03.stdout:5/83: creat d1c/f1e x:0 0 0 2026-03-09T00:03:38.492 INFO:tasks.workunit.client.0.vm03.stdout:5/84: write f14 [683471,65484] 0 2026-03-09T00:03:38.494 INFO:tasks.workunit.client.0.vm03.stdout:5/85: rename f17 to d1c/f1f 0 2026-03-09T00:03:38.496 INFO:tasks.workunit.client.0.vm03.stdout:5/86: mkdir d1c/d20 0 2026-03-09T00:03:38.498 INFO:tasks.workunit.client.0.vm03.stdout:5/87: symlink d1c/d20/l21 0 2026-03-09T00:03:38.498 INFO:tasks.workunit.client.0.vm03.stdout:5/88: chown d1c/d20/l21 893456134 1 2026-03-09T00:03:38.498 INFO:tasks.workunit.client.0.vm03.stdout:5/89: write f14 [97092,106799] 0 2026-03-09T00:03:38.498 INFO:tasks.workunit.client.0.vm03.stdout:5/90: readlink l4 0 2026-03-09T00:03:38.499 INFO:tasks.workunit.client.0.vm03.stdout:5/91: truncate f3 2255088 0 2026-03-09T00:03:38.512 INFO:tasks.workunit.client.0.vm03.stdout:7/120: dwrite d2/d4/d15/f19 [0,4194304] 0 2026-03-09T00:03:38.514 INFO:tasks.workunit.client.0.vm03.stdout:8/105: dwrite f3 [0,4194304] 0 2026-03-09T00:03:38.514 INFO:tasks.workunit.client.0.vm03.stdout:8/106: fsync d7/f9 0 2026-03-09T00:03:38.520 INFO:tasks.workunit.client.0.vm03.stdout:7/121: symlink d2/d4/l25 0 2026-03-09T00:03:38.527 INFO:tasks.workunit.client.0.vm03.stdout:8/107: link d7/lc d7/df/d1e/l1f 0 2026-03-09T00:03:38.527 INFO:tasks.workunit.client.0.vm03.stdout:8/108: fsync f3 0 2026-03-09T00:03:38.533 INFO:tasks.workunit.client.0.vm03.stdout:7/122: symlink d2/d4/l26 0 2026-03-09T00:03:38.535 INFO:tasks.workunit.client.0.vm03.stdout:8/109: mknod d7/df/c20 0 2026-03-09T00:03:38.537 INFO:tasks.workunit.client.0.vm03.stdout:7/123: write d2/d4/fb [1785061,57919] 0 2026-03-09T00:03:38.538 INFO:tasks.workunit.client.0.vm03.stdout:8/110: rmdir d7/df/d1a 39 2026-03-09T00:03:38.539 INFO:tasks.workunit.client.0.vm03.stdout:8/111: symlink d7/df/d1e/l21 0 2026-03-09T00:03:38.542 INFO:tasks.workunit.client.0.vm03.stdout:7/124: write d2/d1f/f11 [2678402,69870] 0 2026-03-09T00:03:38.542 INFO:tasks.workunit.client.0.vm03.stdout:8/112: dread d7/f18 [0,4194304] 0 2026-03-09T00:03:38.544 INFO:tasks.workunit.client.0.vm03.stdout:8/113: symlink d7/df/d1e/l22 0 2026-03-09T00:03:38.546 INFO:tasks.workunit.client.0.vm03.stdout:7/125: rename d2/d4/l14 to d2/d4/d15/l27 0 2026-03-09T00:03:38.548 INFO:tasks.workunit.client.1.vm06.stdout:5/430: dwrite d5/d1c/d21/d28/d35/d49/f83 [0,4194304] 0 2026-03-09T00:03:38.548 INFO:tasks.workunit.client.1.vm06.stdout:5/431: write d5/d1c/d23/d51/f60 [1981413,73353] 0 2026-03-09T00:03:38.549 INFO:tasks.workunit.client.0.vm03.stdout:7/126: creat d2/d1f/f28 x:0 0 0 2026-03-09T00:03:38.549 INFO:tasks.workunit.client.0.vm03.stdout:7/127: read d2/d4/d15/f1a [2094248,25217] 0 2026-03-09T00:03:38.549 INFO:tasks.workunit.client.0.vm03.stdout:7/128: creat d2/d4/d15/f29 x:0 0 0 2026-03-09T00:03:38.552 INFO:tasks.workunit.client.0.vm03.stdout:8/114: mknod d7/c23 0 2026-03-09T00:03:38.555 INFO:tasks.workunit.client.1.vm06.stdout:2/405: dwrite d7/d1b/f6c [0,4194304] 0 2026-03-09T00:03:38.557 INFO:tasks.workunit.client.0.vm03.stdout:5/92: dwrite f11 [0,4194304] 0 2026-03-09T00:03:38.557 INFO:tasks.workunit.client.0.vm03.stdout:5/93: write f1a [236911,39949] 0 2026-03-09T00:03:38.561 INFO:tasks.workunit.client.1.vm06.stdout:7/332: dwrite d0/df/d1a/d22/f2c [8388608,4194304] 0 2026-03-09T00:03:38.562 INFO:tasks.workunit.client.1.vm06.stdout:3/295: dwrite d11/d28/d2e/d2f/d36/f4e [0,4194304] 0 2026-03-09T00:03:38.582 INFO:tasks.workunit.client.1.vm06.stdout:7/333: write d0/df/d1a/d22/f2c [9366955,128699] 0 2026-03-09T00:03:38.582 INFO:tasks.workunit.client.1.vm06.stdout:7/334: chown d0/df/d1a/d27/d4c/d40/f5a 3022577 1 2026-03-09T00:03:38.582 INFO:tasks.workunit.client.1.vm06.stdout:7/335: write d0/df/f13 [1192823,102552] 0 2026-03-09T00:03:38.585 INFO:tasks.workunit.client.0.vm03.stdout:1/115: rmdir d4/d6 39 2026-03-09T00:03:38.585 INFO:tasks.workunit.client.0.vm03.stdout:1/116: chown d4/f9 15 1 2026-03-09T00:03:38.593 INFO:tasks.workunit.client.1.vm06.stdout:7/336: dread d0/df/d17/f21 [0,4194304] 0 2026-03-09T00:03:38.602 INFO:tasks.workunit.client.1.vm06.stdout:5/432: creat d5/d1c/d23/d51/f90 x:0 0 0 2026-03-09T00:03:38.602 INFO:tasks.workunit.client.1.vm06.stdout:3/296: dwrite d11/f12 [0,4194304] 0 2026-03-09T00:03:38.602 INFO:tasks.workunit.client.1.vm06.stdout:3/297: dread - d11/d28/d2e/f38 zero size 2026-03-09T00:03:38.604 INFO:tasks.workunit.client.0.vm03.stdout:5/94: mknod d1c/d20/c22 0 2026-03-09T00:03:38.618 INFO:tasks.workunit.client.1.vm06.stdout:2/406: symlink d7/d1a/l72 0 2026-03-09T00:03:38.620 INFO:tasks.workunit.client.0.vm03.stdout:1/117: mknod d4/d6/c29 0 2026-03-09T00:03:38.620 INFO:tasks.workunit.client.0.vm03.stdout:5/95: dread f12 [0,4194304] 0 2026-03-09T00:03:38.625 INFO:tasks.workunit.client.1.vm06.stdout:7/337: creat d0/df/f5c x:0 0 0 2026-03-09T00:03:38.638 INFO:tasks.workunit.client.0.vm03.stdout:8/115: dwrite d7/fd [0,4194304] 0 2026-03-09T00:03:38.645 INFO:tasks.workunit.client.1.vm06.stdout:3/298: symlink d11/d28/d4d/l61 0 2026-03-09T00:03:38.651 INFO:tasks.workunit.client.1.vm06.stdout:3/299: stat d11/f27 0 2026-03-09T00:03:38.651 INFO:tasks.workunit.client.1.vm06.stdout:2/407: symlink d7/d1b/d5a/l73 0 2026-03-09T00:03:38.651 INFO:tasks.workunit.client.1.vm06.stdout:2/408: truncate f3 448168 0 2026-03-09T00:03:38.651 INFO:tasks.workunit.client.0.vm03.stdout:6/108: sync 2026-03-09T00:03:38.651 INFO:tasks.workunit.client.0.vm03.stdout:6/109: creat d13/f1f x:0 0 0 2026-03-09T00:03:38.652 INFO:tasks.workunit.client.0.vm03.stdout:6/110: write f7 [3197320,123998] 0 2026-03-09T00:03:38.653 INFO:tasks.workunit.client.1.vm06.stdout:2/409: creat d7/da/db/f74 x:0 0 0 2026-03-09T00:03:38.659 INFO:tasks.workunit.client.1.vm06.stdout:2/410: symlink d7/d1a/d3c/l75 0 2026-03-09T00:03:38.659 INFO:tasks.workunit.client.1.vm06.stdout:2/411: chown d7/d1a/f30 26730 1 2026-03-09T00:03:38.667 INFO:tasks.workunit.client.0.vm03.stdout:6/111: mknod d13/d1e/c20 0 2026-03-09T00:03:38.673 INFO:tasks.workunit.client.1.vm06.stdout:3/300: fsync d11/d3f/f54 0 2026-03-09T00:03:38.673 INFO:tasks.workunit.client.1.vm06.stdout:3/301: chown d11/d28/d2e/l3b 118814891 1 2026-03-09T00:03:38.673 INFO:tasks.workunit.client.1.vm06.stdout:3/302: write d11/d28/d2e/d2f/f49 [389411,46435] 0 2026-03-09T00:03:38.673 INFO:tasks.workunit.client.1.vm06.stdout:3/303: link d11/d28/d2e/d2f/d36/f4a d11/d28/d2e/f62 0 2026-03-09T00:03:38.675 INFO:tasks.workunit.client.1.vm06.stdout:3/304: mknod d11/c63 0 2026-03-09T00:03:38.675 INFO:tasks.workunit.client.1.vm06.stdout:3/305: creat d11/d28/d2e/d2f/f64 x:0 0 0 2026-03-09T00:03:38.675 INFO:tasks.workunit.client.1.vm06.stdout:7/338: dwrite d0/df/d1a/f44 [8388608,4194304] 0 2026-03-09T00:03:38.676 INFO:tasks.workunit.client.0.vm03.stdout:1/118: dwrite d4/d21/f26 [0,4194304] 0 2026-03-09T00:03:38.684 INFO:tasks.workunit.client.0.vm03.stdout:1/119: dread d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:38.684 INFO:tasks.workunit.client.1.vm06.stdout:1/288: rmdir d6/d21 39 2026-03-09T00:03:38.689 INFO:tasks.workunit.client.1.vm06.stdout:3/306: rename d11/d28/d2e/f37 to d11/d28/d2e/f65 0 2026-03-09T00:03:38.689 INFO:tasks.workunit.client.1.vm06.stdout:3/307: chown d11/d28/d2e/f65 10 1 2026-03-09T00:03:38.689 INFO:tasks.workunit.client.1.vm06.stdout:3/308: creat d11/f66 x:0 0 0 2026-03-09T00:03:38.693 INFO:tasks.workunit.client.1.vm06.stdout:7/339: creat d0/df/d1a/d3a/f5d x:0 0 0 2026-03-09T00:03:38.695 INFO:tasks.workunit.client.0.vm03.stdout:9/114: truncate f8 1822815 0 2026-03-09T00:03:38.695 INFO:tasks.workunit.client.0.vm03.stdout:9/115: chown d15/f1f 510994 1 2026-03-09T00:03:38.696 INFO:tasks.workunit.client.0.vm03.stdout:1/120: truncate d4/d15/f17 1510369 0 2026-03-09T00:03:38.701 INFO:tasks.workunit.client.0.vm03.stdout:9/116: rename d15/l1d to d15/d1c/l20 0 2026-03-09T00:03:38.710 INFO:tasks.workunit.client.0.vm03.stdout:1/121: link d4/d15/f18 d4/d6/f2a 0 2026-03-09T00:03:38.710 INFO:tasks.workunit.client.0.vm03.stdout:1/122: readlink d4/l7 0 2026-03-09T00:03:38.711 INFO:tasks.workunit.client.1.vm06.stdout:3/309: mkdir d11/d67 0 2026-03-09T00:03:38.711 INFO:tasks.workunit.client.1.vm06.stdout:3/310: getdents d11/d28/d57 0 2026-03-09T00:03:38.715 INFO:tasks.workunit.client.0.vm03.stdout:9/117: mkdir d15/d1c/d21 0 2026-03-09T00:03:38.715 INFO:tasks.workunit.client.1.vm06.stdout:7/340: mkdir d0/df/d1a/d3a/d4e/d5e 0 2026-03-09T00:03:38.716 INFO:tasks.workunit.client.0.vm03.stdout:1/123: creat d4/d15/d1a/f2b x:0 0 0 2026-03-09T00:03:38.719 INFO:tasks.workunit.client.0.vm03.stdout:9/118: mknod d15/c22 0 2026-03-09T00:03:38.721 INFO:tasks.workunit.client.1.vm06.stdout:1/289: unlink d6/d21/d2d/d37/f47 0 2026-03-09T00:03:38.724 INFO:tasks.workunit.client.0.vm03.stdout:1/124: link d4/f1e d4/d21/f2c 0 2026-03-09T00:03:38.724 INFO:tasks.workunit.client.0.vm03.stdout:1/125: chown d4/d15/l22 40 1 2026-03-09T00:03:38.725 INFO:tasks.workunit.client.1.vm06.stdout:7/341: symlink d0/df/d1a/d27/d4c/d40/l5f 0 2026-03-09T00:03:38.725 INFO:tasks.workunit.client.1.vm06.stdout:7/342: write d0/d39/f56 [217348,129556] 0 2026-03-09T00:03:38.726 INFO:tasks.workunit.client.1.vm06.stdout:3/311: dread f3 [0,4194304] 0 2026-03-09T00:03:38.727 INFO:tasks.workunit.client.1.vm06.stdout:5/433: getdents d5 0 2026-03-09T00:03:38.727 INFO:tasks.workunit.client.1.vm06.stdout:5/434: creat d5/d44/d4b/f91 x:0 0 0 2026-03-09T00:03:38.727 INFO:tasks.workunit.client.1.vm06.stdout:5/435: stat d5/d1c/d21/d28/d35/c72 0 2026-03-09T00:03:38.735 INFO:tasks.workunit.client.0.vm03.stdout:1/126: mknod d4/d15/c2d 0 2026-03-09T00:03:38.738 INFO:tasks.workunit.client.1.vm06.stdout:9/236: dwrite d1/d3/d12/f3a [4194304,4194304] 0 2026-03-09T00:03:38.738 INFO:tasks.workunit.client.1.vm06.stdout:3/312: rename d11/d3f/c52 to d11/d28/d2e/c68 0 2026-03-09T00:03:38.742 INFO:tasks.workunit.client.1.vm06.stdout:3/313: fdatasync f8 0 2026-03-09T00:03:38.743 INFO:tasks.workunit.client.0.vm03.stdout:1/127: unlink d4/d6/c16 0 2026-03-09T00:03:38.743 INFO:tasks.workunit.client.1.vm06.stdout:5/436: getdents d5/d44/d84 0 2026-03-09T00:03:38.744 INFO:tasks.workunit.client.1.vm06.stdout:5/437: dread d5/d1c/d21/d28/d35/f4e [0,4194304] 0 2026-03-09T00:03:38.746 INFO:tasks.workunit.client.1.vm06.stdout:9/237: link d1/d3/d12/d21/d14/d25/f4a d1/d3/d12/d21/d14/d25/f4e 0 2026-03-09T00:03:38.753 INFO:tasks.workunit.client.1.vm06.stdout:9/238: dread d1/d3/d12/d21/f2c [0,4194304] 0 2026-03-09T00:03:38.753 INFO:tasks.workunit.client.1.vm06.stdout:3/314: symlink d11/l69 0 2026-03-09T00:03:38.765 INFO:tasks.workunit.client.1.vm06.stdout:9/239: mkdir d1/d3/d4f 0 2026-03-09T00:03:38.765 INFO:tasks.workunit.client.1.vm06.stdout:3/315: creat d11/d3f/f6a x:0 0 0 2026-03-09T00:03:38.769 INFO:tasks.workunit.client.1.vm06.stdout:9/240: truncate d1/d3/d12/d21/d9/f4c 3966413 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/316: truncate d11/d28/d2e/d2f/f3e 2885092 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/317: creat d11/d28/f6b x:0 0 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/318: write d11/f24 [1842615,100214] 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/319: stat d11/d28/d2e/d2f/f3e 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/320: creat d11/d28/d2e/d2f/d5b/f6c x:0 0 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/321: symlink d11/d28/d2e/l6d 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/322: unlink d11/d28/d2e/d2f/f50 0 2026-03-09T00:03:38.790 INFO:tasks.workunit.client.1.vm06.stdout:3/323: chown d11/f66 6587 1 2026-03-09T00:03:38.791 INFO:tasks.workunit.client.1.vm06.stdout:3/324: write d11/d3f/f4c [305623,63341] 0 2026-03-09T00:03:38.791 INFO:tasks.workunit.client.1.vm06.stdout:3/325: creat d11/d28/d4d/f6e x:0 0 0 2026-03-09T00:03:38.791 INFO:tasks.workunit.client.1.vm06.stdout:3/326: mkdir d11/d28/d6f 0 2026-03-09T00:03:38.791 INFO:tasks.workunit.client.1.vm06.stdout:3/327: symlink d11/d28/d2e/d2f/l70 0 2026-03-09T00:03:38.792 INFO:tasks.workunit.client.0.vm03.stdout:9/119: dwrite fc [0,4194304] 0 2026-03-09T00:03:38.797 INFO:tasks.workunit.client.0.vm03.stdout:5/96: dread f3 [0,4194304] 0 2026-03-09T00:03:38.797 INFO:tasks.workunit.client.1.vm06.stdout:5/438: dwrite d5/d1c/d68/f3f [0,4194304] 0 2026-03-09T00:03:38.797 INFO:tasks.workunit.client.1.vm06.stdout:5/439: chown d5/l1e 739877 1 2026-03-09T00:03:38.797 INFO:tasks.workunit.client.1.vm06.stdout:9/241: write d1/d3/d12/d21/d14/d25/f32 [1722049,8467] 0 2026-03-09T00:03:38.798 INFO:tasks.workunit.client.0.vm03.stdout:5/97: rename c16 to d1c/d20/c23 0 2026-03-09T00:03:38.798 INFO:tasks.workunit.client.0.vm03.stdout:5/98: chown d1c/f1f 699 1 2026-03-09T00:03:38.800 INFO:tasks.workunit.client.1.vm06.stdout:7/343: dwrite d0/df/d1a/d27/d4c/f32 [0,4194304] 0 2026-03-09T00:03:38.800 INFO:tasks.workunit.client.0.vm03.stdout:5/99: rename l4 to d1c/d20/l24 0 2026-03-09T00:03:38.802 INFO:tasks.workunit.client.0.vm03.stdout:5/100: dread f14 [0,4194304] 0 2026-03-09T00:03:38.803 INFO:tasks.workunit.client.0.vm03.stdout:5/101: creat d1c/d20/f25 x:0 0 0 2026-03-09T00:03:38.804 INFO:tasks.workunit.client.1.vm06.stdout:7/344: creat d0/df/d1a/d27/f60 x:0 0 0 2026-03-09T00:03:38.805 INFO:tasks.workunit.client.1.vm06.stdout:5/440: rename d5/d1c/d21/d28/d35 to d5/d44/d4b/d92 0 2026-03-09T00:03:38.805 INFO:tasks.workunit.client.1.vm06.stdout:7/345: creat d0/df/d1a/d35/f61 x:0 0 0 2026-03-09T00:03:38.805 INFO:tasks.workunit.client.1.vm06.stdout:7/346: stat d0/df/d1a/d35 0 2026-03-09T00:03:38.806 INFO:tasks.workunit.client.0.vm03.stdout:7/129: truncate d2/d1f/f11 964159 0 2026-03-09T00:03:38.807 INFO:tasks.workunit.client.1.vm06.stdout:5/441: mknod d5/d1c/c93 0 2026-03-09T00:03:38.807 INFO:tasks.workunit.client.1.vm06.stdout:5/442: stat d5/d1c/d21/d28/d5e/d66/d78/c7b 0 2026-03-09T00:03:38.808 INFO:tasks.workunit.client.0.vm03.stdout:7/130: link d2/d4/d15/f1a d2/f2a 0 2026-03-09T00:03:38.808 INFO:tasks.workunit.client.0.vm03.stdout:7/131: readlink d2/d4/d15/l27 0 2026-03-09T00:03:38.808 INFO:tasks.workunit.client.0.vm03.stdout:7/132: stat d2/d4/fb 0 2026-03-09T00:03:38.809 INFO:tasks.workunit.client.1.vm06.stdout:4/263: sync 2026-03-09T00:03:38.809 INFO:tasks.workunit.client.1.vm06.stdout:8/325: sync 2026-03-09T00:03:38.809 INFO:tasks.workunit.client.1.vm06.stdout:6/311: sync 2026-03-09T00:03:38.809 INFO:tasks.workunit.client.1.vm06.stdout:6/312: chown d4/d16/c2b 8 1 2026-03-09T00:03:38.809 INFO:tasks.workunit.client.1.vm06.stdout:0/346: sync 2026-03-09T00:03:38.816 INFO:tasks.workunit.client.1.vm06.stdout:4/264: rename d17/d24/d49/l4e to d17/d24/d49/l55 0 2026-03-09T00:03:38.816 INFO:tasks.workunit.client.1.vm06.stdout:4/265: write d17/d24/d49/f47 [1470939,102984] 0 2026-03-09T00:03:38.816 INFO:tasks.workunit.client.1.vm06.stdout:4/266: write d17/f20 [981320,35636] 0 2026-03-09T00:03:38.816 INFO:tasks.workunit.client.1.vm06.stdout:0/347: write f1 [2948314,31289] 0 2026-03-09T00:03:38.818 INFO:tasks.workunit.client.1.vm06.stdout:7/347: dread d0/df/f13 [0,4194304] 0 2026-03-09T00:03:38.818 INFO:tasks.workunit.client.1.vm06.stdout:7/348: write d0/df/d1a/d3a/f5d [633379,45999] 0 2026-03-09T00:03:38.835 INFO:tasks.workunit.client.0.vm03.stdout:6/112: getdents d13/d1e 0 2026-03-09T00:03:38.837 INFO:tasks.workunit.client.1.vm06.stdout:8/326: rename db/dd/d24/d36/f3d to db/d1e/d46/f69 0 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Migrating agent root cert to cert store 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Migrating agent root key to cert store 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Checking for cert/key for grafana.vm03 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Migrating grafana.vm03 cert to cert store 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Migrating grafana.vm03 key to cert store 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Deploying cephadm binary to vm03 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: Deploying cephadm binary to vm06 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: mgrmap e22: vm06.rzcvhn(active, since 1.72892s) 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: pgmap v3: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:38 vm03.local ceph-mon[52346]: pgmap v4: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:38.839 INFO:tasks.workunit.client.1.vm06.stdout:6/313: link d4/d27/d42/d4b/l4f d4/d16/d46/l59 0 2026-03-09T00:03:38.839 INFO:tasks.workunit.client.1.vm06.stdout:6/314: chown d4/l43 13367 1 2026-03-09T00:03:38.840 INFO:tasks.workunit.client.0.vm03.stdout:6/113: rename f7 to d13/d1e/f21 0 2026-03-09T00:03:38.844 INFO:tasks.workunit.client.0.vm03.stdout:0/109: sync 2026-03-09T00:03:38.844 INFO:tasks.workunit.client.0.vm03.stdout:2/101: sync 2026-03-09T00:03:38.844 INFO:tasks.workunit.client.0.vm03.stdout:4/118: sync 2026-03-09T00:03:38.844 INFO:tasks.workunit.client.0.vm03.stdout:3/84: sync 2026-03-09T00:03:38.844 INFO:tasks.workunit.client.0.vm03.stdout:3/85: creat d2/db/f1a x:0 0 0 2026-03-09T00:03:38.847 INFO:tasks.workunit.client.0.vm03.stdout:0/110: dread d2/da/dd/f11 [0,4194304] 0 2026-03-09T00:03:38.847 INFO:tasks.workunit.client.0.vm03.stdout:0/111: write d2/da/f1b [350330,49748] 0 2026-03-09T00:03:38.850 INFO:tasks.workunit.client.0.vm03.stdout:4/119: mkdir d7/d20 0 2026-03-09T00:03:38.853 INFO:tasks.workunit.client.1.vm06.stdout:0/348: mknod d3/d18/d2c/d2d/d31/c75 0 2026-03-09T00:03:38.853 INFO:tasks.workunit.client.1.vm06.stdout:0/349: write d3/d18/d1f/d39/d69/f71 [533686,128448] 0 2026-03-09T00:03:38.854 INFO:tasks.workunit.client.0.vm03.stdout:4/120: read d7/f1c [3281868,73436] 0 2026-03-09T00:03:38.857 INFO:tasks.workunit.client.0.vm03.stdout:3/86: unlink d2/fd 0 2026-03-09T00:03:38.858 INFO:tasks.workunit.client.1.vm06.stdout:2/412: sync 2026-03-09T00:03:38.858 INFO:tasks.workunit.client.0.vm03.stdout:4/121: write d7/fd [3528733,75275] 0 2026-03-09T00:03:38.871 INFO:tasks.workunit.client.0.vm03.stdout:6/114: symlink d13/d1e/l22 0 2026-03-09T00:03:38.893 INFO:tasks.workunit.client.0.vm03.stdout:6/115: chown d13/f1d 35110398 1 2026-03-09T00:03:38.893 INFO:tasks.workunit.client.0.vm03.stdout:6/116: dread - d13/f1c zero size 2026-03-09T00:03:38.894 INFO:tasks.workunit.client.0.vm03.stdout:6/117: mknod d13/d1e/c23 0 2026-03-09T00:03:38.894 INFO:tasks.workunit.client.0.vm03.stdout:6/118: creat d13/f24 x:0 0 0 2026-03-09T00:03:38.894 INFO:tasks.workunit.client.1.vm06.stdout:6/315: mknod d4/d27/d3e/d57/c5a 0 2026-03-09T00:03:38.894 INFO:tasks.workunit.client.1.vm06.stdout:6/316: symlink d4/d16/d53/l5b 0 2026-03-09T00:03:38.900 INFO:tasks.workunit.client.0.vm03.stdout:6/119: read f10 [2640960,68120] 0 2026-03-09T00:03:38.902 INFO:tasks.workunit.client.1.vm06.stdout:6/317: dread d4/f2d [0,4194304] 0 2026-03-09T00:03:38.902 INFO:tasks.workunit.client.1.vm06.stdout:6/318: fsync d4/f5 0 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Migrating agent root cert to cert store 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Migrating agent root key to cert store 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Checking for cert/key for grafana.vm03 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Migrating grafana.vm03 cert to cert store 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Migrating grafana.vm03 key to cert store 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Deploying cephadm binary to vm03 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: Deploying cephadm binary to vm06 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: mgrmap e22: vm06.rzcvhn(active, since 1.72892s) 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: pgmap v3: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:38 vm06.local ceph-mon[58395]: pgmap v4: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:38.953 INFO:tasks.workunit.client.1.vm06.stdout:4/267: dwrite fe [8388608,4194304] 0 2026-03-09T00:03:38.961 INFO:tasks.workunit.client.1.vm06.stdout:4/268: dread d17/d24/f3a [0,4194304] 0 2026-03-09T00:03:38.961 INFO:tasks.workunit.client.1.vm06.stdout:4/269: chown d17/d24/f39 9734 1 2026-03-09T00:03:38.961 INFO:tasks.workunit.client.1.vm06.stdout:3/328: dwrite d11/d28/d2e/f38 [0,4194304] 0 2026-03-09T00:03:38.961 INFO:tasks.workunit.client.0.vm03.stdout:9/120: dwrite f5 [4194304,4194304] 0 2026-03-09T00:03:38.961 INFO:tasks.workunit.client.0.vm03.stdout:9/121: creat d15/f23 x:0 0 0 2026-03-09T00:03:38.964 INFO:tasks.workunit.client.1.vm06.stdout:4/270: creat d17/d21/d4c/f56 x:0 0 0 2026-03-09T00:03:38.969 INFO:tasks.workunit.client.0.vm03.stdout:6/120: write fb [3948899,63169] 0 2026-03-09T00:03:38.992 INFO:tasks.workunit.client.0.vm03.stdout:6/121: dread - d13/f1f zero size 2026-03-09T00:03:38.992 INFO:tasks.workunit.client.0.vm03.stdout:0/112: write d2/da/dd/f14 [795407,130803] 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.1.vm06.stdout:5/443: dwrite d5/d1c/d21/d28/d5e/d66/f8a [0,4194304] 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/128: dread d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/129: fsync d4/d21/f26 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/130: write d4/d15/d1a/f2b [76210,100103] 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/131: readlink d4/d6/le 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/132: chown d4/f9 221233642 1 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/133: fsync f1 0 2026-03-09T00:03:38.995 INFO:tasks.workunit.client.0.vm03.stdout:1/134: creat d4/d15/d1a/f2e x:0 0 0 2026-03-09T00:03:39.004 INFO:tasks.workunit.client.0.vm03.stdout:6/122: unlink d13/f18 0 2026-03-09T00:03:39.017 INFO:tasks.workunit.client.0.vm03.stdout:0/113: symlink d2/l2b 0 2026-03-09T00:03:39.031 INFO:tasks.workunit.client.0.vm03.stdout:1/135: dread f0 [0,4194304] 0 2026-03-09T00:03:39.031 INFO:tasks.workunit.client.0.vm03.stdout:1/136: chown d4/f9 69259399 1 2026-03-09T00:03:39.039 INFO:tasks.workunit.client.0.vm03.stdout:7/133: dwrite d2/d1f/f16 [0,4194304] 0 2026-03-09T00:03:39.040 INFO:tasks.workunit.client.0.vm03.stdout:7/134: symlink d2/d4/d1e/l2b 0 2026-03-09T00:03:39.041 INFO:tasks.workunit.client.0.vm03.stdout:7/135: link d2/d1f/c10 d2/d4/d15/d24/c2c 0 2026-03-09T00:03:39.055 INFO:tasks.workunit.client.0.vm03.stdout:4/122: dwrite d7/ff [0,4194304] 0 2026-03-09T00:03:39.057 INFO:tasks.workunit.client.0.vm03.stdout:5/102: dwrite f18 [0,4194304] 0 2026-03-09T00:03:39.057 INFO:tasks.workunit.client.0.vm03.stdout:5/103: fdatasync ff 0 2026-03-09T00:03:39.058 INFO:tasks.workunit.client.0.vm03.stdout:4/123: dread d7/fe [0,4194304] 0 2026-03-09T00:03:39.063 INFO:tasks.workunit.client.0.vm03.stdout:7/136: dread d2/d4/d15/f1a [0,4194304] 0 2026-03-09T00:03:39.076 INFO:tasks.workunit.client.0.vm03.stdout:7/137: dread d2/f2a [0,4194304] 0 2026-03-09T00:03:39.076 INFO:tasks.workunit.client.0.vm03.stdout:5/104: mknod d1c/c26 0 2026-03-09T00:03:39.076 INFO:tasks.workunit.client.0.vm03.stdout:5/105: fdatasync f1a 0 2026-03-09T00:03:39.076 INFO:tasks.workunit.client.0.vm03.stdout:5/106: dread f3 [0,4194304] 0 2026-03-09T00:03:39.081 INFO:tasks.workunit.client.0.vm03.stdout:4/124: rename d7/f10 to d7/d20/f21 0 2026-03-09T00:03:39.081 INFO:tasks.workunit.client.0.vm03.stdout:1/137: dread d4/f9 [0,4194304] 0 2026-03-09T00:03:39.081 INFO:tasks.workunit.client.1.vm06.stdout:7/349: dwrite d0/df/d1a/f50 [0,4194304] 0 2026-03-09T00:03:39.084 INFO:tasks.workunit.client.0.vm03.stdout:5/107: link d1c/d20/l21 d1c/l27 0 2026-03-09T00:03:39.089 INFO:tasks.workunit.client.0.vm03.stdout:5/108: write f3 [1276692,64016] 0 2026-03-09T00:03:39.089 INFO:tasks.workunit.client.1.vm06.stdout:2/413: dwrite f2 [0,4194304] 0 2026-03-09T00:03:39.089 INFO:tasks.workunit.client.1.vm06.stdout:2/414: dread d7/d1b/d5a/f5e [0,4194304] 0 2026-03-09T00:03:39.089 INFO:tasks.workunit.client.0.vm03.stdout:2/102: dwrite d8/d17/f1a [0,4194304] 0 2026-03-09T00:03:39.089 INFO:tasks.workunit.client.0.vm03.stdout:7/138: dread d2/d1f/f16 [0,4194304] 0 2026-03-09T00:03:39.114 INFO:tasks.workunit.client.1.vm06.stdout:6/319: dwrite d4/f12 [0,4194304] 0 2026-03-09T00:03:39.114 INFO:tasks.workunit.client.0.vm03.stdout:9/122: dwrite d15/f18 [0,4194304] 0 2026-03-09T00:03:39.114 INFO:tasks.workunit.client.0.vm03.stdout:9/123: chown d15/c22 257 1 2026-03-09T00:03:39.114 INFO:tasks.workunit.client.0.vm03.stdout:9/124: readlink l13 0 2026-03-09T00:03:39.126 INFO:tasks.workunit.client.1.vm06.stdout:5/444: dwrite d5/d1c/d68/f31 [0,4194304] 0 2026-03-09T00:03:39.126 INFO:tasks.workunit.client.0.vm03.stdout:0/114: dwrite d2/f1e [0,4194304] 0 2026-03-09T00:03:39.133 INFO:tasks.workunit.client.0.vm03.stdout:6/123: dwrite d13/f1f [0,4194304] 0 2026-03-09T00:03:39.134 INFO:tasks.workunit.client.0.vm03.stdout:6/124: write d13/f16 [1383219,103592] 0 2026-03-09T00:03:39.145 INFO:tasks.workunit.client.0.vm03.stdout:5/109: dwrite d1c/f1f [0,4194304] 0 2026-03-09T00:03:39.145 INFO:tasks.workunit.client.0.vm03.stdout:5/110: truncate d1c/f1e 857997 0 2026-03-09T00:03:39.207 INFO:tasks.workunit.client.1.vm06.stdout:9/242: write d1/d3/f1f [2425080,16243] 0 2026-03-09T00:03:39.217 INFO:tasks.workunit.client.1.vm06.stdout:3/329: getdents d11/d28 0 2026-03-09T00:03:39.229 INFO:tasks.workunit.client.0.vm03.stdout:8/116: sync 2026-03-09T00:03:39.239 INFO:tasks.workunit.client.1.vm06.stdout:7/350: mkdir d0/df/d1a/d35/d62 0 2026-03-09T00:03:39.240 INFO:tasks.workunit.client.0.vm03.stdout:7/139: fsync d2/d4/d15/f1a 0 2026-03-09T00:03:39.240 INFO:tasks.workunit.client.0.vm03.stdout:7/140: fdatasync d2/d1f/f28 0 2026-03-09T00:03:39.240 INFO:tasks.workunit.client.0.vm03.stdout:4/125: rmdir d7/d20 39 2026-03-09T00:03:39.240 INFO:tasks.workunit.client.0.vm03.stdout:4/126: creat d7/f22 x:0 0 0 2026-03-09T00:03:39.248 INFO:tasks.workunit.client.1.vm06.stdout:2/415: dread - d7/f5d zero size 2026-03-09T00:03:39.249 INFO:tasks.workunit.client.0.vm03.stdout:1/138: rename d4/d15/d1a/d23 to d4/d15/d1a/d2f 0 2026-03-09T00:03:39.259 INFO:tasks.workunit.client.1.vm06.stdout:6/320: creat d4/d27/d3e/d57/f5c x:0 0 0 2026-03-09T00:03:39.260 INFO:tasks.workunit.client.1.vm06.stdout:8/327: fsync db/d1e/d46/f69 0 2026-03-09T00:03:39.267 INFO:tasks.workunit.client.0.vm03.stdout:2/103: creat d8/f21 x:0 0 0 2026-03-09T00:03:39.267 INFO:tasks.workunit.client.0.vm03.stdout:2/104: creat d8/d1b/f22 x:0 0 0 2026-03-09T00:03:39.267 INFO:tasks.workunit.client.0.vm03.stdout:8/117: dwrite d7/df/d1a/f1c [0,4194304] 0 2026-03-09T00:03:39.270 INFO:tasks.workunit.client.0.vm03.stdout:2/105: dread d8/f15 [0,4194304] 0 2026-03-09T00:03:39.271 INFO:tasks.workunit.client.1.vm06.stdout:9/243: dwrite d1/d3/d12/d21/d14/f47 [0,4194304] 0 2026-03-09T00:03:39.272 INFO:tasks.workunit.client.1.vm06.stdout:0/350: getdents d3/d18/d2c/d2d/d31 0 2026-03-09T00:03:39.279 INFO:tasks.workunit.client.0.vm03.stdout:3/87: getdents d2/db 0 2026-03-09T00:03:39.288 INFO:tasks.workunit.client.1.vm06.stdout:4/271: rmdir d17/d21 39 2026-03-09T00:03:39.289 INFO:tasks.workunit.client.0.vm03.stdout:9/125: unlink d15/d1c/l1e 0 2026-03-09T00:03:39.295 INFO:tasks.workunit.client.0.vm03.stdout:0/115: creat d2/d1f/f2c x:0 0 0 2026-03-09T00:03:39.296 INFO:tasks.workunit.client.1.vm06.stdout:1/290: sync 2026-03-09T00:03:39.296 INFO:tasks.workunit.client.1.vm06.stdout:1/291: read - d6/d21/d2d/f5b zero size 2026-03-09T00:03:39.297 INFO:tasks.workunit.client.0.vm03.stdout:9/126: read f5 [2336184,14651] 0 2026-03-09T00:03:39.300 INFO:tasks.workunit.client.1.vm06.stdout:5/445: unlink d5/d44/d4b/d92/d49/f7d 0 2026-03-09T00:03:39.304 INFO:tasks.workunit.client.1.vm06.stdout:1/292: dread d6/ff [0,4194304] 0 2026-03-09T00:03:39.348 INFO:tasks.workunit.client.1.vm06.stdout:3/330: link d11/f27 d11/d3f/f71 0 2026-03-09T00:03:39.349 INFO:tasks.workunit.client.0.vm03.stdout:5/111: mknod d1c/d20/c28 0 2026-03-09T00:03:39.360 INFO:tasks.workunit.client.1.vm06.stdout:7/351: rename d0/df/d1a/d27/d4c/d52/f57 to d0/df/d1a/d3a/d4e/f63 0 2026-03-09T00:03:39.360 INFO:tasks.workunit.client.1.vm06.stdout:7/352: rename d0/df/d1a/d3a to d0/df/d1a/d3a/d4e/d64 22 2026-03-09T00:03:39.368 INFO:tasks.workunit.client.1.vm06.stdout:3/331: write d11/f12 [1860951,118344] 0 2026-03-09T00:03:39.368 INFO:tasks.workunit.client.1.vm06.stdout:6/321: mkdir d4/d27/d42/d52/d5d 0 2026-03-09T00:03:39.369 INFO:tasks.workunit.client.1.vm06.stdout:8/328: symlink db/d53/d5c/l6a 0 2026-03-09T00:03:39.374 INFO:tasks.workunit.client.0.vm03.stdout:2/106: dwrite f2 [0,4194304] 0 2026-03-09T00:03:39.382 INFO:tasks.workunit.client.1.vm06.stdout:9/244: mkdir d1/d3/d50 0 2026-03-09T00:03:39.385 INFO:tasks.workunit.client.1.vm06.stdout:1/293: dwrite d6/f28 [0,4194304] 0 2026-03-09T00:03:39.390 INFO:tasks.workunit.client.0.vm03.stdout:0/116: dwrite d2/da/dd/f14 [4194304,4194304] 0 2026-03-09T00:03:39.398 INFO:tasks.workunit.client.1.vm06.stdout:0/351: symlink d3/d18/d28/d45/l76 0 2026-03-09T00:03:39.399 INFO:tasks.workunit.client.1.vm06.stdout:4/272: creat d17/d21/d4c/f57 x:0 0 0 2026-03-09T00:03:39.406 INFO:tasks.workunit.client.1.vm06.stdout:6/322: creat d4/d16/f5e x:0 0 0 2026-03-09T00:03:39.406 INFO:tasks.workunit.client.0.vm03.stdout:7/141: rmdir d2/d4/d1e 39 2026-03-09T00:03:39.414 INFO:tasks.workunit.client.0.vm03.stdout:4/127: truncate f4 8839807 0 2026-03-09T00:03:39.416 INFO:tasks.workunit.client.0.vm03.stdout:4/128: write d7/fc [2907906,80326] 0 2026-03-09T00:03:39.416 INFO:tasks.workunit.client.0.vm03.stdout:4/129: chown d7/fd 440143 1 2026-03-09T00:03:39.416 INFO:tasks.workunit.client.0.vm03.stdout:1/139: mknod d4/d6/c30 0 2026-03-09T00:03:39.416 INFO:tasks.workunit.client.0.vm03.stdout:1/140: write d4/d6/f27 [102906,28837] 0 2026-03-09T00:03:39.419 INFO:tasks.workunit.client.1.vm06.stdout:6/323: write d4/f5 [5575790,55185] 0 2026-03-09T00:03:39.424 INFO:tasks.workunit.client.1.vm06.stdout:0/352: unlink d3/d18/d1f/f34 0 2026-03-09T00:03:39.426 INFO:tasks.workunit.client.1.vm06.stdout:4/273: rename d17/d21/d32/f3d to d17/d24/d3b/d54/f58 0 2026-03-09T00:03:39.428 INFO:tasks.workunit.client.1.vm06.stdout:4/274: dread d17/d24/f2c [0,4194304] 0 2026-03-09T00:03:39.431 INFO:tasks.workunit.client.0.vm03.stdout:8/118: chown d7/c8 0 1 2026-03-09T00:03:39.431 INFO:tasks.workunit.client.0.vm03.stdout:8/119: write d7/f9 [885317,90443] 0 2026-03-09T00:03:39.431 INFO:tasks.workunit.client.0.vm03.stdout:3/88: rename d2/db/ce to d2/c1b 0 2026-03-09T00:03:39.432 INFO:tasks.workunit.client.0.vm03.stdout:9/127: symlink d15/l24 0 2026-03-09T00:03:39.439 INFO:tasks.workunit.client.0.vm03.stdout:2/107: link d8/f9 d8/d1b/f23 0 2026-03-09T00:03:39.439 INFO:tasks.workunit.client.0.vm03.stdout:2/108: chown f7 3 1 2026-03-09T00:03:39.443 INFO:tasks.workunit.client.0.vm03.stdout:0/117: creat d2/da/f2d x:0 0 0 2026-03-09T00:03:39.451 INFO:tasks.workunit.client.0.vm03.stdout:0/118: fdatasync d2/da/d1a/f25 0 2026-03-09T00:03:39.451 INFO:tasks.workunit.client.0.vm03.stdout:0/119: chown d2/da/d1a 1413472 1 2026-03-09T00:03:39.451 INFO:tasks.workunit.client.0.vm03.stdout:0/120: truncate d2/f22 248864 0 2026-03-09T00:03:39.452 INFO:tasks.workunit.client.0.vm03.stdout:1/141: unlink d4/d15/d1a/f2e 0 2026-03-09T00:03:39.452 INFO:tasks.workunit.client.0.vm03.stdout:0/121: rename d2/da/dd/l1d to d2/d1f/l2e 0 2026-03-09T00:03:39.452 INFO:tasks.workunit.client.1.vm06.stdout:9/245: dwrite d1/d3/d12/d21/d14/d25/f31 [0,4194304] 0 2026-03-09T00:03:39.455 INFO:tasks.workunit.client.0.vm03.stdout:3/89: read d2/db/f10 [1031982,27506] 0 2026-03-09T00:03:39.455 INFO:tasks.workunit.client.0.vm03.stdout:3/90: creat d2/f1c x:0 0 0 2026-03-09T00:03:39.455 INFO:tasks.workunit.client.0.vm03.stdout:3/91: write f1 [2641573,67080] 0 2026-03-09T00:03:39.455 INFO:tasks.workunit.client.0.vm03.stdout:3/92: dread - d2/f16 zero size 2026-03-09T00:03:39.457 INFO:tasks.workunit.client.0.vm03.stdout:0/122: symlink d2/da/d1a/l2f 0 2026-03-09T00:03:39.457 INFO:tasks.workunit.client.0.vm03.stdout:1/142: getdents d4/d21 0 2026-03-09T00:03:39.457 INFO:tasks.workunit.client.0.vm03.stdout:1/143: chown d4/d6/c1c 61445 1 2026-03-09T00:03:39.457 INFO:tasks.workunit.client.0.vm03.stdout:1/144: write d4/d15/d1a/f1d [4610533,69149] 0 2026-03-09T00:03:39.465 INFO:tasks.workunit.client.0.vm03.stdout:3/93: write d2/db/f15 [161806,33802] 0 2026-03-09T00:03:39.469 INFO:tasks.workunit.client.0.vm03.stdout:3/94: chown d2/fc 26328 1 2026-03-09T00:03:39.469 INFO:tasks.workunit.client.0.vm03.stdout:3/95: chown d2/c3 86 1 2026-03-09T00:03:39.469 INFO:tasks.workunit.client.0.vm03.stdout:3/96: creat d2/f1d x:0 0 0 2026-03-09T00:03:39.469 INFO:tasks.workunit.client.0.vm03.stdout:3/97: write d2/db/f15 [242455,44025] 0 2026-03-09T00:03:39.469 INFO:tasks.workunit.client.1.vm06.stdout:9/246: symlink d1/d4/l51 0 2026-03-09T00:03:39.474 INFO:tasks.workunit.client.0.vm03.stdout:5/112: dwrite f3 [0,4194304] 0 2026-03-09T00:03:39.474 INFO:tasks.workunit.client.0.vm03.stdout:5/113: creat d1c/f29 x:0 0 0 2026-03-09T00:03:39.475 INFO:tasks.workunit.client.1.vm06.stdout:9/247: write d1/d3/d12/d21/d14/f20 [1578983,122123] 0 2026-03-09T00:03:39.492 INFO:tasks.workunit.client.0.vm03.stdout:3/98: symlink d2/db/l1e 0 2026-03-09T00:03:39.493 INFO:tasks.workunit.client.1.vm06.stdout:9/248: mkdir d1/d3/d4f/d52 0 2026-03-09T00:03:39.504 INFO:tasks.workunit.client.0.vm03.stdout:5/114: mkdir d1c/d20/d2a 0 2026-03-09T00:03:39.508 INFO:tasks.workunit.client.0.vm03.stdout:5/115: fdatasync f14 0 2026-03-09T00:03:39.508 INFO:tasks.workunit.client.0.vm03.stdout:5/116: truncate ff 166597 0 2026-03-09T00:03:39.508 INFO:tasks.workunit.client.0.vm03.stdout:5/117: mknod d1c/d20/d2a/c2b 0 2026-03-09T00:03:39.540 INFO:tasks.workunit.client.1.vm06.stdout:3/332: dwrite d11/d28/f4f [0,4194304] 0 2026-03-09T00:03:39.544 INFO:tasks.workunit.client.1.vm06.stdout:3/333: dread d11/d28/d2e/d2f/d36/f4a [4194304,4194304] 0 2026-03-09T00:03:39.544 INFO:tasks.workunit.client.1.vm06.stdout:3/334: truncate d11/f27 846285 0 2026-03-09T00:03:39.546 INFO:tasks.workunit.client.1.vm06.stdout:3/335: mknod d11/d28/d2e/d2f/d5b/d5f/c72 0 2026-03-09T00:03:39.546 INFO:tasks.workunit.client.1.vm06.stdout:3/336: write d11/d28/d2e/f47 [4624720,109343] 0 2026-03-09T00:03:39.546 INFO:tasks.workunit.client.1.vm06.stdout:3/337: chown d11/d28/d2e/c46 320 1 2026-03-09T00:03:39.549 INFO:tasks.workunit.client.1.vm06.stdout:3/338: unlink d11/d3f/f6a 0 2026-03-09T00:03:39.549 INFO:tasks.workunit.client.1.vm06.stdout:3/339: write d11/d28/d2e/f62 [1557926,41589] 0 2026-03-09T00:03:39.612 INFO:tasks.workunit.client.0.vm03.stdout:9/128: dwrite f11 [0,4194304] 0 2026-03-09T00:03:39.612 INFO:tasks.workunit.client.0.vm03.stdout:9/129: creat d15/d1c/d21/f25 x:0 0 0 2026-03-09T00:03:39.614 INFO:tasks.workunit.client.0.vm03.stdout:9/130: creat d15/f26 x:0 0 0 2026-03-09T00:03:39.621 INFO:tasks.workunit.client.0.vm03.stdout:9/131: dread f11 [0,4194304] 0 2026-03-09T00:03:39.641 INFO:tasks.workunit.client.0.vm03.stdout:0/123: dwrite d2/da/dd/f11 [0,4194304] 0 2026-03-09T00:03:39.641 INFO:tasks.workunit.client.1.vm06.stdout:4/275: dwrite d17/d21/d4c/f56 [0,4194304] 0 2026-03-09T00:03:39.641 INFO:tasks.workunit.client.1.vm06.stdout:4/276: chown d17/c1a 1 1 2026-03-09T00:03:39.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:39 vm06.local ceph-mon[58395]: mgrmap e23: vm06.rzcvhn(active, since 2s) 2026-03-09T00:03:39.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:39 vm06.local ceph-mon[58395]: [09/Mar/2026:00:03:38] ENGINE Bus STARTING 2026-03-09T00:03:39.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:39 vm06.local ceph-mon[58395]: [09/Mar/2026:00:03:39] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T00:03:39.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:39 vm06.local ceph-mon[58395]: [09/Mar/2026:00:03:39] ENGINE Client ('192.168.123.106', 52096) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T00:03:39.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:39 vm06.local ceph-mon[58395]: [09/Mar/2026:00:03:39] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T00:03:39.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:39 vm06.local ceph-mon[58395]: [09/Mar/2026:00:03:39] ENGINE Bus STARTED 2026-03-09T00:03:39.648 INFO:tasks.workunit.client.1.vm06.stdout:3/340: dwrite d11/f24 [0,4194304] 0 2026-03-09T00:03:39.648 INFO:tasks.workunit.client.1.vm06.stdout:3/341: chown d11/d3f/c4b 8743403 1 2026-03-09T00:03:39.648 INFO:tasks.workunit.client.1.vm06.stdout:9/249: dwrite d1/d4/f39 [0,4194304] 0 2026-03-09T00:03:39.651 INFO:tasks.workunit.client.0.vm03.stdout:2/109: dwrite f6 [0,4194304] 0 2026-03-09T00:03:39.654 INFO:tasks.workunit.client.1.vm06.stdout:3/342: link d11/d28/d2e/d2f/d36/f59 d11/d67/f73 0 2026-03-09T00:03:39.657 INFO:tasks.workunit.client.0.vm03.stdout:2/110: dread d8/f9 [0,4194304] 0 2026-03-09T00:03:39.659 INFO:tasks.workunit.client.1.vm06.stdout:0/353: dwrite d3/d18/d1f/f26 [0,4194304] 0 2026-03-09T00:03:39.660 INFO:tasks.workunit.client.0.vm03.stdout:2/111: mkdir d8/d1b/d24 0 2026-03-09T00:03:39.661 INFO:tasks.workunit.client.0.vm03.stdout:2/112: getdents d8 0 2026-03-09T00:03:39.664 INFO:tasks.workunit.client.1.vm06.stdout:3/343: truncate d11/d3f/f4c 335952 0 2026-03-09T00:03:39.664 INFO:tasks.workunit.client.1.vm06.stdout:3/344: creat d11/d28/d2e/d2f/f74 x:0 0 0 2026-03-09T00:03:39.664 INFO:tasks.workunit.client.1.vm06.stdout:3/345: fdatasync d11/f12 0 2026-03-09T00:03:39.664 INFO:tasks.workunit.client.1.vm06.stdout:3/346: chown d11/d28/c33 2800 1 2026-03-09T00:03:39.664 INFO:tasks.workunit.client.1.vm06.stdout:3/347: write d11/d28/d4d/f6e [289563,68602] 0 2026-03-09T00:03:39.668 INFO:tasks.workunit.client.1.vm06.stdout:0/354: creat d3/d18/d2c/d2d/d74/f77 x:0 0 0 2026-03-09T00:03:39.669 INFO:tasks.workunit.client.0.vm03.stdout:7/142: dwrite d2/d4/f13 [0,4194304] 0 2026-03-09T00:03:39.669 INFO:tasks.workunit.client.0.vm03.stdout:7/143: chown d2/d4/d15 1300796 1 2026-03-09T00:03:39.669 INFO:tasks.workunit.client.0.vm03.stdout:5/118: dwrite f18 [0,4194304] 0 2026-03-09T00:03:39.669 INFO:tasks.workunit.client.1.vm06.stdout:0/355: symlink d3/d18/d3c/l78 0 2026-03-09T00:03:39.679 INFO:tasks.workunit.client.0.vm03.stdout:8/120: dwrite d7/fd [0,4194304] 0 2026-03-09T00:03:39.679 INFO:tasks.workunit.client.0.vm03.stdout:8/121: fsync d7/f18 0 2026-03-09T00:03:39.680 INFO:tasks.workunit.client.0.vm03.stdout:8/122: creat d7/df/d1e/f24 x:0 0 0 2026-03-09T00:03:39.682 INFO:tasks.workunit.client.1.vm06.stdout:0/356: dread d3/d18/d1f/d44/f58 [0,4194304] 0 2026-03-09T00:03:39.685 INFO:tasks.workunit.client.0.vm03.stdout:8/123: creat d7/f25 x:0 0 0 2026-03-09T00:03:39.688 INFO:tasks.workunit.client.1.vm06.stdout:0/357: write d3/d18/d1f/f5e [727634,37673] 0 2026-03-09T00:03:39.697 INFO:tasks.workunit.client.0.vm03.stdout:7/144: read d2/fc [2194386,82021] 0 2026-03-09T00:03:39.697 INFO:tasks.workunit.client.0.vm03.stdout:7/145: chown d2/d4/d1e/l21 4859 1 2026-03-09T00:03:39.697 INFO:tasks.workunit.client.0.vm03.stdout:7/146: fdatasync d2/d4/d15/f1a 0 2026-03-09T00:03:39.700 INFO:tasks.workunit.client.1.vm06.stdout:2/416: symlink d7/da/db/l76 0 2026-03-09T00:03:39.702 INFO:tasks.workunit.client.1.vm06.stdout:2/417: link d7/d1b/d31/l38 d7/d1a/d25/l77 0 2026-03-09T00:03:39.702 INFO:tasks.workunit.client.1.vm06.stdout:2/418: truncate d7/da/db/de/f49 4510223 0 2026-03-09T00:03:39.703 INFO:tasks.workunit.client.0.vm03.stdout:8/124: write d7/f11 [2032765,24110] 0 2026-03-09T00:03:39.704 INFO:tasks.workunit.client.0.vm03.stdout:8/125: write d7/f25 [437,60377] 0 2026-03-09T00:03:39.712 INFO:tasks.workunit.client.1.vm06.stdout:2/419: write f6 [2570785,69852] 0 2026-03-09T00:03:39.717 INFO:tasks.workunit.client.0.vm03.stdout:8/126: read d7/fd [3935225,125262] 0 2026-03-09T00:03:39.717 INFO:tasks.workunit.client.0.vm03.stdout:8/127: fsync f3 0 2026-03-09T00:03:39.721 INFO:tasks.workunit.client.1.vm06.stdout:2/420: mkdir d7/d1a/d56/d78 0 2026-03-09T00:03:39.721 INFO:tasks.workunit.client.1.vm06.stdout:2/421: chown d7/d1a/d3c/c65 117415 1 2026-03-09T00:03:39.721 INFO:tasks.workunit.client.0.vm03.stdout:8/128: mkdir d7/df/d1e/d26 0 2026-03-09T00:03:39.729 INFO:tasks.workunit.client.0.vm03.stdout:8/129: dread d7/f18 [0,4194304] 0 2026-03-09T00:03:39.746 INFO:tasks.workunit.client.0.vm03.stdout:6/125: sync 2026-03-09T00:03:39.746 INFO:tasks.workunit.client.0.vm03.stdout:6/126: truncate f8 4329171 0 2026-03-09T00:03:39.746 INFO:tasks.workunit.client.0.vm03.stdout:6/127: dread - d13/f1a zero size 2026-03-09T00:03:39.746 INFO:tasks.workunit.client.0.vm03.stdout:6/128: mknod d13/d1e/c25 0 2026-03-09T00:03:39.749 INFO:tasks.workunit.client.0.vm03.stdout:6/129: unlink f9 0 2026-03-09T00:03:39.749 INFO:tasks.workunit.client.0.vm03.stdout:6/130: write f8 [5375300,67900] 0 2026-03-09T00:03:39.801 INFO:tasks.workunit.client.1.vm06.stdout:1/294: symlink d6/l5f 0 2026-03-09T00:03:39.820 INFO:tasks.workunit.client.1.vm06.stdout:1/295: unlink d6/f20 0 2026-03-09T00:03:39.822 INFO:tasks.workunit.client.1.vm06.stdout:1/296: rmdir d6/d21/d2d 39 2026-03-09T00:03:39.826 INFO:tasks.workunit.client.1.vm06.stdout:1/297: write d6/f28 [124997,79799] 0 2026-03-09T00:03:39.826 INFO:tasks.workunit.client.1.vm06.stdout:1/298: dread - d6/d21/d2d/d3b/d42/d43/f4b zero size 2026-03-09T00:03:39.830 INFO:tasks.workunit.client.1.vm06.stdout:1/299: readlink d6/l23 0 2026-03-09T00:03:39.861 INFO:tasks.workunit.client.1.vm06.stdout:0/358: rmdir d3/d18/d2c 39 2026-03-09T00:03:39.862 INFO:tasks.workunit.client.1.vm06.stdout:0/359: mkdir d3/d18/d79 0 2026-03-09T00:03:39.862 INFO:tasks.workunit.client.1.vm06.stdout:0/360: chown d3/d18/f25 2238 1 2026-03-09T00:03:39.863 INFO:tasks.workunit.client.1.vm06.stdout:0/361: symlink d3/d18/d1f/d44/d6a/l7a 0 2026-03-09T00:03:39.864 INFO:tasks.workunit.client.1.vm06.stdout:0/362: creat d3/d18/d2c/d2d/d31/f7b x:0 0 0 2026-03-09T00:03:39.881 INFO:tasks.workunit.client.1.vm06.stdout:3/348: rmdir d11/d28/d6f 0 2026-03-09T00:03:39.885 INFO:tasks.workunit.client.1.vm06.stdout:3/349: dread d11/d28/d2e/d2f/d36/f4e [0,4194304] 0 2026-03-09T00:03:39.885 INFO:tasks.workunit.client.1.vm06.stdout:1/300: dwrite d6/fb [0,4194304] 0 2026-03-09T00:03:39.885 INFO:tasks.workunit.client.1.vm06.stdout:1/301: fdatasync d6/d21/d2d/f3c 0 2026-03-09T00:03:39.886 INFO:tasks.workunit.client.1.vm06.stdout:1/302: write d6/d21/d2d/f5d [271856,114422] 0 2026-03-09T00:03:39.898 INFO:tasks.workunit.client.1.vm06.stdout:4/277: symlink d17/l59 0 2026-03-09T00:03:39.898 INFO:tasks.workunit.client.1.vm06.stdout:4/278: fsync f1 0 2026-03-09T00:03:39.914 INFO:tasks.workunit.client.1.vm06.stdout:3/350: dwrite f7 [0,4194304] 0 2026-03-09T00:03:39.914 INFO:tasks.workunit.client.1.vm06.stdout:3/351: creat d11/d28/d2e/d2f/d36/f75 x:0 0 0 2026-03-09T00:03:39.917 INFO:tasks.workunit.client.1.vm06.stdout:9/250: rename d1/d3/d12/d21/d14/f18 to d1/d3/d50/f53 0 2026-03-09T00:03:39.917 INFO:tasks.workunit.client.1.vm06.stdout:9/251: readlink d1/d3/d12/l30 0 2026-03-09T00:03:39.917 INFO:tasks.workunit.client.1.vm06.stdout:9/252: chown d1/d4/f39 1984701307 1 2026-03-09T00:03:39.923 INFO:tasks.workunit.client.1.vm06.stdout:3/352: dread d11/f20 [0,4194304] 0 2026-03-09T00:03:39.923 INFO:tasks.workunit.client.1.vm06.stdout:3/353: dread d11/d28/d4d/f6e [0,4194304] 0 2026-03-09T00:03:39.923 INFO:tasks.workunit.client.1.vm06.stdout:3/354: read - d11/f66 zero size 2026-03-09T00:03:39.934 INFO:tasks.workunit.client.1.vm06.stdout:2/422: rename d7/d1a/d56/d78 to d7/d1b/d71/d79 0 2026-03-09T00:03:39.934 INFO:tasks.workunit.client.1.vm06.stdout:3/355: mkdir d11/d28/d76 0 2026-03-09T00:03:39.934 INFO:tasks.workunit.client.1.vm06.stdout:4/279: rename d17/d24/d3b/f3f to d17/d24/d49/f5a 0 2026-03-09T00:03:39.936 INFO:tasks.workunit.client.1.vm06.stdout:2/423: rename d7/da/d55/f6d to d7/da/d4e/d57/f7a 0 2026-03-09T00:03:39.936 INFO:tasks.workunit.client.1.vm06.stdout:4/280: mkdir d17/d5b 0 2026-03-09T00:03:39.939 INFO:tasks.workunit.client.1.vm06.stdout:3/356: read f7 [3664655,21733] 0 2026-03-09T00:03:39.949 INFO:tasks.workunit.client.1.vm06.stdout:3/357: mknod d11/d3f/c77 0 2026-03-09T00:03:39.949 INFO:tasks.workunit.client.1.vm06.stdout:5/446: sync 2026-03-09T00:03:39.949 INFO:tasks.workunit.client.0.vm03.stdout:0/124: unlink f1 0 2026-03-09T00:03:39.949 INFO:tasks.workunit.client.0.vm03.stdout:0/125: truncate d2/da/d1a/f1c 271031 0 2026-03-09T00:03:39.952 INFO:tasks.workunit.client.0.vm03.stdout:1/145: rename d4/d21 to d4/d6/d31 0 2026-03-09T00:03:39.952 INFO:tasks.workunit.client.0.vm03.stdout:1/146: read - d4/f12 zero size 2026-03-09T00:03:39.952 INFO:tasks.workunit.client.0.vm03.stdout:1/147: stat d4/d6/ff 0 2026-03-09T00:03:39.958 INFO:tasks.workunit.client.1.vm06.stdout:5/447: dread d5/d44/d4b/d92/d49/f83 [0,4194304] 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.1.vm06.stdout:5/448: chown d5/d1c/l3a 15789 1 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.1.vm06.stdout:5/449: chown d5/d1c/d21/d28/d5e/d66/d78 3349 1 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.1.vm06.stdout:5/450: chown d5/d1c/l30 10 1 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.1.vm06.stdout:3/358: rename d11/f66 to d11/d28/d2e/d2f/f78 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.1.vm06.stdout:3/359: write d11/d28/d2e/f32 [894967,55861] 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.1.vm06.stdout:3/360: creat d11/d28/d2e/d2f/f79 x:0 0 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.0.vm03.stdout:1/148: read f0 [2683646,83840] 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.0.vm03.stdout:1/149: write f0 [5169319,7165] 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.0.vm03.stdout:1/150: truncate d4/d6/d31/f2c 143559 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.0.vm03.stdout:1/151: truncate d4/d15/f17 1896141 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.0.vm03.stdout:2/113: rename c3 to d8/d1b/c25 0 2026-03-09T00:03:39.959 INFO:tasks.workunit.client.0.vm03.stdout:0/126: unlink d2/f7 0 2026-03-09T00:03:39.961 INFO:tasks.workunit.client.0.vm03.stdout:1/152: truncate d4/fb 179673 0 2026-03-09T00:03:39.961 INFO:tasks.workunit.client.0.vm03.stdout:1/153: chown d4/d6/d31 315348685 1 2026-03-09T00:03:39.961 INFO:tasks.workunit.client.1.vm06.stdout:5/451: dread d5/d44/d4b/d92/d49/f83 [0,4194304] 0 2026-03-09T00:03:39.961 INFO:tasks.workunit.client.1.vm06.stdout:5/452: readlink d5/l8 0 2026-03-09T00:03:39.961 INFO:tasks.workunit.client.1.vm06.stdout:5/453: write d5/d1c/f62 [695132,33701] 0 2026-03-09T00:03:39.966 INFO:tasks.workunit.client.1.vm06.stdout:2/424: write d7/d1a/d56/f50 [3029887,41718] 0 2026-03-09T00:03:39.966 INFO:tasks.workunit.client.1.vm06.stdout:2/425: dread - d7/d1b/d5a/f62 zero size 2026-03-09T00:03:39.966 INFO:tasks.workunit.client.1.vm06.stdout:3/361: symlink d11/d28/d2e/d2f/l7a 0 2026-03-09T00:03:39.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:39 vm03.local ceph-mon[52346]: mgrmap e23: vm06.rzcvhn(active, since 2s) 2026-03-09T00:03:39.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:39 vm03.local ceph-mon[52346]: [09/Mar/2026:00:03:38] ENGINE Bus STARTING 2026-03-09T00:03:39.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:39 vm03.local ceph-mon[52346]: [09/Mar/2026:00:03:39] ENGINE Serving on https://192.168.123.106:7150 2026-03-09T00:03:39.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:39 vm03.local ceph-mon[52346]: [09/Mar/2026:00:03:39] ENGINE Client ('192.168.123.106', 52096) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T00:03:39.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:39 vm03.local ceph-mon[52346]: [09/Mar/2026:00:03:39] ENGINE Serving on http://192.168.123.106:8765 2026-03-09T00:03:39.967 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:39 vm03.local ceph-mon[52346]: [09/Mar/2026:00:03:39] ENGINE Bus STARTED 2026-03-09T00:03:39.973 INFO:tasks.workunit.client.1.vm06.stdout:5/454: mknod d5/d44/d4b/d92/d49/c94 0 2026-03-09T00:03:39.976 INFO:tasks.workunit.client.0.vm03.stdout:7/147: rename d2/c1d to d2/d4/d15/c2d 0 2026-03-09T00:03:39.976 INFO:tasks.workunit.client.0.vm03.stdout:7/148: creat d2/d4/f2e x:0 0 0 2026-03-09T00:03:39.981 INFO:tasks.workunit.client.0.vm03.stdout:7/149: write d2/d4/d15/f19 [20724,126051] 0 2026-03-09T00:03:39.983 INFO:tasks.workunit.client.1.vm06.stdout:9/253: dwrite d1/d3/d12/d21/d14/d25/f31 [0,4194304] 0 2026-03-09T00:03:39.984 INFO:tasks.workunit.client.1.vm06.stdout:9/254: creat d1/d4/f54 x:0 0 0 2026-03-09T00:03:39.985 INFO:tasks.workunit.client.1.vm06.stdout:3/362: creat d11/d28/d57/f7b x:0 0 0 2026-03-09T00:03:39.989 INFO:tasks.workunit.client.0.vm03.stdout:1/154: mkdir d4/d6/d31/d32 0 2026-03-09T00:03:39.993 INFO:tasks.workunit.client.1.vm06.stdout:5/455: mkdir d5/d44/d4b/d92/d95 0 2026-03-09T00:03:39.994 INFO:tasks.workunit.client.1.vm06.stdout:4/281: dwrite d17/d24/d3b/f4a [0,4194304] 0 2026-03-09T00:03:39.994 INFO:tasks.workunit.client.1.vm06.stdout:4/282: creat d17/d24/f5c x:0 0 0 2026-03-09T00:03:39.994 INFO:tasks.workunit.client.1.vm06.stdout:2/426: dwrite d7/da/d55/f5b [0,4194304] 0 2026-03-09T00:03:39.997 INFO:tasks.workunit.client.1.vm06.stdout:6/324: sync 2026-03-09T00:03:39.997 INFO:tasks.workunit.client.1.vm06.stdout:8/329: sync 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:7/353: sync 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:7/354: dread - d0/f4f zero size 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:7/355: chown d0/df/d1a/f44 1874843 1 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:7/356: dread - d0/f4f zero size 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:7/357: rename d0/df/d1a/d35/d62 to d0/df/d1a/d35/d62/d65 22 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:1/303: sync 2026-03-09T00:03:39.998 INFO:tasks.workunit.client.1.vm06.stdout:1/304: fsync d6/d21/d2d/d3b/d42/d43/f4b 0 2026-03-09T00:03:39.999 INFO:tasks.workunit.client.1.vm06.stdout:0/363: sync 2026-03-09T00:03:40.004 INFO:tasks.workunit.client.0.vm03.stdout:0/127: rmdir d2/da 39 2026-03-09T00:03:40.004 INFO:tasks.workunit.client.0.vm03.stdout:0/128: dread - d2/da/f2d zero size 2026-03-09T00:03:40.006 INFO:tasks.workunit.client.1.vm06.stdout:9/255: mknod d1/d3/d4f/c55 0 2026-03-09T00:03:40.006 INFO:tasks.workunit.client.1.vm06.stdout:9/256: readlink d1/d3/d2b/l2e 0 2026-03-09T00:03:40.006 INFO:tasks.workunit.client.1.vm06.stdout:9/257: chown d1/d3/d2b 53348 1 2026-03-09T00:03:40.008 INFO:tasks.workunit.client.0.vm03.stdout:0/129: dread d2/f1e [0,4194304] 0 2026-03-09T00:03:40.008 INFO:tasks.workunit.client.0.vm03.stdout:0/130: readlink d2/d1f/l26 0 2026-03-09T00:03:40.015 INFO:tasks.workunit.client.0.vm03.stdout:6/131: rename cf to d13/c26 0 2026-03-09T00:03:40.016 INFO:tasks.workunit.client.1.vm06.stdout:3/363: rmdir d11/d28/d76 0 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:4/130: sync 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:8/130: sync 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:4/131: chown d7/l14 21 1 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:8/131: stat d7/df/d1e/f24 0 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:5/119: sync 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:3/99: sync 2026-03-09T00:03:40.018 INFO:tasks.workunit.client.0.vm03.stdout:9/132: sync 2026-03-09T00:03:40.019 INFO:tasks.workunit.client.1.vm06.stdout:5/456: creat d5/d1c/d21/f96 x:0 0 0 2026-03-09T00:03:40.019 INFO:tasks.workunit.client.1.vm06.stdout:5/457: fsync d5/d1c/d68/f31 0 2026-03-09T00:03:40.021 INFO:tasks.workunit.client.0.vm03.stdout:2/114: rmdir d8/d1b 39 2026-03-09T00:03:40.021 INFO:tasks.workunit.client.0.vm03.stdout:2/115: dread - d8/d1b/f1f zero size 2026-03-09T00:03:40.021 INFO:tasks.workunit.client.0.vm03.stdout:2/116: write d8/d17/f1a [4265881,52143] 0 2026-03-09T00:03:40.021 INFO:tasks.workunit.client.0.vm03.stdout:1/155: creat d4/d6/f33 x:0 0 0 2026-03-09T00:03:40.029 INFO:tasks.workunit.client.0.vm03.stdout:7/150: rename d2/d4/l26 to d2/l2f 0 2026-03-09T00:03:40.029 INFO:tasks.workunit.client.0.vm03.stdout:7/151: dread - d2/d4/d15/f29 zero size 2026-03-09T00:03:40.029 INFO:tasks.workunit.client.0.vm03.stdout:7/152: write d2/d4/d15/f29 [715440,81765] 0 2026-03-09T00:03:40.030 INFO:tasks.workunit.client.0.vm03.stdout:6/132: creat d13/d1e/f27 x:0 0 0 2026-03-09T00:03:40.031 INFO:tasks.workunit.client.0.vm03.stdout:4/132: mkdir d7/d23 0 2026-03-09T00:03:40.032 INFO:tasks.workunit.client.0.vm03.stdout:3/100: chown d2/db/f1a 35179126 1 2026-03-09T00:03:40.034 INFO:tasks.workunit.client.1.vm06.stdout:2/427: mkdir d7/d1b/d31/d7b 0 2026-03-09T00:03:40.034 INFO:tasks.workunit.client.1.vm06.stdout:2/428: write d7/d1b/f5c [4546022,119347] 0 2026-03-09T00:03:40.036 INFO:tasks.workunit.client.0.vm03.stdout:4/133: read f4 [980599,129274] 0 2026-03-09T00:03:40.041 INFO:tasks.workunit.client.0.vm03.stdout:1/156: dread d4/d6/fa [4194304,4194304] 0 2026-03-09T00:03:40.041 INFO:tasks.workunit.client.0.vm03.stdout:1/157: write d4/f12 [505790,52453] 0 2026-03-09T00:03:40.043 INFO:tasks.workunit.client.0.vm03.stdout:5/120: link f14 d1c/d20/d2a/f2c 0 2026-03-09T00:03:40.045 INFO:tasks.workunit.client.1.vm06.stdout:6/325: getdents d4/d27/d42/d4b 0 2026-03-09T00:03:40.045 INFO:tasks.workunit.client.1.vm06.stdout:6/326: write d4/d27/d42/d4b/f50 [527066,128175] 0 2026-03-09T00:03:40.059 INFO:tasks.workunit.client.0.vm03.stdout:9/133: rename d15/c22 to d15/c27 0 2026-03-09T00:03:40.059 INFO:tasks.workunit.client.0.vm03.stdout:9/134: write f8 [2341441,127130] 0 2026-03-09T00:03:40.060 INFO:tasks.workunit.client.1.vm06.stdout:1/305: symlink d6/d21/l60 0 2026-03-09T00:03:40.060 INFO:tasks.workunit.client.1.vm06.stdout:8/330: dwrite db/dd/f1c [0,4194304] 0 2026-03-09T00:03:40.060 INFO:tasks.workunit.client.1.vm06.stdout:8/331: stat db/dd/d24/c32 0 2026-03-09T00:03:40.060 INFO:tasks.workunit.client.1.vm06.stdout:8/332: dread - db/d1e/f5f zero size 2026-03-09T00:03:40.061 INFO:tasks.workunit.client.1.vm06.stdout:0/364: creat d3/d18/d1f/d44/f7c x:0 0 0 2026-03-09T00:03:40.062 INFO:tasks.workunit.client.0.vm03.stdout:0/131: symlink d2/l30 0 2026-03-09T00:03:40.064 INFO:tasks.workunit.client.1.vm06.stdout:0/365: read d3/d18/d1f/d44/f58 [3977992,105309] 0 2026-03-09T00:03:40.064 INFO:tasks.workunit.client.1.vm06.stdout:0/366: write d3/d18/d1f/d44/f5a [2162554,92568] 0 2026-03-09T00:03:40.064 INFO:tasks.workunit.client.1.vm06.stdout:0/367: fdatasync d3/d18/d2c/f4e 0 2026-03-09T00:03:40.064 INFO:tasks.workunit.client.1.vm06.stdout:0/368: stat d3/d18/d2c/d2d/f46 0 2026-03-09T00:03:40.073 INFO:tasks.workunit.client.1.vm06.stdout:9/258: rename d1/d3/d12/d21/d14/f20 to d1/d3/d12/d49/f56 0 2026-03-09T00:03:40.073 INFO:tasks.workunit.client.1.vm06.stdout:9/259: write d1/d3/d12/d21/d9/f3d [1013593,72070] 0 2026-03-09T00:03:40.073 INFO:tasks.workunit.client.1.vm06.stdout:9/260: chown d1/d3/d2b 593033 1 2026-03-09T00:03:40.076 INFO:tasks.workunit.client.0.vm03.stdout:0/132: dread f0 [0,4194304] 0 2026-03-09T00:03:40.076 INFO:tasks.workunit.client.0.vm03.stdout:0/133: write d2/da/f1b [935445,22994] 0 2026-03-09T00:03:40.078 INFO:tasks.workunit.client.0.vm03.stdout:8/132: rename l4 to d7/df/d1e/l27 0 2026-03-09T00:03:40.083 INFO:tasks.workunit.client.1.vm06.stdout:3/364: mknod d11/d28/d4d/c7c 0 2026-03-09T00:03:40.089 INFO:tasks.workunit.client.1.vm06.stdout:7/358: dwrite d0/df/d1a/d27/d4c/f32 [0,4194304] 0 2026-03-09T00:03:40.105 INFO:tasks.workunit.client.0.vm03.stdout:2/117: dwrite d8/f12 [0,4194304] 0 2026-03-09T00:03:40.113 INFO:tasks.workunit.client.1.vm06.stdout:5/458: symlink d5/d1c/d21/d28/l97 0 2026-03-09T00:03:40.113 INFO:tasks.workunit.client.0.vm03.stdout:6/133: creat d13/d1e/f28 x:0 0 0 2026-03-09T00:03:40.116 INFO:tasks.workunit.client.0.vm03.stdout:3/101: truncate d2/fc 50515 0 2026-03-09T00:03:40.117 INFO:tasks.workunit.client.1.vm06.stdout:4/283: rmdir d17/d24 39 2026-03-09T00:03:40.117 INFO:tasks.workunit.client.1.vm06.stdout:4/284: chown d17/l37 15172728 1 2026-03-09T00:03:40.118 INFO:tasks.workunit.client.0.vm03.stdout:4/134: mknod d7/c24 0 2026-03-09T00:03:40.119 INFO:tasks.workunit.client.0.vm03.stdout:4/135: stat d7/l11 0 2026-03-09T00:03:40.121 INFO:tasks.workunit.client.0.vm03.stdout:2/118: dread f2 [0,4194304] 0 2026-03-09T00:03:40.122 INFO:tasks.workunit.client.0.vm03.stdout:2/119: write f6 [4677695,96023] 0 2026-03-09T00:03:40.130 INFO:tasks.workunit.client.1.vm06.stdout:6/327: dwrite d4/d27/d3e/f41 [4194304,4194304] 0 2026-03-09T00:03:40.132 INFO:tasks.workunit.client.0.vm03.stdout:8/133: dwrite f6 [8388608,4194304] 0 2026-03-09T00:03:40.152 INFO:tasks.workunit.client.0.vm03.stdout:1/158: rename d4/d6/f27 to d4/d6/f34 0 2026-03-09T00:03:40.155 INFO:tasks.workunit.client.1.vm06.stdout:2/429: truncate d7/f26 3576088 0 2026-03-09T00:03:40.172 INFO:tasks.workunit.client.1.vm06.stdout:2/430: chown d7/da/d4e/d57 50090 1 2026-03-09T00:03:40.172 INFO:tasks.workunit.client.1.vm06.stdout:2/431: chown d7/d1a/d25/d66 5832744 1 2026-03-09T00:03:40.172 INFO:tasks.workunit.client.0.vm03.stdout:5/121: mknod d1c/c2d 0 2026-03-09T00:03:40.173 INFO:tasks.workunit.client.0.vm03.stdout:9/135: mkdir d15/d1c/d28 0 2026-03-09T00:03:40.173 INFO:tasks.workunit.client.1.vm06.stdout:1/306: link d6/d21/c49 d6/d21/d2d/d3b/d42/d43/d4d/c61 0 2026-03-09T00:03:40.173 INFO:tasks.workunit.client.1.vm06.stdout:8/333: symlink db/d53/d5c/l6b 0 2026-03-09T00:03:40.175 INFO:tasks.workunit.client.1.vm06.stdout:7/359: dwrite d0/df/d1a/d27/f43 [0,4194304] 0 2026-03-09T00:03:40.175 INFO:tasks.workunit.client.1.vm06.stdout:7/360: dread - d0/df/d1a/d27/f37 zero size 2026-03-09T00:03:40.175 INFO:tasks.workunit.client.1.vm06.stdout:7/361: readlink d0/l45 0 2026-03-09T00:03:40.193 INFO:tasks.workunit.client.1.vm06.stdout:1/307: dread d6/fb [0,4194304] 0 2026-03-09T00:03:40.193 INFO:tasks.workunit.client.1.vm06.stdout:9/261: rename d1/d4/c1b to d1/d3/d50/c57 0 2026-03-09T00:03:40.197 INFO:tasks.workunit.client.1.vm06.stdout:3/365: link d11/d28/d2e/f65 d11/d28/d2e/d2f/d5b/f7d 0 2026-03-09T00:03:40.197 INFO:tasks.workunit.client.1.vm06.stdout:3/366: chown d11/d28/c35 503143885 1 2026-03-09T00:03:40.198 INFO:tasks.workunit.client.0.vm03.stdout:7/153: rmdir d2 39 2026-03-09T00:03:40.200 INFO:tasks.workunit.client.1.vm06.stdout:5/459: symlink d5/d1c/d21/d28/d5e/l98 0 2026-03-09T00:03:40.207 INFO:tasks.workunit.client.1.vm06.stdout:2/432: mknod d7/d1b/d31/c7c 0 2026-03-09T00:03:40.209 INFO:tasks.workunit.client.1.vm06.stdout:8/334: mknod db/dd/d24/d36/d38/c6c 0 2026-03-09T00:03:40.214 INFO:tasks.workunit.client.0.vm03.stdout:4/136: mkdir d7/d23/d25 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.0.vm03.stdout:4/137: chown d7/c24 0 1 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.0.vm03.stdout:4/138: dread - d7/f1e zero size 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.0.vm03.stdout:4/139: truncate d7/f15 257454 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.0.vm03.stdout:4/140: write d7/f1d [935968,58893] 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.1.vm06.stdout:1/308: mknod d6/d21/d2d/d3b/c62 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.1.vm06.stdout:9/262: mkdir d1/d3/d2b/d58 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.1.vm06.stdout:3/367: rename d11/d67 to d11/d28/d2e/d7e 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.1.vm06.stdout:3/368: write d11/d28/d2e/f47 [4841802,10141] 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.1.vm06.stdout:3/369: read d11/d28/d2e/d2f/d36/f4a [5276746,17404] 0 2026-03-09T00:03:40.220 INFO:tasks.workunit.client.1.vm06.stdout:3/370: chown d11/d28/d2e/d2f/d36/f59 2519 1 2026-03-09T00:03:40.223 INFO:tasks.workunit.client.1.vm06.stdout:2/433: truncate d7/da/db/f6e 2908708 0 2026-03-09T00:03:40.224 INFO:tasks.workunit.client.0.vm03.stdout:8/134: mknod d7/df/d1e/c28 0 2026-03-09T00:03:40.224 INFO:tasks.workunit.client.0.vm03.stdout:8/135: fdatasync d7/f11 0 2026-03-09T00:03:40.229 INFO:tasks.workunit.client.1.vm06.stdout:8/335: rmdir db/dd/d24/d36/d38 39 2026-03-09T00:03:40.229 INFO:tasks.workunit.client.1.vm06.stdout:8/336: write db/d1e/f2e [2730459,125589] 0 2026-03-09T00:03:40.239 INFO:tasks.workunit.client.0.vm03.stdout:4/141: dread d7/f12 [0,4194304] 0 2026-03-09T00:03:40.242 INFO:tasks.workunit.client.0.vm03.stdout:2/120: dwrite d8/d17/f20 [0,4194304] 0 2026-03-09T00:03:40.243 INFO:tasks.workunit.client.1.vm06.stdout:0/369: dwrite d3/d18/d1f/d44/f5a [0,4194304] 0 2026-03-09T00:03:40.244 INFO:tasks.workunit.client.0.vm03.stdout:5/122: rename f3 to d1c/f2e 0 2026-03-09T00:03:40.265 INFO:tasks.workunit.client.1.vm06.stdout:9/263: unlink d1/d3/d12/d21/d14/d25/f31 0 2026-03-09T00:03:40.265 INFO:tasks.workunit.client.1.vm06.stdout:9/264: getdents d1/d3/d4f/d52 0 2026-03-09T00:03:40.270 INFO:tasks.workunit.client.1.vm06.stdout:5/460: dwrite d5/d1c/d23/f4c [0,4194304] 0 2026-03-09T00:03:40.270 INFO:tasks.workunit.client.1.vm06.stdout:9/265: dread d1/d3/d12/d49/f56 [0,4194304] 0 2026-03-09T00:03:40.274 INFO:tasks.workunit.client.1.vm06.stdout:3/371: symlink d11/d28/d2e/d2f/d5b/l7f 0 2026-03-09T00:03:40.276 INFO:tasks.workunit.client.1.vm06.stdout:4/285: dwrite d17/d24/f2c [0,4194304] 0 2026-03-09T00:03:40.279 INFO:tasks.workunit.client.1.vm06.stdout:6/328: dwrite d4/d16/f34 [4194304,4194304] 0 2026-03-09T00:03:40.280 INFO:tasks.workunit.client.1.vm06.stdout:6/329: dread d4/d27/d3e/f55 [0,4194304] 0 2026-03-09T00:03:40.280 INFO:tasks.workunit.client.1.vm06.stdout:6/330: chown d4/f2a 2514 1 2026-03-09T00:03:40.280 INFO:tasks.workunit.client.1.vm06.stdout:6/331: write d4/d16/f5e [575798,87782] 0 2026-03-09T00:03:40.285 INFO:tasks.workunit.client.0.vm03.stdout:5/123: dwrite d1c/f2e [0,4194304] 0 2026-03-09T00:03:40.285 INFO:tasks.workunit.client.0.vm03.stdout:5/124: dread - d1c/f29 zero size 2026-03-09T00:03:40.285 INFO:tasks.workunit.client.0.vm03.stdout:5/125: dread - d1c/d20/f25 zero size 2026-03-09T00:03:40.289 INFO:tasks.workunit.client.0.vm03.stdout:5/126: dread f18 [0,4194304] 0 2026-03-09T00:03:40.298 INFO:tasks.workunit.client.0.vm03.stdout:9/136: creat d15/d1c/d28/f29 x:0 0 0 2026-03-09T00:03:40.298 INFO:tasks.workunit.client.0.vm03.stdout:9/137: write d15/f1b [1949077,20615] 0 2026-03-09T00:03:40.300 INFO:tasks.workunit.client.1.vm06.stdout:2/434: rename d7/d1b/f6c to d7/d1b/d31/f7d 0 2026-03-09T00:03:40.300 INFO:tasks.workunit.client.1.vm06.stdout:2/435: chown d7/da/d63 102 1 2026-03-09T00:03:40.301 INFO:tasks.workunit.client.1.vm06.stdout:8/337: getdents db/dd/d24/d36 0 2026-03-09T00:03:40.317 INFO:tasks.workunit.client.1.vm06.stdout:1/309: mkdir d6/d63 0 2026-03-09T00:03:40.321 INFO:tasks.workunit.client.0.vm03.stdout:7/154: creat d2/d1f/f30 x:0 0 0 2026-03-09T00:03:40.325 INFO:tasks.workunit.client.1.vm06.stdout:0/370: mkdir d3/d18/d2c/d2d/d74/d7d 0 2026-03-09T00:03:40.326 INFO:tasks.workunit.client.0.vm03.stdout:3/102: symlink d2/db/l1f 0 2026-03-09T00:03:40.329 INFO:tasks.workunit.client.0.vm03.stdout:8/136: creat d7/df/f29 x:0 0 0 2026-03-09T00:03:40.329 INFO:tasks.workunit.client.0.vm03.stdout:1/159: getdents d4/d6 0 2026-03-09T00:03:40.350 INFO:tasks.workunit.client.0.vm03.stdout:4/142: rename d7/ff to d7/f26 0 2026-03-09T00:03:40.350 INFO:tasks.workunit.client.0.vm03.stdout:4/143: write d7/f1e [498295,102493] 0 2026-03-09T00:03:40.351 INFO:tasks.workunit.client.1.vm06.stdout:8/338: dwrite db/d1e/f52 [0,4194304] 0 2026-03-09T00:03:40.357 INFO:tasks.workunit.client.0.vm03.stdout:4/144: dread d7/f15 [0,4194304] 0 2026-03-09T00:03:40.363 INFO:tasks.workunit.client.0.vm03.stdout:8/137: dwrite d7/fd [0,4194304] 0 2026-03-09T00:03:40.363 INFO:tasks.workunit.client.0.vm03.stdout:8/138: write d7/f25 [222392,80464] 0 2026-03-09T00:03:40.364 INFO:tasks.workunit.client.1.vm06.stdout:5/461: link d5/l1e d5/d44/d4b/d92/d49/l99 0 2026-03-09T00:03:40.372 INFO:tasks.workunit.client.1.vm06.stdout:9/266: symlink d1/d3/d4f/d52/l59 0 2026-03-09T00:03:40.372 INFO:tasks.workunit.client.1.vm06.stdout:9/267: chown d1/d3/d12/d21/d14/d25/f4a 2080 1 2026-03-09T00:03:40.372 INFO:tasks.workunit.client.1.vm06.stdout:9/268: creat d1/d3/d12/f5a x:0 0 0 2026-03-09T00:03:40.372 INFO:tasks.workunit.client.1.vm06.stdout:9/269: readlink d1/d3/d12/d21/l1d 0 2026-03-09T00:03:40.372 INFO:tasks.workunit.client.1.vm06.stdout:9/270: fsync d1/d3/d50/f53 0 2026-03-09T00:03:40.380 INFO:tasks.workunit.client.0.vm03.stdout:2/121: mkdir d8/d26 0 2026-03-09T00:03:40.380 INFO:tasks.workunit.client.0.vm03.stdout:0/134: sync 2026-03-09T00:03:40.383 INFO:tasks.workunit.client.1.vm06.stdout:4/286: creat d17/d21/f5d x:0 0 0 2026-03-09T00:03:40.386 INFO:tasks.workunit.client.0.vm03.stdout:5/127: unlink d1c/d20/d2a/f2c 0 2026-03-09T00:03:40.386 INFO:tasks.workunit.client.0.vm03.stdout:5/128: chown d1c/d20/l21 3623 1 2026-03-09T00:03:40.395 INFO:tasks.workunit.client.0.vm03.stdout:6/134: sync 2026-03-09T00:03:40.395 INFO:tasks.workunit.client.0.vm03.stdout:6/135: readlink l11 0 2026-03-09T00:03:40.395 INFO:tasks.workunit.client.0.vm03.stdout:6/136: write d13/f1f [5093214,106423] 0 2026-03-09T00:03:40.401 INFO:tasks.workunit.client.0.vm03.stdout:9/138: symlink d15/l2a 0 2026-03-09T00:03:40.408 INFO:tasks.workunit.client.0.vm03.stdout:7/155: mkdir d2/d4/d15/d31 0 2026-03-09T00:03:40.416 INFO:tasks.workunit.client.0.vm03.stdout:5/129: write f11 [2407018,68039] 0 2026-03-09T00:03:40.416 INFO:tasks.workunit.client.0.vm03.stdout:1/160: creat d4/d15/f35 x:0 0 0 2026-03-09T00:03:40.421 INFO:tasks.workunit.client.0.vm03.stdout:7/156: dread d2/d4/d15/f19 [0,4194304] 0 2026-03-09T00:03:40.421 INFO:tasks.workunit.client.0.vm03.stdout:7/157: write d2/d4/f22 [219114,46035] 0 2026-03-09T00:03:40.430 INFO:tasks.workunit.client.0.vm03.stdout:2/122: dwrite f6 [0,4194304] 0 2026-03-09T00:03:40.432 INFO:tasks.workunit.client.0.vm03.stdout:4/145: mkdir d7/d27 0 2026-03-09T00:03:40.437 INFO:tasks.workunit.client.1.vm06.stdout:6/332: creat d4/d16/d53/f5f x:0 0 0 2026-03-09T00:03:40.451 INFO:tasks.workunit.client.0.vm03.stdout:8/139: creat d7/df/d1a/f2a x:0 0 0 2026-03-09T00:03:40.454 INFO:tasks.workunit.client.1.vm06.stdout:1/310: mknod d6/d21/d2d/d37/c64 0 2026-03-09T00:03:40.454 INFO:tasks.workunit.client.1.vm06.stdout:2/436: rmdir d7/da/d4e 39 2026-03-09T00:03:40.464 INFO:tasks.workunit.client.0.vm03.stdout:6/137: symlink d13/l29 0 2026-03-09T00:03:40.464 INFO:tasks.workunit.client.1.vm06.stdout:8/339: mkdir db/d53/d6d 0 2026-03-09T00:03:40.464 INFO:tasks.workunit.client.1.vm06.stdout:0/371: truncate d3/f11 1396632 0 2026-03-09T00:03:40.464 INFO:tasks.workunit.client.1.vm06.stdout:8/340: fsync db/dd/f27 0 2026-03-09T00:03:40.464 INFO:tasks.workunit.client.1.vm06.stdout:5/462: mknod d5/d1c/d23/d51/c9a 0 2026-03-09T00:03:40.464 INFO:tasks.workunit.client.1.vm06.stdout:5/463: dread - d5/d44/d4b/f91 zero size 2026-03-09T00:03:40.467 INFO:tasks.workunit.client.0.vm03.stdout:1/161: dwrite d4/fb [0,4194304] 0 2026-03-09T00:03:40.474 INFO:tasks.workunit.client.0.vm03.stdout:5/130: mknod d1c/d20/c2f 0 2026-03-09T00:03:40.474 INFO:tasks.workunit.client.0.vm03.stdout:5/131: write f15 [454543,116359] 0 2026-03-09T00:03:40.474 INFO:tasks.workunit.client.0.vm03.stdout:5/132: creat d1c/f30 x:0 0 0 2026-03-09T00:03:40.475 INFO:tasks.workunit.client.1.vm06.stdout:7/362: sync 2026-03-09T00:03:40.476 INFO:tasks.workunit.client.0.vm03.stdout:7/158: mkdir d2/d4/d15/d24/d32 0 2026-03-09T00:03:40.487 INFO:tasks.workunit.client.1.vm06.stdout:9/271: unlink d1/d3/d12/d21/d14/l3f 0 2026-03-09T00:03:40.487 INFO:tasks.workunit.client.0.vm03.stdout:2/123: creat d8/d17/f27 x:0 0 0 2026-03-09T00:03:40.487 INFO:tasks.workunit.client.0.vm03.stdout:2/124: write d8/d17/f27 [295537,51600] 0 2026-03-09T00:03:40.494 INFO:tasks.workunit.client.1.vm06.stdout:3/372: getdents d11/d28/d2e/d2f/d5b 0 2026-03-09T00:03:40.494 INFO:tasks.workunit.client.1.vm06.stdout:3/373: fdatasync d11/f24 0 2026-03-09T00:03:40.494 INFO:tasks.workunit.client.1.vm06.stdout:3/374: chown d11/d28/f5e 522 1 2026-03-09T00:03:40.494 INFO:tasks.workunit.client.1.vm06.stdout:3/375: readlink d11/d28/d2e/l3b 0 2026-03-09T00:03:40.495 INFO:tasks.workunit.client.0.vm03.stdout:3/103: sync 2026-03-09T00:03:40.497 INFO:tasks.workunit.client.1.vm06.stdout:4/287: mkdir d17/d24/d3b/d5e 0 2026-03-09T00:03:40.499 INFO:tasks.workunit.client.0.vm03.stdout:8/140: mkdir d7/df/d1a/d2b 0 2026-03-09T00:03:40.503 INFO:tasks.workunit.client.1.vm06.stdout:6/333: getdents d4/d27/d42 0 2026-03-09T00:03:40.519 INFO:tasks.workunit.client.1.vm06.stdout:1/311: mknod d6/d21/d2d/d3b/d42/d43/c65 0 2026-03-09T00:03:40.519 INFO:tasks.workunit.client.1.vm06.stdout:1/312: truncate d6/d21/d2d/d3b/d42/d43/d4d/f59 620491 0 2026-03-09T00:03:40.520 INFO:tasks.workunit.client.0.vm03.stdout:6/138: symlink d13/l2a 0 2026-03-09T00:03:40.522 INFO:tasks.workunit.client.1.vm06.stdout:2/437: link d7/d1a/d39/c3d d7/da/d55/c7e 0 2026-03-09T00:03:40.522 INFO:tasks.workunit.client.1.vm06.stdout:2/438: write d7/da/db/de/f32 [819196,48897] 0 2026-03-09T00:03:40.523 INFO:tasks.workunit.client.0.vm03.stdout:9/139: unlink la 0 2026-03-09T00:03:40.526 INFO:tasks.workunit.client.0.vm03.stdout:1/162: symlink d4/d6/d31/d32/l36 0 2026-03-09T00:03:40.530 INFO:tasks.workunit.client.1.vm06.stdout:0/372: creat d3/d18/d2c/f7e x:0 0 0 2026-03-09T00:03:40.530 INFO:tasks.workunit.client.1.vm06.stdout:8/341: creat db/dd/d24/f6e x:0 0 0 2026-03-09T00:03:40.530 INFO:tasks.workunit.client.1.vm06.stdout:8/342: fsync db/dd/f67 0 2026-03-09T00:03:40.531 INFO:tasks.workunit.client.1.vm06.stdout:3/376: truncate d11/f1d 255895 0 2026-03-09T00:03:40.531 INFO:tasks.workunit.client.1.vm06.stdout:3/377: write d11/d28/d2e/d2f/d36/f75 [930916,5527] 0 2026-03-09T00:03:40.532 INFO:tasks.workunit.client.0.vm03.stdout:5/133: truncate fe 237826 0 2026-03-09T00:03:40.533 INFO:tasks.workunit.client.1.vm06.stdout:4/288: mkdir d17/d24/d49/d5f 0 2026-03-09T00:03:40.533 INFO:tasks.workunit.client.1.vm06.stdout:4/289: chown d17/d24 2045642 1 2026-03-09T00:03:40.533 INFO:tasks.workunit.client.1.vm06.stdout:4/290: fsync d17/d24/d3b/d54/f58 0 2026-03-09T00:03:40.533 INFO:tasks.workunit.client.1.vm06.stdout:7/363: dwrite d0/df/d1a/d3a/d4e/f63 [0,4194304] 0 2026-03-09T00:03:40.533 INFO:tasks.workunit.client.1.vm06.stdout:7/364: write d0/df/f13 [2159973,52468] 0 2026-03-09T00:03:40.534 INFO:tasks.workunit.client.0.vm03.stdout:7/159: symlink d2/d4/d15/d24/d32/l33 0 2026-03-09T00:03:40.538 INFO:tasks.workunit.client.1.vm06.stdout:2/439: mknod d7/da/d63/c7f 0 2026-03-09T00:03:40.539 INFO:tasks.workunit.client.0.vm03.stdout:2/125: mknod d8/c28 0 2026-03-09T00:03:40.541 INFO:tasks.workunit.client.0.vm03.stdout:3/104: fdatasync d2/db/f17 0 2026-03-09T00:03:40.541 INFO:tasks.workunit.client.1.vm06.stdout:5/464: rmdir d5/d44/d4b/d92/d49 39 2026-03-09T00:03:40.541 INFO:tasks.workunit.client.0.vm03.stdout:9/140: link d15/l1a d15/d1c/d28/l2b 0 2026-03-09T00:03:40.541 INFO:tasks.workunit.client.0.vm03.stdout:9/141: readlink d15/d1c/d28/l2b 0 2026-03-09T00:03:40.541 INFO:tasks.workunit.client.0.vm03.stdout:9/142: creat d15/f2c x:0 0 0 2026-03-09T00:03:40.544 INFO:tasks.workunit.client.0.vm03.stdout:1/163: rmdir d4/d15/d1a/d2f 0 2026-03-09T00:03:40.545 INFO:tasks.workunit.client.1.vm06.stdout:8/343: creat db/d53/d5c/f6f x:0 0 0 2026-03-09T00:03:40.546 INFO:tasks.workunit.client.1.vm06.stdout:3/378: symlink d11/d28/d2e/d2f/d5b/d5f/l80 0 2026-03-09T00:03:40.546 INFO:tasks.workunit.client.1.vm06.stdout:3/379: creat d11/d28/d2e/d2f/d5b/d5f/f81 x:0 0 0 2026-03-09T00:03:40.547 INFO:tasks.workunit.client.1.vm06.stdout:5/465: dread d5/d44/d4b/d92/d49/f83 [0,4194304] 0 2026-03-09T00:03:40.547 INFO:tasks.workunit.client.0.vm03.stdout:2/126: unlink d8/d1b/f23 0 2026-03-09T00:03:40.547 INFO:tasks.workunit.client.0.vm03.stdout:7/160: creat d2/d4/f34 x:0 0 0 2026-03-09T00:03:40.553 INFO:tasks.workunit.client.1.vm06.stdout:4/291: rename d17/d24/d49/f47 to d17/d21/d4c/d50/f60 0 2026-03-09T00:03:40.554 INFO:tasks.workunit.client.1.vm06.stdout:8/344: read db/d1e/f60 [228257,83005] 0 2026-03-09T00:03:40.554 INFO:tasks.workunit.client.0.vm03.stdout:3/105: creat d2/db/f20 x:0 0 0 2026-03-09T00:03:40.554 INFO:tasks.workunit.client.0.vm03.stdout:1/164: symlink d4/d15/l37 0 2026-03-09T00:03:40.558 INFO:tasks.workunit.client.1.vm06.stdout:1/313: truncate d6/fa 2066161 0 2026-03-09T00:03:40.563 INFO:tasks.workunit.client.1.vm06.stdout:2/440: unlink d7/d1b/d5a/f62 0 2026-03-09T00:03:40.563 INFO:tasks.workunit.client.0.vm03.stdout:7/161: mkdir d2/d1f/d35 0 2026-03-09T00:03:40.563 INFO:tasks.workunit.client.0.vm03.stdout:3/106: creat d2/db/f21 x:0 0 0 2026-03-09T00:03:40.563 INFO:tasks.workunit.client.0.vm03.stdout:3/107: fsync d2/f1c 0 2026-03-09T00:03:40.571 INFO:tasks.workunit.client.1.vm06.stdout:9/272: dwrite d1/f45 [0,4194304] 0 2026-03-09T00:03:40.577 INFO:tasks.workunit.client.0.vm03.stdout:7/162: dread d2/d4/d15/f1a [0,4194304] 0 2026-03-09T00:03:40.577 INFO:tasks.workunit.client.0.vm03.stdout:7/163: write d2/f3 [3997241,27977] 0 2026-03-09T00:03:40.578 INFO:tasks.workunit.client.0.vm03.stdout:1/165: rename d4/l13 to d4/d6/l38 0 2026-03-09T00:03:40.578 INFO:tasks.workunit.client.0.vm03.stdout:1/166: chown d4/d6/d31 57 1 2026-03-09T00:03:40.578 INFO:tasks.workunit.client.1.vm06.stdout:5/466: symlink d5/d1c/d21/d28/l9b 0 2026-03-09T00:03:40.579 INFO:tasks.workunit.client.0.vm03.stdout:7/164: symlink d2/l36 0 2026-03-09T00:03:40.580 INFO:tasks.workunit.client.1.vm06.stdout:8/345: rename db/dd/d24/d36 to db/d53/d70 0 2026-03-09T00:03:40.580 INFO:tasks.workunit.client.1.vm06.stdout:1/314: mknod d6/d21/d2d/d3b/d42/c66 0 2026-03-09T00:03:40.581 INFO:tasks.workunit.client.1.vm06.stdout:2/441: rmdir d7/da/d1c 39 2026-03-09T00:03:40.583 INFO:tasks.workunit.client.0.vm03.stdout:1/167: creat d4/f39 x:0 0 0 2026-03-09T00:03:40.588 INFO:tasks.workunit.client.0.vm03.stdout:1/168: stat d4/d6/f34 0 2026-03-09T00:03:40.588 INFO:tasks.workunit.client.0.vm03.stdout:1/169: write d4/d6/f33 [356541,28436] 0 2026-03-09T00:03:40.589 INFO:tasks.workunit.client.0.vm03.stdout:3/108: dread d2/f8 [0,4194304] 0 2026-03-09T00:03:40.589 INFO:tasks.workunit.client.1.vm06.stdout:5/467: creat d5/d44/d4b/d92/d95/f9c x:0 0 0 2026-03-09T00:03:40.589 INFO:tasks.workunit.client.1.vm06.stdout:5/468: creat d5/d1c/d68/f9d x:0 0 0 2026-03-09T00:03:40.589 INFO:tasks.workunit.client.1.vm06.stdout:1/315: write d6/fb [2532992,114972] 0 2026-03-09T00:03:40.591 INFO:tasks.workunit.client.1.vm06.stdout:9/273: write d1/d4/f6 [3193380,40274] 0 2026-03-09T00:03:40.591 INFO:tasks.workunit.client.1.vm06.stdout:1/316: mkdir d6/d4c/d67 0 2026-03-09T00:03:40.591 INFO:tasks.workunit.client.1.vm06.stdout:1/317: rmdir d6/d4c/d67 0 2026-03-09T00:03:40.595 INFO:tasks.workunit.client.0.vm03.stdout:1/170: read d4/d6/f20 [1732777,32963] 0 2026-03-09T00:03:40.597 INFO:tasks.workunit.client.0.vm03.stdout:1/171: fdatasync d4/d6/f8 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:5/469: dread d5/f14 [4194304,4194304] 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:5/470: write d5/d1c/d23/f54 [1090293,95820] 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:5/471: mknod d5/d44/c9e 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:2/442: mknod d7/da/d4e/c80 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:8/346: link f7 db/d53/d70/f71 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:8/347: dread - db/d53/d70/d38/d4d/f65 zero size 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:9/274: rename d1/d4/l51 to d1/l5b 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:9/275: write d1/d3/d12/d21/d9/f3d [1191814,71944] 0 2026-03-09T00:03:40.599 INFO:tasks.workunit.client.1.vm06.stdout:1/318: symlink d6/d21/d2d/d3b/d42/d43/d4d/l68 0 2026-03-09T00:03:40.603 INFO:tasks.workunit.client.1.vm06.stdout:6/334: dwrite d4/f2d [0,4194304] 0 2026-03-09T00:03:40.603 INFO:tasks.workunit.client.1.vm06.stdout:6/335: chown d4/c24 0 1 2026-03-09T00:03:40.603 INFO:tasks.workunit.client.1.vm06.stdout:9/276: dread d1/d3/d12/f3a [4194304,4194304] 0 2026-03-09T00:03:40.603 INFO:tasks.workunit.client.0.vm03.stdout:8/141: dwrite d7/df/d1a/f2a [0,4194304] 0 2026-03-09T00:03:40.606 INFO:tasks.workunit.client.1.vm06.stdout:2/443: mkdir d7/da/d63/d81 0 2026-03-09T00:03:40.606 INFO:tasks.workunit.client.1.vm06.stdout:1/319: write d6/f1b [1007862,53856] 0 2026-03-09T00:03:40.608 INFO:tasks.workunit.client.1.vm06.stdout:6/336: creat d4/d27/d42/f60 x:0 0 0 2026-03-09T00:03:40.608 INFO:tasks.workunit.client.1.vm06.stdout:9/277: rename d1/d3/d12/f3a to d1/d3/f5c 0 2026-03-09T00:03:40.624 INFO:tasks.workunit.client.1.vm06.stdout:1/320: creat d6/d21/f69 x:0 0 0 2026-03-09T00:03:40.625 INFO:tasks.workunit.client.1.vm06.stdout:6/337: creat d4/d27/f61 x:0 0 0 2026-03-09T00:03:40.625 INFO:tasks.workunit.client.1.vm06.stdout:6/338: chown d4/d16/c2b 25613866 1 2026-03-09T00:03:40.630 INFO:tasks.workunit.client.1.vm06.stdout:9/278: link d1/d3/d2b/f33 d1/d3/d12/d21/f5d 0 2026-03-09T00:03:40.633 INFO:tasks.workunit.client.1.vm06.stdout:6/339: mknod d4/d16/d46/c62 0 2026-03-09T00:03:40.633 INFO:tasks.workunit.client.1.vm06.stdout:6/340: creat d4/d16/f63 x:0 0 0 2026-03-09T00:03:40.639 INFO:tasks.workunit.client.1.vm06.stdout:3/380: write d11/d28/d2e/f65 [1169344,64671] 0 2026-03-09T00:03:40.653 INFO:tasks.workunit.client.0.vm03.stdout:5/134: dwrite ff [0,4194304] 0 2026-03-09T00:03:40.653 INFO:tasks.workunit.client.0.vm03.stdout:5/135: fsync f18 0 2026-03-09T00:03:40.653 INFO:tasks.workunit.client.0.vm03.stdout:5/136: write d1c/f30 [868558,84622] 0 2026-03-09T00:03:40.660 INFO:tasks.workunit.client.0.vm03.stdout:7/165: dread d2/d4/d15/f29 [0,4194304] 0 2026-03-09T00:03:40.670 INFO:tasks.workunit.client.1.vm06.stdout:2/444: read d7/da/db/de/f53 [427117,80857] 0 2026-03-09T00:03:40.670 INFO:tasks.workunit.client.1.vm06.stdout:2/445: write d7/d1a/d56/f50 [1690687,46732] 0 2026-03-09T00:03:40.672 INFO:tasks.workunit.client.0.vm03.stdout:2/127: dwrite d8/d17/f1c [0,4194304] 0 2026-03-09T00:03:40.673 INFO:tasks.workunit.client.0.vm03.stdout:5/137: unlink d1c/d20/l21 0 2026-03-09T00:03:40.675 INFO:tasks.workunit.client.0.vm03.stdout:7/166: mkdir d2/d4/d15/d31/d37 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.1.vm06.stdout:2/446: rename d7/da/db/de/lf to d7/da/d63/d81/l82 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.1.vm06.stdout:2/447: creat d7/da/d4e/f83 x:0 0 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/128: symlink d8/d1b/l29 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/129: chown d8/f12 62165546 1 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:5/138: rename d1c/c26 to d1c/c31 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/130: mkdir d8/d1b/d2a 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/131: mknod d8/d26/c2b 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/132: write d8/d1b/f1f [1002132,18378] 0 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/133: dread - d8/f21 zero size 2026-03-09T00:03:40.699 INFO:tasks.workunit.client.0.vm03.stdout:2/134: creat d8/d17/f2c x:0 0 0 2026-03-09T00:03:40.703 INFO:tasks.workunit.client.0.vm03.stdout:2/135: creat d8/d1b/d2a/f2d x:0 0 0 2026-03-09T00:03:40.703 INFO:tasks.workunit.client.0.vm03.stdout:2/136: write f2 [5085105,28792] 0 2026-03-09T00:03:40.713 INFO:tasks.workunit.client.0.vm03.stdout:6/139: dwrite d13/f17 [0,4194304] 0 2026-03-09T00:03:40.713 INFO:tasks.workunit.client.1.vm06.stdout:2/448: write d7/d1b/d31/f7d [1705393,118384] 0 2026-03-09T00:03:40.725 INFO:tasks.workunit.client.0.vm03.stdout:6/140: link ce d13/c2b 0 2026-03-09T00:03:40.727 INFO:tasks.workunit.client.0.vm03.stdout:0/135: sync 2026-03-09T00:03:40.727 INFO:tasks.workunit.client.0.vm03.stdout:0/136: read - d2/da/d1a/f25 zero size 2026-03-09T00:03:40.727 INFO:tasks.workunit.client.0.vm03.stdout:4/146: sync 2026-03-09T00:03:40.727 INFO:tasks.workunit.client.0.vm03.stdout:4/147: creat d7/f28 x:0 0 0 2026-03-09T00:03:40.736 INFO:tasks.workunit.client.0.vm03.stdout:0/137: rename d2/d1f/c28 to d2/d1f/c31 0 2026-03-09T00:03:40.754 INFO:tasks.workunit.client.0.vm03.stdout:0/138: truncate d2/f1e 4787967 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/139: read d2/f1e [530568,65246] 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/140: fdatasync d2/ff 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/148: mkdir d7/d20/d29 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:6/141: getdents d13/d1e 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:6/142: readlink l11 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:6/143: creat d13/f2c x:0 0 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/141: creat d2/f32 x:0 0 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/142: fdatasync f0 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/143: chown d2/da/c10 455977021 1 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/149: unlink d7/f26 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/150: dread d7/f12 [0,4194304] 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:6/144: creat d13/d1e/f2d x:0 0 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/144: rmdir d2/da/d1a 39 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:0/145: chown d2/da/d1a/c20 26415 1 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/151: unlink d7/fc 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/152: creat d7/d20/d29/f2a x:0 0 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/153: fsync d7/fe 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.0.vm03.stdout:4/154: write d7/f22 [712663,86796] 0 2026-03-09T00:03:40.755 INFO:tasks.workunit.client.1.vm06.stdout:6/341: dwrite d4/ff [0,4194304] 0 2026-03-09T00:03:40.756 INFO:tasks.workunit.client.0.vm03.stdout:6/145: symlink d13/d1e/l2e 0 2026-03-09T00:03:40.756 INFO:tasks.workunit.client.0.vm03.stdout:6/146: dread - d13/f14 zero size 2026-03-09T00:03:40.756 INFO:tasks.workunit.client.0.vm03.stdout:6/147: read - d13/f14 zero size 2026-03-09T00:03:40.756 INFO:tasks.workunit.client.0.vm03.stdout:6/148: creat d13/f2f x:0 0 0 2026-03-09T00:03:40.760 INFO:tasks.workunit.client.0.vm03.stdout:0/146: symlink d2/l33 0 2026-03-09T00:03:40.760 INFO:tasks.workunit.client.0.vm03.stdout:0/147: fdatasync d2/fe 0 2026-03-09T00:03:40.762 INFO:tasks.workunit.client.0.vm03.stdout:4/155: dread d7/d20/f21 [0,4194304] 0 2026-03-09T00:03:40.762 INFO:tasks.workunit.client.0.vm03.stdout:0/148: dread d2/fb [0,4194304] 0 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/348: dwrite db/d1e/d46/f69 [0,4194304] 0 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/349: chown db/dd/f67 14682 1 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/350: chown db/d1e/f4f 16 1 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/351: chown db/d1e/l3b 3 1 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/352: truncate db/d1e/f34 4793069 0 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/353: fsync db/f1d 0 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/354: read - db/d53/d5c/f6f zero size 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/355: stat db/dd/d24/d63 0 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/356: dread - db/d1e/d46/f5d zero size 2026-03-09T00:03:40.763 INFO:tasks.workunit.client.1.vm06.stdout:8/357: chown db/dd 1030106700 1 2026-03-09T00:03:40.773 INFO:tasks.workunit.client.1.vm06.stdout:5/472: dwrite d5/d44/d4b/d92/f86 [0,4194304] 0 2026-03-09T00:03:40.773 INFO:tasks.workunit.client.0.vm03.stdout:6/149: write d13/f1f [2333453,21893] 0 2026-03-09T00:03:40.775 INFO:tasks.workunit.client.0.vm03.stdout:0/149: dread d2/ff [0,4194304] 0 2026-03-09T00:03:40.775 INFO:tasks.workunit.client.0.vm03.stdout:0/150: write d2/da/f1b [958691,86853] 0 2026-03-09T00:03:40.775 INFO:tasks.workunit.client.0.vm03.stdout:0/151: fsync d2/da/dd/f24 0 2026-03-09T00:03:40.781 INFO:tasks.workunit.client.1.vm06.stdout:3/381: dread d11/d28/d2e/f38 [0,4194304] 0 2026-03-09T00:03:40.781 INFO:tasks.workunit.client.1.vm06.stdout:3/382: read d11/d28/d2e/d2f/d5b/f7d [622274,86557] 0 2026-03-09T00:03:40.781 INFO:tasks.workunit.client.1.vm06.stdout:3/383: write d11/d28/d2e/d7e/f73 [612221,22351] 0 2026-03-09T00:03:40.799 INFO:tasks.workunit.client.1.vm06.stdout:6/342: symlink d4/d16/d53/l64 0 2026-03-09T00:03:40.801 INFO:tasks.workunit.client.1.vm06.stdout:6/343: dread d4/d16/f21 [0,4194304] 0 2026-03-09T00:03:40.802 INFO:tasks.workunit.client.1.vm06.stdout:6/344: creat d4/d27/d3e/d57/f65 x:0 0 0 2026-03-09T00:03:40.806 INFO:tasks.workunit.client.0.vm03.stdout:6/150: fdatasync d13/d1e/f27 0 2026-03-09T00:03:40.816 INFO:tasks.workunit.client.0.vm03.stdout:6/151: rename d13/d1e/f27 to d13/d1e/f30 0 2026-03-09T00:03:40.830 INFO:tasks.workunit.client.0.vm03.stdout:6/152: truncate fb 6110481 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.0.vm03.stdout:6/153: rename d13/f1f to d13/f31 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.0.vm03.stdout:6/154: fsync f12 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.0.vm03.stdout:6/155: symlink d13/l32 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.1.vm06.stdout:5/473: mknod d5/d44/d4b/d92/d95/c9f 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.1.vm06.stdout:6/345: mkdir d4/d66 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.1.vm06.stdout:3/384: rmdir d11/d28/d4d 39 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.1.vm06.stdout:5/474: mkdir d5/d44/d4b/d92/d49/da0 0 2026-03-09T00:03:40.831 INFO:tasks.workunit.client.1.vm06.stdout:6/346: mkdir d4/d16/d53/d67 0 2026-03-09T00:03:40.832 INFO:tasks.workunit.client.1.vm06.stdout:6/347: rename d4/d16/f32 to d4/f68 0 2026-03-09T00:03:40.832 INFO:tasks.workunit.client.1.vm06.stdout:6/348: dread - d4/d27/f61 zero size 2026-03-09T00:03:40.832 INFO:tasks.workunit.client.1.vm06.stdout:6/349: mknod d4/d27/d42/d52/d5d/c69 0 2026-03-09T00:03:40.835 INFO:tasks.workunit.client.1.vm06.stdout:5/475: dread d5/d44/d4b/d92/f86 [0,4194304] 0 2026-03-09T00:03:40.841 INFO:tasks.workunit.client.1.vm06.stdout:6/350: dread d4/f3d [0,4194304] 0 2026-03-09T00:03:40.841 INFO:tasks.workunit.client.0.vm03.stdout:6/156: dread f8 [0,4194304] 0 2026-03-09T00:03:40.845 INFO:tasks.workunit.client.1.vm06.stdout:5/476: mkdir d5/d44/d4b/d92/d49/da1 0 2026-03-09T00:03:40.845 INFO:tasks.workunit.client.1.vm06.stdout:6/351: link d4/d16/d53/l64 d4/d27/d3e/l6a 0 2026-03-09T00:03:40.859 INFO:tasks.workunit.client.1.vm06.stdout:5/477: mkdir d5/d1c/d68/da2 0 2026-03-09T00:03:40.867 INFO:tasks.workunit.client.1.vm06.stdout:2/449: dwrite d7/da/f18 [0,4194304] 0 2026-03-09T00:03:40.872 INFO:tasks.workunit.client.1.vm06.stdout:2/450: creat d7/d1a/d25/d66/f84 x:0 0 0 2026-03-09T00:03:40.872 INFO:tasks.workunit.client.1.vm06.stdout:2/451: readlink d7/d1a/l72 0 2026-03-09T00:03:40.872 INFO:tasks.workunit.client.1.vm06.stdout:1/321: dwrite d6/f41 [0,4194304] 0 2026-03-09T00:03:40.872 INFO:tasks.workunit.client.1.vm06.stdout:1/322: creat d6/d63/f6a x:0 0 0 2026-03-09T00:03:40.877 INFO:tasks.workunit.client.1.vm06.stdout:1/323: getdents d6/d21/d2d/d3b/d42/d43 0 2026-03-09T00:03:40.877 INFO:tasks.workunit.client.1.vm06.stdout:6/352: dread d4/f12 [0,4194304] 0 2026-03-09T00:03:40.877 INFO:tasks.workunit.client.1.vm06.stdout:6/353: creat d4/d27/d42/f6b x:0 0 0 2026-03-09T00:03:40.877 INFO:tasks.workunit.client.1.vm06.stdout:6/354: write d4/d16/f63 [403261,103511] 0 2026-03-09T00:03:40.878 INFO:tasks.workunit.client.0.vm03.stdout:4/156: dwrite d7/f12 [0,4194304] 0 2026-03-09T00:03:40.879 INFO:tasks.workunit.client.0.vm03.stdout:8/142: dwrite d7/fd [0,4194304] 0 2026-03-09T00:03:40.879 INFO:tasks.workunit.client.0.vm03.stdout:8/143: stat d7/df/d1a 0 2026-03-09T00:03:40.886 INFO:tasks.workunit.client.1.vm06.stdout:9/279: dwrite d1/d3/d12/d21/d9/f4c [0,4194304] 0 2026-03-09T00:03:40.886 INFO:tasks.workunit.client.1.vm06.stdout:9/280: write d1/d4/f54 [592012,85023] 0 2026-03-09T00:03:40.886 INFO:tasks.workunit.client.1.vm06.stdout:6/355: truncate d4/d27/f31 2277393 0 2026-03-09T00:03:40.894 INFO:tasks.workunit.client.1.vm06.stdout:9/281: rename d1/d3/f35 to d1/d3/d4f/d52/f5e 0 2026-03-09T00:03:40.897 INFO:tasks.workunit.client.1.vm06.stdout:9/282: unlink d1/d3/d12/d21/d14/f47 0 2026-03-09T00:03:40.897 INFO:tasks.workunit.client.1.vm06.stdout:9/283: dread - d1/d3/d12/d21/d14/d25/f4e zero size 2026-03-09T00:03:40.897 INFO:tasks.workunit.client.0.vm03.stdout:8/144: write d7/f15 [235506,57910] 0 2026-03-09T00:03:40.901 INFO:tasks.workunit.client.1.vm06.stdout:9/284: unlink d1/d3/d4f/c55 0 2026-03-09T00:03:40.914 INFO:tasks.workunit.client.1.vm06.stdout:9/285: creat d1/d3/d2b/d58/f5f x:0 0 0 2026-03-09T00:03:40.914 INFO:tasks.workunit.client.1.vm06.stdout:9/286: link d1/d3/d12/f28 d1/d3/d12/d21/d14/d25/f60 0 2026-03-09T00:03:40.915 INFO:tasks.workunit.client.1.vm06.stdout:3/385: dwrite d11/d28/d2e/d2f/f74 [0,4194304] 0 2026-03-09T00:03:40.926 INFO:tasks.workunit.client.0.vm03.stdout:8/145: dread d7/f9 [0,4194304] 0 2026-03-09T00:03:40.940 INFO:tasks.workunit.client.0.vm03.stdout:8/146: unlink d7/cb 0 2026-03-09T00:03:40.940 INFO:tasks.workunit.client.0.vm03.stdout:8/147: getdents d7/df/d1e 0 2026-03-09T00:03:40.941 INFO:tasks.workunit.client.0.vm03.stdout:8/148: dread d7/f25 [0,4194304] 0 2026-03-09T00:03:40.941 INFO:tasks.workunit.client.0.vm03.stdout:8/149: creat d7/df/f2c x:0 0 0 2026-03-09T00:03:40.944 INFO:tasks.workunit.client.0.vm03.stdout:8/150: mknod d7/c2d 0 2026-03-09T00:03:40.971 INFO:tasks.workunit.client.1.vm06.stdout:9/287: dwrite d1/d3/d4f/d52/f5e [4194304,4194304] 0 2026-03-09T00:03:40.972 INFO:tasks.workunit.client.1.vm06.stdout:9/288: symlink d1/d3/d2b/d58/l61 0 2026-03-09T00:03:40.973 INFO:tasks.workunit.client.1.vm06.stdout:9/289: symlink d1/d3/d12/d21/d14/l62 0 2026-03-09T00:03:40.974 INFO:tasks.workunit.client.0.vm03.stdout:0/152: dwrite d2/f22 [0,4194304] 0 2026-03-09T00:03:40.976 INFO:tasks.workunit.client.0.vm03.stdout:0/153: unlink d2/d1f/c21 0 2026-03-09T00:03:40.991 INFO:tasks.workunit.client.1.vm06.stdout:1/324: write d6/fa [2590220,86020] 0 2026-03-09T00:03:40.991 INFO:tasks.workunit.client.1.vm06.stdout:1/325: write d6/d21/d2d/f5d [1208923,36524] 0 2026-03-09T00:03:40.993 INFO:tasks.workunit.client.1.vm06.stdout:1/326: mknod d6/d4c/c6b 0 2026-03-09T00:03:40.994 INFO:tasks.workunit.client.1.vm06.stdout:1/327: unlink d6/d21/d2d/d3b/c62 0 2026-03-09T00:03:41.005 INFO:tasks.workunit.client.1.vm06.stdout:1/328: readlink d6/l23 0 2026-03-09T00:03:41.006 INFO:tasks.workunit.client.0.vm03.stdout:1/172: truncate d4/d6/f33 47018 0 2026-03-09T00:03:41.006 INFO:tasks.workunit.client.1.vm06.stdout:1/329: rmdir d6/d21/d2d/d3b/d42 39 2026-03-09T00:03:41.006 INFO:tasks.workunit.client.1.vm06.stdout:1/330: rename d6/d21/d2d/f44 to d6/d21/d2d/f6c 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:1/331: mkdir d6/d21/d2d/d37/d6d 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/290: getdents d1 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/291: rename d1/l5b to d1/d3/d50/l63 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/292: readlink d1/d3/ld 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/293: write d1/d4/d2f/f43 [103129,71456] 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/294: truncate d1/d3/d2b/d58/f5f 415793 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/295: truncate d1/d3/d12/d21/f5d 222632 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/296: chown d1/d3/d2b/f33 12 1 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/297: mknod d1/d3/d4f/c64 0 2026-03-09T00:03:41.007 INFO:tasks.workunit.client.1.vm06.stdout:9/298: creat d1/d3/d12/d21/d9/f65 x:0 0 0 2026-03-09T00:03:41.014 INFO:tasks.workunit.client.1.vm06.stdout:1/332: dread d6/f1d [0,4194304] 0 2026-03-09T00:03:41.015 INFO:tasks.workunit.client.0.vm03.stdout:2/137: dwrite d8/d17/f27 [0,4194304] 0 2026-03-09T00:03:41.015 INFO:tasks.workunit.client.0.vm03.stdout:5/139: dwrite f12 [0,4194304] 0 2026-03-09T00:03:41.015 INFO:tasks.workunit.client.0.vm03.stdout:5/140: fsync d1c/f1f 0 2026-03-09T00:03:41.016 INFO:tasks.workunit.client.0.vm03.stdout:5/141: creat d1c/f32 x:0 0 0 2026-03-09T00:03:41.016 INFO:tasks.workunit.client.0.vm03.stdout:5/142: dread fe [0,4194304] 0 2026-03-09T00:03:41.028 INFO:tasks.workunit.client.1.vm06.stdout:6/356: dwrite d4/d27/d42/f60 [0,4194304] 0 2026-03-09T00:03:41.028 INFO:tasks.workunit.client.1.vm06.stdout:6/357: fdatasync d4/d27/d3e/d45/f4d 0 2026-03-09T00:03:41.028 INFO:tasks.workunit.client.1.vm06.stdout:9/299: dread d1/d3/d12/d21/d14/d25/f32 [0,4194304] 0 2026-03-09T00:03:41.030 INFO:tasks.workunit.client.0.vm03.stdout:1/173: rename d4/d6/d31 to d4/d3a 0 2026-03-09T00:03:41.030 INFO:tasks.workunit.client.0.vm03.stdout:1/174: chown d4/d6/ff 111 1 2026-03-09T00:03:41.033 INFO:tasks.workunit.client.1.vm06.stdout:5/478: dwrite d5/d1c/d68/f9d [0,4194304] 0 2026-03-09T00:03:41.037 INFO:tasks.workunit.client.1.vm06.stdout:1/333: write d6/f19 [2589737,66777] 0 2026-03-09T00:03:41.044 INFO:tasks.workunit.client.1.vm06.stdout:6/358: dread d4/d16/f34 [4194304,4194304] 0 2026-03-09T00:03:41.050 INFO:tasks.workunit.client.0.vm03.stdout:4/157: dwrite d7/f22 [0,4194304] 0 2026-03-09T00:03:41.050 INFO:tasks.workunit.client.0.vm03.stdout:4/158: write d7/fd [4161953,172] 0 2026-03-09T00:03:41.051 INFO:tasks.workunit.client.0.vm03.stdout:4/159: write d7/f1c [2605055,125956] 0 2026-03-09T00:03:41.053 INFO:tasks.workunit.client.0.vm03.stdout:1/175: dread d4/f9 [0,4194304] 0 2026-03-09T00:03:41.057 INFO:tasks.workunit.client.1.vm06.stdout:3/386: dwrite d11/d28/d2e/d2f/d5b/d5f/f60 [0,4194304] 0 2026-03-09T00:03:41.066 INFO:tasks.workunit.client.1.vm06.stdout:3/387: read - d11/d28/d2e/d2f/f64 zero size 2026-03-09T00:03:41.066 INFO:tasks.workunit.client.1.vm06.stdout:3/388: dread - d11/d28/d2e/d2f/f64 zero size 2026-03-09T00:03:41.066 INFO:tasks.workunit.client.1.vm06.stdout:2/452: dwrite d7/d1a/d25/d66/f84 [0,4194304] 0 2026-03-09T00:03:41.066 INFO:tasks.workunit.client.1.vm06.stdout:3/389: getdents d11/d28/d2e/d2f/d5b 0 2026-03-09T00:03:41.067 INFO:tasks.workunit.client.0.vm03.stdout:5/143: link fe d1c/d20/f33 0 2026-03-09T00:03:41.067 INFO:tasks.workunit.client.0.vm03.stdout:1/176: mknod d4/c3b 0 2026-03-09T00:03:41.067 INFO:tasks.workunit.client.1.vm06.stdout:6/359: write d4/d16/f34 [5678302,114152] 0 2026-03-09T00:03:41.067 INFO:tasks.workunit.client.1.vm06.stdout:5/479: symlink d5/d44/d4b/d92/d49/la3 0 2026-03-09T00:03:41.075 INFO:tasks.workunit.client.0.vm03.stdout:1/177: rmdir d4/d3a/d32 39 2026-03-09T00:03:41.076 INFO:tasks.workunit.client.1.vm06.stdout:2/453: dread f2 [0,4194304] 0 2026-03-09T00:03:41.076 INFO:tasks.workunit.client.0.vm03.stdout:1/178: creat d4/d15/f3c x:0 0 0 2026-03-09T00:03:41.076 INFO:tasks.workunit.client.0.vm03.stdout:1/179: readlink d4/d15/l22 0 2026-03-09T00:03:41.090 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:40 vm03.local ceph-mon[52346]: pgmap v5: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:41.092 INFO:tasks.workunit.client.0.vm03.stdout:0/154: dwrite d2/da/d1a/f1c [0,4194304] 0 2026-03-09T00:03:41.096 INFO:tasks.workunit.client.0.vm03.stdout:4/160: write d7/d20/f21 [4153679,124215] 0 2026-03-09T00:03:41.122 INFO:tasks.workunit.client.0.vm03.stdout:5/144: dwrite fe [0,4194304] 0 2026-03-09T00:03:41.130 INFO:tasks.workunit.client.0.vm03.stdout:2/138: dwrite d8/d1b/f1f [0,4194304] 0 2026-03-09T00:03:41.131 INFO:tasks.workunit.client.1.vm06.stdout:6/360: creat d4/d27/d42/d52/f6c x:0 0 0 2026-03-09T00:03:41.132 INFO:tasks.workunit.client.1.vm06.stdout:5/480: creat d5/d1c/d68/da2/fa4 x:0 0 0 2026-03-09T00:03:41.132 INFO:tasks.workunit.client.1.vm06.stdout:5/481: write d5/d44/d4b/f6c [929293,128685] 0 2026-03-09T00:03:41.134 INFO:tasks.workunit.client.0.vm03.stdout:0/155: unlink d2/da/c23 0 2026-03-09T00:03:41.134 INFO:tasks.workunit.client.0.vm03.stdout:0/156: chown d2/da/d1a/f25 11665 1 2026-03-09T00:03:41.135 INFO:tasks.workunit.client.1.vm06.stdout:1/334: rename d6/d4c/l4f to d6/d21/d2d/d37/d6d/l6e 0 2026-03-09T00:03:41.135 INFO:tasks.workunit.client.1.vm06.stdout:1/335: readlink d6/d21/l60 0 2026-03-09T00:03:41.135 INFO:tasks.workunit.client.0.vm03.stdout:4/161: mknod d7/d20/c2b 0 2026-03-09T00:03:41.138 INFO:tasks.workunit.client.1.vm06.stdout:2/454: dwrite d7/da/db/f74 [0,4194304] 0 2026-03-09T00:03:41.140 INFO:tasks.workunit.client.1.vm06.stdout:9/300: dwrite d1/d3/d4f/d52/f5e [0,4194304] 0 2026-03-09T00:03:41.163 INFO:tasks.workunit.client.0.vm03.stdout:6/157: rmdir d13 39 2026-03-09T00:03:41.163 INFO:tasks.workunit.client.0.vm03.stdout:6/158: dread - d13/f2c zero size 2026-03-09T00:03:41.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:40 vm06.local ceph-mon[58395]: pgmap v5: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:41.174 INFO:tasks.workunit.client.1.vm06.stdout:3/390: rmdir d11/d3f 39 2026-03-09T00:03:41.174 INFO:tasks.workunit.client.1.vm06.stdout:3/391: write d11/f5a [500516,28756] 0 2026-03-09T00:03:41.179 INFO:tasks.workunit.client.1.vm06.stdout:3/392: read f9 [1739562,10213] 0 2026-03-09T00:03:41.195 INFO:tasks.workunit.client.1.vm06.stdout:3/393: readlink l1 0 2026-03-09T00:03:41.196 INFO:tasks.workunit.client.0.vm03.stdout:6/159: dread d13/f16 [0,4194304] 0 2026-03-09T00:03:41.196 INFO:tasks.workunit.client.1.vm06.stdout:6/361: unlink d4/d27/f2e 0 2026-03-09T00:03:41.204 INFO:tasks.workunit.client.1.vm06.stdout:1/336: link d6/d21/d2d/c2f d6/d4c/d51/c6f 0 2026-03-09T00:03:41.209 INFO:tasks.workunit.client.1.vm06.stdout:2/455: dwrite d7/da/d1c/f5f [4194304,4194304] 0 2026-03-09T00:03:41.209 INFO:tasks.workunit.client.0.vm03.stdout:2/139: mkdir d8/d1b/d2a/d2e 0 2026-03-09T00:03:41.214 INFO:tasks.workunit.client.1.vm06.stdout:5/482: dwrite d5/f3d [0,4194304] 0 2026-03-09T00:03:41.234 INFO:tasks.workunit.client.1.vm06.stdout:9/301: truncate d1/d3/d12/d49/f56 1263898 0 2026-03-09T00:03:41.237 INFO:tasks.workunit.client.0.vm03.stdout:4/162: creat d7/d27/f2c x:0 0 0 2026-03-09T00:03:41.240 INFO:tasks.workunit.client.1.vm06.stdout:3/394: symlink d11/d28/d4d/l82 0 2026-03-09T00:03:41.241 INFO:tasks.workunit.client.1.vm06.stdout:1/337: symlink d6/d21/d2d/d3b/d42/d43/l70 0 2026-03-09T00:03:41.243 INFO:tasks.workunit.client.0.vm03.stdout:6/160: creat d13/d1e/f33 x:0 0 0 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.1.vm06.stdout:4/292: sync 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.1.vm06.stdout:4/293: fsync d17/d24/f5c 0 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.1.vm06.stdout:4/294: chown d17/d24/f31 360472499 1 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.1.vm06.stdout:7/365: sync 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.1.vm06.stdout:0/373: sync 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.0.vm03.stdout:0/157: getdents d2 0 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.0.vm03.stdout:0/158: dread - d2/f32 zero size 2026-03-09T00:03:41.249 INFO:tasks.workunit.client.0.vm03.stdout:4/163: mknod d7/d20/d29/c2d 0 2026-03-09T00:03:41.251 INFO:tasks.workunit.client.0.vm03.stdout:2/140: dwrite d8/fb [0,4194304] 0 2026-03-09T00:03:41.252 INFO:tasks.workunit.client.0.vm03.stdout:6/161: creat d13/d1e/f34 x:0 0 0 2026-03-09T00:03:41.256 INFO:tasks.workunit.client.0.vm03.stdout:0/159: write d2/f22 [328839,128917] 0 2026-03-09T00:03:41.258 INFO:tasks.workunit.client.0.vm03.stdout:4/164: symlink d7/d23/d25/l2e 0 2026-03-09T00:03:41.263 INFO:tasks.workunit.client.1.vm06.stdout:5/483: symlink d5/d1c/d21/d28/la5 0 2026-03-09T00:03:41.271 INFO:tasks.workunit.client.0.vm03.stdout:2/141: link d8/d1b/f1f d8/d1b/d24/f2f 0 2026-03-09T00:03:41.271 INFO:tasks.workunit.client.0.vm03.stdout:8/151: rmdir d7/df 39 2026-03-09T00:03:41.283 INFO:tasks.workunit.client.0.vm03.stdout:6/162: mkdir d13/d35 0 2026-03-09T00:03:41.295 INFO:tasks.workunit.client.0.vm03.stdout:0/160: mknod d2/da/c34 0 2026-03-09T00:03:41.298 INFO:tasks.workunit.client.0.vm03.stdout:4/165: creat d7/d20/f2f x:0 0 0 2026-03-09T00:03:41.309 INFO:tasks.workunit.client.1.vm06.stdout:7/366: dwrite d0/df/d17/f38 [0,4194304] 0 2026-03-09T00:03:41.309 INFO:tasks.workunit.client.1.vm06.stdout:7/367: creat d0/df/d1a/d27/f66 x:0 0 0 2026-03-09T00:03:41.315 INFO:tasks.workunit.client.0.vm03.stdout:8/152: dwrite f6 [0,4194304] 0 2026-03-09T00:03:41.315 INFO:tasks.workunit.client.0.vm03.stdout:8/153: chown d7/df/d1e 46 1 2026-03-09T00:03:41.319 INFO:tasks.workunit.client.1.vm06.stdout:3/395: unlink d11/d28/d2e/d2f/d5b/f6c 0 2026-03-09T00:03:41.327 INFO:tasks.workunit.client.1.vm06.stdout:7/368: write d0/df/d1a/f50 [2251858,117671] 0 2026-03-09T00:03:41.327 INFO:tasks.workunit.client.1.vm06.stdout:7/369: stat d0/f4f 0 2026-03-09T00:03:41.328 INFO:tasks.workunit.client.1.vm06.stdout:4/295: creat d17/f61 x:0 0 0 2026-03-09T00:03:41.328 INFO:tasks.workunit.client.0.vm03.stdout:2/142: rename d8/d17/f20 to d8/d1b/f30 0 2026-03-09T00:03:41.328 INFO:tasks.workunit.client.0.vm03.stdout:2/143: chown d8/d1b 252334312 1 2026-03-09T00:03:41.328 INFO:tasks.workunit.client.0.vm03.stdout:2/144: write d8/f15 [972933,43020] 0 2026-03-09T00:03:41.331 INFO:tasks.workunit.client.0.vm03.stdout:6/163: creat d13/d1e/f36 x:0 0 0 2026-03-09T00:03:41.337 INFO:tasks.workunit.client.0.vm03.stdout:6/164: dread f8 [0,4194304] 0 2026-03-09T00:03:41.338 INFO:tasks.workunit.client.1.vm06.stdout:0/374: creat d3/d18/d79/f7f x:0 0 0 2026-03-09T00:03:41.342 INFO:tasks.workunit.client.0.vm03.stdout:4/166: unlink d7/f1e 0 2026-03-09T00:03:41.362 INFO:tasks.workunit.client.1.vm06.stdout:5/484: rmdir d5/d44/d4b/d92/d49/da1 0 2026-03-09T00:03:41.362 INFO:tasks.workunit.client.1.vm06.stdout:5/485: write d5/fe [1534632,39009] 0 2026-03-09T00:03:41.362 INFO:tasks.workunit.client.1.vm06.stdout:5/486: fsync d5/f8e 0 2026-03-09T00:03:41.362 INFO:tasks.workunit.client.1.vm06.stdout:5/487: write d5/d1c/d21/d28/d5e/f69 [853701,120758] 0 2026-03-09T00:03:41.363 INFO:tasks.workunit.client.0.vm03.stdout:9/143: sync 2026-03-09T00:03:41.363 INFO:tasks.workunit.client.0.vm03.stdout:9/144: write f5 [2466309,117286] 0 2026-03-09T00:03:41.364 INFO:tasks.workunit.client.0.vm03.stdout:3/109: sync 2026-03-09T00:03:41.364 INFO:tasks.workunit.client.0.vm03.stdout:6/165: symlink d13/d35/l37 0 2026-03-09T00:03:41.369 INFO:tasks.workunit.client.0.vm03.stdout:0/161: dwrite d2/da/d1a/f25 [0,4194304] 0 2026-03-09T00:03:41.394 INFO:tasks.workunit.client.0.vm03.stdout:4/167: dread f4 [4194304,4194304] 0 2026-03-09T00:03:41.394 INFO:tasks.workunit.client.0.vm03.stdout:4/168: chown d7/d20 25 1 2026-03-09T00:03:41.394 INFO:tasks.workunit.client.0.vm03.stdout:4/169: chown d7/d20/d29/c2d 1670 1 2026-03-09T00:03:41.401 INFO:tasks.workunit.client.0.vm03.stdout:7/167: sync 2026-03-09T00:03:41.401 INFO:tasks.workunit.client.0.vm03.stdout:7/168: truncate d2/d1f/f30 679824 0 2026-03-09T00:03:41.401 INFO:tasks.workunit.client.0.vm03.stdout:6/166: symlink d13/l38 0 2026-03-09T00:03:41.401 INFO:tasks.workunit.client.0.vm03.stdout:6/167: truncate d13/f1d 850837 0 2026-03-09T00:03:41.407 INFO:tasks.workunit.client.0.vm03.stdout:7/169: dread d2/f2a [0,4194304] 0 2026-03-09T00:03:41.409 INFO:tasks.workunit.client.1.vm06.stdout:4/296: creat d17/d24/d49/f62 x:0 0 0 2026-03-09T00:03:41.410 INFO:tasks.workunit.client.1.vm06.stdout:8/358: sync 2026-03-09T00:03:41.418 INFO:tasks.workunit.client.0.vm03.stdout:0/162: symlink d2/da/dd/l35 0 2026-03-09T00:03:41.423 INFO:tasks.workunit.client.0.vm03.stdout:5/145: creat d1c/d20/d2a/f34 x:0 0 0 2026-03-09T00:03:41.439 INFO:tasks.workunit.client.0.vm03.stdout:1/180: sync 2026-03-09T00:03:41.439 INFO:tasks.workunit.client.0.vm03.stdout:1/181: readlink d4/l10 0 2026-03-09T00:03:41.439 INFO:tasks.workunit.client.0.vm03.stdout:7/170: mknod d2/d4/d15/d24/c38 0 2026-03-09T00:03:41.439 INFO:tasks.workunit.client.0.vm03.stdout:7/171: fdatasync d2/d4/fb 0 2026-03-09T00:03:41.439 INFO:tasks.workunit.client.0.vm03.stdout:7/172: chown d2/d4/f13 14786 1 2026-03-09T00:03:41.439 INFO:tasks.workunit.client.0.vm03.stdout:0/163: mkdir d2/da/d36 0 2026-03-09T00:03:41.440 INFO:tasks.workunit.client.1.vm06.stdout:5/488: mkdir d5/d1c/d21/d28/d5e/d66/d78/da6 0 2026-03-09T00:03:41.440 INFO:tasks.workunit.client.1.vm06.stdout:5/489: write d5/d1c/d21/d28/f3b [3169442,38061] 0 2026-03-09T00:03:41.440 INFO:tasks.workunit.client.1.vm06.stdout:5/490: symlink d5/d1c/d21/la7 0 2026-03-09T00:03:41.440 INFO:tasks.workunit.client.1.vm06.stdout:5/491: mknod d5/ca8 0 2026-03-09T00:03:41.440 INFO:tasks.workunit.client.0.vm03.stdout:4/170: rmdir d7/d20/d29 39 2026-03-09T00:03:41.452 INFO:tasks.workunit.client.1.vm06.stdout:5/492: unlink d5/d44/d4b/d92/d49/l99 0 2026-03-09T00:03:41.455 INFO:tasks.workunit.client.1.vm06.stdout:9/302: sync 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:9/303: fdatasync d1/d3/d2b/f33 0 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:9/304: dread - d1/f2a zero size 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:9/305: truncate d1/d3/d12/f4d 416282 0 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:6/362: sync 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:6/363: creat d4/d16/d53/f6d x:0 0 0 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:6/364: read - d4/d27/f61 zero size 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:6/365: write d4/d16/f21 [925637,54899] 0 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:6/366: fdatasync d4/d27/d3e/f55 0 2026-03-09T00:03:41.464 INFO:tasks.workunit.client.1.vm06.stdout:2/456: sync 2026-03-09T00:03:41.465 INFO:tasks.workunit.client.0.vm03.stdout:1/182: mkdir d4/d3a/d3d 0 2026-03-09T00:03:41.465 INFO:tasks.workunit.client.0.vm03.stdout:5/146: dread ff [0,4194304] 0 2026-03-09T00:03:41.466 INFO:tasks.workunit.client.1.vm06.stdout:3/396: dwrite d11/d28/d2e/d2f/f53 [0,4194304] 0 2026-03-09T00:03:41.469 INFO:tasks.workunit.client.0.vm03.stdout:7/173: getdents d2/d4/d1e 0 2026-03-09T00:03:41.469 INFO:tasks.workunit.client.0.vm03.stdout:7/174: stat d2/d4/f34 0 2026-03-09T00:03:41.477 INFO:tasks.workunit.client.1.vm06.stdout:3/397: write d11/d28/d2e/f38 [4082135,56542] 0 2026-03-09T00:03:41.478 INFO:tasks.workunit.client.1.vm06.stdout:3/398: fdatasync d11/f1a 0 2026-03-09T00:03:41.478 INFO:tasks.workunit.client.1.vm06.stdout:3/399: fsync d11/d3f/f54 0 2026-03-09T00:03:41.478 INFO:tasks.workunit.client.1.vm06.stdout:3/400: write d11/d28/d2e/d2f/d5b/d5f/f60 [4340319,72603] 0 2026-03-09T00:03:41.484 INFO:tasks.workunit.client.1.vm06.stdout:7/370: dwrite d0/f14 [4194304,4194304] 0 2026-03-09T00:03:41.484 INFO:tasks.workunit.client.1.vm06.stdout:7/371: creat d0/df/d1a/d27/d4c/d40/f67 x:0 0 0 2026-03-09T00:03:41.484 INFO:tasks.workunit.client.1.vm06.stdout:1/338: sync 2026-03-09T00:03:41.485 INFO:tasks.workunit.client.0.vm03.stdout:1/183: truncate d4/f12 246021 0 2026-03-09T00:03:41.487 INFO:tasks.workunit.client.0.vm03.stdout:9/145: dwrite fd [4194304,4194304] 0 2026-03-09T00:03:41.490 INFO:tasks.workunit.client.0.vm03.stdout:7/175: write d2/d1f/f16 [3771898,4286] 0 2026-03-09T00:03:41.505 INFO:tasks.workunit.client.0.vm03.stdout:5/147: symlink d1c/d20/d2a/l35 0 2026-03-09T00:03:41.505 INFO:tasks.workunit.client.0.vm03.stdout:5/148: creat d1c/d20/d2a/f36 x:0 0 0 2026-03-09T00:03:41.511 INFO:tasks.workunit.client.1.vm06.stdout:2/457: mkdir d7/d1b/d85 0 2026-03-09T00:03:41.522 INFO:tasks.workunit.client.1.vm06.stdout:8/359: dwrite db/f1d [0,4194304] 0 2026-03-09T00:03:41.524 INFO:tasks.workunit.client.0.vm03.stdout:6/168: dwrite d13/f24 [0,4194304] 0 2026-03-09T00:03:41.524 INFO:tasks.workunit.client.0.vm03.stdout:6/169: write d13/f1d [1649197,9693] 0 2026-03-09T00:03:41.524 INFO:tasks.workunit.client.0.vm03.stdout:6/170: fsync f2 0 2026-03-09T00:03:41.531 INFO:tasks.workunit.client.0.vm03.stdout:9/146: link c14 d15/d1c/d21/c2d 0 2026-03-09T00:03:41.539 INFO:tasks.workunit.client.0.vm03.stdout:8/154: truncate f6 11902583 0 2026-03-09T00:03:41.547 INFO:tasks.workunit.client.0.vm03.stdout:8/155: write d7/df/d1a/f1c [75949,111412] 0 2026-03-09T00:03:41.556 INFO:tasks.workunit.client.1.vm06.stdout:4/297: dwrite d17/d21/d4c/f57 [0,4194304] 0 2026-03-09T00:03:41.557 INFO:tasks.workunit.client.1.vm06.stdout:4/298: truncate d17/d24/f3a 1568914 0 2026-03-09T00:03:41.561 INFO:tasks.workunit.client.0.vm03.stdout:7/176: mkdir d2/d4/d15/d31/d37/d39 0 2026-03-09T00:03:41.562 INFO:tasks.workunit.client.1.vm06.stdout:3/401: mkdir d11/d28/d2e/d7e/d83 0 2026-03-09T00:03:41.563 INFO:tasks.workunit.client.1.vm06.stdout:3/402: stat cc 0 2026-03-09T00:03:41.563 INFO:tasks.workunit.client.1.vm06.stdout:3/403: write d11/d3f/f71 [628232,13384] 0 2026-03-09T00:03:41.564 INFO:tasks.workunit.client.0.vm03.stdout:5/149: creat d1c/f37 x:0 0 0 2026-03-09T00:03:41.564 INFO:tasks.workunit.client.1.vm06.stdout:9/306: dwrite d1/d3/f1f [0,4194304] 0 2026-03-09T00:03:41.575 INFO:tasks.workunit.client.0.vm03.stdout:0/164: getdents d2/da/dd 0 2026-03-09T00:03:41.582 INFO:tasks.workunit.client.1.vm06.stdout:6/367: getdents d4/d16 0 2026-03-09T00:03:41.584 INFO:tasks.workunit.client.0.vm03.stdout:4/171: dwrite d7/f1d [0,4194304] 0 2026-03-09T00:03:41.584 INFO:tasks.workunit.client.0.vm03.stdout:4/172: write d7/d20/d29/f2a [348149,10008] 0 2026-03-09T00:03:41.585 INFO:tasks.workunit.client.1.vm06.stdout:2/458: dwrite d7/d1b/f46 [0,4194304] 0 2026-03-09T00:03:41.586 INFO:tasks.workunit.client.1.vm06.stdout:2/459: write d7/d1b/f37 [333958,33840] 0 2026-03-09T00:03:41.588 INFO:tasks.workunit.client.0.vm03.stdout:6/171: mknod d13/c39 0 2026-03-09T00:03:41.588 INFO:tasks.workunit.client.0.vm03.stdout:6/172: stat d13/l38 0 2026-03-09T00:03:41.597 INFO:tasks.workunit.client.0.vm03.stdout:5/150: rename c7 to d1c/d20/c38 0 2026-03-09T00:03:41.597 INFO:tasks.workunit.client.0.vm03.stdout:9/147: mknod d15/d1c/d28/c2e 0 2026-03-09T00:03:41.597 INFO:tasks.workunit.client.0.vm03.stdout:9/148: creat d15/d1c/d28/f2f x:0 0 0 2026-03-09T00:03:41.600 INFO:tasks.workunit.client.1.vm06.stdout:8/360: truncate db/d1e/f58 3620174 0 2026-03-09T00:03:41.600 INFO:tasks.workunit.client.1.vm06.stdout:8/361: write db/d1e/d46/f5d [407301,51407] 0 2026-03-09T00:03:41.601 INFO:tasks.workunit.client.0.vm03.stdout:1/184: dwrite d4/d6/f33 [0,4194304] 0 2026-03-09T00:03:41.603 INFO:tasks.workunit.client.0.vm03.stdout:0/165: truncate d2/fb 138712 0 2026-03-09T00:03:41.603 INFO:tasks.workunit.client.0.vm03.stdout:0/166: readlink d2/da/d1a/l2f 0 2026-03-09T00:03:41.603 INFO:tasks.workunit.client.0.vm03.stdout:0/167: readlink d2/da/l13 0 2026-03-09T00:03:41.610 INFO:tasks.workunit.client.1.vm06.stdout:4/299: link d17/c51 d17/d24/d3b/d5e/c63 0 2026-03-09T00:03:41.620 INFO:tasks.workunit.client.1.vm06.stdout:3/404: creat d11/d28/d2e/d2f/d5b/f84 x:0 0 0 2026-03-09T00:03:41.620 INFO:tasks.workunit.client.1.vm06.stdout:3/405: chown d11/d28/d2e/f38 1224 1 2026-03-09T00:03:41.621 INFO:tasks.workunit.client.1.vm06.stdout:3/406: write d11/d28/d2e/d2f/d36/f59 [988541,97079] 0 2026-03-09T00:03:41.624 INFO:tasks.workunit.client.1.vm06.stdout:9/307: symlink d1/d3/d2b/d58/l66 0 2026-03-09T00:03:41.624 INFO:tasks.workunit.client.1.vm06.stdout:9/308: dread - d1/d3/d12/d21/d14/d25/f4a zero size 2026-03-09T00:03:41.624 INFO:tasks.workunit.client.1.vm06.stdout:9/309: truncate d1/d4/f44 827956 0 2026-03-09T00:03:41.629 INFO:tasks.workunit.client.0.vm03.stdout:4/173: link d7/d23/d25/l2e d7/d20/l30 0 2026-03-09T00:03:41.636 INFO:tasks.workunit.client.0.vm03.stdout:6/173: creat d13/f3a x:0 0 0 2026-03-09T00:03:41.641 INFO:tasks.workunit.client.1.vm06.stdout:2/460: rmdir d7/d1b/d31/d7b 0 2026-03-09T00:03:41.641 INFO:tasks.workunit.client.1.vm06.stdout:2/461: truncate d7/da/d4e/d57/f7a 623169 0 2026-03-09T00:03:41.643 INFO:tasks.workunit.client.1.vm06.stdout:1/339: rename d6/d21/d2d/d3b/d42/d43 to d6/d4c/d71 0 2026-03-09T00:03:41.655 INFO:tasks.workunit.client.0.vm03.stdout:9/149: mkdir d15/d1c/d28/d30 0 2026-03-09T00:03:41.655 INFO:tasks.workunit.client.1.vm06.stdout:8/362: link db/d1e/f51 db/d53/d70/d38/f72 0 2026-03-09T00:03:41.655 INFO:tasks.workunit.client.1.vm06.stdout:8/363: readlink db/d53/d5c/l6b 0 2026-03-09T00:03:41.655 INFO:tasks.workunit.client.1.vm06.stdout:4/300: creat d17/d5b/f64 x:0 0 0 2026-03-09T00:03:41.655 INFO:tasks.workunit.client.1.vm06.stdout:4/301: write f14 [58515,100978] 0 2026-03-09T00:03:41.661 INFO:tasks.workunit.client.1.vm06.stdout:4/302: dread d17/d21/f4b [0,4194304] 0 2026-03-09T00:03:41.665 INFO:tasks.workunit.client.0.vm03.stdout:0/168: mknod d2/c37 0 2026-03-09T00:03:41.666 INFO:tasks.workunit.client.0.vm03.stdout:0/169: write d2/da/f1b [433010,90760] 0 2026-03-09T00:03:41.671 INFO:tasks.workunit.client.0.vm03.stdout:1/185: dwrite d4/d3a/f28 [0,4194304] 0 2026-03-09T00:03:41.684 INFO:tasks.workunit.client.1.vm06.stdout:4/303: dread f14 [0,4194304] 0 2026-03-09T00:03:41.684 INFO:tasks.workunit.client.1.vm06.stdout:4/304: write d17/d24/f31 [496200,88942] 0 2026-03-09T00:03:41.684 INFO:tasks.workunit.client.1.vm06.stdout:4/305: creat d17/d24/d49/f65 x:0 0 0 2026-03-09T00:03:41.687 INFO:tasks.workunit.client.1.vm06.stdout:3/407: dwrite f7 [0,4194304] 0 2026-03-09T00:03:41.693 INFO:tasks.workunit.client.1.vm06.stdout:3/408: truncate d11/d28/d2e/d2f/f79 924682 0 2026-03-09T00:03:41.699 INFO:tasks.workunit.client.1.vm06.stdout:8/364: symlink db/d53/d6d/l73 0 2026-03-09T00:03:41.700 INFO:tasks.workunit.client.1.vm06.stdout:9/310: rmdir d1/d4/d2f 39 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:2/145: sync 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:3/110: sync 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:3/111: write d2/db/f1a [883638,82867] 0 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:3/112: write d2/f1d [791150,3145] 0 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:3/113: chown d2/db/f1a 1762 1 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:4/174: rename d7/d20/f2f to d7/d27/f31 0 2026-03-09T00:03:41.701 INFO:tasks.workunit.client.0.vm03.stdout:4/175: write d7/f28 [470231,100977] 0 2026-03-09T00:03:41.702 INFO:tasks.workunit.client.1.vm06.stdout:5/493: dwrite d5/d44/d4b/f6d [0,4194304] 0 2026-03-09T00:03:41.705 INFO:tasks.workunit.client.0.vm03.stdout:0/170: creat d2/da/dd/f38 x:0 0 0 2026-03-09T00:03:41.706 INFO:tasks.workunit.client.1.vm06.stdout:9/311: write d1/d3/f5c [2551624,83720] 0 2026-03-09T00:03:41.706 INFO:tasks.workunit.client.1.vm06.stdout:9/312: write d1/d3/d2b/d58/f5f [526828,68684] 0 2026-03-09T00:03:41.717 INFO:tasks.workunit.client.0.vm03.stdout:6/174: rmdir d13/d1e 39 2026-03-09T00:03:41.717 INFO:tasks.workunit.client.0.vm03.stdout:2/146: link d8/d17/f1d d8/d1b/f31 0 2026-03-09T00:03:41.720 INFO:tasks.workunit.client.0.vm03.stdout:1/186: dread d4/d15/d1a/f1b [0,4194304] 0 2026-03-09T00:03:41.722 INFO:tasks.workunit.client.0.vm03.stdout:1/187: write d4/f9 [553489,118713] 0 2026-03-09T00:03:41.722 INFO:tasks.workunit.client.0.vm03.stdout:1/188: fdatasync d4/f39 0 2026-03-09T00:03:41.729 INFO:tasks.workunit.client.1.vm06.stdout:1/340: dwrite d6/d21/f2e [0,4194304] 0 2026-03-09T00:03:41.730 INFO:tasks.workunit.client.0.vm03.stdout:6/175: unlink d13/l32 0 2026-03-09T00:03:41.730 INFO:tasks.workunit.client.0.vm03.stdout:4/176: mknod d7/d27/c32 0 2026-03-09T00:03:41.737 INFO:tasks.workunit.client.0.vm03.stdout:2/147: unlink d8/f12 0 2026-03-09T00:03:41.749 INFO:tasks.workunit.client.0.vm03.stdout:3/114: link d2/db/l1f d2/db/l22 0 2026-03-09T00:03:41.749 INFO:tasks.workunit.client.0.vm03.stdout:1/189: link d4/d15/l22 d4/d15/d1a/l3e 0 2026-03-09T00:03:41.752 INFO:tasks.workunit.client.0.vm03.stdout:4/177: truncate f4 6307954 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/177: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/178: read - d2/d1f/f28 zero size 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/179: stat d2/d4 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/180: chown d2/d4/d15/d24/d32 15 1 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/181: fdatasync d2/d4/d15/f1a 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/182: write d2/d4/f22 [1282116,70564] 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:3/115: symlink d2/l23 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:3/116: dread - d2/db/f21 zero size 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/183: dread d2/d1f/f30 [0,4194304] 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:7/184: stat d2/d1f/d35 0 2026-03-09T00:03:41.758 INFO:tasks.workunit.client.0.vm03.stdout:1/190: rmdir d4/d6 39 2026-03-09T00:03:41.759 INFO:tasks.workunit.client.0.vm03.stdout:4/178: rename d7/fd to d7/d20/f33 0 2026-03-09T00:03:41.759 INFO:tasks.workunit.client.1.vm06.stdout:5/494: rmdir d5/d44/d4b/d92/d49 39 2026-03-09T00:03:41.759 INFO:tasks.workunit.client.1.vm06.stdout:5/495: readlink d5/d1c/d21/la7 0 2026-03-09T00:03:41.759 INFO:tasks.workunit.client.1.vm06.stdout:5/496: creat d5/d44/d4b/fa9 x:0 0 0 2026-03-09T00:03:41.762 INFO:tasks.workunit.client.1.vm06.stdout:9/313: symlink d1/d3/d12/d49/l67 0 2026-03-09T00:03:41.762 INFO:tasks.workunit.client.1.vm06.stdout:9/314: creat d1/d3/d12/f68 x:0 0 0 2026-03-09T00:03:41.762 INFO:tasks.workunit.client.1.vm06.stdout:9/315: readlink d1/d3/d12/d49/l67 0 2026-03-09T00:03:41.762 INFO:tasks.workunit.client.1.vm06.stdout:9/316: chown d1/d3/d12 127 1 2026-03-09T00:03:41.762 INFO:tasks.workunit.client.1.vm06.stdout:4/306: rmdir d17/d21 39 2026-03-09T00:03:41.762 INFO:tasks.workunit.client.1.vm06.stdout:8/365: mkdir db/d74 0 2026-03-09T00:03:41.774 INFO:tasks.workunit.client.0.vm03.stdout:7/185: rename d2/d4/d15 to d2/d1f/d3a 0 2026-03-09T00:03:41.778 INFO:tasks.workunit.client.1.vm06.stdout:1/341: rename d6/d21/d2d/d37/c64 to d6/d4c/d71/c72 0 2026-03-09T00:03:41.778 INFO:tasks.workunit.client.0.vm03.stdout:3/117: rename d2/db/f20 to d2/db/f24 0 2026-03-09T00:03:41.779 INFO:tasks.workunit.client.0.vm03.stdout:3/118: chown d2/db/l1f 69701 1 2026-03-09T00:03:41.779 INFO:tasks.workunit.client.0.vm03.stdout:7/186: rename d2/d1f/f16 to d2/d1f/f3b 0 2026-03-09T00:03:41.779 INFO:tasks.workunit.client.0.vm03.stdout:7/187: fdatasync d2/d4/f2e 0 2026-03-09T00:03:41.779 INFO:tasks.workunit.client.0.vm03.stdout:7/188: creat d2/f3c x:0 0 0 2026-03-09T00:03:41.779 INFO:tasks.workunit.client.1.vm06.stdout:9/317: rmdir d1/d3/d12/d48 0 2026-03-09T00:03:41.782 INFO:tasks.workunit.client.1.vm06.stdout:1/342: dread d6/f1b [0,4194304] 0 2026-03-09T00:03:41.783 INFO:tasks.workunit.client.1.vm06.stdout:8/366: rename db/dd/f2c to db/d53/d70/f75 0 2026-03-09T00:03:41.783 INFO:tasks.workunit.client.0.vm03.stdout:7/189: creat d2/d1f/d3a/d31/d37/d39/f3d x:0 0 0 2026-03-09T00:03:41.784 INFO:tasks.workunit.client.1.vm06.stdout:1/343: truncate f0 3584509 0 2026-03-09T00:03:41.787 INFO:tasks.workunit.client.0.vm03.stdout:7/190: dread d2/d1f/d3a/f1a [0,4194304] 0 2026-03-09T00:03:41.793 INFO:tasks.workunit.client.0.vm03.stdout:7/191: creat d2/d1f/d35/f3e x:0 0 0 2026-03-09T00:03:41.793 INFO:tasks.workunit.client.0.vm03.stdout:7/192: dread - d2/d1f/d3a/d31/d37/d39/f3d zero size 2026-03-09T00:03:41.794 INFO:tasks.workunit.client.0.vm03.stdout:7/193: fsync d2/d1f/f11 0 2026-03-09T00:03:41.794 INFO:tasks.workunit.client.0.vm03.stdout:7/194: fdatasync d2/f2a 0 2026-03-09T00:03:41.794 INFO:tasks.workunit.client.0.vm03.stdout:7/195: unlink d2/d1f/d3a/d31/d37/d39/f3d 0 2026-03-09T00:03:41.794 INFO:tasks.workunit.client.0.vm03.stdout:7/196: write d2/d1f/f30 [1457306,81151] 0 2026-03-09T00:03:41.794 INFO:tasks.workunit.client.0.vm03.stdout:7/197: fsync d2/fc 0 2026-03-09T00:03:41.796 INFO:tasks.workunit.client.0.vm03.stdout:7/198: truncate d2/d4/f13 1222779 0 2026-03-09T00:03:41.816 INFO:tasks.workunit.client.0.vm03.stdout:5/151: write d1c/f32 [1012322,23418] 0 2026-03-09T00:03:41.844 INFO:tasks.workunit.client.0.vm03.stdout:6/176: dwrite d13/f2f [0,4194304] 0 2026-03-09T00:03:41.845 INFO:tasks.workunit.client.0.vm03.stdout:6/177: symlink d13/d35/l3b 0 2026-03-09T00:03:41.845 INFO:tasks.workunit.client.0.vm03.stdout:6/178: stat f2 0 2026-03-09T00:03:41.846 INFO:tasks.workunit.client.0.vm03.stdout:6/179: symlink d13/d35/l3c 0 2026-03-09T00:03:41.846 INFO:tasks.workunit.client.0.vm03.stdout:6/180: mknod d13/d35/c3d 0 2026-03-09T00:03:41.847 INFO:tasks.workunit.client.0.vm03.stdout:6/181: creat d13/d1e/f3e x:0 0 0 2026-03-09T00:03:41.915 INFO:tasks.workunit.client.0.vm03.stdout:1/191: dwrite d4/d3a/f26 [0,4194304] 0 2026-03-09T00:03:41.915 INFO:tasks.workunit.client.0.vm03.stdout:1/192: chown d4/d3a/f26 183165406 1 2026-03-09T00:03:41.915 INFO:tasks.workunit.client.0.vm03.stdout:1/193: chown d4/d6/c30 158256 1 2026-03-09T00:03:41.915 INFO:tasks.workunit.client.0.vm03.stdout:1/194: write d4/d15/d1a/f2b [350164,3593] 0 2026-03-09T00:03:41.916 INFO:tasks.workunit.client.0.vm03.stdout:2/148: dwrite f2 [0,4194304] 0 2026-03-09T00:03:41.917 INFO:tasks.workunit.client.1.vm06.stdout:2/462: dwrite d7/d1a/f30 [4194304,4194304] 0 2026-03-09T00:03:41.920 INFO:tasks.workunit.client.0.vm03.stdout:2/149: creat d8/d1b/f32 x:0 0 0 2026-03-09T00:03:41.923 INFO:tasks.workunit.client.1.vm06.stdout:0/375: dwrite d3/f11 [0,4194304] 0 2026-03-09T00:03:41.927 INFO:tasks.workunit.client.0.vm03.stdout:1/195: dread d4/d6/f2a [0,4194304] 0 2026-03-09T00:03:41.927 INFO:tasks.workunit.client.0.vm03.stdout:1/196: creat d4/d15/f3f x:0 0 0 2026-03-09T00:03:41.927 INFO:tasks.workunit.client.0.vm03.stdout:1/197: truncate d4/f12 777193 0 2026-03-09T00:03:41.943 INFO:tasks.workunit.client.1.vm06.stdout:0/376: rmdir d3/d18/d2c/d2d/d31 39 2026-03-09T00:03:41.943 INFO:tasks.workunit.client.1.vm06.stdout:5/497: dwrite d5/d44/d4b/d92/f40 [0,4194304] 0 2026-03-09T00:03:41.945 INFO:tasks.workunit.client.1.vm06.stdout:0/377: rmdir d3/d18/d1f/d39 39 2026-03-09T00:03:41.945 INFO:tasks.workunit.client.1.vm06.stdout:5/498: truncate d5/d1c/d23/f42 1177324 0 2026-03-09T00:03:41.962 INFO:tasks.workunit.client.1.vm06.stdout:5/499: mkdir d5/d1c/d23/d51/daa 0 2026-03-09T00:03:41.963 INFO:tasks.workunit.client.1.vm06.stdout:5/500: mkdir d5/d1c/d21/d28/d5e/d66/dab 0 2026-03-09T00:03:41.965 INFO:tasks.workunit.client.1.vm06.stdout:5/501: mknod d5/d1c/d23/d34/d47/cac 0 2026-03-09T00:03:41.967 INFO:tasks.workunit.client.1.vm06.stdout:5/502: write d5/d1c/d21/d28/f57 [282162,10182] 0 2026-03-09T00:03:41.968 INFO:tasks.workunit.client.1.vm06.stdout:1/344: dwrite d6/d21/f3d [4194304,4194304] 0 2026-03-09T00:03:41.972 INFO:tasks.workunit.client.0.vm03.stdout:4/179: write f4 [3755207,34594] 0 2026-03-09T00:03:41.972 INFO:tasks.workunit.client.1.vm06.stdout:6/368: dwrite d4/f2d [0,4194304] 0 2026-03-09T00:03:41.975 INFO:tasks.workunit.client.1.vm06.stdout:8/367: dwrite db/d53/d70/d38/d4d/f65 [0,4194304] 0 2026-03-09T00:03:41.989 INFO:tasks.workunit.client.1.vm06.stdout:6/369: creat d4/f6e x:0 0 0 2026-03-09T00:03:41.989 INFO:tasks.workunit.client.1.vm06.stdout:6/370: unlink d4/d16/d53/f6d 0 2026-03-09T00:03:41.989 INFO:tasks.workunit.client.0.vm03.stdout:5/152: dwrite fb [0,4194304] 0 2026-03-09T00:03:41.989 INFO:tasks.workunit.client.0.vm03.stdout:5/153: chown f12 243095410 1 2026-03-09T00:03:41.989 INFO:tasks.workunit.client.0.vm03.stdout:5/154: creat d1c/d20/f39 x:0 0 0 2026-03-09T00:03:41.989 INFO:tasks.workunit.client.0.vm03.stdout:6/182: dwrite d13/f1c [0,4194304] 0 2026-03-09T00:03:41.992 INFO:tasks.workunit.client.1.vm06.stdout:8/368: write db/d1e/d46/f69 [873879,30400] 0 2026-03-09T00:03:41.992 INFO:tasks.workunit.client.1.vm06.stdout:6/371: mknod d4/d27/c6f 0 2026-03-09T00:03:41.993 INFO:tasks.workunit.client.0.vm03.stdout:3/119: getdents d2 0 2026-03-09T00:03:41.994 INFO:tasks.workunit.client.1.vm06.stdout:6/372: dread d4/d16/f21 [0,4194304] 0 2026-03-09T00:03:41.997 INFO:tasks.workunit.client.0.vm03.stdout:5/155: creat d1c/f3a x:0 0 0 2026-03-09T00:03:41.999 INFO:tasks.workunit.client.0.vm03.stdout:6/183: truncate d13/f31 1098992 0 2026-03-09T00:03:42.004 INFO:tasks.workunit.client.1.vm06.stdout:8/369: rename f5 to db/d53/f76 0 2026-03-09T00:03:42.004 INFO:tasks.workunit.client.0.vm03.stdout:5/156: mkdir d1c/d20/d2a/d3b 0 2026-03-09T00:03:42.004 INFO:tasks.workunit.client.0.vm03.stdout:5/157: creat d1c/d20/d2a/d3b/f3c x:0 0 0 2026-03-09T00:03:42.018 INFO:tasks.workunit.client.1.vm06.stdout:6/373: write d4/ff [2517823,93459] 0 2026-03-09T00:03:42.018 INFO:tasks.workunit.client.1.vm06.stdout:6/374: creat d4/d27/f70 x:0 0 0 2026-03-09T00:03:42.018 INFO:tasks.workunit.client.1.vm06.stdout:6/375: truncate d4/f22 4682274 0 2026-03-09T00:03:42.031 INFO:tasks.workunit.client.0.vm03.stdout:6/184: rename d13/d1e/f33 to d13/d1e/f3f 0 2026-03-09T00:03:42.039 INFO:tasks.workunit.client.0.vm03.stdout:5/158: rename f1a to d1c/d20/d2a/f3d 0 2026-03-09T00:03:42.039 INFO:tasks.workunit.client.0.vm03.stdout:1/198: write d4/f9 [579022,33799] 0 2026-03-09T00:03:42.039 INFO:tasks.workunit.client.0.vm03.stdout:1/199: dread - d4/f39 zero size 2026-03-09T00:03:42.048 INFO:tasks.workunit.client.0.vm03.stdout:5/159: dread fe [0,4194304] 0 2026-03-09T00:03:42.050 INFO:tasks.workunit.client.0.vm03.stdout:9/150: sync 2026-03-09T00:03:42.050 INFO:tasks.workunit.client.0.vm03.stdout:9/151: fdatasync d15/f1f 0 2026-03-09T00:03:42.050 INFO:tasks.workunit.client.0.vm03.stdout:8/156: sync 2026-03-09T00:03:42.054 INFO:tasks.workunit.client.1.vm06.stdout:0/378: dwrite d3/d18/d28/d45/f48 [0,4194304] 0 2026-03-09T00:03:42.058 INFO:tasks.workunit.client.0.vm03.stdout:9/152: write f8 [1044339,39962] 0 2026-03-09T00:03:42.087 INFO:tasks.workunit.client.1.vm06.stdout:6/376: mknod d4/d27/d3e/d57/c71 0 2026-03-09T00:03:42.088 INFO:tasks.workunit.client.0.vm03.stdout:1/200: mknod d4/d3a/d32/c40 0 2026-03-09T00:03:42.088 INFO:tasks.workunit.client.0.vm03.stdout:1/201: chown f2 139715 1 2026-03-09T00:03:42.088 INFO:tasks.workunit.client.0.vm03.stdout:1/202: chown d4/d15/f3c 64 1 2026-03-09T00:03:42.089 INFO:tasks.workunit.client.1.vm06.stdout:6/377: dread d4/d16/f5e [0,4194304] 0 2026-03-09T00:03:42.095 INFO:tasks.workunit.client.0.vm03.stdout:8/157: creat d7/df/d1a/f2e x:0 0 0 2026-03-09T00:03:42.105 INFO:tasks.workunit.client.1.vm06.stdout:2/463: dwrite d7/da/d4e/d57/f7a [0,4194304] 0 2026-03-09T00:03:42.105 INFO:tasks.workunit.client.1.vm06.stdout:2/464: dread - d7/d1a/d3c/f4d zero size 2026-03-09T00:03:42.107 INFO:tasks.workunit.client.1.vm06.stdout:6/378: truncate d4/d27/d3e/f55 3785091 0 2026-03-09T00:03:42.110 INFO:tasks.workunit.client.0.vm03.stdout:4/180: dread f4 [0,4194304] 0 2026-03-09T00:03:42.113 INFO:tasks.workunit.client.0.vm03.stdout:9/153: truncate fb 1576212 0 2026-03-09T00:03:42.115 INFO:tasks.workunit.client.1.vm06.stdout:8/370: dwrite db/dd/f13 [0,4194304] 0 2026-03-09T00:03:42.116 INFO:tasks.workunit.client.0.vm03.stdout:7/199: dwrite d2/d1f/d3a/f29 [0,4194304] 0 2026-03-09T00:03:42.121 INFO:tasks.workunit.client.0.vm03.stdout:7/200: dread d2/d1f/f11 [0,4194304] 0 2026-03-09T00:03:42.128 INFO:tasks.workunit.client.1.vm06.stdout:4/307: dwrite d17/d24/f31 [0,4194304] 0 2026-03-09T00:03:42.138 INFO:tasks.workunit.client.0.vm03.stdout:6/185: mknod d13/d1e/c40 0 2026-03-09T00:03:42.138 INFO:tasks.workunit.client.0.vm03.stdout:1/203: unlink d4/c5 0 2026-03-09T00:03:42.138 INFO:tasks.workunit.client.0.vm03.stdout:1/204: chown d4/d15/l37 14226625 1 2026-03-09T00:03:42.138 INFO:tasks.workunit.client.1.vm06.stdout:2/465: mkdir d7/d1b/d5a/d86 0 2026-03-09T00:03:42.141 INFO:tasks.workunit.client.1.vm06.stdout:0/379: dwrite d3/d18/d2c/d2d/f46 [0,4194304] 0 2026-03-09T00:03:42.144 INFO:tasks.workunit.client.1.vm06.stdout:6/379: mknod d4/d66/c72 0 2026-03-09T00:03:42.144 INFO:tasks.workunit.client.1.vm06.stdout:8/371: symlink db/d53/d70/l77 0 2026-03-09T00:03:42.144 INFO:tasks.workunit.client.1.vm06.stdout:8/372: fdatasync db/f31 0 2026-03-09T00:03:42.146 INFO:tasks.workunit.client.1.vm06.stdout:7/372: sync 2026-03-09T00:03:42.150 INFO:tasks.workunit.client.1.vm06.stdout:8/373: write db/f28 [4732835,41058] 0 2026-03-09T00:03:42.150 INFO:tasks.workunit.client.0.vm03.stdout:5/160: rmdir d1c/d20 39 2026-03-09T00:03:42.150 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:41 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.150 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:41 vm06.local ceph-mon[58395]: mgrmap e24: vm06.rzcvhn(active, since 5s) 2026-03-09T00:03:42.150 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:41 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.150 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:41 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.150 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:41 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.150 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:41 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.152 INFO:tasks.workunit.client.0.vm03.stdout:8/158: symlink d7/l2f 0 2026-03-09T00:03:42.153 INFO:tasks.workunit.client.1.vm06.stdout:2/466: mkdir d7/d1a/d25/d66/d87 0 2026-03-09T00:03:42.156 INFO:tasks.workunit.client.1.vm06.stdout:7/373: dread d0/fe [0,4194304] 0 2026-03-09T00:03:42.156 INFO:tasks.workunit.client.1.vm06.stdout:7/374: creat d0/d39/f68 x:0 0 0 2026-03-09T00:03:42.159 INFO:tasks.workunit.client.1.vm06.stdout:7/375: dread d0/f7 [0,4194304] 0 2026-03-09T00:03:42.159 INFO:tasks.workunit.client.1.vm06.stdout:7/376: truncate d0/f4f 402356 0 2026-03-09T00:03:42.164 INFO:tasks.workunit.client.1.vm06.stdout:0/380: mknod d3/d18/d2c/d2d/d31/c80 0 2026-03-09T00:03:42.164 INFO:tasks.workunit.client.1.vm06.stdout:0/381: chown d3/d18/d1f/d39/d49/f72 487433506 1 2026-03-09T00:03:42.164 INFO:tasks.workunit.client.1.vm06.stdout:0/382: fsync d3/d18/d1f/d39/f6e 0 2026-03-09T00:03:42.164 INFO:tasks.workunit.client.1.vm06.stdout:0/383: truncate d3/d18/d2c/d2d/d31/f4f 819157 0 2026-03-09T00:03:42.164 INFO:tasks.workunit.client.1.vm06.stdout:0/384: fdatasync d3/d18/f25 0 2026-03-09T00:03:42.165 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:41 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.165 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:41 vm03.local ceph-mon[52346]: mgrmap e24: vm06.rzcvhn(active, since 5s) 2026-03-09T00:03:42.165 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:41 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.165 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:41 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.165 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:41 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.165 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:41 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:42.173 INFO:tasks.workunit.client.0.vm03.stdout:6/186: link d13/d1e/c20 d13/c41 0 2026-03-09T00:03:42.174 INFO:tasks.workunit.client.0.vm03.stdout:6/187: chown d13/f16 3373 1 2026-03-09T00:03:42.174 INFO:tasks.workunit.client.0.vm03.stdout:5/161: write d1c/f30 [1359669,43258] 0 2026-03-09T00:03:42.174 INFO:tasks.workunit.client.0.vm03.stdout:4/181: dwrite d7/f1f [0,4194304] 0 2026-03-09T00:03:42.174 INFO:tasks.workunit.client.0.vm03.stdout:4/182: chown d7/c1b 55357 1 2026-03-09T00:03:42.182 INFO:tasks.workunit.client.1.vm06.stdout:6/380: symlink d4/d27/d42/d4b/l73 0 2026-03-09T00:03:42.182 INFO:tasks.workunit.client.1.vm06.stdout:6/381: creat d4/d27/f74 x:0 0 0 2026-03-09T00:03:42.192 INFO:tasks.workunit.client.1.vm06.stdout:0/385: write d3/d18/d1f/d44/f5a [3909161,32376] 0 2026-03-09T00:03:42.195 INFO:tasks.workunit.client.1.vm06.stdout:2/467: truncate d7/da/db/f6e 2758183 0 2026-03-09T00:03:42.201 INFO:tasks.workunit.client.0.vm03.stdout:1/205: dread d4/d6/f2a [0,4194304] 0 2026-03-09T00:03:42.210 INFO:tasks.workunit.client.1.vm06.stdout:1/345: sync 2026-03-09T00:03:42.210 INFO:tasks.workunit.client.1.vm06.stdout:3/409: sync 2026-03-09T00:03:42.210 INFO:tasks.workunit.client.1.vm06.stdout:3/410: stat d11 0 2026-03-09T00:03:42.210 INFO:tasks.workunit.client.0.vm03.stdout:8/159: write f6 [11783870,78444] 0 2026-03-09T00:03:42.210 INFO:tasks.workunit.client.0.vm03.stdout:8/160: write d7/f9 [3456839,58261] 0 2026-03-09T00:03:42.212 INFO:tasks.workunit.client.0.vm03.stdout:5/162: rename d1c/f32 to d1c/d20/f3e 0 2026-03-09T00:03:42.215 INFO:tasks.workunit.client.1.vm06.stdout:0/386: creat d3/d18/d28/f81 x:0 0 0 2026-03-09T00:03:42.215 INFO:tasks.workunit.client.1.vm06.stdout:2/468: symlink d7/da/d4e/d57/l88 0 2026-03-09T00:03:42.215 INFO:tasks.workunit.client.1.vm06.stdout:2/469: fsync d7/d1a/f30 0 2026-03-09T00:03:42.215 INFO:tasks.workunit.client.1.vm06.stdout:2/470: truncate d7/f4c 441655 0 2026-03-09T00:03:42.218 INFO:tasks.workunit.client.0.vm03.stdout:1/206: getdents d4/d15/d1a 0 2026-03-09T00:03:42.219 INFO:tasks.workunit.client.0.vm03.stdout:8/161: write d7/fd [810733,85590] 0 2026-03-09T00:03:42.230 INFO:tasks.workunit.client.1.vm06.stdout:3/411: symlink d11/d28/d2e/d2f/l85 0 2026-03-09T00:03:42.247 INFO:tasks.workunit.client.0.vm03.stdout:8/162: rmdir d7/df/d1e/d26 0 2026-03-09T00:03:42.252 INFO:tasks.workunit.client.0.vm03.stdout:8/163: write d7/df/d1a/f2a [2281101,36174] 0 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:6/382: dread d4/f5 [4194304,4194304] 0 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:6/383: creat d4/d27/d42/f75 x:0 0 0 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:4/308: dwrite d17/d21/d4c/d50/f60 [0,4194304] 0 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:6/384: chown d4/fc 31534 1 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:6/385: chown d4/ff 90014 1 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:4/309: dread - d17/d21/f5d zero size 2026-03-09T00:03:42.258 INFO:tasks.workunit.client.1.vm06.stdout:4/310: chown d17/d24/d3b 62749566 1 2026-03-09T00:03:42.263 INFO:tasks.workunit.client.0.vm03.stdout:3/120: dwrite d2/db/f14 [0,4194304] 0 2026-03-09T00:03:42.270 INFO:tasks.workunit.client.1.vm06.stdout:9/318: sync 2026-03-09T00:03:42.271 INFO:tasks.workunit.client.1.vm06.stdout:5/503: sync 2026-03-09T00:03:42.271 INFO:tasks.workunit.client.1.vm06.stdout:9/319: chown d1/d3/d2b/d58 1 1 2026-03-09T00:03:42.271 INFO:tasks.workunit.client.1.vm06.stdout:9/320: write d1/f2a [85442,111092] 0 2026-03-09T00:03:42.271 INFO:tasks.workunit.client.0.vm03.stdout:6/188: dwrite d13/f2c [0,4194304] 0 2026-03-09T00:03:42.276 INFO:tasks.workunit.client.1.vm06.stdout:9/321: read d1/d4/f39 [146996,51053] 0 2026-03-09T00:03:42.283 INFO:tasks.workunit.client.0.vm03.stdout:7/201: dwrite d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:03:42.291 INFO:tasks.workunit.client.1.vm06.stdout:8/374: dwrite db/d53/d70/f54 [0,4194304] 0 2026-03-09T00:03:42.291 INFO:tasks.workunit.client.1.vm06.stdout:8/375: chown db/d53/d70/f71 226041 1 2026-03-09T00:03:42.291 INFO:tasks.workunit.client.1.vm06.stdout:8/376: stat db/d1e/l3b 0 2026-03-09T00:03:42.291 INFO:tasks.workunit.client.1.vm06.stdout:8/377: truncate db/d1e/d46/f5d 762758 0 2026-03-09T00:03:42.291 INFO:tasks.workunit.client.0.vm03.stdout:5/163: unlink d1c/f2e 0 2026-03-09T00:03:42.292 INFO:tasks.workunit.client.0.vm03.stdout:5/164: chown d1c/d20/d2a 930860 1 2026-03-09T00:03:42.293 INFO:tasks.workunit.client.0.vm03.stdout:6/189: dread f2 [4194304,4194304] 0 2026-03-09T00:03:42.299 INFO:tasks.workunit.client.1.vm06.stdout:3/412: mknod d11/d28/d4d/c86 0 2026-03-09T00:03:42.304 INFO:tasks.workunit.client.1.vm06.stdout:3/413: read d11/f12 [214557,56867] 0 2026-03-09T00:03:42.305 INFO:tasks.workunit.client.0.vm03.stdout:1/207: read d4/d6/f33 [1744600,48423] 0 2026-03-09T00:03:42.308 INFO:tasks.workunit.client.0.vm03.stdout:4/183: dwrite d7/f15 [0,4194304] 0 2026-03-09T00:03:42.308 INFO:tasks.workunit.client.1.vm06.stdout:0/387: dread d3/f51 [0,4194304] 0 2026-03-09T00:03:42.310 INFO:tasks.workunit.client.0.vm03.stdout:8/164: creat d7/df/f30 x:0 0 0 2026-03-09T00:03:42.310 INFO:tasks.workunit.client.0.vm03.stdout:8/165: truncate d7/df/f2c 142281 0 2026-03-09T00:03:42.310 INFO:tasks.workunit.client.0.vm03.stdout:4/184: write d7/fe [4380738,51831] 0 2026-03-09T00:03:42.313 INFO:tasks.workunit.client.1.vm06.stdout:0/388: dread d3/d18/d1f/d39/d3b/f55 [0,4194304] 0 2026-03-09T00:03:42.314 INFO:tasks.workunit.client.0.vm03.stdout:3/121: creat d2/db/f25 x:0 0 0 2026-03-09T00:03:42.314 INFO:tasks.workunit.client.1.vm06.stdout:9/322: fdatasync d1/f2a 0 2026-03-09T00:03:42.319 INFO:tasks.workunit.client.1.vm06.stdout:4/311: mkdir d17/d21/d4c/d66 0 2026-03-09T00:03:42.328 INFO:tasks.workunit.client.1.vm06.stdout:5/504: mknod d5/d1c/d21/d28/d5e/cad 0 2026-03-09T00:03:42.329 INFO:tasks.workunit.client.1.vm06.stdout:2/471: rmdir d7/da/d55 39 2026-03-09T00:03:42.329 INFO:tasks.workunit.client.1.vm06.stdout:2/472: chown d7/d1a/d25/d66/d87 2800253 1 2026-03-09T00:03:42.336 INFO:tasks.workunit.client.1.vm06.stdout:8/378: mkdir db/d74/d78 0 2026-03-09T00:03:42.338 INFO:tasks.workunit.client.1.vm06.stdout:3/414: truncate d11/d28/d2e/d2f/d5b/f7d 231715 0 2026-03-09T00:03:42.349 INFO:tasks.workunit.client.1.vm06.stdout:3/415: read d11/d28/d2e/d2f/d36/f4a [1629448,9886] 0 2026-03-09T00:03:42.349 INFO:tasks.workunit.client.1.vm06.stdout:3/416: readlink d11/d28/l43 0 2026-03-09T00:03:42.349 INFO:tasks.workunit.client.1.vm06.stdout:3/417: chown d11/d28/d57 6365 1 2026-03-09T00:03:42.350 INFO:tasks.workunit.client.0.vm03.stdout:5/165: creat d1c/d20/d2a/d3b/f3f x:0 0 0 2026-03-09T00:03:42.350 INFO:tasks.workunit.client.0.vm03.stdout:5/166: dread - d1c/d20/d2a/f34 zero size 2026-03-09T00:03:42.350 INFO:tasks.workunit.client.0.vm03.stdout:1/208: creat d4/d3a/f41 x:0 0 0 2026-03-09T00:03:42.358 INFO:tasks.workunit.client.1.vm06.stdout:9/323: creat d1/d3/d12/d49/f69 x:0 0 0 2026-03-09T00:03:42.364 INFO:tasks.workunit.client.1.vm06.stdout:2/473: mkdir d7/d1a/d89 0 2026-03-09T00:03:42.364 INFO:tasks.workunit.client.1.vm06.stdout:2/474: chown d7/da/d1c/f70 728371 1 2026-03-09T00:03:42.366 INFO:tasks.workunit.client.1.vm06.stdout:4/312: dwrite d17/d24/d49/f62 [0,4194304] 0 2026-03-09T00:03:42.366 INFO:tasks.workunit.client.1.vm06.stdout:4/313: chown d17/d24/d3b/d54 890 1 2026-03-09T00:03:42.366 INFO:tasks.workunit.client.1.vm06.stdout:4/314: write d17/d24/f36 [5208872,105232] 0 2026-03-09T00:03:42.366 INFO:tasks.workunit.client.1.vm06.stdout:4/315: chown d17/d24/d49/l33 0 1 2026-03-09T00:03:42.366 INFO:tasks.workunit.client.1.vm06.stdout:4/316: chown d17/d24/d49/f5a 268920405 1 2026-03-09T00:03:42.369 INFO:tasks.workunit.client.1.vm06.stdout:4/317: dread d17/f1d [0,4194304] 0 2026-03-09T00:03:42.369 INFO:tasks.workunit.client.1.vm06.stdout:5/505: dwrite d5/d1c/d21/f73 [0,4194304] 0 2026-03-09T00:03:42.376 INFO:tasks.workunit.client.1.vm06.stdout:6/386: dwrite d4/fb [0,4194304] 0 2026-03-09T00:03:42.382 INFO:tasks.workunit.client.1.vm06.stdout:6/387: dread d4/d27/f4e [0,4194304] 0 2026-03-09T00:03:42.388 INFO:tasks.workunit.client.0.vm03.stdout:8/166: unlink d7/c2d 0 2026-03-09T00:03:42.389 INFO:tasks.workunit.client.0.vm03.stdout:6/190: dwrite d13/f31 [0,4194304] 0 2026-03-09T00:03:42.389 INFO:tasks.workunit.client.0.vm03.stdout:6/191: readlink d13/l38 0 2026-03-09T00:03:42.389 INFO:tasks.workunit.client.0.vm03.stdout:6/192: fsync d13/f31 0 2026-03-09T00:03:42.389 INFO:tasks.workunit.client.0.vm03.stdout:8/167: creat d7/df/f31 x:0 0 0 2026-03-09T00:03:42.389 INFO:tasks.workunit.client.0.vm03.stdout:8/168: read - d7/df/d1e/f24 zero size 2026-03-09T00:03:42.405 INFO:tasks.workunit.client.0.vm03.stdout:8/169: fdatasync d7/f9 0 2026-03-09T00:03:42.413 INFO:tasks.workunit.client.0.vm03.stdout:2/150: sync 2026-03-09T00:03:42.413 INFO:tasks.workunit.client.0.vm03.stdout:0/171: sync 2026-03-09T00:03:42.413 INFO:tasks.workunit.client.0.vm03.stdout:0/172: write d2/ff [4306911,52363] 0 2026-03-09T00:03:42.423 INFO:tasks.workunit.client.1.vm06.stdout:7/377: sync 2026-03-09T00:03:42.431 INFO:tasks.workunit.client.1.vm06.stdout:8/379: mkdir db/d53/d70/d38/d4d/d79 0 2026-03-09T00:03:42.432 INFO:tasks.workunit.client.0.vm03.stdout:4/185: unlink d7/f12 0 2026-03-09T00:03:42.432 INFO:tasks.workunit.client.0.vm03.stdout:4/186: creat d7/d20/f34 x:0 0 0 2026-03-09T00:03:42.436 INFO:tasks.workunit.client.1.vm06.stdout:3/418: mkdir d11/d28/d2e/d7e/d83/d87 0 2026-03-09T00:03:42.448 INFO:tasks.workunit.client.0.vm03.stdout:7/202: rmdir d2/d1f/d35 39 2026-03-09T00:03:42.450 INFO:tasks.workunit.client.1.vm06.stdout:2/475: creat d7/d1b/d5a/d86/f8a x:0 0 0 2026-03-09T00:03:42.450 INFO:tasks.workunit.client.1.vm06.stdout:2/476: write d7/d1b/d5a/d86/f8a [855432,63216] 0 2026-03-09T00:03:42.450 INFO:tasks.workunit.client.0.vm03.stdout:5/167: getdents d1c/d20/d2a 0 2026-03-09T00:03:42.451 INFO:tasks.workunit.client.1.vm06.stdout:2/477: dread d7/d1b/f37 [0,4194304] 0 2026-03-09T00:03:42.451 INFO:tasks.workunit.client.1.vm06.stdout:2/478: creat d7/d1b/d5a/d86/f8b x:0 0 0 2026-03-09T00:03:42.451 INFO:tasks.workunit.client.1.vm06.stdout:2/479: truncate d7/d1b/f37 570919 0 2026-03-09T00:03:42.451 INFO:tasks.workunit.client.1.vm06.stdout:2/480: fdatasync d7/d1b/d5a/d86/f8b 0 2026-03-09T00:03:42.459 INFO:tasks.workunit.client.0.vm03.stdout:1/209: link f1 d4/f42 0 2026-03-09T00:03:42.477 INFO:tasks.workunit.client.0.vm03.stdout:2/151: dwrite d8/d17/f2c [0,4194304] 0 2026-03-09T00:03:42.478 INFO:tasks.workunit.client.1.vm06.stdout:4/318: rename d17/l2d to d17/d24/l67 0 2026-03-09T00:03:42.484 INFO:tasks.workunit.client.0.vm03.stdout:6/193: dwrite f10 [0,4194304] 0 2026-03-09T00:03:42.494 INFO:tasks.workunit.client.0.vm03.stdout:0/173: truncate d2/f1e 2496625 0 2026-03-09T00:03:42.495 INFO:tasks.workunit.client.0.vm03.stdout:9/154: sync 2026-03-09T00:03:42.503 INFO:tasks.workunit.client.0.vm03.stdout:0/174: write d2/f1e [2189366,37647] 0 2026-03-09T00:03:42.513 INFO:tasks.workunit.client.1.vm06.stdout:6/388: creat d4/d16/d46/f76 x:0 0 0 2026-03-09T00:03:42.519 INFO:tasks.workunit.client.0.vm03.stdout:4/187: mkdir d7/d20/d35 0 2026-03-09T00:03:42.519 INFO:tasks.workunit.client.0.vm03.stdout:4/188: read - d7/d27/f31 zero size 2026-03-09T00:03:42.519 INFO:tasks.workunit.client.0.vm03.stdout:4/189: chown d7/d23 395418 1 2026-03-09T00:03:42.519 INFO:tasks.workunit.client.0.vm03.stdout:4/190: stat d7/d20/d29/f2a 0 2026-03-09T00:03:42.533 INFO:tasks.workunit.client.1.vm06.stdout:7/378: truncate d0/df/d1a/f50 930580 0 2026-03-09T00:03:42.535 INFO:tasks.workunit.client.1.vm06.stdout:7/379: dread d0/df/d1a/d3a/f5d [0,4194304] 0 2026-03-09T00:03:42.541 INFO:tasks.workunit.client.1.vm06.stdout:8/380: rmdir db/d53/d70/d38 39 2026-03-09T00:03:42.541 INFO:tasks.workunit.client.0.vm03.stdout:3/122: getdents d2/db 0 2026-03-09T00:03:42.541 INFO:tasks.workunit.client.0.vm03.stdout:3/123: write d2/db/f14 [5204609,67492] 0 2026-03-09T00:03:42.544 INFO:tasks.workunit.client.1.vm06.stdout:3/419: link d11/d28/d2e/c68 d11/d28/d4d/c88 0 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.0.vm03.stdout:9/155: dwrite d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.0.vm03.stdout:5/168: rmdir d1c/d20 39 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.0.vm03.stdout:5/169: fsync d1c/d20/d2a/d3b/f3c 0 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.0.vm03.stdout:7/203: rename d2/d1f/f30 to d2/d1f/d3a/d31/f3f 0 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.1.vm06.stdout:9/324: rmdir d1/d4 39 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.1.vm06.stdout:1/346: sync 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.1.vm06.stdout:0/389: sync 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.1.vm06.stdout:0/390: write d3/d18/d1f/d44/f5a [1356238,110869] 0 2026-03-09T00:03:42.545 INFO:tasks.workunit.client.1.vm06.stdout:0/391: creat d3/d18/f82 x:0 0 0 2026-03-09T00:03:42.546 INFO:tasks.workunit.client.0.vm03.stdout:1/210: mkdir d4/d3a/d43 0 2026-03-09T00:03:42.561 INFO:tasks.workunit.client.1.vm06.stdout:2/481: link d7/da/db/l35 d7/da/d4e/d57/l8c 0 2026-03-09T00:03:42.561 INFO:tasks.workunit.client.1.vm06.stdout:2/482: chown d7/d1b/d5a/d86 161590 1 2026-03-09T00:03:42.561 INFO:tasks.workunit.client.1.vm06.stdout:2/483: creat d7/d1a/d25/d66/f8d x:0 0 0 2026-03-09T00:03:42.561 INFO:tasks.workunit.client.1.vm06.stdout:4/319: mkdir d17/d21/d4c/d66/d68 0 2026-03-09T00:03:42.571 INFO:tasks.workunit.client.1.vm06.stdout:5/506: rename d5/d1c/d68/f31 to d5/fae 0 2026-03-09T00:03:42.574 INFO:tasks.workunit.client.1.vm06.stdout:7/380: symlink d0/d55/l69 0 2026-03-09T00:03:42.579 INFO:tasks.workunit.client.0.vm03.stdout:6/194: mknod d13/d35/c42 0 2026-03-09T00:03:42.579 INFO:tasks.workunit.client.0.vm03.stdout:6/195: dread - d13/f3a zero size 2026-03-09T00:03:42.580 INFO:tasks.workunit.client.0.vm03.stdout:8/170: getdents d7/df/d1a 0 2026-03-09T00:03:42.582 INFO:tasks.workunit.client.1.vm06.stdout:9/325: mknod d1/d3/d12/d21/d14/d25/c6a 0 2026-03-09T00:03:42.588 INFO:tasks.workunit.client.0.vm03.stdout:4/191: symlink d7/d27/l36 0 2026-03-09T00:03:42.588 INFO:tasks.workunit.client.1.vm06.stdout:1/347: symlink d6/d21/d2d/l73 0 2026-03-09T00:03:42.606 INFO:tasks.workunit.client.0.vm03.stdout:0/175: dwrite d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:03:42.611 INFO:tasks.workunit.client.1.vm06.stdout:6/389: dwrite d4/d16/f21 [0,4194304] 0 2026-03-09T00:03:42.611 INFO:tasks.workunit.client.1.vm06.stdout:6/390: chown d4/f38 1 1 2026-03-09T00:03:42.611 INFO:tasks.workunit.client.1.vm06.stdout:6/391: write d4/d16/f34 [5527916,36150] 0 2026-03-09T00:03:42.611 INFO:tasks.workunit.client.0.vm03.stdout:9/156: mknod d15/d1c/d21/c31 0 2026-03-09T00:03:42.613 INFO:tasks.workunit.client.0.vm03.stdout:5/170: chown d1c/l27 3651 1 2026-03-09T00:03:42.613 INFO:tasks.workunit.client.1.vm06.stdout:4/320: creat d17/d21/d4c/d50/f69 x:0 0 0 2026-03-09T00:03:42.614 INFO:tasks.workunit.client.0.vm03.stdout:1/211: link d4/d15/f35 d4/d15/f44 0 2026-03-09T00:03:42.614 INFO:tasks.workunit.client.0.vm03.stdout:1/212: creat d4/d15/f45 x:0 0 0 2026-03-09T00:03:42.618 INFO:tasks.workunit.client.0.vm03.stdout:2/152: rmdir d8/d17 39 2026-03-09T00:03:42.618 INFO:tasks.workunit.client.1.vm06.stdout:7/381: truncate d0/df/d17/f1f 936623 0 2026-03-09T00:03:42.619 INFO:tasks.workunit.client.0.vm03.stdout:6/196: unlink d13/d1e/f36 0 2026-03-09T00:03:42.623 INFO:tasks.workunit.client.0.vm03.stdout:8/171: link d7/df/f2c d7/df/f32 0 2026-03-09T00:03:42.624 INFO:tasks.workunit.client.1.vm06.stdout:4/321: dread d17/d21/d4c/f56 [0,4194304] 0 2026-03-09T00:03:42.628 INFO:tasks.workunit.client.0.vm03.stdout:4/192: mkdir d7/d20/d29/d37 0 2026-03-09T00:03:42.629 INFO:tasks.workunit.client.0.vm03.stdout:4/193: chown d7/d20/f21 485 1 2026-03-09T00:03:42.630 INFO:tasks.workunit.client.0.vm03.stdout:0/176: truncate d2/f22 1235582 0 2026-03-09T00:03:42.648 INFO:tasks.workunit.client.1.vm06.stdout:6/392: mkdir d4/d16/d46/d77 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:6/393: write d4/d16/f34 [7804944,80719] 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/382: mknod d0/df/c6a 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/383: stat d0/df/d1a/d27/f37 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/384: write d0/df/d1a/d27/f4b [996692,92693] 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/385: mknod d0/df/d1a/d35/c6b 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/386: rename d0 to d0/d39/d6c 22 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/387: unlink d0/f2 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:7/388: write d0/df/d1a/d3a/f5d [473312,78509] 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:6/394: dread d4/f68 [0,4194304] 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.1.vm06.stdout:6/395: dread - d4/d27/d42/f6b zero size 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:5/171: symlink d1c/d20/d2a/l40 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/172: creat d7/df/d1a/f33 x:0 0 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/173: write d7/f18 [830549,73192] 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:4/194: rename d7/d20/d29/d37 to d7/d20/d29/d38 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:5/172: symlink d1c/d20/l41 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:1/213: getdents d4 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/174: creat d7/f34 x:0 0 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:4/195: mknod d7/d27/c39 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:4/196: getdents d7/d20/d35 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/175: link d7/df/d1a/l1b d7/df/d1a/d2b/l35 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/176: readlink d7/l2f 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:4/197: mkdir d7/d20/d29/d38/d3a 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/177: creat d7/df/d1e/f36 x:0 0 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/178: creat d7/df/f37 x:0 0 0 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:8/179: chown d7/df/f31 622974 1 2026-03-09T00:03:42.649 INFO:tasks.workunit.client.0.vm03.stdout:4/198: symlink d7/d27/l3b 0 2026-03-09T00:03:42.653 INFO:tasks.workunit.client.0.vm03.stdout:2/153: write d8/f9 [2927789,76201] 0 2026-03-09T00:03:42.653 INFO:tasks.workunit.client.0.vm03.stdout:2/154: creat d8/d1b/d2a/f33 x:0 0 0 2026-03-09T00:03:42.653 INFO:tasks.workunit.client.0.vm03.stdout:2/155: fdatasync f6 0 2026-03-09T00:03:42.770 INFO:tasks.workunit.client.1.vm06.stdout:8/381: dwrite db/d53/d70/d38/f72 [0,4194304] 0 2026-03-09T00:03:42.772 INFO:tasks.workunit.client.1.vm06.stdout:8/382: rename db/d1e/f20 to db/dd/f7a 0 2026-03-09T00:03:42.791 INFO:tasks.workunit.client.1.vm06.stdout:7/389: write d0/df/d1a/d3a/f23 [82392,21084] 0 2026-03-09T00:03:42.791 INFO:tasks.workunit.client.1.vm06.stdout:7/390: truncate d0/f4f 814565 0 2026-03-09T00:03:42.791 INFO:tasks.workunit.client.1.vm06.stdout:3/420: dwrite d11/d28/d2e/d2f/d36/f4a [0,4194304] 0 2026-03-09T00:03:42.791 INFO:tasks.workunit.client.0.vm03.stdout:1/214: dwrite d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:42.792 INFO:tasks.workunit.client.0.vm03.stdout:1/215: mkdir d4/d3a/d3d/d46 0 2026-03-09T00:03:42.793 INFO:tasks.workunit.client.0.vm03.stdout:1/216: link d4/d15/f3f d4/d3a/d43/f47 0 2026-03-09T00:03:42.793 INFO:tasks.workunit.client.0.vm03.stdout:3/124: dwrite d2/db/f13 [0,4194304] 0 2026-03-09T00:03:42.794 INFO:tasks.workunit.client.1.vm06.stdout:7/391: creat d0/df/d1a/d27/d4c/f6d x:0 0 0 2026-03-09T00:03:42.795 INFO:tasks.workunit.client.1.vm06.stdout:3/421: mkdir d11/d28/d4d/d89 0 2026-03-09T00:03:42.795 INFO:tasks.workunit.client.1.vm06.stdout:3/422: readlink d11/l23 0 2026-03-09T00:03:42.795 INFO:tasks.workunit.client.1.vm06.stdout:1/348: dwrite d6/d4c/d71/d4d/f5c [0,4194304] 0 2026-03-09T00:03:42.795 INFO:tasks.workunit.client.1.vm06.stdout:1/349: write d6/d21/f2e [5233225,86665] 0 2026-03-09T00:03:42.795 INFO:tasks.workunit.client.1.vm06.stdout:1/350: chown d6/d4c/c6b 30573188 1 2026-03-09T00:03:42.795 INFO:tasks.workunit.client.0.vm03.stdout:1/217: write d4/f9 [808912,81614] 0 2026-03-09T00:03:42.798 INFO:tasks.workunit.client.1.vm06.stdout:4/322: dwrite d17/d21/d4c/d50/f60 [0,4194304] 0 2026-03-09T00:03:42.798 INFO:tasks.workunit.client.1.vm06.stdout:7/392: mknod d0/df/c6e 0 2026-03-09T00:03:42.799 INFO:tasks.workunit.client.0.vm03.stdout:3/125: creat d2/db/f26 x:0 0 0 2026-03-09T00:03:42.799 INFO:tasks.workunit.client.0.vm03.stdout:3/126: creat d2/db/f27 x:0 0 0 2026-03-09T00:03:42.799 INFO:tasks.workunit.client.0.vm03.stdout:3/127: chown d2/f6 1229 1 2026-03-09T00:03:42.799 INFO:tasks.workunit.client.0.vm03.stdout:3/128: chown d2/db/c18 3865604 1 2026-03-09T00:03:42.799 INFO:tasks.workunit.client.0.vm03.stdout:3/129: creat d2/db/f28 x:0 0 0 2026-03-09T00:03:42.799 INFO:tasks.workunit.client.0.vm03.stdout:3/130: readlink d2/l23 0 2026-03-09T00:03:42.803 INFO:tasks.workunit.client.1.vm06.stdout:1/351: creat d6/d21/d2d/f74 x:0 0 0 2026-03-09T00:03:42.804 INFO:tasks.workunit.client.1.vm06.stdout:7/393: creat d0/df/d1a/d3a/d4e/d5e/f6f x:0 0 0 2026-03-09T00:03:42.806 INFO:tasks.workunit.client.1.vm06.stdout:7/394: mkdir d0/df/d1a/d27/d70 0 2026-03-09T00:03:42.806 INFO:tasks.workunit.client.1.vm06.stdout:1/352: rmdir d6/d21/d2d/d3b 39 2026-03-09T00:03:42.818 INFO:tasks.workunit.client.1.vm06.stdout:6/396: dwrite d4/f26 [0,4194304] 0 2026-03-09T00:03:42.823 INFO:tasks.workunit.client.1.vm06.stdout:1/353: unlink d6/f41 0 2026-03-09T00:03:42.826 INFO:tasks.workunit.client.1.vm06.stdout:1/354: creat d6/d63/f75 x:0 0 0 2026-03-09T00:03:42.827 INFO:tasks.workunit.client.1.vm06.stdout:1/355: write d6/f28 [3830167,87943] 0 2026-03-09T00:03:42.827 INFO:tasks.workunit.client.1.vm06.stdout:1/356: truncate d6/d21/f69 753466 0 2026-03-09T00:03:42.827 INFO:tasks.workunit.client.1.vm06.stdout:6/397: mkdir d4/d27/d3e/d78 0 2026-03-09T00:03:42.827 INFO:tasks.workunit.client.1.vm06.stdout:6/398: fdatasync d4/f40 0 2026-03-09T00:03:42.830 INFO:tasks.workunit.client.0.vm03.stdout:9/157: dwrite fb [0,4194304] 0 2026-03-09T00:03:42.830 INFO:tasks.workunit.client.0.vm03.stdout:7/204: dwrite d2/d1f/d3a/f29 [0,4194304] 0 2026-03-09T00:03:42.830 INFO:tasks.workunit.client.0.vm03.stdout:7/205: write d2/d1f/f11 [1480045,62622] 0 2026-03-09T00:03:42.830 INFO:tasks.workunit.client.0.vm03.stdout:4/199: dwrite d7/d27/f2c [0,4194304] 0 2026-03-09T00:03:42.830 INFO:tasks.workunit.client.0.vm03.stdout:4/200: chown d7/d20/d29/d38 257952 1 2026-03-09T00:03:42.830 INFO:tasks.workunit.client.1.vm06.stdout:1/357: rename d6/d21/d2d/d3b/l48 to d6/l76 0 2026-03-09T00:03:42.831 INFO:tasks.workunit.client.1.vm06.stdout:1/358: creat d6/d21/d2d/d37/f77 x:0 0 0 2026-03-09T00:03:42.831 INFO:tasks.workunit.client.1.vm06.stdout:1/359: creat d6/d21/d2d/d37/f78 x:0 0 0 2026-03-09T00:03:42.843 INFO:tasks.workunit.client.0.vm03.stdout:7/206: unlink d2/f3c 0 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.1.vm06.stdout:1/360: rename d6/d4c/d71/d4d to d6/d4c/d79 0 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/201: mknod d7/d20/c3c 0 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/202: dread - d7/d20/f34 zero size 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/203: chown d7/fe 108435 1 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/204: creat d7/d20/f3d x:0 0 0 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/205: write d7/d20/d29/f2a [887105,30569] 0 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/206: chown d7/d20/f34 906 1 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:4/207: chown d7/f1d 576158 1 2026-03-09T00:03:42.849 INFO:tasks.workunit.client.0.vm03.stdout:7/207: mkdir d2/d1f/d40 0 2026-03-09T00:03:42.855 INFO:tasks.workunit.client.1.vm06.stdout:1/361: rename d6/c56 to d6/d4c/c7a 0 2026-03-09T00:03:42.856 INFO:tasks.workunit.client.0.vm03.stdout:4/208: link d7/f15 d7/d23/d25/f3e 0 2026-03-09T00:03:42.856 INFO:tasks.workunit.client.0.vm03.stdout:4/209: stat d7/d23/d25/f3e 0 2026-03-09T00:03:42.856 INFO:tasks.workunit.client.0.vm03.stdout:4/210: truncate d7/d20/f3d 1034150 0 2026-03-09T00:03:42.856 INFO:tasks.workunit.client.0.vm03.stdout:4/211: truncate d7/d20/f34 593762 0 2026-03-09T00:03:42.857 INFO:tasks.workunit.client.0.vm03.stdout:7/208: link d2/l36 d2/l41 0 2026-03-09T00:03:42.858 INFO:tasks.workunit.client.0.vm03.stdout:4/212: rename d7/d27/c39 to d7/d20/c3f 0 2026-03-09T00:03:42.858 INFO:tasks.workunit.client.1.vm06.stdout:1/362: rename d6/d21/d2d/f31 to d6/d21/f7b 0 2026-03-09T00:03:42.858 INFO:tasks.workunit.client.1.vm06.stdout:1/363: chown d6/d21/d2d/d37/d6d 142785696 1 2026-03-09T00:03:42.858 INFO:tasks.workunit.client.1.vm06.stdout:1/364: chown d6/d21/f55 57 1 2026-03-09T00:03:42.863 INFO:tasks.workunit.client.0.vm03.stdout:4/213: mknod d7/d27/c40 0 2026-03-09T00:03:42.863 INFO:tasks.workunit.client.1.vm06.stdout:1/365: stat d6/l76 0 2026-03-09T00:03:42.863 INFO:tasks.workunit.client.1.vm06.stdout:1/366: chown d6/d21/d2d 327523 1 2026-03-09T00:03:42.871 INFO:tasks.workunit.client.0.vm03.stdout:4/214: write d7/d23/d25/f3e [4177749,124441] 0 2026-03-09T00:03:42.871 INFO:tasks.workunit.client.0.vm03.stdout:4/215: fdatasync d7/d27/f31 0 2026-03-09T00:03:42.871 INFO:tasks.workunit.client.0.vm03.stdout:4/216: chown d7/d23/d25/f3e 3 1 2026-03-09T00:03:42.871 INFO:tasks.workunit.client.1.vm06.stdout:1/367: creat d6/f7c x:0 0 0 2026-03-09T00:03:42.903 INFO:tasks.workunit.client.1.vm06.stdout:1/368: dread d6/d21/f69 [0,4194304] 0 2026-03-09T00:03:42.903 INFO:tasks.workunit.client.0.vm03.stdout:1/218: dwrite d4/d15/f44 [0,4194304] 0 2026-03-09T00:03:42.904 INFO:tasks.workunit.client.1.vm06.stdout:0/392: dwrite d3/f7 [0,4194304] 0 2026-03-09T00:03:42.905 INFO:tasks.workunit.client.1.vm06.stdout:1/369: truncate f0 1406163 0 2026-03-09T00:03:42.906 INFO:tasks.workunit.client.1.vm06.stdout:0/393: rmdir d3/d18/d1f/d44/d6a 39 2026-03-09T00:03:42.907 INFO:tasks.workunit.client.0.vm03.stdout:1/219: rename d4/d6/fa to d4/d3a/f48 0 2026-03-09T00:03:42.918 INFO:tasks.workunit.client.0.vm03.stdout:1/220: fdatasync d4/d6/f8 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.1.vm06.stdout:0/394: creat d3/d18/d1f/d39/f83 x:0 0 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.1.vm06.stdout:0/395: mknod d3/d18/d2c/c84 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.1.vm06.stdout:0/396: chown d3/d18/d1f/c5b 7369874 1 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.1.vm06.stdout:0/397: read d3/d18/d1f/d39/d49/f50 [591260,56739] 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/158: read fd [6567384,88099] 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/159: getdents d15/d1c/d28/d30 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/160: symlink d15/d1c/d28/d30/l32 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/161: creat d15/d1c/d28/f33 x:0 0 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/162: creat d15/d1c/d21/f34 x:0 0 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:0/177: dread d2/ff [0,4194304] 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/163: unlink l12 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/164: fsync d15/d1c/d21/f25 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:0/178: mkdir d2/da/d36/d39 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:0/179: dread - d2/d1f/f2c zero size 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/165: mknod d15/d1c/d28/d30/c35 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/166: write d15/d1c/d28/f33 [949231,37930] 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/167: mkdir d15/d1c/d36 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/168: mknod d15/d1c/d28/d30/c37 0 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/169: chown d15/d1c 14330750 1 2026-03-09T00:03:42.919 INFO:tasks.workunit.client.0.vm03.stdout:9/170: rename d15/c27 to d15/d1c/d21/c38 0 2026-03-09T00:03:42.922 INFO:tasks.workunit.client.1.vm06.stdout:0/398: dread d3/f19 [0,4194304] 0 2026-03-09T00:03:42.922 INFO:tasks.workunit.client.1.vm06.stdout:0/399: read d3/fa [124823,76402] 0 2026-03-09T00:03:42.926 INFO:tasks.workunit.client.0.vm03.stdout:5/173: dwrite d1c/d20/f3e [0,4194304] 0 2026-03-09T00:03:42.926 INFO:tasks.workunit.client.0.vm03.stdout:5/174: dread - d1c/f37 zero size 2026-03-09T00:03:42.932 INFO:tasks.workunit.client.1.vm06.stdout:0/400: rename f1 to d3/d18/d2c/d2d/f85 0 2026-03-09T00:03:42.935 INFO:tasks.workunit.client.1.vm06.stdout:0/401: fsync d3/d18/d2c/d2d/d31/f4f 0 2026-03-09T00:03:42.935 INFO:tasks.workunit.client.0.vm03.stdout:5/175: dread - d1c/f29 zero size 2026-03-09T00:03:42.936 INFO:tasks.workunit.client.1.vm06.stdout:1/370: dread d6/d21/d2d/f6c [4194304,4194304] 0 2026-03-09T00:03:42.936 INFO:tasks.workunit.client.1.vm06.stdout:1/371: readlink d6/d21/d2d/l33 0 2026-03-09T00:03:42.937 INFO:tasks.workunit.client.1.vm06.stdout:1/372: stat d6/c36 0 2026-03-09T00:03:42.940 INFO:tasks.workunit.client.1.vm06.stdout:1/373: dread d6/fa [0,4194304] 0 2026-03-09T00:03:42.940 INFO:tasks.workunit.client.1.vm06.stdout:1/374: read - d6/d21/d2d/d37/f77 zero size 2026-03-09T00:03:42.967 INFO:tasks.workunit.client.0.vm03.stdout:3/131: dwrite d2/db/f24 [0,4194304] 0 2026-03-09T00:03:42.968 INFO:tasks.workunit.client.1.vm06.stdout:9/326: dwrite d1/d3/d12/d21/d14/d25/f32 [0,4194304] 0 2026-03-09T00:03:42.968 INFO:tasks.workunit.client.1.vm06.stdout:9/327: chown d1/d4/f44 266 1 2026-03-09T00:03:42.969 INFO:tasks.workunit.client.0.vm03.stdout:3/132: mknod d2/db/c29 0 2026-03-09T00:03:42.975 INFO:tasks.workunit.client.0.vm03.stdout:3/133: rename d2/f1c to d2/f2a 0 2026-03-09T00:03:42.980 INFO:tasks.workunit.client.1.vm06.stdout:9/328: dread d1/d3/d12/d21/d14/d25/f60 [0,4194304] 0 2026-03-09T00:03:42.980 INFO:tasks.workunit.client.1.vm06.stdout:9/329: chown d1/d3/d2b/d58/l66 554189 1 2026-03-09T00:03:42.980 INFO:tasks.workunit.client.1.vm06.stdout:9/330: chown d1/d3/d50/c57 30931810 1 2026-03-09T00:03:42.980 INFO:tasks.workunit.client.1.vm06.stdout:5/507: getdents d5 0 2026-03-09T00:03:42.988 INFO:tasks.workunit.client.1.vm06.stdout:5/508: symlink d5/d44/d84/laf 0 2026-03-09T00:03:42.990 INFO:tasks.workunit.client.1.vm06.stdout:2/484: rmdir d7/d1a/d25/d66 39 2026-03-09T00:03:42.990 INFO:tasks.workunit.client.1.vm06.stdout:2/485: dread - d7/d1a/d3c/f4d zero size 2026-03-09T00:03:42.991 INFO:tasks.workunit.client.1.vm06.stdout:2/486: mkdir d7/d1a/d25/d66/d87/d8e 0 2026-03-09T00:03:42.991 INFO:tasks.workunit.client.1.vm06.stdout:2/487: getdents d7/d1a 0 2026-03-09T00:03:42.992 INFO:tasks.workunit.client.1.vm06.stdout:2/488: symlink d7/d1b/d5a/l8f 0 2026-03-09T00:03:42.992 INFO:tasks.workunit.client.1.vm06.stdout:2/489: getdents d7/d1a/d25/d66 0 2026-03-09T00:03:42.993 INFO:tasks.workunit.client.1.vm06.stdout:2/490: creat d7/d1b/d31/f90 x:0 0 0 2026-03-09T00:03:42.998 INFO:tasks.workunit.client.1.vm06.stdout:0/402: dwrite d3/d18/d2c/d2d/d31/f7b [0,4194304] 0 2026-03-09T00:03:43.000 INFO:tasks.workunit.client.1.vm06.stdout:0/403: link d3/fa d3/d18/d28/f86 0 2026-03-09T00:03:43.000 INFO:tasks.workunit.client.1.vm06.stdout:0/404: chown d3/d18/d1f/d44/d6a/l7a 1715 1 2026-03-09T00:03:43.002 INFO:tasks.workunit.client.1.vm06.stdout:0/405: dread d3/d18/d1f/f4a [0,4194304] 0 2026-03-09T00:03:43.002 INFO:tasks.workunit.client.1.vm06.stdout:0/406: dread - d3/d18/d1f/d39/f6e zero size 2026-03-09T00:03:43.018 INFO:tasks.workunit.client.0.vm03.stdout:1/221: dwrite d4/d15/d1a/f1d [0,4194304] 0 2026-03-09T00:03:43.042 INFO:tasks.workunit.client.0.vm03.stdout:1/222: dread d4/d3a/f48 [0,4194304] 0 2026-03-09T00:03:43.042 INFO:tasks.workunit.client.0.vm03.stdout:1/223: truncate d4/d3a/d43/f47 852039 0 2026-03-09T00:03:43.042 INFO:tasks.workunit.client.0.vm03.stdout:1/224: creat d4/d3a/d43/f49 x:0 0 0 2026-03-09T00:03:43.043 INFO:tasks.workunit.client.0.vm03.stdout:1/225: rename d4/d6/f34 to d4/d3a/d3d/f4a 0 2026-03-09T00:03:43.044 INFO:tasks.workunit.client.0.vm03.stdout:1/226: rename d4/d6/ff to d4/d3a/d32/f4b 0 2026-03-09T00:03:43.047 INFO:tasks.workunit.client.0.vm03.stdout:1/227: read d4/d3a/d32/f4b [492865,44824] 0 2026-03-09T00:03:43.047 INFO:tasks.workunit.client.0.vm03.stdout:1/228: chown d4/f1e 26651476 1 2026-03-09T00:03:43.047 INFO:tasks.workunit.client.0.vm03.stdout:1/229: creat d4/d3a/d3d/d46/f4c x:0 0 0 2026-03-09T00:03:43.048 INFO:tasks.workunit.client.0.vm03.stdout:1/230: rename d4/d6/f2a to d4/d3a/f4d 0 2026-03-09T00:03:43.048 INFO:tasks.workunit.client.0.vm03.stdout:1/231: readlink d4/lc 0 2026-03-09T00:03:43.057 INFO:tasks.workunit.client.0.vm03.stdout:7/209: dwrite d2/d1f/d3a/f29 [0,4194304] 0 2026-03-09T00:03:43.059 INFO:tasks.workunit.client.0.vm03.stdout:7/210: mkdir d2/d1f/d42 0 2026-03-09T00:03:43.059 INFO:tasks.workunit.client.0.vm03.stdout:7/211: readlink d2/d1f/l20 0 2026-03-09T00:03:43.064 INFO:tasks.workunit.client.0.vm03.stdout:7/212: dread d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:03:43.065 INFO:tasks.workunit.client.0.vm03.stdout:7/213: mkdir d2/d1f/d42/d43 0 2026-03-09T00:03:43.065 INFO:tasks.workunit.client.0.vm03.stdout:7/214: rename d2/f2a to d2/d1f/d3a/d31/f44 0 2026-03-09T00:03:43.065 INFO:tasks.workunit.client.0.vm03.stdout:7/215: readlink d2/l2f 0 2026-03-09T00:03:43.093 INFO:tasks.workunit.client.1.vm06.stdout:0/407: dwrite d3/d18/d1f/d39/d49/f50 [0,4194304] 0 2026-03-09T00:03:43.093 INFO:tasks.workunit.client.1.vm06.stdout:0/408: creat d3/d18/d3c/f87 x:0 0 0 2026-03-09T00:03:43.094 INFO:tasks.workunit.client.1.vm06.stdout:0/409: creat d3/d18/d2c/d2d/d31/f88 x:0 0 0 2026-03-09T00:03:43.094 INFO:tasks.workunit.client.1.vm06.stdout:0/410: stat d3/d18/f59 0 2026-03-09T00:03:43.094 INFO:tasks.workunit.client.1.vm06.stdout:0/411: creat d3/d18/d2c/d2d/d31/f89 x:0 0 0 2026-03-09T00:03:43.094 INFO:tasks.workunit.client.1.vm06.stdout:0/412: fdatasync d3/d18/d2c/d2d/d31/f89 0 2026-03-09T00:03:43.094 INFO:tasks.workunit.client.1.vm06.stdout:0/413: chown d3/d18/d2c/d2d 3 1 2026-03-09T00:03:43.094 INFO:tasks.workunit.client.1.vm06.stdout:0/414: write d3/f1a [1400441,129686] 0 2026-03-09T00:03:43.112 INFO:tasks.workunit.client.1.vm06.stdout:2/491: write d7/d1b/d5a/d86/f8a [89598,81125] 0 2026-03-09T00:03:43.112 INFO:tasks.workunit.client.1.vm06.stdout:2/492: readlink d7/d1a/d56/l6f 0 2026-03-09T00:03:43.112 INFO:tasks.workunit.client.1.vm06.stdout:2/493: chown d7/da/d55/f5b 14 1 2026-03-09T00:03:43.122 INFO:tasks.workunit.client.0.vm03.stdout:0/180: dwrite f0 [0,4194304] 0 2026-03-09T00:03:43.128 INFO:tasks.workunit.client.0.vm03.stdout:0/181: creat d2/da/d1a/f3a x:0 0 0 2026-03-09T00:03:43.128 INFO:tasks.workunit.client.0.vm03.stdout:0/182: creat d2/da/d36/d39/f3b x:0 0 0 2026-03-09T00:03:43.132 INFO:tasks.workunit.client.1.vm06.stdout:5/509: dwrite d5/d1c/d68/f3f [0,4194304] 0 2026-03-09T00:03:43.132 INFO:tasks.workunit.client.1.vm06.stdout:5/510: creat d5/d44/d4b/d92/d95/fb0 x:0 0 0 2026-03-09T00:03:43.134 INFO:tasks.workunit.client.0.vm03.stdout:0/183: mknod d2/da/d1a/c3c 0 2026-03-09T00:03:43.134 INFO:tasks.workunit.client.0.vm03.stdout:0/184: fdatasync d2/f32 0 2026-03-09T00:03:43.134 INFO:tasks.workunit.client.0.vm03.stdout:0/185: fdatasync d2/da/dd/f38 0 2026-03-09T00:03:43.141 INFO:tasks.workunit.client.0.vm03.stdout:7/216: dwrite d2/d1f/f11 [0,4194304] 0 2026-03-09T00:03:43.142 INFO:tasks.workunit.client.0.vm03.stdout:0/186: mknod d2/d1f/c3d 0 2026-03-09T00:03:43.151 INFO:tasks.workunit.client.0.vm03.stdout:8/180: truncate d7/f18 907084 0 2026-03-09T00:03:43.154 INFO:tasks.workunit.client.0.vm03.stdout:8/181: dread - d7/df/f30 zero size 2026-03-09T00:03:43.155 INFO:tasks.workunit.client.1.vm06.stdout:3/423: getdents d11/d28/d4d 0 2026-03-09T00:03:43.156 INFO:tasks.workunit.client.0.vm03.stdout:7/217: symlink d2/d1f/d42/d43/l45 0 2026-03-09T00:03:43.174 INFO:tasks.workunit.client.1.vm06.stdout:3/424: symlink d11/d28/d2e/d7e/d83/l8a 0 2026-03-09T00:03:43.178 INFO:tasks.workunit.client.1.vm06.stdout:3/425: dread d11/d28/f42 [0,4194304] 0 2026-03-09T00:03:43.180 INFO:tasks.workunit.client.0.vm03.stdout:8/182: getdents d7 0 2026-03-09T00:03:43.180 INFO:tasks.workunit.client.0.vm03.stdout:8/183: mkdir d7/df/d1e/d38 0 2026-03-09T00:03:43.183 INFO:tasks.workunit.client.1.vm06.stdout:3/426: unlink d11/d28/d2e/d2f/f53 0 2026-03-09T00:03:43.183 INFO:tasks.workunit.client.1.vm06.stdout:3/427: read d11/d3f/f71 [210351,33095] 0 2026-03-09T00:03:43.183 INFO:tasks.workunit.client.1.vm06.stdout:5/511: dwrite d5/ff [0,4194304] 0 2026-03-09T00:03:43.185 INFO:tasks.workunit.client.1.vm06.stdout:3/428: symlink d11/d28/d2e/d2f/d5b/d5f/l8b 0 2026-03-09T00:03:43.189 INFO:tasks.workunit.client.1.vm06.stdout:5/512: mkdir d5/db1 0 2026-03-09T00:03:43.191 INFO:tasks.workunit.client.1.vm06.stdout:3/429: symlink d11/d28/d2e/d2f/d5b/l8c 0 2026-03-09T00:03:43.201 INFO:tasks.workunit.client.1.vm06.stdout:5/513: write d5/d1c/d21/d28/f57 [584535,82548] 0 2026-03-09T00:03:43.201 INFO:tasks.workunit.client.1.vm06.stdout:5/514: write d5/d1c/f75 [760835,17223] 0 2026-03-09T00:03:43.201 INFO:tasks.workunit.client.1.vm06.stdout:5/515: write d5/f43 [2979509,51892] 0 2026-03-09T00:03:43.201 INFO:tasks.workunit.client.1.vm06.stdout:3/430: mkdir d11/d3f/d8d 0 2026-03-09T00:03:43.213 INFO:tasks.workunit.client.0.vm03.stdout:0/187: dwrite d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:03:43.216 INFO:tasks.workunit.client.1.vm06.stdout:5/516: creat d5/d1c/d23/d34/fb2 x:0 0 0 2026-03-09T00:03:43.216 INFO:tasks.workunit.client.1.vm06.stdout:5/517: stat d5/d1c/d21/d28/d5e/d66/f8a 0 2026-03-09T00:03:43.216 INFO:tasks.workunit.client.1.vm06.stdout:5/518: write d5/d1c/d23/f42 [1784206,115557] 0 2026-03-09T00:03:43.216 INFO:tasks.workunit.client.1.vm06.stdout:5/519: write d5/d1c/f75 [1560114,120918] 0 2026-03-09T00:03:43.221 INFO:tasks.workunit.client.0.vm03.stdout:0/188: symlink d2/da/d36/l3e 0 2026-03-09T00:03:43.221 INFO:tasks.workunit.client.0.vm03.stdout:0/189: write d2/da/dd/f14 [2119710,96998] 0 2026-03-09T00:03:43.221 INFO:tasks.workunit.client.1.vm06.stdout:5/520: dread d5/d44/d4b/f6d [0,4194304] 0 2026-03-09T00:03:43.221 INFO:tasks.workunit.client.1.vm06.stdout:5/521: read d5/fe [3657174,106624] 0 2026-03-09T00:03:43.221 INFO:tasks.workunit.client.1.vm06.stdout:5/522: stat d5/d44/d4b/d92/d49/la3 0 2026-03-09T00:03:43.222 INFO:tasks.workunit.client.1.vm06.stdout:5/523: creat d5/d1c/d23/fb3 x:0 0 0 2026-03-09T00:03:43.222 INFO:tasks.workunit.client.1.vm06.stdout:5/524: chown d5/d1c/d21/f96 359086 1 2026-03-09T00:03:43.224 INFO:tasks.workunit.client.1.vm06.stdout:5/525: rename d5/d1c/d68/f9d to d5/d1c/d68/fb4 0 2026-03-09T00:03:43.224 INFO:tasks.workunit.client.0.vm03.stdout:0/190: getdents d2/da/dd 0 2026-03-09T00:03:43.235 INFO:tasks.workunit.client.0.vm03.stdout:0/191: dread d2/fe [0,4194304] 0 2026-03-09T00:03:43.237 INFO:tasks.workunit.client.0.vm03.stdout:7/218: dwrite d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:03:43.237 INFO:tasks.workunit.client.0.vm03.stdout:7/219: write d2/d4/fb [3844457,12915] 0 2026-03-09T00:03:43.238 INFO:tasks.workunit.client.1.vm06.stdout:3/431: dwrite d11/d28/f4f [4194304,4194304] 0 2026-03-09T00:03:43.238 INFO:tasks.workunit.client.1.vm06.stdout:3/432: fdatasync d11/d28/d2e/d2f/d5b/f84 0 2026-03-09T00:03:43.240 INFO:tasks.workunit.client.0.vm03.stdout:7/220: dread d2/d1f/d3a/d31/f3f [0,4194304] 0 2026-03-09T00:03:43.242 INFO:tasks.workunit.client.0.vm03.stdout:0/192: write d2/fe [3285475,49801] 0 2026-03-09T00:03:43.242 INFO:tasks.workunit.client.0.vm03.stdout:0/193: stat d2/d1f/l26 0 2026-03-09T00:03:43.243 INFO:tasks.workunit.client.0.vm03.stdout:7/221: rename d2/d1f/d3a/d24/d32 to d2/d1f/d42/d46 0 2026-03-09T00:03:43.244 INFO:tasks.workunit.client.1.vm06.stdout:5/526: dread d5/d1c/d68/fb4 [0,4194304] 0 2026-03-09T00:03:43.244 INFO:tasks.workunit.client.1.vm06.stdout:5/527: chown d5/d1c/d21/f3c 1451951 1 2026-03-09T00:03:43.244 INFO:tasks.workunit.client.0.vm03.stdout:7/222: creat d2/d1f/d42/f47 x:0 0 0 2026-03-09T00:03:43.244 INFO:tasks.workunit.client.0.vm03.stdout:7/223: chown d2/d4 562728295 1 2026-03-09T00:03:43.246 INFO:tasks.workunit.client.1.vm06.stdout:3/433: mknod d11/d28/d4d/d89/c8e 0 2026-03-09T00:03:43.246 INFO:tasks.workunit.client.1.vm06.stdout:5/528: mknod d5/d1c/d23/d34/cb5 0 2026-03-09T00:03:43.248 INFO:tasks.workunit.client.0.vm03.stdout:7/224: rename d2/d4/l25 to d2/d1f/d35/l48 0 2026-03-09T00:03:43.248 INFO:tasks.workunit.client.0.vm03.stdout:7/225: truncate d2/d1f/d42/f47 819240 0 2026-03-09T00:03:43.281 INFO:tasks.workunit.client.0.vm03.stdout:4/217: truncate d7/f15 1342178 0 2026-03-09T00:03:43.286 INFO:tasks.workunit.client.0.vm03.stdout:4/218: symlink d7/d20/d35/l41 0 2026-03-09T00:03:43.296 INFO:tasks.workunit.client.1.vm06.stdout:0/415: getdents d3/d18/d2c 0 2026-03-09T00:03:43.318 INFO:tasks.workunit.client.0.vm03.stdout:3/134: rmdir d2/db 39 2026-03-09T00:03:43.319 INFO:tasks.workunit.client.0.vm03.stdout:3/135: mknod d2/c2b 0 2026-03-09T00:03:43.319 INFO:tasks.workunit.client.0.vm03.stdout:3/136: chown d2/db/f27 91 1 2026-03-09T00:03:43.319 INFO:tasks.workunit.client.0.vm03.stdout:3/137: truncate d2/db/f17 927860 0 2026-03-09T00:03:43.327 INFO:tasks.workunit.client.0.vm03.stdout:3/138: dread d2/db/f15 [0,4194304] 0 2026-03-09T00:03:43.327 INFO:tasks.workunit.client.0.vm03.stdout:3/139: write d2/f6 [555433,87634] 0 2026-03-09T00:03:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:42 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:42 vm03.local ceph-mon[52346]: pgmap v6: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:42 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:42 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:42 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:03:43.340 INFO:tasks.workunit.client.1.vm06.stdout:3/434: dwrite f9 [4194304,4194304] 0 2026-03-09T00:03:43.340 INFO:tasks.workunit.client.0.vm03.stdout:4/219: dwrite d7/d27/f31 [0,4194304] 0 2026-03-09T00:03:43.346 INFO:tasks.workunit.client.0.vm03.stdout:4/220: symlink d7/d20/l42 0 2026-03-09T00:03:43.346 INFO:tasks.workunit.client.1.vm06.stdout:3/435: mkdir d11/d28/d2e/d2f/d36/d8f 0 2026-03-09T00:03:43.346 INFO:tasks.workunit.client.1.vm06.stdout:3/436: stat d11/d28/d2e/d2f/l7a 0 2026-03-09T00:03:43.346 INFO:tasks.workunit.client.1.vm06.stdout:3/437: stat d11/d28/l34 0 2026-03-09T00:03:43.346 INFO:tasks.workunit.client.0.vm03.stdout:4/221: creat d7/d20/d29/f43 x:0 0 0 2026-03-09T00:03:43.346 INFO:tasks.workunit.client.0.vm03.stdout:4/222: truncate d7/d20/f21 5280553 0 2026-03-09T00:03:43.348 INFO:tasks.workunit.client.1.vm06.stdout:3/438: dread d11/d3f/f4c [0,4194304] 0 2026-03-09T00:03:43.348 INFO:tasks.workunit.client.1.vm06.stdout:3/439: chown d11/d28/d2e/d2f/d5b/d5f/f81 227504587 1 2026-03-09T00:03:43.348 INFO:tasks.workunit.client.1.vm06.stdout:3/440: chown d11 1019661 1 2026-03-09T00:03:43.348 INFO:tasks.workunit.client.1.vm06.stdout:3/441: mkdir d11/d28/d4d/d89/d90 0 2026-03-09T00:03:43.348 INFO:tasks.workunit.client.1.vm06.stdout:3/442: chown d11/d28/c35 2155 1 2026-03-09T00:03:43.349 INFO:tasks.workunit.client.0.vm03.stdout:4/223: mknod d7/d23/c44 0 2026-03-09T00:03:43.349 INFO:tasks.workunit.client.1.vm06.stdout:3/443: mkdir d11/d28/d2e/d2f/d5b/d5f/d91 0 2026-03-09T00:03:43.349 INFO:tasks.workunit.client.1.vm06.stdout:3/444: read d11/f27 [514303,119757] 0 2026-03-09T00:03:43.349 INFO:tasks.workunit.client.1.vm06.stdout:3/445: chown d11/f5a 1070 1 2026-03-09T00:03:43.349 INFO:tasks.workunit.client.1.vm06.stdout:7/395: sync 2026-03-09T00:03:43.349 INFO:tasks.workunit.client.1.vm06.stdout:6/399: sync 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:9/331: sync 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:8/383: sync 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:4/323: sync 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:7/396: fdatasync d0/df/d1a/d27/d4c/f6d 0 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:7/397: chown d0/df/d1a/d22/l2e 32135 1 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:4/324: fdatasync d17/f35 0 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:9/332: dread - d1/d3/d12/d21/d9/f65 zero size 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:6/400: creat d4/d27/d3e/d57/f79 x:0 0 0 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:6/401: read - d4/d16/d53/f5f zero size 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:7/398: write d0/df/d17/f2d [2282804,73020] 0 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:7/399: fdatasync d0/df/d1a/d27/f4b 0 2026-03-09T00:03:43.350 INFO:tasks.workunit.client.1.vm06.stdout:7/400: write d0/df/f5c [117826,79383] 0 2026-03-09T00:03:43.351 INFO:tasks.workunit.client.0.vm03.stdout:0/194: dwrite d2/da/d1a/f1c [0,4194304] 0 2026-03-09T00:03:43.351 INFO:tasks.workunit.client.0.vm03.stdout:0/195: creat d2/d1f/f3f x:0 0 0 2026-03-09T00:03:43.352 INFO:tasks.workunit.client.1.vm06.stdout:8/384: getdents db 0 2026-03-09T00:03:43.356 INFO:tasks.workunit.client.1.vm06.stdout:3/446: dread d11/d28/d2e/d2f/d36/f75 [0,4194304] 0 2026-03-09T00:03:43.356 INFO:tasks.workunit.client.1.vm06.stdout:9/333: rmdir d1 39 2026-03-09T00:03:43.361 INFO:tasks.workunit.client.1.vm06.stdout:3/447: read d11/f1a [1611774,91244] 0 2026-03-09T00:03:43.364 INFO:tasks.workunit.client.1.vm06.stdout:3/448: creat d11/d28/d2e/d2f/f92 x:0 0 0 2026-03-09T00:03:43.364 INFO:tasks.workunit.client.1.vm06.stdout:9/334: rename d1/d3/d12/f68 to d1/d3/d4f/d52/f6b 0 2026-03-09T00:03:43.364 INFO:tasks.workunit.client.1.vm06.stdout:9/335: chown d1/d4/d2f/f43 18498900 1 2026-03-09T00:03:43.365 INFO:tasks.workunit.client.1.vm06.stdout:3/449: mknod d11/d28/d2e/d2f/d36/c93 0 2026-03-09T00:03:43.368 INFO:tasks.workunit.client.1.vm06.stdout:7/401: write d0/df/d1a/d27/d4c/f32 [1338983,46067] 0 2026-03-09T00:03:43.368 INFO:tasks.workunit.client.1.vm06.stdout:7/402: read - d0/d39/f3e zero size 2026-03-09T00:03:43.371 INFO:tasks.workunit.client.1.vm06.stdout:3/450: mkdir d11/d28/d2e/d2f/d5b/d94 0 2026-03-09T00:03:43.375 INFO:tasks.workunit.client.0.vm03.stdout:1/232: rmdir d4/d3a/d3d/d46 39 2026-03-09T00:03:43.376 INFO:tasks.workunit.client.0.vm03.stdout:1/233: getdents d4/d15/d1a 0 2026-03-09T00:03:43.376 INFO:tasks.workunit.client.0.vm03.stdout:1/234: creat d4/d15/f4e x:0 0 0 2026-03-09T00:03:43.376 INFO:tasks.workunit.client.1.vm06.stdout:3/451: symlink d11/d3f/d8d/l95 0 2026-03-09T00:03:43.378 INFO:tasks.workunit.client.1.vm06.stdout:9/336: dread d1/d4/ff [0,4194304] 0 2026-03-09T00:03:43.386 INFO:tasks.workunit.client.1.vm06.stdout:3/452: creat d11/d3f/f96 x:0 0 0 2026-03-09T00:03:43.391 INFO:tasks.workunit.client.1.vm06.stdout:9/337: unlink d1/d3/d12/d21/d9/c17 0 2026-03-09T00:03:43.393 INFO:tasks.workunit.client.1.vm06.stdout:3/453: symlink d11/d28/l97 0 2026-03-09T00:03:43.397 INFO:tasks.workunit.client.1.vm06.stdout:9/338: unlink d1/d3/d50/f53 0 2026-03-09T00:03:43.397 INFO:tasks.workunit.client.1.vm06.stdout:9/339: fdatasync d1/d3/d12/d21/d14/d25/f4a 0 2026-03-09T00:03:43.397 INFO:tasks.workunit.client.1.vm06.stdout:9/340: readlink d1/d4/l36 0 2026-03-09T00:03:43.404 INFO:tasks.workunit.client.0.vm03.stdout:1/235: dread d4/f42 [0,4194304] 0 2026-03-09T00:03:43.405 INFO:tasks.workunit.client.0.vm03.stdout:7/226: dwrite d2/d4/f34 [0,4194304] 0 2026-03-09T00:03:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:42 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:42 vm06.local ceph-mon[58395]: pgmap v6: 65 pgs: 65 active+clean; 1.1 GiB data, 4.5 GiB used, 115 GiB / 120 GiB avail 2026-03-09T00:03:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:42 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:42 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:42 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:03:43.426 INFO:tasks.workunit.client.1.vm06.stdout:2/494: sync 2026-03-09T00:03:43.432 INFO:tasks.workunit.client.1.vm06.stdout:9/341: dread d1/d4/f24 [0,4194304] 0 2026-03-09T00:03:43.441 INFO:tasks.workunit.client.1.vm06.stdout:6/402: dwrite d4/fb [0,4194304] 0 2026-03-09T00:03:43.442 INFO:tasks.workunit.client.0.vm03.stdout:1/236: truncate f0 3968 0 2026-03-09T00:03:43.442 INFO:tasks.workunit.client.1.vm06.stdout:8/385: write db/d53/d70/d38/d4d/f65 [2898863,68800] 0 2026-03-09T00:03:43.442 INFO:tasks.workunit.client.1.vm06.stdout:8/386: fsync db/d53/d70/f54 0 2026-03-09T00:03:43.444 INFO:tasks.workunit.client.1.vm06.stdout:2/495: creat d7/d1a/d39/f91 x:0 0 0 2026-03-09T00:03:43.444 INFO:tasks.workunit.client.1.vm06.stdout:2/496: chown f2 237409 1 2026-03-09T00:03:43.444 INFO:tasks.workunit.client.1.vm06.stdout:2/497: readlink d7/d1a/l72 0 2026-03-09T00:03:43.444 INFO:tasks.workunit.client.1.vm06.stdout:2/498: read - d7/d1a/d3c/f4d zero size 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.1.vm06.stdout:2/499: dread d7/f3a [0,4194304] 0 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.0.vm03.stdout:6/197: sync 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.0.vm03.stdout:5/176: sync 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.0.vm03.stdout:2/156: sync 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.0.vm03.stdout:5/177: chown d1c/d20/c28 24554 1 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.0.vm03.stdout:5/178: stat d1c/d20 0 2026-03-09T00:03:43.458 INFO:tasks.workunit.client.0.vm03.stdout:2/157: chown d8/d1b/f22 1757 1 2026-03-09T00:03:43.459 INFO:tasks.workunit.client.0.vm03.stdout:4/224: write d7/f15 [1877581,82390] 0 2026-03-09T00:03:43.459 INFO:tasks.workunit.client.0.vm03.stdout:4/225: rename d7 to d7/d20/d29/d45 22 2026-03-09T00:03:43.459 INFO:tasks.workunit.client.0.vm03.stdout:9/171: sync 2026-03-09T00:03:43.459 INFO:tasks.workunit.client.0.vm03.stdout:8/184: sync 2026-03-09T00:03:43.461 INFO:tasks.workunit.client.0.vm03.stdout:2/158: write d8/d17/f1d [211920,95259] 0 2026-03-09T00:03:43.461 INFO:tasks.workunit.client.0.vm03.stdout:2/159: write d8/f11 [1397127,13737] 0 2026-03-09T00:03:43.466 INFO:tasks.workunit.client.0.vm03.stdout:1/237: write d4/d3a/f26 [3188492,98937] 0 2026-03-09T00:03:43.466 INFO:tasks.workunit.client.0.vm03.stdout:1/238: write d4/d3a/f2c [1087015,117290] 0 2026-03-09T00:03:43.466 INFO:tasks.workunit.client.1.vm06.stdout:1/375: write f0 [1813405,107455] 0 2026-03-09T00:03:43.466 INFO:tasks.workunit.client.1.vm06.stdout:1/376: fsync d6/d21/d2d/f6c 0 2026-03-09T00:03:43.473 INFO:tasks.workunit.client.0.vm03.stdout:4/226: dread d7/f28 [0,4194304] 0 2026-03-09T00:03:43.478 INFO:tasks.workunit.client.1.vm06.stdout:2/500: dread d7/d1a/d56/f50 [0,4194304] 0 2026-03-09T00:03:43.478 INFO:tasks.workunit.client.1.vm06.stdout:2/501: write f6 [56105,87319] 0 2026-03-09T00:03:43.483 INFO:tasks.workunit.client.1.vm06.stdout:7/403: dwrite d0/df/d1a/f25 [0,4194304] 0 2026-03-09T00:03:43.493 INFO:tasks.workunit.client.0.vm03.stdout:6/198: unlink d13/f2f 0 2026-03-09T00:03:43.493 INFO:tasks.workunit.client.1.vm06.stdout:1/377: dread d6/f28 [0,4194304] 0 2026-03-09T00:03:43.493 INFO:tasks.workunit.client.1.vm06.stdout:1/378: fsync d6/d4c/d79/f5c 0 2026-03-09T00:03:43.496 INFO:tasks.workunit.client.1.vm06.stdout:3/454: dwrite d11/d28/d2e/f38 [4194304,4194304] 0 2026-03-09T00:03:43.497 INFO:tasks.workunit.client.0.vm03.stdout:5/179: creat d1c/d20/d2a/f42 x:0 0 0 2026-03-09T00:03:43.502 INFO:tasks.workunit.client.1.vm06.stdout:4/325: dwrite d17/d24/f2c [4194304,4194304] 0 2026-03-09T00:03:43.514 INFO:tasks.workunit.client.0.vm03.stdout:9/172: readlink d15/d1c/d28/l2b 0 2026-03-09T00:03:43.515 INFO:tasks.workunit.client.0.vm03.stdout:9/173: write f8 [2230317,48510] 0 2026-03-09T00:03:43.518 INFO:tasks.workunit.client.1.vm06.stdout:1/379: rmdir d6/d4c/d71 39 2026-03-09T00:03:43.519 INFO:tasks.workunit.client.0.vm03.stdout:8/185: unlink d7/df/d1e/l1f 0 2026-03-09T00:03:43.523 INFO:tasks.workunit.client.1.vm06.stdout:3/455: mknod d11/d28/d2e/d2f/d5b/d5f/c98 0 2026-03-09T00:03:43.523 INFO:tasks.workunit.client.1.vm06.stdout:3/456: dread - d11/d28/d2e/d2f/f92 zero size 2026-03-09T00:03:43.523 INFO:tasks.workunit.client.1.vm06.stdout:3/457: chown d11/d28/d2e/d2f/l70 69290 1 2026-03-09T00:03:43.523 INFO:tasks.workunit.client.1.vm06.stdout:3/458: write d11/d28/d57/f7b [795761,76770] 0 2026-03-09T00:03:43.535 INFO:tasks.workunit.client.0.vm03.stdout:2/160: creat d8/d17/f34 x:0 0 0 2026-03-09T00:03:43.539 INFO:tasks.workunit.client.1.vm06.stdout:1/380: chown d6/l10 5751494 1 2026-03-09T00:03:43.548 INFO:tasks.workunit.client.0.vm03.stdout:1/239: unlink d4/d3a/f28 0 2026-03-09T00:03:43.549 INFO:tasks.workunit.client.0.vm03.stdout:0/196: dwrite d2/fe [0,4194304] 0 2026-03-09T00:03:43.549 INFO:tasks.workunit.client.0.vm03.stdout:4/227: symlink d7/d20/d29/d38/d3a/l46 0 2026-03-09T00:03:43.551 INFO:tasks.workunit.client.1.vm06.stdout:6/403: dwrite d4/f68 [0,4194304] 0 2026-03-09T00:03:43.551 INFO:tasks.workunit.client.1.vm06.stdout:6/404: write d4/ff [3113007,53138] 0 2026-03-09T00:03:43.552 INFO:tasks.workunit.client.0.vm03.stdout:4/228: dread d7/d20/f34 [0,4194304] 0 2026-03-09T00:03:43.552 INFO:tasks.workunit.client.0.vm03.stdout:4/229: chown d7/d20/d29/c2d 2386835 1 2026-03-09T00:03:43.554 INFO:tasks.workunit.client.1.vm06.stdout:1/381: truncate d6/d21/d2d/f6c 1691496 0 2026-03-09T00:03:43.563 INFO:tasks.workunit.client.0.vm03.stdout:5/180: mkdir d1c/d20/d2a/d43 0 2026-03-09T00:03:43.565 INFO:tasks.workunit.client.0.vm03.stdout:5/181: read f11 [750466,5504] 0 2026-03-09T00:03:43.565 INFO:tasks.workunit.client.0.vm03.stdout:5/182: chown d1c/d20/d2a/f3d 9 1 2026-03-09T00:03:43.569 INFO:tasks.workunit.client.0.vm03.stdout:4/230: dread d7/d20/f21 [0,4194304] 0 2026-03-09T00:03:43.575 INFO:tasks.workunit.client.0.vm03.stdout:9/174: unlink cf 0 2026-03-09T00:03:43.586 INFO:tasks.workunit.client.1.vm06.stdout:3/459: dwrite d11/f48 [0,4194304] 0 2026-03-09T00:03:43.586 INFO:tasks.workunit.client.1.vm06.stdout:3/460: creat d11/d28/d2e/d2f/f99 x:0 0 0 2026-03-09T00:03:43.586 INFO:tasks.workunit.client.1.vm06.stdout:3/461: write d11/d28/f42 [3044959,737] 0 2026-03-09T00:03:43.587 INFO:tasks.workunit.client.0.vm03.stdout:8/186: symlink d7/l39 0 2026-03-09T00:03:43.587 INFO:tasks.workunit.client.0.vm03.stdout:8/187: chown d7/df/l17 71361 1 2026-03-09T00:03:43.602 INFO:tasks.workunit.client.1.vm06.stdout:7/404: dwrite d0/df/d1a/f44 [4194304,4194304] 0 2026-03-09T00:03:43.606 INFO:tasks.workunit.client.0.vm03.stdout:2/161: creat d8/d1b/d2a/d2e/f35 x:0 0 0 2026-03-09T00:03:43.606 INFO:tasks.workunit.client.0.vm03.stdout:2/162: chown d8/d17/f34 699 1 2026-03-09T00:03:43.606 INFO:tasks.workunit.client.1.vm06.stdout:4/326: dwrite d17/f1d [0,4194304] 0 2026-03-09T00:03:43.609 INFO:tasks.workunit.client.1.vm06.stdout:7/405: dread d0/d39/f56 [0,4194304] 0 2026-03-09T00:03:43.609 INFO:tasks.workunit.client.1.vm06.stdout:7/406: fdatasync d0/df/f5c 0 2026-03-09T00:03:43.609 INFO:tasks.workunit.client.0.vm03.stdout:1/240: creat d4/d3a/d32/f4f x:0 0 0 2026-03-09T00:03:43.609 INFO:tasks.workunit.client.0.vm03.stdout:1/241: fdatasync d4/d3a/f48 0 2026-03-09T00:03:43.620 INFO:tasks.workunit.client.1.vm06.stdout:2/502: dwrite d7/d1a/d25/d66/f8d [0,4194304] 0 2026-03-09T00:03:43.621 INFO:tasks.workunit.client.1.vm06.stdout:5/529: sync 2026-03-09T00:03:43.621 INFO:tasks.workunit.client.1.vm06.stdout:5/530: fsync d5/d44/f81 0 2026-03-09T00:03:43.622 INFO:tasks.workunit.client.1.vm06.stdout:0/416: sync 2026-03-09T00:03:43.630 INFO:tasks.workunit.client.1.vm06.stdout:9/342: write d1/d4/f44 [86962,24214] 0 2026-03-09T00:03:43.641 INFO:tasks.workunit.client.1.vm06.stdout:6/405: truncate d4/d27/d3e/f41 3945471 0 2026-03-09T00:03:43.644 INFO:tasks.workunit.client.1.vm06.stdout:1/382: rename d6/f7c to d6/d21/d2d/d3b/d42/f7d 0 2026-03-09T00:03:43.656 INFO:tasks.workunit.client.0.vm03.stdout:0/197: symlink d2/da/d36/d39/l40 0 2026-03-09T00:03:43.656 INFO:tasks.workunit.client.0.vm03.stdout:0/198: creat d2/da/d36/d39/f41 x:0 0 0 2026-03-09T00:03:43.657 INFO:tasks.workunit.client.0.vm03.stdout:8/188: dwrite d7/df/f32 [0,4194304] 0 2026-03-09T00:03:43.658 INFO:tasks.workunit.client.0.vm03.stdout:6/199: rename d13/d35/l3c to d13/d1e/l43 0 2026-03-09T00:03:43.663 INFO:tasks.workunit.client.1.vm06.stdout:4/327: symlink d17/d24/d3b/d5e/l6a 0 2026-03-09T00:03:43.669 INFO:tasks.workunit.client.0.vm03.stdout:2/163: symlink d8/d1b/d2a/l36 0 2026-03-09T00:03:43.669 INFO:tasks.workunit.client.0.vm03.stdout:1/242: symlink d4/d3a/d3d/d46/l50 0 2026-03-09T00:03:43.669 INFO:tasks.workunit.client.0.vm03.stdout:1/243: chown d4/fb 105037884 1 2026-03-09T00:03:43.676 INFO:tasks.workunit.client.1.vm06.stdout:4/328: dread d17/d24/f2c [4194304,4194304] 0 2026-03-09T00:03:43.680 INFO:tasks.workunit.client.0.vm03.stdout:0/199: mknod d2/d1f/c42 0 2026-03-09T00:03:43.690 INFO:tasks.workunit.client.0.vm03.stdout:0/200: dread - d2/da/f2d zero size 2026-03-09T00:03:43.690 INFO:tasks.workunit.client.0.vm03.stdout:0/201: chown d2/da/f2d 16009605 1 2026-03-09T00:03:43.693 INFO:tasks.workunit.client.1.vm06.stdout:5/531: dwrite d5/f8e [0,4194304] 0 2026-03-09T00:03:43.697 INFO:tasks.workunit.client.1.vm06.stdout:2/503: link d7/f5d d7/da/d1c/f92 0 2026-03-09T00:03:43.703 INFO:tasks.workunit.client.1.vm06.stdout:2/504: dread d7/da/f18 [0,4194304] 0 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.0.vm03.stdout:8/189: unlink d7/df/f32 0 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.0.vm03.stdout:8/190: read - d7/df/f31 zero size 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.0.vm03.stdout:8/191: creat d7/df/d1e/f3a x:0 0 0 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.0.vm03.stdout:8/192: creat d7/df/d1a/f3b x:0 0 0 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.1.vm06.stdout:6/406: truncate d4/f2a 1617099 0 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.1.vm06.stdout:1/383: creat d6/d21/d2d/d3b/f7e x:0 0 0 2026-03-09T00:03:43.715 INFO:tasks.workunit.client.1.vm06.stdout:3/462: rename d11/d28/d2e/d2f/d5b/f84 to d11/d28/d2e/d7e/d83/f9a 0 2026-03-09T00:03:43.717 INFO:tasks.workunit.client.1.vm06.stdout:9/343: rename d1 to d1/d3/d12/d21/d6c 22 2026-03-09T00:03:43.725 INFO:tasks.workunit.client.0.vm03.stdout:4/231: truncate d7/d20/f21 1359284 0 2026-03-09T00:03:43.730 INFO:tasks.workunit.client.1.vm06.stdout:0/417: dwrite d3/d18/d1f/d39/d49/f64 [0,4194304] 0 2026-03-09T00:03:43.730 INFO:tasks.workunit.client.1.vm06.stdout:7/407: truncate d0/df/d1a/f25 643903 0 2026-03-09T00:03:43.743 INFO:tasks.workunit.client.0.vm03.stdout:2/164: mknod d8/d17/c37 0 2026-03-09T00:03:43.750 INFO:tasks.workunit.client.0.vm03.stdout:3/140: sync 2026-03-09T00:03:43.759 INFO:tasks.workunit.client.1.vm06.stdout:4/329: creat d17/d24/d49/d5f/f6b x:0 0 0 2026-03-09T00:03:43.762 INFO:tasks.workunit.client.0.vm03.stdout:8/193: creat d7/f3c x:0 0 0 2026-03-09T00:03:43.775 INFO:tasks.workunit.client.0.vm03.stdout:8/194: dread - d7/df/f29 zero size 2026-03-09T00:03:43.775 INFO:tasks.workunit.client.0.vm03.stdout:5/183: rename d1c/d20/d2a to d1c/d20/d44 0 2026-03-09T00:03:43.775 INFO:tasks.workunit.client.0.vm03.stdout:5/184: chown d1c/d20/d44/f3d 14529 1 2026-03-09T00:03:43.775 INFO:tasks.workunit.client.0.vm03.stdout:2/165: unlink d8/f18 0 2026-03-09T00:03:43.784 INFO:tasks.workunit.client.0.vm03.stdout:5/185: write ff [2321049,78766] 0 2026-03-09T00:03:43.784 INFO:tasks.workunit.client.0.vm03.stdout:8/195: read d7/f10 [3364612,19114] 0 2026-03-09T00:03:43.789 INFO:tasks.workunit.client.1.vm06.stdout:6/407: rmdir d4/d16/d46/d77 0 2026-03-09T00:03:43.790 INFO:tasks.workunit.client.0.vm03.stdout:3/141: mknod d2/c2c 0 2026-03-09T00:03:43.792 INFO:tasks.workunit.client.0.vm03.stdout:2/166: rmdir d8/d1b/d24 39 2026-03-09T00:03:43.798 INFO:tasks.workunit.client.1.vm06.stdout:7/408: symlink d0/df/d1a/d3a/l71 0 2026-03-09T00:03:43.808 INFO:tasks.workunit.client.1.vm06.stdout:7/409: stat d0/df/d1a/d22/f28 0 2026-03-09T00:03:43.809 INFO:tasks.workunit.client.1.vm06.stdout:4/330: creat d17/d24/d3b/d5e/f6c x:0 0 0 2026-03-09T00:03:43.809 INFO:tasks.workunit.client.0.vm03.stdout:3/142: mkdir d2/db/d2d 0 2026-03-09T00:03:43.809 INFO:tasks.workunit.client.0.vm03.stdout:2/167: creat d8/d1b/d24/f38 x:0 0 0 2026-03-09T00:03:43.811 INFO:tasks.workunit.client.0.vm03.stdout:1/244: dwrite d4/fb [4194304,4194304] 0 2026-03-09T00:03:43.811 INFO:tasks.workunit.client.0.vm03.stdout:6/200: dwrite d13/d1e/f2d [0,4194304] 0 2026-03-09T00:03:43.815 INFO:tasks.workunit.client.0.vm03.stdout:2/168: write d8/d1b/f1f [2625544,113147] 0 2026-03-09T00:03:43.815 INFO:tasks.workunit.client.0.vm03.stdout:2/169: stat d8 0 2026-03-09T00:03:43.815 INFO:tasks.workunit.client.0.vm03.stdout:2/170: dread - d8/d1b/f22 zero size 2026-03-09T00:03:43.818 INFO:tasks.workunit.client.0.vm03.stdout:4/232: write d7/d27/f31 [1453732,22492] 0 2026-03-09T00:03:43.829 INFO:tasks.workunit.client.1.vm06.stdout:6/408: creat d4/d27/d3e/f7a x:0 0 0 2026-03-09T00:03:43.833 INFO:tasks.workunit.client.0.vm03.stdout:3/143: rename d2/l23 to d2/db/d2d/l2e 0 2026-03-09T00:03:43.833 INFO:tasks.workunit.client.0.vm03.stdout:3/144: chown d2/db/f17 138690979 1 2026-03-09T00:03:43.833 INFO:tasks.workunit.client.1.vm06.stdout:7/410: creat d0/df/d1a/f72 x:0 0 0 2026-03-09T00:03:43.836 INFO:tasks.workunit.client.1.vm06.stdout:4/331: link fe d17/d24/d3b/d5e/f6d 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:3/145: stat d2/c3 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:6/201: mkdir d13/d1e/d44 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:3/146: creat d2/db/d2d/f2f x:0 0 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:3/147: readlink d2/db/l1e 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:3/148: truncate f1 4025422 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:3/149: link d2/db/f24 d2/f30 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.0.vm03.stdout:3/150: fsync d2/db/f13 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.1.vm06.stdout:7/411: dread d0/df/d1a/f50 [0,4194304] 0 2026-03-09T00:03:43.846 INFO:tasks.workunit.client.1.vm06.stdout:4/332: read f15 [278912,102608] 0 2026-03-09T00:03:43.847 INFO:tasks.workunit.client.1.vm06.stdout:6/409: unlink d4/f2d 0 2026-03-09T00:03:43.847 INFO:tasks.workunit.client.1.vm06.stdout:7/412: rename d0/df/f5c to d0/df/d1a/d3a/d4e/d5e/f73 0 2026-03-09T00:03:43.847 INFO:tasks.workunit.client.1.vm06.stdout:7/413: write d0/df/d1a/d3a/f3c [3633286,88963] 0 2026-03-09T00:03:43.847 INFO:tasks.workunit.client.1.vm06.stdout:4/333: truncate f14 1273825 0 2026-03-09T00:03:43.847 INFO:tasks.workunit.client.1.vm06.stdout:4/334: write d17/d24/f5c [987575,16082] 0 2026-03-09T00:03:43.847 INFO:tasks.workunit.client.1.vm06.stdout:6/410: symlink d4/d27/d3e/d45/l7b 0 2026-03-09T00:03:43.848 INFO:tasks.workunit.client.1.vm06.stdout:7/414: creat d0/df/d17/f74 x:0 0 0 2026-03-09T00:03:43.849 INFO:tasks.workunit.client.0.vm03.stdout:3/151: mknod d2/c31 0 2026-03-09T00:03:43.850 INFO:tasks.workunit.client.1.vm06.stdout:4/335: mkdir d17/d24/d3b/d5e/d6e 0 2026-03-09T00:03:43.853 INFO:tasks.workunit.client.1.vm06.stdout:4/336: readlink d17/l59 0 2026-03-09T00:03:43.856 INFO:tasks.workunit.client.1.vm06.stdout:6/411: unlink d4/d27/c6f 0 2026-03-09T00:03:43.859 INFO:tasks.workunit.client.1.vm06.stdout:7/415: mknod d0/df/d1a/d27/d70/c75 0 2026-03-09T00:03:43.862 INFO:tasks.workunit.client.1.vm06.stdout:6/412: mknod d4/d66/c7c 0 2026-03-09T00:03:43.880 INFO:tasks.workunit.client.1.vm06.stdout:7/416: symlink d0/df/d1a/d27/d4c/d40/d51/l76 0 2026-03-09T00:03:43.880 INFO:tasks.workunit.client.1.vm06.stdout:7/417: read - d0/df/d1a/d27/d4c/f6d zero size 2026-03-09T00:03:43.884 INFO:tasks.workunit.client.0.vm03.stdout:1/245: dread d4/d15/d1a/f1b [0,4194304] 0 2026-03-09T00:03:43.884 INFO:tasks.workunit.client.1.vm06.stdout:3/463: dwrite d11/d28/d2e/f32 [0,4194304] 0 2026-03-09T00:03:43.885 INFO:tasks.workunit.client.1.vm06.stdout:1/384: dwrite d6/d21/d2d/f5b [0,4194304] 0 2026-03-09T00:03:43.893 INFO:tasks.workunit.client.0.vm03.stdout:7/227: sync 2026-03-09T00:03:43.893 INFO:tasks.workunit.client.0.vm03.stdout:7/228: chown d2/d1f/d3a/c2d 498716506 1 2026-03-09T00:03:43.893 INFO:tasks.workunit.client.0.vm03.stdout:7/229: stat d2/d1f/d3a 0 2026-03-09T00:03:43.893 INFO:tasks.workunit.client.0.vm03.stdout:6/202: getdents d13/d35 0 2026-03-09T00:03:43.894 INFO:tasks.workunit.client.0.vm03.stdout:1/246: getdents d4/d6 0 2026-03-09T00:03:43.901 INFO:tasks.workunit.client.0.vm03.stdout:7/230: creat d2/d1f/d42/d43/f49 x:0 0 0 2026-03-09T00:03:43.901 INFO:tasks.workunit.client.0.vm03.stdout:7/231: write d2/d4/fb [622028,29696] 0 2026-03-09T00:03:43.901 INFO:tasks.workunit.client.0.vm03.stdout:7/232: creat d2/d1f/d42/d43/f4a x:0 0 0 2026-03-09T00:03:43.901 INFO:tasks.workunit.client.0.vm03.stdout:7/233: read - d2/d4/f2e zero size 2026-03-09T00:03:43.905 INFO:tasks.workunit.client.0.vm03.stdout:1/247: mknod d4/d15/c51 0 2026-03-09T00:03:43.906 INFO:tasks.workunit.client.1.vm06.stdout:3/464: mkdir d11/d28/d4d/d9b 0 2026-03-09T00:03:43.919 INFO:tasks.workunit.client.0.vm03.stdout:2/171: dwrite d8/d17/f34 [0,4194304] 0 2026-03-09T00:03:43.919 INFO:tasks.workunit.client.0.vm03.stdout:2/172: chown d8/d17/f2c 1538626764 1 2026-03-09T00:03:43.919 INFO:tasks.workunit.client.0.vm03.stdout:2/173: write d8/d1b/f1e [2787705,67215] 0 2026-03-09T00:03:43.920 INFO:tasks.workunit.client.1.vm06.stdout:0/418: dwrite d3/d18/d28/f70 [0,4194304] 0 2026-03-09T00:03:43.921 INFO:tasks.workunit.client.0.vm03.stdout:0/202: dwrite d2/da/dd/f24 [0,4194304] 0 2026-03-09T00:03:43.921 INFO:tasks.workunit.client.0.vm03.stdout:7/234: creat d2/d1f/d3a/d31/d37/d39/f4b x:0 0 0 2026-03-09T00:03:43.921 INFO:tasks.workunit.client.0.vm03.stdout:7/235: chown d2/d1f 3537300 1 2026-03-09T00:03:43.923 INFO:tasks.workunit.client.0.vm03.stdout:0/203: dread d2/da/f1b [0,4194304] 0 2026-03-09T00:03:43.923 INFO:tasks.workunit.client.0.vm03.stdout:0/204: creat d2/d1f/f43 x:0 0 0 2026-03-09T00:03:43.923 INFO:tasks.workunit.client.0.vm03.stdout:0/205: fdatasync d2/da/f2d 0 2026-03-09T00:03:43.924 INFO:tasks.workunit.client.0.vm03.stdout:6/203: mknod d13/d1e/c45 0 2026-03-09T00:03:43.925 INFO:tasks.workunit.client.1.vm06.stdout:1/385: write d6/d21/f2e [815592,121005] 0 2026-03-09T00:03:43.925 INFO:tasks.workunit.client.1.vm06.stdout:1/386: write d6/f34 [750172,103985] 0 2026-03-09T00:03:43.925 INFO:tasks.workunit.client.1.vm06.stdout:5/532: dwrite d5/d1c/d21/d28/f59 [0,4194304] 0 2026-03-09T00:03:43.925 INFO:tasks.workunit.client.1.vm06.stdout:5/533: write d5/d1c/d23/d34/d47/f87 [427980,50460] 0 2026-03-09T00:03:43.926 INFO:tasks.workunit.client.1.vm06.stdout:1/387: dread d6/d21/f7b [0,4194304] 0 2026-03-09T00:03:43.927 INFO:tasks.workunit.client.0.vm03.stdout:1/248: stat d4/d15/l22 0 2026-03-09T00:03:43.927 INFO:tasks.workunit.client.1.vm06.stdout:1/388: write d6/d21/f69 [240338,30413] 0 2026-03-09T00:03:43.933 INFO:tasks.workunit.client.1.vm06.stdout:5/534: link d5/d1c/d21/d28/c67 d5/d1c/d68/cb6 0 2026-03-09T00:03:43.934 INFO:tasks.workunit.client.1.vm06.stdout:2/505: dwrite d7/f17 [0,4194304] 0 2026-03-09T00:03:43.934 INFO:tasks.workunit.client.1.vm06.stdout:2/506: fsync d7/d1a/d39/f91 0 2026-03-09T00:03:43.936 INFO:tasks.workunit.client.0.vm03.stdout:2/174: rename d8/cc to d8/d1b/d2a/c39 0 2026-03-09T00:03:43.939 INFO:tasks.workunit.client.1.vm06.stdout:1/389: mkdir d6/d4c/d51/d7f 0 2026-03-09T00:03:43.939 INFO:tasks.workunit.client.1.vm06.stdout:1/390: write d6/d21/d2d/f6c [2554064,112044] 0 2026-03-09T00:03:43.942 INFO:tasks.workunit.client.0.vm03.stdout:1/249: dread d4/f9 [0,4194304] 0 2026-03-09T00:03:43.953 INFO:tasks.workunit.client.0.vm03.stdout:1/250: fsync d4/d3a/d32/f4b 0 2026-03-09T00:03:43.954 INFO:tasks.workunit.client.1.vm06.stdout:5/535: mknod d5/d1c/d23/d34/cb7 0 2026-03-09T00:03:43.954 INFO:tasks.workunit.client.1.vm06.stdout:2/507: mkdir d7/da/d93 0 2026-03-09T00:03:43.954 INFO:tasks.workunit.client.0.vm03.stdout:5/186: write fb [1444288,43448] 0 2026-03-09T00:03:43.954 INFO:tasks.workunit.client.0.vm03.stdout:5/187: chown d1c/d20/d44 4557 1 2026-03-09T00:03:43.954 INFO:tasks.workunit.client.0.vm03.stdout:7/236: creat d2/d1f/d3a/d31/d37/f4c x:0 0 0 2026-03-09T00:03:43.954 INFO:tasks.workunit.client.0.vm03.stdout:7/237: creat d2/f4d x:0 0 0 2026-03-09T00:03:43.956 INFO:tasks.workunit.client.0.vm03.stdout:0/206: getdents d2 0 2026-03-09T00:03:43.960 INFO:tasks.workunit.client.1.vm06.stdout:1/391: getdents d6/d21/d2d/d37/d6d 0 2026-03-09T00:03:43.961 INFO:tasks.workunit.client.0.vm03.stdout:6/204: symlink d13/d1e/d44/l46 0 2026-03-09T00:03:43.962 INFO:tasks.workunit.client.0.vm03.stdout:7/238: dread d2/fc [0,4194304] 0 2026-03-09T00:03:43.964 INFO:tasks.workunit.client.0.vm03.stdout:3/152: dwrite d2/db/f28 [0,4194304] 0 2026-03-09T00:03:43.964 INFO:tasks.workunit.client.0.vm03.stdout:3/153: stat d2/db/f17 0 2026-03-09T00:03:43.964 INFO:tasks.workunit.client.0.vm03.stdout:3/154: stat d2 0 2026-03-09T00:03:43.964 INFO:tasks.workunit.client.1.vm06.stdout:9/344: dwrite d1/d4/f6 [0,4194304] 0 2026-03-09T00:03:43.964 INFO:tasks.workunit.client.1.vm06.stdout:9/345: chown d1/d3 3 1 2026-03-09T00:03:43.964 INFO:tasks.workunit.client.1.vm06.stdout:9/346: write d1/d3/f5c [1661282,28720] 0 2026-03-09T00:03:43.967 INFO:tasks.workunit.client.1.vm06.stdout:5/536: mknod d5/cb8 0 2026-03-09T00:03:43.969 INFO:tasks.workunit.client.0.vm03.stdout:2/175: mknod d8/d1b/d2a/c3a 0 2026-03-09T00:03:43.969 INFO:tasks.workunit.client.0.vm03.stdout:2/176: readlink d8/le 0 2026-03-09T00:03:43.974 INFO:tasks.workunit.client.1.vm06.stdout:2/508: symlink d7/d1a/d25/d66/d87/l94 0 2026-03-09T00:03:43.978 INFO:tasks.workunit.client.1.vm06.stdout:5/537: write d5/fae [1452153,101316] 0 2026-03-09T00:03:43.979 INFO:tasks.workunit.client.1.vm06.stdout:2/509: dread d7/da/db/de/f53 [0,4194304] 0 2026-03-09T00:03:43.980 INFO:tasks.workunit.client.1.vm06.stdout:5/538: write d5/d1c/d21/d28/d5e/f69 [182035,74206] 0 2026-03-09T00:03:43.984 INFO:tasks.workunit.client.1.vm06.stdout:9/347: truncate d1/f16 1706515 0 2026-03-09T00:03:43.993 INFO:tasks.workunit.client.1.vm06.stdout:6/413: dwrite d4/d27/d42/d52/f6c [0,4194304] 0 2026-03-09T00:03:43.994 INFO:tasks.workunit.client.1.vm06.stdout:2/510: link d7/da/l15 d7/d1b/d5a/d86/l95 0 2026-03-09T00:03:43.995 INFO:tasks.workunit.client.1.vm06.stdout:2/511: chown d7/da/db/de/f60 53541 1 2026-03-09T00:03:43.995 INFO:tasks.workunit.client.1.vm06.stdout:2/512: write d7/d1b/d5a/d86/f8b [916640,78608] 0 2026-03-09T00:03:43.995 INFO:tasks.workunit.client.1.vm06.stdout:2/513: fdatasync d7/d1a/d56/f50 0 2026-03-09T00:03:43.995 INFO:tasks.workunit.client.1.vm06.stdout:0/419: dwrite d3/d18/d2c/f4d [0,4194304] 0 2026-03-09T00:03:43.996 INFO:tasks.workunit.client.0.vm03.stdout:1/251: mkdir d4/d6/d52 0 2026-03-09T00:03:44.000 INFO:tasks.workunit.client.1.vm06.stdout:2/514: write d7/d1a/d25/d66/f8d [1839837,58555] 0 2026-03-09T00:03:44.004 INFO:tasks.workunit.client.1.vm06.stdout:2/515: fdatasync d7/d1a/d56/f50 0 2026-03-09T00:03:44.004 INFO:tasks.workunit.client.0.vm03.stdout:4/233: write d7/d20/f21 [705434,78422] 0 2026-03-09T00:03:44.004 INFO:tasks.workunit.client.1.vm06.stdout:6/414: dread d4/fc [0,4194304] 0 2026-03-09T00:03:44.005 INFO:tasks.workunit.client.1.vm06.stdout:6/415: write d4/f40 [5500315,55305] 0 2026-03-09T00:03:44.005 INFO:tasks.workunit.client.1.vm06.stdout:8/387: sync 2026-03-09T00:03:44.005 INFO:tasks.workunit.client.1.vm06.stdout:4/337: fsync d17/d24/d49/d5f/f6b 0 2026-03-09T00:03:44.005 INFO:tasks.workunit.client.1.vm06.stdout:4/338: readlink d17/d24/d49/l46 0 2026-03-09T00:03:44.009 INFO:tasks.workunit.client.1.vm06.stdout:4/339: dread d17/f20 [0,4194304] 0 2026-03-09T00:03:44.012 INFO:tasks.workunit.client.0.vm03.stdout:9/175: sync 2026-03-09T00:03:44.012 INFO:tasks.workunit.client.0.vm03.stdout:9/176: stat fd 0 2026-03-09T00:03:44.021 INFO:tasks.workunit.client.1.vm06.stdout:9/348: creat d1/d3/d2b/f6d x:0 0 0 2026-03-09T00:03:44.021 INFO:tasks.workunit.client.1.vm06.stdout:9/349: fdatasync d1/d4/f39 0 2026-03-09T00:03:44.023 INFO:tasks.workunit.client.0.vm03.stdout:5/188: unlink f11 0 2026-03-09T00:03:44.025 INFO:tasks.workunit.client.0.vm03.stdout:9/177: dread f8 [0,4194304] 0 2026-03-09T00:03:44.031 INFO:tasks.workunit.client.0.vm03.stdout:0/207: creat d2/da/f44 x:0 0 0 2026-03-09T00:03:44.032 INFO:tasks.workunit.client.0.vm03.stdout:9/178: dread d15/f17 [0,4194304] 0 2026-03-09T00:03:44.033 INFO:tasks.workunit.client.0.vm03.stdout:6/205: read d13/d1e/f21 [1825371,68987] 0 2026-03-09T00:03:44.033 INFO:tasks.workunit.client.0.vm03.stdout:6/206: write fb [959983,116783] 0 2026-03-09T00:03:44.034 INFO:tasks.workunit.client.0.vm03.stdout:3/155: dwrite d2/db/f14 [0,4194304] 0 2026-03-09T00:03:44.034 INFO:tasks.workunit.client.0.vm03.stdout:3/156: chown d2/db/f1a 13851 1 2026-03-09T00:03:44.035 INFO:tasks.workunit.client.0.vm03.stdout:7/239: mknod d2/d1f/d3a/c4e 0 2026-03-09T00:03:44.036 INFO:tasks.workunit.client.0.vm03.stdout:3/157: dread d2/f1d [0,4194304] 0 2026-03-09T00:03:44.036 INFO:tasks.workunit.client.0.vm03.stdout:3/158: write d2/db/f21 [660251,75018] 0 2026-03-09T00:03:44.041 INFO:tasks.workunit.client.0.vm03.stdout:7/240: dread d2/f3 [0,4194304] 0 2026-03-09T00:03:44.049 INFO:tasks.workunit.client.0.vm03.stdout:7/241: chown d2/d1f/d3a/l27 3460744 1 2026-03-09T00:03:44.050 INFO:tasks.workunit.client.0.vm03.stdout:2/177: symlink d8/d17/l3b 0 2026-03-09T00:03:44.053 INFO:tasks.workunit.client.0.vm03.stdout:1/252: rename d4/f9 to d4/d3a/d32/f53 0 2026-03-09T00:03:44.059 INFO:tasks.workunit.client.0.vm03.stdout:6/207: mknod d13/d35/c47 0 2026-03-09T00:03:44.061 INFO:tasks.workunit.client.1.vm06.stdout:0/420: rename d3/d18/d28/d45/l5c to d3/d18/d2c/d2d/l8a 0 2026-03-09T00:03:44.061 INFO:tasks.workunit.client.0.vm03.stdout:3/159: mknod d2/db/c32 0 2026-03-09T00:03:44.063 INFO:tasks.workunit.client.0.vm03.stdout:7/242: symlink d2/d1f/d42/d46/l4f 0 2026-03-09T00:03:44.073 INFO:tasks.workunit.client.1.vm06.stdout:2/516: mkdir d7/d1a/d96 0 2026-03-09T00:03:44.073 INFO:tasks.workunit.client.1.vm06.stdout:5/539: dwrite d5/f14 [4194304,4194304] 0 2026-03-09T00:03:44.073 INFO:tasks.workunit.client.1.vm06.stdout:5/540: fdatasync d5/d1c/f75 0 2026-03-09T00:03:44.074 INFO:tasks.workunit.client.0.vm03.stdout:0/208: rename d2/c16 to d2/da/d36/d39/c45 0 2026-03-09T00:03:44.076 INFO:tasks.workunit.client.1.vm06.stdout:5/541: write d5/d1c/d68/fb4 [169631,86155] 0 2026-03-09T00:03:44.077 INFO:tasks.workunit.client.1.vm06.stdout:7/418: rmdir d0/df 39 2026-03-09T00:03:44.077 INFO:tasks.workunit.client.0.vm03.stdout:1/253: symlink d4/d3a/d3d/d46/l54 0 2026-03-09T00:03:44.078 INFO:tasks.workunit.client.1.vm06.stdout:6/416: mkdir d4/d27/d42/d52/d7d 0 2026-03-09T00:03:44.078 INFO:tasks.workunit.client.0.vm03.stdout:6/208: creat d13/d1e/f48 x:0 0 0 2026-03-09T00:03:44.087 INFO:tasks.workunit.client.1.vm06.stdout:8/388: unlink db/c1b 0 2026-03-09T00:03:44.088 INFO:tasks.workunit.client.1.vm06.stdout:8/389: write db/d1e/d46/f69 [3306882,41720] 0 2026-03-09T00:03:44.088 INFO:tasks.workunit.client.1.vm06.stdout:8/390: getdents db/d53/d70/d38/d4d/d79 0 2026-03-09T00:03:44.088 INFO:tasks.workunit.client.0.vm03.stdout:3/160: symlink d2/db/l33 0 2026-03-09T00:03:44.088 INFO:tasks.workunit.client.0.vm03.stdout:3/161: write d2/f5 [3042735,111062] 0 2026-03-09T00:03:44.088 INFO:tasks.workunit.client.0.vm03.stdout:3/162: chown d2/db/f10 106508010 1 2026-03-09T00:03:44.095 INFO:tasks.workunit.client.1.vm06.stdout:4/340: link d17/d5b/f64 d17/d24/d3b/d5e/f6f 0 2026-03-09T00:03:44.111 INFO:tasks.workunit.client.1.vm06.stdout:0/421: rename d3/d18/d28/d45/l76 to d3/d18/d2c/d2d/d74/l8b 0 2026-03-09T00:03:44.111 INFO:tasks.workunit.client.1.vm06.stdout:2/517: stat d7/d1a/d25/l77 0 2026-03-09T00:03:44.111 INFO:tasks.workunit.client.0.vm03.stdout:0/209: link d2/l30 d2/da/d1a/l46 0 2026-03-09T00:03:44.111 INFO:tasks.workunit.client.0.vm03.stdout:3/163: write d2/db/f28 [3781018,15302] 0 2026-03-09T00:03:44.111 INFO:tasks.workunit.client.0.vm03.stdout:1/254: truncate f2 6447740 0 2026-03-09T00:03:44.114 INFO:tasks.workunit.client.1.vm06.stdout:4/341: dread d17/d24/d3b/d5e/f6d [0,4194304] 0 2026-03-09T00:03:44.119 INFO:tasks.workunit.client.1.vm06.stdout:5/542: truncate d5/f43 53998 0 2026-03-09T00:03:44.119 INFO:tasks.workunit.client.1.vm06.stdout:5/543: read d5/d1c/f2d [164477,15155] 0 2026-03-09T00:03:44.119 INFO:tasks.workunit.client.1.vm06.stdout:5/544: chown d5/ff 32 1 2026-03-09T00:03:44.121 INFO:tasks.workunit.client.1.vm06.stdout:7/419: getdents d0/df/d1a/d3a/d4e 0 2026-03-09T00:03:44.128 INFO:tasks.workunit.client.1.vm06.stdout:0/422: mkdir d3/d18/d2c/d2d/d8c 0 2026-03-09T00:03:44.130 INFO:tasks.workunit.client.1.vm06.stdout:2/518: stat d7/c40 0 2026-03-09T00:03:44.132 INFO:tasks.workunit.client.0.vm03.stdout:4/234: dwrite d7/d20/d29/f2a [0,4194304] 0 2026-03-09T00:03:44.136 INFO:tasks.workunit.client.1.vm06.stdout:4/342: mknod d17/d24/d3b/c70 0 2026-03-09T00:03:44.140 INFO:tasks.workunit.client.0.vm03.stdout:0/210: dread d2/da/f1b [0,4194304] 0 2026-03-09T00:03:44.147 INFO:tasks.workunit.client.0.vm03.stdout:4/235: symlink d7/d20/d29/d38/d3a/l47 0 2026-03-09T00:03:44.148 INFO:tasks.workunit.client.0.vm03.stdout:4/236: link d7/d20/l42 d7/d20/d29/l48 0 2026-03-09T00:03:44.148 INFO:tasks.workunit.client.0.vm03.stdout:4/237: write d7/f1c [2972631,29893] 0 2026-03-09T00:03:44.148 INFO:tasks.workunit.client.1.vm06.stdout:4/343: rmdir d17/d24/d49/d5f 39 2026-03-09T00:03:44.149 INFO:tasks.workunit.client.0.vm03.stdout:4/238: symlink d7/d20/d29/d38/d3a/l49 0 2026-03-09T00:03:44.152 INFO:tasks.workunit.client.0.vm03.stdout:4/239: dread d7/f22 [0,4194304] 0 2026-03-09T00:03:44.166 INFO:tasks.workunit.client.1.vm06.stdout:1/392: dwrite d6/d21/f2e [0,4194304] 0 2026-03-09T00:03:44.167 INFO:tasks.workunit.client.0.vm03.stdout:5/189: dwrite d1c/d20/d44/f42 [0,4194304] 0 2026-03-09T00:03:44.169 INFO:tasks.workunit.client.0.vm03.stdout:5/190: chown d1c/f1e 3179223 1 2026-03-09T00:03:44.169 INFO:tasks.workunit.client.0.vm03.stdout:5/191: fdatasync f12 0 2026-03-09T00:03:44.170 INFO:tasks.workunit.client.0.vm03.stdout:5/192: getdents d1c/d20/d44/d3b 0 2026-03-09T00:03:44.170 INFO:tasks.workunit.client.0.vm03.stdout:5/193: readlink d1c/d20/d44/l40 0 2026-03-09T00:03:44.197 INFO:tasks.workunit.client.0.vm03.stdout:2/178: dwrite d8/d1b/f31 [0,4194304] 0 2026-03-09T00:03:44.204 INFO:tasks.workunit.client.0.vm03.stdout:4/240: write d7/d27/f2c [2213514,6165] 0 2026-03-09T00:03:44.205 INFO:tasks.workunit.client.0.vm03.stdout:4/241: creat d7/d23/f4a x:0 0 0 2026-03-09T00:03:44.206 INFO:tasks.workunit.client.0.vm03.stdout:4/242: chown d7/d27/c40 199 1 2026-03-09T00:03:44.207 INFO:tasks.workunit.client.0.vm03.stdout:2/179: read d8/d17/f34 [1899016,9690] 0 2026-03-09T00:03:44.207 INFO:tasks.workunit.client.0.vm03.stdout:2/180: dread - d8/d1b/f32 zero size 2026-03-09T00:03:44.207 INFO:tasks.workunit.client.0.vm03.stdout:2/181: write d8/f15 [1913214,70605] 0 2026-03-09T00:03:44.207 INFO:tasks.workunit.client.0.vm03.stdout:2/182: write f6 [4734808,7901] 0 2026-03-09T00:03:44.207 INFO:tasks.workunit.client.0.vm03.stdout:2/183: link d8/d17/f1c d8/d17/f3c 0 2026-03-09T00:03:44.213 INFO:tasks.workunit.client.1.vm06.stdout:8/391: dwrite db/d1e/d46/f4b [0,4194304] 0 2026-03-09T00:03:44.213 INFO:tasks.workunit.client.1.vm06.stdout:8/392: dread - db/dd/d48/f68 zero size 2026-03-09T00:03:44.214 INFO:tasks.workunit.client.1.vm06.stdout:8/393: mkdir db/d53/d6d/d7b 0 2026-03-09T00:03:44.214 INFO:tasks.workunit.client.1.vm06.stdout:8/394: getdents db/d53 0 2026-03-09T00:03:44.215 INFO:tasks.workunit.client.1.vm06.stdout:8/395: mkdir db/d53/d7c 0 2026-03-09T00:03:44.215 INFO:tasks.workunit.client.1.vm06.stdout:8/396: mknod db/d53/d70/d38/d47/c7d 0 2026-03-09T00:03:44.217 INFO:tasks.workunit.client.0.vm03.stdout:2/184: dread d8/fb [0,4194304] 0 2026-03-09T00:03:44.219 INFO:tasks.workunit.client.1.vm06.stdout:8/397: mknod db/d53/d70/c7e 0 2026-03-09T00:03:44.223 INFO:tasks.workunit.client.1.vm06.stdout:8/398: creat db/dd/d48/f7f x:0 0 0 2026-03-09T00:03:44.227 INFO:tasks.workunit.client.0.vm03.stdout:2/185: dread d8/d17/f1d [0,4194304] 0 2026-03-09T00:03:44.227 INFO:tasks.workunit.client.0.vm03.stdout:2/186: dread - d8/d1b/d2a/d2e/f35 zero size 2026-03-09T00:03:44.227 INFO:tasks.workunit.client.0.vm03.stdout:2/187: read d8/f15 [987521,22051] 0 2026-03-09T00:03:44.227 INFO:tasks.workunit.client.0.vm03.stdout:2/188: creat d8/d1b/f3d x:0 0 0 2026-03-09T00:03:44.241 INFO:tasks.workunit.client.0.vm03.stdout:0/211: dwrite d2/da/dd/f11 [0,4194304] 0 2026-03-09T00:03:44.241 INFO:tasks.workunit.client.0.vm03.stdout:0/212: chown d2/fb 264963766 1 2026-03-09T00:03:44.242 INFO:tasks.workunit.client.0.vm03.stdout:0/213: unlink d2/da/d1a/c29 0 2026-03-09T00:03:44.246 INFO:tasks.workunit.client.0.vm03.stdout:0/214: dread f0 [0,4194304] 0 2026-03-09T00:03:44.247 INFO:tasks.workunit.client.0.vm03.stdout:0/215: truncate d2/da/dd/f24 783333 0 2026-03-09T00:03:44.298 INFO:tasks.workunit.client.0.vm03.stdout:8/196: sync 2026-03-09T00:03:44.298 INFO:tasks.workunit.client.0.vm03.stdout:8/197: chown d7/df/d1a/f2e 6272 1 2026-03-09T00:03:44.298 INFO:tasks.workunit.client.0.vm03.stdout:8/198: dread - d7/df/f37 zero size 2026-03-09T00:03:44.300 INFO:tasks.workunit.client.1.vm06.stdout:0/423: dwrite d3/d18/d2c/d2d/d31/f5d [0,4194304] 0 2026-03-09T00:03:44.301 INFO:tasks.workunit.client.1.vm06.stdout:0/424: dread d3/d18/d1f/d39/d69/f71 [0,4194304] 0 2026-03-09T00:03:44.301 INFO:tasks.workunit.client.1.vm06.stdout:0/425: write d3/d18/d2c/f7e [815220,48156] 0 2026-03-09T00:03:44.304 INFO:tasks.workunit.client.1.vm06.stdout:5/545: dwrite d5/d1c/d21/d28/f3b [0,4194304] 0 2026-03-09T00:03:44.304 INFO:tasks.workunit.client.1.vm06.stdout:5/546: write d5/d1c/d21/d28/f57 [906452,47893] 0 2026-03-09T00:03:44.305 INFO:tasks.workunit.client.1.vm06.stdout:5/547: mknod d5/d1c/d23/d51/daa/cb9 0 2026-03-09T00:03:44.305 INFO:tasks.workunit.client.1.vm06.stdout:5/548: dread - d5/d44/d4b/fa9 zero size 2026-03-09T00:03:44.305 INFO:tasks.workunit.client.1.vm06.stdout:5/549: mknod d5/d44/d4b/d92/d49/da0/cba 0 2026-03-09T00:03:44.306 INFO:tasks.workunit.client.1.vm06.stdout:5/550: symlink d5/d1c/d23/lbb 0 2026-03-09T00:03:44.308 INFO:tasks.workunit.client.1.vm06.stdout:5/551: symlink d5/d1c/d23/lbc 0 2026-03-09T00:03:44.313 INFO:tasks.workunit.client.1.vm06.stdout:1/393: dwrite d6/f7 [0,4194304] 0 2026-03-09T00:03:44.315 INFO:tasks.workunit.client.0.vm03.stdout:4/243: dwrite d7/f28 [0,4194304] 0 2026-03-09T00:03:44.315 INFO:tasks.workunit.client.0.vm03.stdout:4/244: fdatasync d7/f22 0 2026-03-09T00:03:44.316 INFO:tasks.workunit.client.0.vm03.stdout:7/243: dwrite d2/d4/f22 [0,4194304] 0 2026-03-09T00:03:44.316 INFO:tasks.workunit.client.0.vm03.stdout:1/255: dwrite d4/d15/f3f [0,4194304] 0 2026-03-09T00:03:44.318 INFO:tasks.workunit.client.0.vm03.stdout:4/245: dread d7/f22 [0,4194304] 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:1/256: creat d4/d15/d1a/f55 x:0 0 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:4/246: getdents d7 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:4/247: creat d7/d20/d29/d38/d3a/f4b x:0 0 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:1/257: dread d4/d3a/d32/f53 [0,4194304] 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:4/248: symlink d7/d23/d25/l4c 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:4/249: symlink d7/d20/d29/l4d 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:4/250: mkdir d7/d20/d29/d4e 0 2026-03-09T00:03:44.329 INFO:tasks.workunit.client.0.vm03.stdout:4/251: creat d7/d20/d29/d4e/f4f x:0 0 0 2026-03-09T00:03:44.336 INFO:tasks.workunit.client.1.vm06.stdout:8/399: dwrite db/d1e/f2e [4194304,4194304] 0 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: Standby manager daemon vm03.yvcons started 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/crt"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/key"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:44 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.339 INFO:tasks.workunit.client.0.vm03.stdout:7/244: rmdir d2/d1f/d42 39 2026-03-09T00:03:44.339 INFO:tasks.workunit.client.0.vm03.stdout:7/245: write d2/d1f/d42/f47 [1346251,119383] 0 2026-03-09T00:03:44.361 INFO:tasks.workunit.client.1.vm06.stdout:5/552: dwrite d5/d44/d4b/f91 [0,4194304] 0 2026-03-09T00:03:44.366 INFO:tasks.workunit.client.1.vm06.stdout:1/394: dwrite d6/d21/f69 [0,4194304] 0 2026-03-09T00:03:44.366 INFO:tasks.workunit.client.1.vm06.stdout:1/395: write d6/d21/d2d/d37/f77 [829799,82980] 0 2026-03-09T00:03:44.377 INFO:tasks.workunit.client.1.vm06.stdout:1/396: dread d6/d21/f2e [4194304,4194304] 0 2026-03-09T00:03:44.379 INFO:tasks.workunit.client.0.vm03.stdout:4/252: dread d7/d20/f34 [0,4194304] 0 2026-03-09T00:03:44.379 INFO:tasks.workunit.client.0.vm03.stdout:4/253: readlink d7/l14 0 2026-03-09T00:03:44.396 INFO:tasks.workunit.client.1.vm06.stdout:3/465: sync 2026-03-09T00:03:44.402 INFO:tasks.workunit.client.1.vm06.stdout:3/466: dread d11/d28/f4f [4194304,4194304] 0 2026-03-09T00:03:44.406 INFO:tasks.workunit.client.1.vm06.stdout:3/467: unlink cb 0 2026-03-09T00:03:44.406 INFO:tasks.workunit.client.1.vm06.stdout:3/468: truncate d11/d28/d2e/d2f/d5b/d5f/f60 1255571 0 2026-03-09T00:03:44.435 INFO:tasks.workunit.client.1.vm06.stdout:5/553: dwrite d5/d1c/d21/d28/f33 [0,4194304] 0 2026-03-09T00:03:44.451 INFO:tasks.workunit.client.0.vm03.stdout:8/199: creat d7/df/f3d x:0 0 0 2026-03-09T00:03:44.451 INFO:tasks.workunit.client.0.vm03.stdout:8/200: link d7/df/f3d d7/df/d1e/d38/f3e 0 2026-03-09T00:03:44.452 INFO:tasks.workunit.client.0.vm03.stdout:8/201: truncate d7/f11 983664 0 2026-03-09T00:03:44.452 INFO:tasks.workunit.client.0.vm03.stdout:8/202: write d7/df/f31 [137359,87231] 0 2026-03-09T00:03:44.452 INFO:tasks.workunit.client.0.vm03.stdout:8/203: dread - d7/df/f29 zero size 2026-03-09T00:03:44.453 INFO:tasks.workunit.client.0.vm03.stdout:8/204: mkdir d7/df/d1e/d3f 0 2026-03-09T00:03:44.458 INFO:tasks.workunit.client.0.vm03.stdout:4/254: dwrite d7/f22 [0,4194304] 0 2026-03-09T00:03:44.458 INFO:tasks.workunit.client.0.vm03.stdout:4/255: creat d7/d20/d29/d38/d3a/f50 x:0 0 0 2026-03-09T00:03:44.459 INFO:tasks.workunit.client.0.vm03.stdout:1/258: dwrite d4/d3a/f48 [0,4194304] 0 2026-03-09T00:03:44.460 INFO:tasks.workunit.client.0.vm03.stdout:1/259: dread - d4/d15/f45 zero size 2026-03-09T00:03:44.462 INFO:tasks.workunit.client.0.vm03.stdout:8/205: dread d7/df/f2c [0,4194304] 0 2026-03-09T00:03:44.462 INFO:tasks.workunit.client.1.vm06.stdout:3/469: dwrite d11/d28/f5e [0,4194304] 0 2026-03-09T00:03:44.462 INFO:tasks.workunit.client.0.vm03.stdout:4/256: mknod d7/d23/c51 0 2026-03-09T00:03:44.463 INFO:tasks.workunit.client.0.vm03.stdout:4/257: fsync d7/d20/f3d 0 2026-03-09T00:03:44.463 INFO:tasks.workunit.client.0.vm03.stdout:4/258: write d7/d20/f3d [1736607,17808] 0 2026-03-09T00:03:44.463 INFO:tasks.workunit.client.0.vm03.stdout:4/259: chown d7/d27/l3b 5169 1 2026-03-09T00:03:44.465 INFO:tasks.workunit.client.0.vm03.stdout:1/260: read d4/d15/f17 [260,69232] 0 2026-03-09T00:03:44.465 INFO:tasks.workunit.client.0.vm03.stdout:1/261: chown d4/d15 50912 1 2026-03-09T00:03:44.467 INFO:tasks.workunit.client.0.vm03.stdout:8/206: mkdir d7/df/d1a/d40 0 2026-03-09T00:03:44.467 INFO:tasks.workunit.client.0.vm03.stdout:8/207: chown d7/df/l19 1902225969 1 2026-03-09T00:03:44.467 INFO:tasks.workunit.client.0.vm03.stdout:4/260: fdatasync d7/d20/f3d 0 2026-03-09T00:03:44.467 INFO:tasks.workunit.client.0.vm03.stdout:4/261: creat d7/d27/f52 x:0 0 0 2026-03-09T00:03:44.470 INFO:tasks.workunit.client.1.vm06.stdout:3/470: creat d11/d28/d4d/f9c x:0 0 0 2026-03-09T00:03:44.470 INFO:tasks.workunit.client.0.vm03.stdout:1/262: mknod d4/d15/d1a/c56 0 2026-03-09T00:03:44.470 INFO:tasks.workunit.client.0.vm03.stdout:1/263: creat d4/d3a/d43/f57 x:0 0 0 2026-03-09T00:03:44.476 INFO:tasks.workunit.client.0.vm03.stdout:1/264: write d4/d3a/f4d [4162068,10568] 0 2026-03-09T00:03:44.482 INFO:tasks.workunit.client.0.vm03.stdout:8/208: symlink d7/df/d1e/l41 0 2026-03-09T00:03:44.495 INFO:tasks.workunit.client.0.vm03.stdout:8/209: chown d7/df/f2c 7 1 2026-03-09T00:03:44.495 INFO:tasks.workunit.client.1.vm06.stdout:3/471: creat d11/d28/d4d/d9b/f9d x:0 0 0 2026-03-09T00:03:44.495 INFO:tasks.workunit.client.1.vm06.stdout:3/472: symlink d11/d28/d2e/l9e 0 2026-03-09T00:03:44.499 INFO:tasks.workunit.client.1.vm06.stdout:9/350: rename d1/d3/d12/d21 to d1/d4/d6e 0 2026-03-09T00:03:44.499 INFO:tasks.workunit.client.1.vm06.stdout:9/351: stat d1/d3/d4f/c64 0 2026-03-09T00:03:44.501 INFO:tasks.workunit.client.1.vm06.stdout:3/473: write d11/d28/d2e/f38 [5684636,40034] 0 2026-03-09T00:03:44.501 INFO:tasks.workunit.client.1.vm06.stdout:3/474: symlink d11/d28/d2e/l9f 0 2026-03-09T00:03:44.502 INFO:tasks.workunit.client.1.vm06.stdout:3/475: write d11/d28/d2e/d2f/d36/f4a [4833021,6261] 0 2026-03-09T00:03:44.502 INFO:tasks.workunit.client.1.vm06.stdout:3/476: truncate d11/d28/f42 3379177 0 2026-03-09T00:03:44.502 INFO:tasks.workunit.client.1.vm06.stdout:3/477: fsync d11/d28/f5e 0 2026-03-09T00:03:44.502 INFO:tasks.workunit.client.1.vm06.stdout:3/478: chown d11/d28/d4d/f6e 1808291250 1 2026-03-09T00:03:44.503 INFO:tasks.workunit.client.1.vm06.stdout:3/479: mknod d11/d28/d4d/d89/d90/ca0 0 2026-03-09T00:03:44.503 INFO:tasks.workunit.client.1.vm06.stdout:3/480: creat d11/d28/d2e/d2f/d5b/d94/fa1 x:0 0 0 2026-03-09T00:03:44.504 INFO:tasks.workunit.client.1.vm06.stdout:9/352: dread d1/d4/d6e/f7 [0,4194304] 0 2026-03-09T00:03:44.504 INFO:tasks.workunit.client.1.vm06.stdout:3/481: link d11/f48 d11/fa2 0 2026-03-09T00:03:44.505 INFO:tasks.workunit.client.1.vm06.stdout:9/353: creat d1/d4/d6e/d14/d25/f6f x:0 0 0 2026-03-09T00:03:44.508 INFO:tasks.workunit.client.1.vm06.stdout:2/519: mkdir d7/d1a/d25/d97 0 2026-03-09T00:03:44.516 INFO:tasks.workunit.client.1.vm06.stdout:2/520: creat d7/da/db/f98 x:0 0 0 2026-03-09T00:03:44.519 INFO:tasks.workunit.client.0.vm03.stdout:4/262: dwrite d7/f1f [0,4194304] 0 2026-03-09T00:03:44.519 INFO:tasks.workunit.client.0.vm03.stdout:4/263: rmdir d7/d23/d25 39 2026-03-09T00:03:44.546 INFO:tasks.workunit.client.0.vm03.stdout:1/265: dwrite d4/f1e [0,4194304] 0 2026-03-09T00:03:44.549 INFO:tasks.workunit.client.0.vm03.stdout:1/266: creat d4/d3a/d3d/f58 x:0 0 0 2026-03-09T00:03:44.549 INFO:tasks.workunit.client.0.vm03.stdout:1/267: dread - d4/d3a/d43/f49 zero size 2026-03-09T00:03:44.549 INFO:tasks.workunit.client.0.vm03.stdout:1/268: write d4/d3a/d32/f53 [214991,78769] 0 2026-03-09T00:03:44.549 INFO:tasks.workunit.client.0.vm03.stdout:1/269: getdents d4/d6/d52 0 2026-03-09T00:03:44.579 INFO:tasks.workunit.client.1.vm06.stdout:2/521: dwrite d7/da/db/de/f60 [0,4194304] 0 2026-03-09T00:03:44.582 INFO:tasks.workunit.client.0.vm03.stdout:4/264: dwrite d7/d20/f33 [4194304,4194304] 0 2026-03-09T00:03:44.600 INFO:tasks.workunit.client.0.vm03.stdout:1/270: creat d4/d15/f59 x:0 0 0 2026-03-09T00:03:44.600 INFO:tasks.workunit.client.0.vm03.stdout:1/271: dread d4/d3a/d3d/f4a [0,4194304] 0 2026-03-09T00:03:44.601 INFO:tasks.workunit.client.0.vm03.stdout:9/179: rename f5 to d15/d1c/d28/f39 0 2026-03-09T00:03:44.601 INFO:tasks.workunit.client.0.vm03.stdout:9/180: chown d15 16429703 1 2026-03-09T00:03:44.601 INFO:tasks.workunit.client.0.vm03.stdout:9/181: truncate d15/d1c/d21/f25 4766981 0 2026-03-09T00:03:44.601 INFO:tasks.workunit.client.1.vm06.stdout:3/482: creat d11/d28/d2e/d2f/fa3 x:0 0 0 2026-03-09T00:03:44.602 INFO:tasks.workunit.client.1.vm06.stdout:3/483: chown d11/c63 735 1 2026-03-09T00:03:44.603 INFO:tasks.workunit.client.0.vm03.stdout:5/194: rename d1c/d20/d44/f36 to d1c/d20/d44/d3b/f45 0 2026-03-09T00:03:44.603 INFO:tasks.workunit.client.0.vm03.stdout:5/195: dread - d1c/d20/d44/d3b/f3c zero size 2026-03-09T00:03:44.603 INFO:tasks.workunit.client.0.vm03.stdout:0/216: rename d2/da/d36 to d2/da/d36/d47 22 2026-03-09T00:03:44.603 INFO:tasks.workunit.client.0.vm03.stdout:0/217: creat d2/da/d36/d39/f48 x:0 0 0 2026-03-09T00:03:44.603 INFO:tasks.workunit.client.0.vm03.stdout:0/218: chown d2/da/d1a/c3c 1361586823 1 2026-03-09T00:03:44.604 INFO:tasks.workunit.client.1.vm06.stdout:3/484: mknod d11/d28/d2e/d2f/d5b/d5f/ca4 0 2026-03-09T00:03:44.608 INFO:tasks.workunit.client.0.vm03.stdout:9/182: chown d15/d1c/d28/d30/c35 13944965 1 2026-03-09T00:03:44.608 INFO:tasks.workunit.client.0.vm03.stdout:9/183: chown l6 272496 1 2026-03-09T00:03:44.609 INFO:tasks.workunit.client.1.vm06.stdout:3/485: dread d11/d28/d2e/d2f/d5b/f7d [0,4194304] 0 2026-03-09T00:03:44.613 INFO:tasks.workunit.client.1.vm06.stdout:3/486: link d11/d28/d2e/l3b d11/d28/d2e/d2f/d5b/d5f/la5 0 2026-03-09T00:03:44.615 INFO:tasks.workunit.client.1.vm06.stdout:3/487: link d11/f24 d11/d28/d4d/d89/fa6 0 2026-03-09T00:03:44.616 INFO:tasks.workunit.client.1.vm06.stdout:3/488: write d11/d28/d2e/d2f/d5b/d5f/f81 [942737,19117] 0 2026-03-09T00:03:44.616 INFO:tasks.workunit.client.1.vm06.stdout:3/489: chown d11/d3f/d8d/l95 5 1 2026-03-09T00:03:44.619 INFO:tasks.workunit.client.0.vm03.stdout:9/184: link d15/f17 d15/d1c/d36/f3a 0 2026-03-09T00:03:44.621 INFO:tasks.workunit.client.0.vm03.stdout:5/196: dread f14 [0,4194304] 0 2026-03-09T00:03:44.623 INFO:tasks.workunit.client.0.vm03.stdout:1/272: dread d4/d3a/f2c [0,4194304] 0 2026-03-09T00:03:44.623 INFO:tasks.workunit.client.0.vm03.stdout:1/273: creat d4/d3a/d43/f5a x:0 0 0 2026-03-09T00:03:44.628 INFO:tasks.workunit.client.0.vm03.stdout:9/185: getdents d15 0 2026-03-09T00:03:44.629 INFO:tasks.workunit.client.1.vm06.stdout:8/400: mkdir db/dd/d24/d80 0 2026-03-09T00:03:44.629 INFO:tasks.workunit.client.1.vm06.stdout:8/401: chown db/c61 25 1 2026-03-09T00:03:44.629 INFO:tasks.workunit.client.0.vm03.stdout:5/197: creat d1c/d20/d44/f46 x:0 0 0 2026-03-09T00:03:44.630 INFO:tasks.workunit.client.1.vm06.stdout:8/402: creat db/d53/d5c/f81 x:0 0 0 2026-03-09T00:03:44.631 INFO:tasks.workunit.client.1.vm06.stdout:8/403: dread db/f28 [4194304,4194304] 0 2026-03-09T00:03:44.634 INFO:tasks.workunit.client.0.vm03.stdout:9/186: read d15/d1c/d21/f25 [1637406,118755] 0 2026-03-09T00:03:44.634 INFO:tasks.workunit.client.0.vm03.stdout:9/187: write d15/f2c [1018096,50831] 0 2026-03-09T00:03:44.634 INFO:tasks.workunit.client.0.vm03.stdout:9/188: rename d15/d1c to d15/d1c/d21/d3b 22 2026-03-09T00:03:44.636 INFO:tasks.workunit.client.0.vm03.stdout:1/274: dread d4/d3a/d43/f47 [0,4194304] 0 2026-03-09T00:03:44.636 INFO:tasks.workunit.client.0.vm03.stdout:1/275: write d4/d15/f17 [1898985,46657] 0 2026-03-09T00:03:44.636 INFO:tasks.workunit.client.0.vm03.stdout:1/276: stat d4/d6/c30 0 2026-03-09T00:03:44.637 INFO:tasks.workunit.client.0.vm03.stdout:4/265: dwrite d7/d27/f52 [0,4194304] 0 2026-03-09T00:03:44.637 INFO:tasks.workunit.client.0.vm03.stdout:4/266: chown d7/d20/d29/d38/d3a 419 1 2026-03-09T00:03:44.637 INFO:tasks.workunit.client.0.vm03.stdout:4/267: creat d7/d20/d29/f53 x:0 0 0 2026-03-09T00:03:44.643 INFO:tasks.workunit.client.0.vm03.stdout:9/189: dread fc [0,4194304] 0 2026-03-09T00:03:44.653 INFO:tasks.workunit.client.0.vm03.stdout:5/198: dread d1c/f1f [0,4194304] 0 2026-03-09T00:03:44.653 INFO:tasks.workunit.client.0.vm03.stdout:0/219: dwrite d2/f32 [0,4194304] 0 2026-03-09T00:03:44.657 INFO:tasks.workunit.client.0.vm03.stdout:1/277: link d4/l10 d4/d6/l5b 0 2026-03-09T00:03:44.665 INFO:tasks.workunit.client.0.vm03.stdout:4/268: mkdir d7/d20/d29/d54 0 2026-03-09T00:03:44.665 INFO:tasks.workunit.client.0.vm03.stdout:4/269: dread - d7/d23/f4a zero size 2026-03-09T00:03:44.665 INFO:tasks.workunit.client.0.vm03.stdout:5/199: rename d1c/d20/c2f to d1c/d20/d44/d43/c47 0 2026-03-09T00:03:44.665 INFO:tasks.workunit.client.0.vm03.stdout:5/200: chown d1c/d20/d44/c2b 868 1 2026-03-09T00:03:44.665 INFO:tasks.workunit.client.0.vm03.stdout:5/201: write fb [3431266,15061] 0 2026-03-09T00:03:44.665 INFO:tasks.workunit.client.0.vm03.stdout:5/202: chown d1c/d20/d44/d3b/f3f 17046 1 2026-03-09T00:03:44.666 INFO:tasks.workunit.client.0.vm03.stdout:0/220: mkdir d2/da/dd/d49 0 2026-03-09T00:03:44.668 INFO:tasks.workunit.client.0.vm03.stdout:1/278: write d4/d15/f44 [1180004,76322] 0 2026-03-09T00:03:44.668 INFO:tasks.workunit.client.1.vm06.stdout:3/490: dwrite d11/d28/d2e/d2f/d36/f75 [0,4194304] 0 2026-03-09T00:03:44.668 INFO:tasks.workunit.client.0.vm03.stdout:4/270: symlink d7/d27/l55 0 2026-03-09T00:03:44.670 INFO:tasks.workunit.client.0.vm03.stdout:7/246: creat d2/f50 x:0 0 0 2026-03-09T00:03:44.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: Standby manager daemon vm03.yvcons started 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/crt"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/key"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/2' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:44 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:44.673 INFO:tasks.workunit.client.0.vm03.stdout:5/203: mknod d1c/d20/d44/c48 0 2026-03-09T00:03:44.673 INFO:tasks.workunit.client.0.vm03.stdout:4/271: dread d7/d20/f33 [0,4194304] 0 2026-03-09T00:03:44.674 INFO:tasks.workunit.client.0.vm03.stdout:0/221: symlink d2/da/d36/d39/l4a 0 2026-03-09T00:03:44.674 INFO:tasks.workunit.client.0.vm03.stdout:0/222: write d2/da/f44 [312237,3301] 0 2026-03-09T00:03:44.682 INFO:tasks.workunit.client.0.vm03.stdout:9/190: dwrite d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:44.687 INFO:tasks.workunit.client.0.vm03.stdout:2/189: link d8/f21 d8/f3e 0 2026-03-09T00:03:44.687 INFO:tasks.workunit.client.0.vm03.stdout:2/190: creat d8/d1b/f3f x:0 0 0 2026-03-09T00:03:44.687 INFO:tasks.workunit.client.1.vm06.stdout:3/491: unlink d11/d28/d2e/d2f/l85 0 2026-03-09T00:03:44.690 INFO:tasks.workunit.client.1.vm06.stdout:3/492: creat d11/d28/d4d/d89/d90/fa7 x:0 0 0 2026-03-09T00:03:44.690 INFO:tasks.workunit.client.0.vm03.stdout:7/247: mknod d2/d1f/d42/d43/c51 0 2026-03-09T00:03:44.692 INFO:tasks.workunit.client.0.vm03.stdout:1/279: getdents d4 0 2026-03-09T00:03:44.696 INFO:tasks.workunit.client.0.vm03.stdout:1/280: write f2 [457050,31367] 0 2026-03-09T00:03:44.704 INFO:tasks.workunit.client.1.vm06.stdout:6/417: rename d4/d66 to d4/d27/d42/d7e 0 2026-03-09T00:03:44.704 INFO:tasks.workunit.client.1.vm06.stdout:3/493: mknod d11/d28/d2e/d2f/d36/ca8 0 2026-03-09T00:03:44.704 INFO:tasks.workunit.client.1.vm06.stdout:3/494: write d11/f1a [3456722,89183] 0 2026-03-09T00:03:44.704 INFO:tasks.workunit.client.0.vm03.stdout:4/272: link d7/d27/l36 d7/d23/l56 0 2026-03-09T00:03:44.706 INFO:tasks.workunit.client.0.vm03.stdout:9/191: rmdir d15 39 2026-03-09T00:03:44.717 INFO:tasks.workunit.client.0.vm03.stdout:9/192: write f11 [2127271,49347] 0 2026-03-09T00:03:44.717 INFO:tasks.workunit.client.0.vm03.stdout:9/193: chown d15/d1c/d21/c31 7 1 2026-03-09T00:03:44.721 INFO:tasks.workunit.client.0.vm03.stdout:1/281: mkdir d4/d15/d5c 0 2026-03-09T00:03:44.724 INFO:tasks.workunit.client.1.vm06.stdout:7/420: rename d0/df/d1a/d27/f4b to d0/df/d1a/d35/f77 0 2026-03-09T00:03:44.727 INFO:tasks.workunit.client.0.vm03.stdout:5/204: mknod d1c/c49 0 2026-03-09T00:03:44.727 INFO:tasks.workunit.client.1.vm06.stdout:6/418: symlink d4/d27/d42/l7f 0 2026-03-09T00:03:44.727 INFO:tasks.workunit.client.1.vm06.stdout:3/495: mkdir d11/d28/d2e/d2f/d5b/d5f/d91/da9 0 2026-03-09T00:03:44.732 INFO:tasks.workunit.client.0.vm03.stdout:4/273: symlink d7/d20/d29/l57 0 2026-03-09T00:03:44.735 INFO:tasks.workunit.client.0.vm03.stdout:9/194: creat d15/d1c/f3c x:0 0 0 2026-03-09T00:03:44.735 INFO:tasks.workunit.client.1.vm06.stdout:4/344: rename d17/d24/d3b/d5e/c63 to d17/d24/d49/d5f/c71 0 2026-03-09T00:03:44.740 INFO:tasks.workunit.client.1.vm06.stdout:4/345: write d17/f1d [2298237,28884] 0 2026-03-09T00:03:44.741 INFO:tasks.workunit.client.1.vm06.stdout:7/421: truncate d0/df/d1a/d27/d4c/f32 2894771 0 2026-03-09T00:03:44.742 INFO:tasks.workunit.client.1.vm06.stdout:7/422: dread - d0/df/d1a/d27/d4c/d40/f5a zero size 2026-03-09T00:03:44.742 INFO:tasks.workunit.client.1.vm06.stdout:7/423: fsync d0/df/d1a/d3a/d4e/d5e/f73 0 2026-03-09T00:03:44.742 INFO:tasks.workunit.client.1.vm06.stdout:7/424: write d0/fe [2634921,57868] 0 2026-03-09T00:03:44.749 INFO:tasks.workunit.client.0.vm03.stdout:1/282: rename d4/d15/d1a/f2b to d4/d3a/d3d/d46/f5d 0 2026-03-09T00:03:44.749 INFO:tasks.workunit.client.1.vm06.stdout:5/554: sync 2026-03-09T00:03:44.757 INFO:tasks.workunit.client.1.vm06.stdout:0/426: rename d3/d18/d3c/l78 to d3/d18/d1f/d39/d49/d60/l8d 0 2026-03-09T00:03:44.764 INFO:tasks.workunit.client.0.vm03.stdout:5/205: mknod d1c/d20/d44/d43/c4a 0 2026-03-09T00:03:44.772 INFO:tasks.workunit.client.1.vm06.stdout:4/346: mknod d17/d24/d3b/c72 0 2026-03-09T00:03:44.772 INFO:tasks.workunit.client.1.vm06.stdout:4/347: fdatasync d17/d24/f39 0 2026-03-09T00:03:44.772 INFO:tasks.workunit.client.0.vm03.stdout:4/274: mkdir d7/d20/d29/d54/d58 0 2026-03-09T00:03:44.780 INFO:tasks.workunit.client.0.vm03.stdout:9/195: creat d15/d1c/d28/d30/f3d x:0 0 0 2026-03-09T00:03:44.780 INFO:tasks.workunit.client.0.vm03.stdout:9/196: readlink d15/d1c/l20 0 2026-03-09T00:03:44.780 INFO:tasks.workunit.client.0.vm03.stdout:6/209: sync 2026-03-09T00:03:44.780 INFO:tasks.workunit.client.0.vm03.stdout:6/210: fdatasync d13/f14 0 2026-03-09T00:03:44.780 INFO:tasks.workunit.client.0.vm03.stdout:3/164: sync 2026-03-09T00:03:44.781 INFO:tasks.workunit.client.0.vm03.stdout:8/210: sync 2026-03-09T00:03:44.781 INFO:tasks.workunit.client.0.vm03.stdout:8/211: fsync d7/df/f29 0 2026-03-09T00:03:44.781 INFO:tasks.workunit.client.0.vm03.stdout:8/212: rename d7/df to d7/df/d1a/d40/d42 22 2026-03-09T00:03:44.781 INFO:tasks.workunit.client.0.vm03.stdout:6/211: write d13/d1e/f21 [738976,108268] 0 2026-03-09T00:03:44.781 INFO:tasks.workunit.client.0.vm03.stdout:6/212: readlink d13/l29 0 2026-03-09T00:03:44.781 INFO:tasks.workunit.client.0.vm03.stdout:9/197: write f8 [3436612,91333] 0 2026-03-09T00:03:44.785 INFO:tasks.workunit.client.0.vm03.stdout:3/165: dread d2/f9 [0,4194304] 0 2026-03-09T00:03:44.797 INFO:tasks.workunit.client.1.vm06.stdout:1/397: rename d6/d21/f69 to d6/d21/d2d/d3b/d42/f80 0 2026-03-09T00:03:44.797 INFO:tasks.workunit.client.0.vm03.stdout:6/213: dread d13/f31 [0,4194304] 0 2026-03-09T00:03:44.797 INFO:tasks.workunit.client.0.vm03.stdout:1/283: mkdir d4/d5e 0 2026-03-09T00:03:44.803 INFO:tasks.workunit.client.0.vm03.stdout:3/166: write d2/db/f24 [44995,126187] 0 2026-03-09T00:03:44.803 INFO:tasks.workunit.client.1.vm06.stdout:9/354: sync 2026-03-09T00:03:44.808 INFO:tasks.workunit.client.0.vm03.stdout:5/206: symlink d1c/d20/l4b 0 2026-03-09T00:03:44.808 INFO:tasks.workunit.client.0.vm03.stdout:5/207: write d1c/d20/d44/f3d [376543,84524] 0 2026-03-09T00:03:44.810 INFO:tasks.workunit.client.0.vm03.stdout:5/208: read d1c/f1f [193193,27287] 0 2026-03-09T00:03:44.810 INFO:tasks.workunit.client.0.vm03.stdout:5/209: fsync d1c/f3a 0 2026-03-09T00:03:44.810 INFO:tasks.workunit.client.0.vm03.stdout:5/210: chown d1c/f1f 7359 1 2026-03-09T00:03:44.810 INFO:tasks.workunit.client.0.vm03.stdout:5/211: dread - d1c/f37 zero size 2026-03-09T00:03:44.812 INFO:tasks.workunit.client.1.vm06.stdout:4/348: mknod d17/d21/c73 0 2026-03-09T00:03:44.812 INFO:tasks.workunit.client.1.vm06.stdout:4/349: chown d17/d24/f5c 4469044 1 2026-03-09T00:03:44.812 INFO:tasks.workunit.client.1.vm06.stdout:4/350: readlink d17/l1b 0 2026-03-09T00:03:44.814 INFO:tasks.workunit.client.1.vm06.stdout:9/355: dread d1/d4/f39 [0,4194304] 0 2026-03-09T00:03:44.817 INFO:tasks.workunit.client.1.vm06.stdout:2/522: rename d7/d1b/d85 to d7/d1a/d25/d97/d99 0 2026-03-09T00:03:44.819 INFO:tasks.workunit.client.1.vm06.stdout:6/419: dwrite d4/f40 [4194304,4194304] 0 2026-03-09T00:03:44.822 INFO:tasks.workunit.client.1.vm06.stdout:6/420: creat d4/d27/d42/d52/f80 x:0 0 0 2026-03-09T00:03:44.823 INFO:tasks.workunit.client.0.vm03.stdout:8/213: mkdir d7/df/d1a/d2b/d43 0 2026-03-09T00:03:44.823 INFO:tasks.workunit.client.1.vm06.stdout:4/351: mknod d17/d21/c74 0 2026-03-09T00:03:44.831 INFO:tasks.workunit.client.1.vm06.stdout:1/398: write d6/d21/d2d/d3b/d42/f80 [518499,58182] 0 2026-03-09T00:03:44.831 INFO:tasks.workunit.client.1.vm06.stdout:7/425: dwrite d0/df/d1a/d3a/f23 [0,4194304] 0 2026-03-09T00:03:44.831 INFO:tasks.workunit.client.1.vm06.stdout:5/555: dwrite d5/d44/d4b/fa9 [0,4194304] 0 2026-03-09T00:03:44.832 INFO:tasks.workunit.client.0.vm03.stdout:7/248: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:44.832 INFO:tasks.workunit.client.0.vm03.stdout:3/167: dread d2/f6 [0,4194304] 0 2026-03-09T00:03:44.832 INFO:tasks.workunit.client.0.vm03.stdout:3/168: write d2/db/d2d/f2f [1009535,12182] 0 2026-03-09T00:03:44.835 INFO:tasks.workunit.client.0.vm03.stdout:9/198: creat d15/f3e x:0 0 0 2026-03-09T00:03:44.840 INFO:tasks.workunit.client.0.vm03.stdout:9/199: dread fd [4194304,4194304] 0 2026-03-09T00:03:44.840 INFO:tasks.workunit.client.0.vm03.stdout:9/200: chown c14 16224 1 2026-03-09T00:03:44.845 INFO:tasks.workunit.client.1.vm06.stdout:9/356: getdents d1/d3/d4f/d52 0 2026-03-09T00:03:44.848 INFO:tasks.workunit.client.0.vm03.stdout:6/214: unlink d13/f16 0 2026-03-09T00:03:44.854 INFO:tasks.workunit.client.0.vm03.stdout:5/212: getdents d1c 0 2026-03-09T00:03:44.859 INFO:tasks.workunit.client.0.vm03.stdout:1/284: getdents d4 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.0.vm03.stdout:8/214: creat d7/df/d1a/d2b/f44 x:0 0 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.0.vm03.stdout:8/215: write d7/df/d1e/d38/f3e [297763,82631] 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.0.vm03.stdout:8/216: truncate d7/df/d1a/f2e 176417 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.1.vm06.stdout:8/404: rename db/d53/d5c/f81 to db/d1e/f82 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.1.vm06.stdout:2/523: truncate d7/d1a/f30 6303106 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.1.vm06.stdout:6/421: mknod d4/d16/d53/c81 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.1.vm06.stdout:6/422: readlink d4/d27/d3e/l56 0 2026-03-09T00:03:44.860 INFO:tasks.workunit.client.1.vm06.stdout:0/427: write d3/d18/f14 [1117280,95875] 0 2026-03-09T00:03:44.861 INFO:tasks.workunit.client.0.vm03.stdout:4/275: dwrite d7/f22 [0,4194304] 0 2026-03-09T00:03:44.863 INFO:tasks.workunit.client.0.vm03.stdout:7/249: rmdir d2/d1f/d3a/d24 39 2026-03-09T00:03:44.863 INFO:tasks.workunit.client.1.vm06.stdout:4/352: mkdir d17/d24/d3b/d75 0 2026-03-09T00:03:44.869 INFO:tasks.workunit.client.0.vm03.stdout:3/169: dwrite d2/f8 [0,4194304] 0 2026-03-09T00:03:44.869 INFO:tasks.workunit.client.1.vm06.stdout:4/353: dread d17/d24/d49/f62 [0,4194304] 0 2026-03-09T00:03:44.869 INFO:tasks.workunit.client.1.vm06.stdout:4/354: write d17/d24/d3b/d54/f58 [754727,501] 0 2026-03-09T00:03:44.869 INFO:tasks.workunit.client.1.vm06.stdout:4/355: write d17/d24/f3a [2167717,103239] 0 2026-03-09T00:03:44.870 INFO:tasks.workunit.client.0.vm03.stdout:9/201: symlink d15/d1c/d36/l3f 0 2026-03-09T00:03:44.870 INFO:tasks.workunit.client.0.vm03.stdout:9/202: readlink l13 0 2026-03-09T00:03:44.870 INFO:tasks.workunit.client.0.vm03.stdout:9/203: fdatasync d15/f1b 0 2026-03-09T00:03:44.871 INFO:tasks.workunit.client.1.vm06.stdout:1/399: creat d6/f81 x:0 0 0 2026-03-09T00:03:44.873 INFO:tasks.workunit.client.0.vm03.stdout:2/191: sync 2026-03-09T00:03:44.873 INFO:tasks.workunit.client.0.vm03.stdout:2/192: read d8/d1b/d24/f2f [2718706,130309] 0 2026-03-09T00:03:44.873 INFO:tasks.workunit.client.0.vm03.stdout:2/193: write d8/d1b/d2a/d2e/f35 [199009,110003] 0 2026-03-09T00:03:44.873 INFO:tasks.workunit.client.0.vm03.stdout:0/223: sync 2026-03-09T00:03:44.873 INFO:tasks.workunit.client.0.vm03.stdout:0/224: chown d2/da/d1a/c3c 21 1 2026-03-09T00:03:44.874 INFO:tasks.workunit.client.1.vm06.stdout:4/356: write d17/d24/d3b/d54/f58 [148784,36098] 0 2026-03-09T00:03:44.884 INFO:tasks.workunit.client.0.vm03.stdout:9/204: dread f11 [0,4194304] 0 2026-03-09T00:03:44.884 INFO:tasks.workunit.client.0.vm03.stdout:9/205: truncate d15/d1c/d21/f34 1041247 0 2026-03-09T00:03:44.884 INFO:tasks.workunit.client.0.vm03.stdout:9/206: truncate d15/d1c/d21/f25 5435193 0 2026-03-09T00:03:44.886 INFO:tasks.workunit.client.0.vm03.stdout:6/215: rename d13/f24 to d13/d1e/d44/f49 0 2026-03-09T00:03:44.891 INFO:tasks.workunit.client.0.vm03.stdout:5/213: rename d1c/f3a to d1c/f4c 0 2026-03-09T00:03:44.891 INFO:tasks.workunit.client.1.vm06.stdout:7/426: creat d0/df/d1a/d27/d4c/d40/d5b/f78 x:0 0 0 2026-03-09T00:03:44.891 INFO:tasks.workunit.client.1.vm06.stdout:7/427: chown d0/df/d1a/d27/f37 24723 1 2026-03-09T00:03:44.891 INFO:tasks.workunit.client.1.vm06.stdout:7/428: fdatasync d0/df/d1a/d3a/f23 0 2026-03-09T00:03:44.891 INFO:tasks.workunit.client.1.vm06.stdout:7/429: read - d0/df/d1a/d27/f60 zero size 2026-03-09T00:03:44.894 INFO:tasks.workunit.client.0.vm03.stdout:0/225: dread d2/fe [0,4194304] 0 2026-03-09T00:03:44.896 INFO:tasks.workunit.client.1.vm06.stdout:5/556: truncate d5/f14 5149880 0 2026-03-09T00:03:44.898 INFO:tasks.workunit.client.1.vm06.stdout:9/357: creat d1/d4/d6e/d14/d25/f70 x:0 0 0 2026-03-09T00:03:44.898 INFO:tasks.workunit.client.0.vm03.stdout:5/214: dread f18 [0,4194304] 0 2026-03-09T00:03:44.900 INFO:tasks.workunit.client.1.vm06.stdout:3/496: rename f8 to d11/d28/d2e/d2f/d5b/d94/faa 0 2026-03-09T00:03:44.902 INFO:tasks.workunit.client.1.vm06.stdout:8/405: creat db/d53/d70/d38/d4d/f83 x:0 0 0 2026-03-09T00:03:44.903 INFO:tasks.workunit.client.1.vm06.stdout:8/406: dread - db/dd/d48/f68 zero size 2026-03-09T00:03:44.903 INFO:tasks.workunit.client.1.vm06.stdout:8/407: dread - db/dd/d48/f7f zero size 2026-03-09T00:03:44.904 INFO:tasks.workunit.client.0.vm03.stdout:4/276: mknod d7/d20/c59 0 2026-03-09T00:03:44.904 INFO:tasks.workunit.client.0.vm03.stdout:7/250: mknod d2/d1f/c52 0 2026-03-09T00:03:44.909 INFO:tasks.workunit.client.1.vm06.stdout:2/524: creat d7/d1a/d25/d66/d87/d8e/f9a x:0 0 0 2026-03-09T00:03:44.909 INFO:tasks.workunit.client.1.vm06.stdout:2/525: fsync d7/da/d1c/f92 0 2026-03-09T00:03:44.912 INFO:tasks.workunit.client.1.vm06.stdout:6/423: link d4/d16/f5e d4/d16/d53/f82 0 2026-03-09T00:03:44.919 INFO:tasks.workunit.client.0.vm03.stdout:6/216: truncate d13/d1e/f30 639304 0 2026-03-09T00:03:44.920 INFO:tasks.workunit.client.1.vm06.stdout:0/428: creat d3/d18/f8e x:0 0 0 2026-03-09T00:03:44.920 INFO:tasks.workunit.client.0.vm03.stdout:1/285: rename d4/d15/f59 to d4/d15/d5c/f5f 0 2026-03-09T00:03:44.921 INFO:tasks.workunit.client.0.vm03.stdout:1/286: dread - d4/d3a/d43/f49 zero size 2026-03-09T00:03:44.921 INFO:tasks.workunit.client.0.vm03.stdout:1/287: rename d4/d3a/d43 to d4/d3a/d43/d60 22 2026-03-09T00:03:44.922 INFO:tasks.workunit.client.1.vm06.stdout:1/400: truncate d6/d21/d2d/f5b 2976414 0 2026-03-09T00:03:44.923 INFO:tasks.workunit.client.1.vm06.stdout:4/357: creat d17/d24/d49/d5f/f76 x:0 0 0 2026-03-09T00:03:44.923 INFO:tasks.workunit.client.1.vm06.stdout:4/358: write d17/f1d [4017302,100719] 0 2026-03-09T00:03:44.930 INFO:tasks.workunit.client.1.vm06.stdout:7/430: mknod d0/df/d1a/d27/c79 0 2026-03-09T00:03:44.931 INFO:tasks.workunit.client.0.vm03.stdout:8/217: rmdir d7/df 39 2026-03-09T00:03:44.931 INFO:tasks.workunit.client.0.vm03.stdout:8/218: write d7/f25 [921430,35247] 0 2026-03-09T00:03:44.931 INFO:tasks.workunit.client.0.vm03.stdout:8/219: read - d7/df/f30 zero size 2026-03-09T00:03:44.931 INFO:tasks.workunit.client.0.vm03.stdout:8/220: chown d7/df/d1a/d2b/l35 1950828 1 2026-03-09T00:03:44.931 INFO:tasks.workunit.client.1.vm06.stdout:5/557: link d5/d1c/d23/f4c d5/d1c/d23/d34/d47/fbd 0 2026-03-09T00:03:44.931 INFO:tasks.workunit.client.1.vm06.stdout:5/558: chown d5/d1c/d21/d28/f33 3844 1 2026-03-09T00:03:44.934 INFO:tasks.workunit.client.1.vm06.stdout:3/497: mknod d11/d3f/d8d/cab 0 2026-03-09T00:03:44.950 INFO:tasks.workunit.client.0.vm03.stdout:7/251: symlink d2/d4/d1e/l53 0 2026-03-09T00:03:44.950 INFO:tasks.workunit.client.0.vm03.stdout:6/217: mkdir d13/d1e/d44/d4a 0 2026-03-09T00:03:44.950 INFO:tasks.workunit.client.0.vm03.stdout:3/170: rename d2/f6 to d2/db/d2d/f34 0 2026-03-09T00:03:44.950 INFO:tasks.workunit.client.0.vm03.stdout:0/226: rmdir d2/da/d36/d39 39 2026-03-09T00:03:44.950 INFO:tasks.workunit.client.0.vm03.stdout:8/221: mknod d7/df/c45 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.0.vm03.stdout:5/215: creat d1c/d20/d44/d43/f4d x:0 0 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.0.vm03.stdout:7/252: mkdir d2/d1f/d42/d46/d54 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.0.vm03.stdout:7/253: fsync d2/f3 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:5/559: dread d5/d1c/d23/f4f [4194304,4194304] 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:2/526: creat d7/d1a/d25/d66/d87/f9b x:0 0 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:1/401: link d6/fa d6/d63/f82 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:4/359: link d17/d5b/f64 d17/d5b/f77 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:7/431: creat d0/df/d1a/f7a x:0 0 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:3/498: getdents d11/d28/d2e 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:5/560: symlink d5/d44/d4b/lbe 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:8/408: mkdir db/dd/d84 0 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:8/409: dread - db/d53/d5c/f6f zero size 2026-03-09T00:03:44.951 INFO:tasks.workunit.client.1.vm06.stdout:8/410: dread - db/dd/d24/f6e zero size 2026-03-09T00:03:44.956 INFO:tasks.workunit.client.0.vm03.stdout:7/254: dread d2/d1f/f11 [0,4194304] 0 2026-03-09T00:03:44.959 INFO:tasks.workunit.client.0.vm03.stdout:6/218: dread - d13/d1e/f28 zero size 2026-03-09T00:03:44.963 INFO:tasks.workunit.client.1.vm06.stdout:2/527: symlink d7/da/d1c/l9c 0 2026-03-09T00:03:44.963 INFO:tasks.workunit.client.1.vm06.stdout:4/360: truncate d17/d21/f4b 2059444 0 2026-03-09T00:03:44.963 INFO:tasks.workunit.client.0.vm03.stdout:3/171: rename d2/c4 to d2/db/c35 0 2026-03-09T00:03:44.963 INFO:tasks.workunit.client.0.vm03.stdout:3/172: readlink d2/db/l22 0 2026-03-09T00:03:44.963 INFO:tasks.workunit.client.0.vm03.stdout:3/173: write d2/f5 [3503978,1650] 0 2026-03-09T00:03:44.973 INFO:tasks.workunit.client.0.vm03.stdout:8/222: getdents d7/df/d1a 0 2026-03-09T00:03:44.974 INFO:tasks.workunit.client.1.vm06.stdout:7/432: rename d0/df/d1a/d27/d4c/d52 to d0/df/d7b 0 2026-03-09T00:03:44.974 INFO:tasks.workunit.client.0.vm03.stdout:5/216: creat d1c/d20/f4e x:0 0 0 2026-03-09T00:03:45.000 INFO:tasks.workunit.client.0.vm03.stdout:5/217: dread - d1c/d20/d44/d3b/f3f zero size 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:7/255: mknod d2/d1f/d42/d46/d54/c55 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:7/256: creat d2/d1f/d3a/d31/d37/f56 x:0 0 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:6/219: symlink d13/d35/l4b 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:8/223: unlink d7/l2f 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:8/224: read d7/f18 [599393,61492] 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/218: mkdir d1c/d20/d44/d4f 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:6/220: mkdir d13/d35/d4c 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:6/221: write d13/d1e/f48 [845894,23837] 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/219: symlink d1c/d20/l50 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/220: dread - d1c/d20/d44/f46 zero size 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:6/222: creat d13/f4d x:0 0 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:6/223: dread - d13/f1a zero size 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/221: rmdir d1c/d20/d44/d3b 39 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/222: readlink d1c/d20/l24 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/223: write d1c/d20/d44/f34 [928018,23971] 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/224: chown f14 48090350 1 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:6/224: link d13/l2a d13/l4e 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/225: stat d1c/f1f 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.0.vm03.stdout:5/226: write d1c/f1f [4211937,92825] 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:3/499: mknod d11/d28/d4d/d89/d90/cac 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:3/500: creat d11/d28/d2e/d2f/d5b/d94/fad x:0 0 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/361: rename d17/d24/d3b/d5e/l6a to d17/d21/d4c/d50/l78 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:7/433: truncate d0/df/f13 1561121 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:7/434: chown d0/df/d1a/d3a/d4e/d5e 44383 1 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:3/501: unlink f3 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:3/502: stat d11/d28/d2e/l6d 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/362: rmdir d17/d21/d4c/d66 39 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/363: unlink d17/d24/d49/l43 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/364: mknod d17/d21/c79 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/365: unlink l0 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/366: mkdir d17/d24/d3b/d5e/d7a 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/367: truncate d17/f20 864522 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/368: write d17/d24/d49/d5f/f6b [778472,115287] 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/369: creat d17/d21/d4c/d66/f7b x:0 0 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/370: rmdir d17/d21/d32 39 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:4/371: fdatasync d17/d21/d4c/f57 0 2026-03-09T00:03:45.001 INFO:tasks.workunit.client.1.vm06.stdout:7/435: dread d0/df/d17/f38 [0,4194304] 0 2026-03-09T00:03:45.002 INFO:tasks.workunit.client.0.vm03.stdout:2/194: dwrite d8/d17/f2c [0,4194304] 0 2026-03-09T00:03:45.004 INFO:tasks.workunit.client.0.vm03.stdout:6/225: write d13/d1e/f2d [2124543,86658] 0 2026-03-09T00:03:45.004 INFO:tasks.workunit.client.1.vm06.stdout:7/436: rename d0/l45 to d0/l7c 0 2026-03-09T00:03:45.004 INFO:tasks.workunit.client.1.vm06.stdout:7/437: fdatasync d0/df/d1a/d3a/d4e/d5e/f73 0 2026-03-09T00:03:45.004 INFO:tasks.workunit.client.1.vm06.stdout:7/438: chown d0/df/c1d 651275 1 2026-03-09T00:03:45.013 INFO:tasks.workunit.client.1.vm06.stdout:7/439: creat d0/df/d1a/d3f/f7d x:0 0 0 2026-03-09T00:03:45.013 INFO:tasks.workunit.client.1.vm06.stdout:7/440: write d0/df/d1a/d35/f61 [174470,100683] 0 2026-03-09T00:03:45.014 INFO:tasks.workunit.client.0.vm03.stdout:2/195: truncate d8/d1b/f1e 532676 0 2026-03-09T00:03:45.014 INFO:tasks.workunit.client.0.vm03.stdout:2/196: fsync d8/d1b/f1f 0 2026-03-09T00:03:45.014 INFO:tasks.workunit.client.0.vm03.stdout:2/197: chown f7 0 1 2026-03-09T00:03:45.014 INFO:tasks.workunit.client.0.vm03.stdout:7/257: write d2/d1f/d35/f3e [781061,64472] 0 2026-03-09T00:03:45.014 INFO:tasks.workunit.client.0.vm03.stdout:7/258: write d2/d1f/d3a/d31/f3f [1460971,38396] 0 2026-03-09T00:03:45.016 INFO:tasks.workunit.client.0.vm03.stdout:1/288: dread f2 [0,4194304] 0 2026-03-09T00:03:45.017 INFO:tasks.workunit.client.1.vm06.stdout:7/441: creat d0/df/d17/f7e x:0 0 0 2026-03-09T00:03:45.020 INFO:tasks.workunit.client.0.vm03.stdout:2/198: rename c0 to d8/d1b/d2a/d2e/c40 0 2026-03-09T00:03:45.045 INFO:tasks.workunit.client.0.vm03.stdout:2/199: chown d8/d1b/f32 15193 1 2026-03-09T00:03:45.045 INFO:tasks.workunit.client.0.vm03.stdout:2/200: creat d8/d1b/d24/f41 x:0 0 0 2026-03-09T00:03:45.045 INFO:tasks.workunit.client.1.vm06.stdout:7/442: mknod d0/d39/c7f 0 2026-03-09T00:03:45.045 INFO:tasks.workunit.client.0.vm03.stdout:1/289: mkdir d4/d3a/d61 0 2026-03-09T00:03:45.092 INFO:tasks.workunit.client.1.vm06.stdout:6/424: dwrite d4/d27/d3e/d57/f5c [0,4194304] 0 2026-03-09T00:03:45.094 INFO:tasks.workunit.client.0.vm03.stdout:9/207: dwrite d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:45.097 INFO:tasks.workunit.client.1.vm06.stdout:6/425: unlink d4/d27/d42/d52/f80 0 2026-03-09T00:03:45.100 INFO:tasks.workunit.client.1.vm06.stdout:6/426: write d4/ff [1517753,27090] 0 2026-03-09T00:03:45.100 INFO:tasks.workunit.client.1.vm06.stdout:6/427: dread - d4/d27/d3e/f7a zero size 2026-03-09T00:03:45.100 INFO:tasks.workunit.client.1.vm06.stdout:6/428: fsync d4/f2a 0 2026-03-09T00:03:45.104 INFO:tasks.workunit.client.1.vm06.stdout:2/528: dwrite d7/da/d55/f5b [0,4194304] 0 2026-03-09T00:03:45.104 INFO:tasks.workunit.client.1.vm06.stdout:2/529: write d7/da/db/de/f53 [1012057,112678] 0 2026-03-09T00:03:45.106 INFO:tasks.workunit.client.1.vm06.stdout:2/530: getdents d7/da/db 0 2026-03-09T00:03:45.107 INFO:tasks.workunit.client.1.vm06.stdout:2/531: mkdir d7/da/d4e/d57/d9d 0 2026-03-09T00:03:45.108 INFO:tasks.workunit.client.1.vm06.stdout:2/532: creat d7/da/d1c/f9e x:0 0 0 2026-03-09T00:03:45.117 INFO:tasks.workunit.client.1.vm06.stdout:9/358: dwrite d1/d3/f11 [0,4194304] 0 2026-03-09T00:03:45.136 INFO:tasks.workunit.client.0.vm03.stdout:5/227: dwrite d1c/d20/f39 [0,4194304] 0 2026-03-09T00:03:45.136 INFO:tasks.workunit.client.0.vm03.stdout:5/228: chown f14 14012 1 2026-03-09T00:03:45.136 INFO:tasks.workunit.client.0.vm03.stdout:5/229: readlink d1c/d20/d44/l40 0 2026-03-09T00:03:45.138 INFO:tasks.workunit.client.0.vm03.stdout:8/225: dread d7/f10 [0,4194304] 0 2026-03-09T00:03:45.140 INFO:tasks.workunit.client.0.vm03.stdout:5/230: rmdir d1c/d20/d44/d3b 39 2026-03-09T00:03:45.151 INFO:tasks.workunit.client.0.vm03.stdout:5/231: chown d1c/d20/d44/d43 22359 1 2026-03-09T00:03:45.152 INFO:tasks.workunit.client.0.vm03.stdout:5/232: stat f12 0 2026-03-09T00:03:45.152 INFO:tasks.workunit.client.0.vm03.stdout:5/233: stat d1c/f1f 0 2026-03-09T00:03:45.152 INFO:tasks.workunit.client.0.vm03.stdout:8/226: mknod d7/df/d1e/c46 0 2026-03-09T00:03:45.152 INFO:tasks.workunit.client.0.vm03.stdout:8/227: creat d7/df/d1e/d3f/f47 x:0 0 0 2026-03-09T00:03:45.158 INFO:tasks.workunit.client.0.vm03.stdout:4/277: dwrite d7/f15 [0,4194304] 0 2026-03-09T00:03:45.160 INFO:tasks.workunit.client.0.vm03.stdout:4/278: unlink d7/d27/l3b 0 2026-03-09T00:03:45.163 INFO:tasks.workunit.client.0.vm03.stdout:4/279: rename d7/d23/c44 to d7/d20/d29/d54/c5a 0 2026-03-09T00:03:45.164 INFO:tasks.workunit.client.0.vm03.stdout:9/208: dread d15/f1b [0,4194304] 0 2026-03-09T00:03:45.164 INFO:tasks.workunit.client.0.vm03.stdout:9/209: mknod d15/d1c/d28/d30/c40 0 2026-03-09T00:03:45.218 INFO:tasks.workunit.client.1.vm06.stdout:0/429: dwrite d3/d18/d2c/d2d/d31/f89 [0,4194304] 0 2026-03-09T00:03:45.219 INFO:tasks.workunit.client.1.vm06.stdout:0/430: write d3/d18/d28/d45/f52 [937418,122289] 0 2026-03-09T00:03:45.219 INFO:tasks.workunit.client.0.vm03.stdout:3/174: dwrite d2/db/f21 [0,4194304] 0 2026-03-09T00:03:45.219 INFO:tasks.workunit.client.0.vm03.stdout:3/175: chown d2/db/l22 7472 1 2026-03-09T00:03:45.226 INFO:tasks.workunit.client.0.vm03.stdout:4/280: dread d7/d20/d29/f2a [0,4194304] 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/431: link d3/d18/d1f/d39/d49/d60/l8d d3/d18/d1f/l8f 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/432: chown d3/d18/d2c/d2d/d31/f88 99704499 1 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/433: mkdir d3/d18/d2c/d2d/d74/d90 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/434: rename d3/d18/f25 to d3/d18/d1f/d39/d69/f91 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/435: truncate d3/d18/d2c/d2d/d31/f88 829518 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/436: unlink d3/d18/d2c/d2d/d74/l8b 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/437: creat d3/d18/d1f/d39/d49/d60/f92 x:0 0 0 2026-03-09T00:03:45.231 INFO:tasks.workunit.client.1.vm06.stdout:0/438: mknod d3/d18/d1f/c93 0 2026-03-09T00:03:45.232 INFO:tasks.workunit.client.1.vm06.stdout:0/439: write d3/d18/d28/d45/f52 [533205,74532] 0 2026-03-09T00:03:45.235 INFO:tasks.workunit.client.1.vm06.stdout:0/440: truncate d3/f10 319763 0 2026-03-09T00:03:45.235 INFO:tasks.workunit.client.1.vm06.stdout:0/441: fsync d3/f1e 0 2026-03-09T00:03:45.235 INFO:tasks.workunit.client.1.vm06.stdout:0/442: rmdir d3/d18/d2c 39 2026-03-09T00:03:45.236 INFO:tasks.workunit.client.0.vm03.stdout:4/281: dread d7/f1d [0,4194304] 0 2026-03-09T00:03:45.236 INFO:tasks.workunit.client.1.vm06.stdout:0/443: link d3/d18/d1f/d39/d3b/c33 d3/d18/d79/c94 0 2026-03-09T00:03:45.250 INFO:tasks.workunit.client.0.vm03.stdout:0/227: dwrite d2/d1f/f43 [0,4194304] 0 2026-03-09T00:03:45.250 INFO:tasks.workunit.client.0.vm03.stdout:0/228: chown d2/da/d1a/c20 117 1 2026-03-09T00:03:45.250 INFO:tasks.workunit.client.0.vm03.stdout:7/259: dwrite d2/d1f/d3a/f19 [4194304,4194304] 0 2026-03-09T00:03:45.250 INFO:tasks.workunit.client.0.vm03.stdout:4/282: dread d7/d27/f52 [0,4194304] 0 2026-03-09T00:03:45.250 INFO:tasks.workunit.client.0.vm03.stdout:6/226: dwrite d13/d1e/f3e [0,4194304] 0 2026-03-09T00:03:45.250 INFO:tasks.workunit.client.0.vm03.stdout:4/283: chown d7/d20/f33 198317360 1 2026-03-09T00:03:45.254 INFO:tasks.workunit.client.1.vm06.stdout:0/444: write d3/d18/d2c/d2d/f40 [3035607,30966] 0 2026-03-09T00:03:45.254 INFO:tasks.workunit.client.1.vm06.stdout:0/445: fdatasync d3/f19 0 2026-03-09T00:03:45.255 INFO:tasks.workunit.client.0.vm03.stdout:0/229: unlink d2/d1f/c42 0 2026-03-09T00:03:45.255 INFO:tasks.workunit.client.0.vm03.stdout:0/230: stat d2/da/f2d 0 2026-03-09T00:03:45.259 INFO:tasks.workunit.client.1.vm06.stdout:0/446: dread d3/d18/d2c/d2d/f85 [0,4194304] 0 2026-03-09T00:03:45.259 INFO:tasks.workunit.client.1.vm06.stdout:0/447: write d3/d18/d3c/f87 [285423,51648] 0 2026-03-09T00:03:45.261 INFO:tasks.workunit.client.0.vm03.stdout:2/201: dwrite d8/f3e [0,4194304] 0 2026-03-09T00:03:45.261 INFO:tasks.workunit.client.0.vm03.stdout:2/202: readlink d8/le 0 2026-03-09T00:03:45.262 INFO:tasks.workunit.client.0.vm03.stdout:1/290: dwrite d4/d3a/f41 [0,4194304] 0 2026-03-09T00:03:45.266 INFO:tasks.workunit.client.1.vm06.stdout:0/448: symlink d3/d18/d1f/d39/d69/l95 0 2026-03-09T00:03:45.267 INFO:tasks.workunit.client.0.vm03.stdout:6/227: creat d13/d35/d4c/f4f x:0 0 0 2026-03-09T00:03:45.268 INFO:tasks.workunit.client.0.vm03.stdout:6/228: dread d13/d1e/f48 [0,4194304] 0 2026-03-09T00:03:45.274 INFO:tasks.workunit.client.0.vm03.stdout:5/234: dwrite f18 [0,4194304] 0 2026-03-09T00:03:45.276 INFO:tasks.workunit.client.0.vm03.stdout:9/210: dread f8 [0,4194304] 0 2026-03-09T00:03:45.276 INFO:tasks.workunit.client.0.vm03.stdout:9/211: chown d15/d1c/d21/f25 743 1 2026-03-09T00:03:45.276 INFO:tasks.workunit.client.0.vm03.stdout:9/212: creat d15/d1c/d21/f41 x:0 0 0 2026-03-09T00:03:45.277 INFO:tasks.workunit.client.1.vm06.stdout:9/359: dwrite d1/d3/d4f/d52/f5e [4194304,4194304] 0 2026-03-09T00:03:45.279 INFO:tasks.workunit.client.0.vm03.stdout:4/284: symlink d7/d20/l5b 0 2026-03-09T00:03:45.280 INFO:tasks.workunit.client.0.vm03.stdout:5/235: write d1c/d20/d44/f42 [4112951,4956] 0 2026-03-09T00:03:45.280 INFO:tasks.workunit.client.0.vm03.stdout:5/236: truncate d1c/d20/d44/d3b/f3c 965039 0 2026-03-09T00:03:45.280 INFO:tasks.workunit.client.0.vm03.stdout:5/237: chown d1c/d20/d44/d3b/f3c 1048 1 2026-03-09T00:03:45.283 INFO:tasks.workunit.client.0.vm03.stdout:0/231: mkdir d2/da/d36/d39/d4b 0 2026-03-09T00:03:45.283 INFO:tasks.workunit.client.0.vm03.stdout:0/232: dread - d2/da/d36/d39/f48 zero size 2026-03-09T00:03:45.288 INFO:tasks.workunit.client.0.vm03.stdout:3/176: rename d2/db/d2d/f34 to d2/db/d2d/f36 0 2026-03-09T00:03:45.297 INFO:tasks.workunit.client.0.vm03.stdout:3/177: read d2/db/f21 [2188248,12338] 0 2026-03-09T00:03:45.302 INFO:tasks.workunit.client.1.vm06.stdout:0/449: dwrite d3/f29 [4194304,4194304] 0 2026-03-09T00:03:45.302 INFO:tasks.workunit.client.1.vm06.stdout:0/450: write d3/d18/f14 [1370330,67608] 0 2026-03-09T00:03:45.318 INFO:tasks.workunit.client.1.vm06.stdout:9/360: creat d1/d3/d4f/f71 x:0 0 0 2026-03-09T00:03:45.324 INFO:tasks.workunit.client.0.vm03.stdout:1/291: creat d4/d15/d5c/f62 x:0 0 0 2026-03-09T00:03:45.330 INFO:tasks.workunit.client.0.vm03.stdout:2/203: fsync d8/f11 0 2026-03-09T00:03:45.338 INFO:tasks.workunit.client.0.vm03.stdout:7/260: rmdir d2/d1f 39 2026-03-09T00:03:45.338 INFO:tasks.workunit.client.0.vm03.stdout:7/261: readlink d2/d4/d1e/l21 0 2026-03-09T00:03:45.338 INFO:tasks.workunit.client.0.vm03.stdout:6/229: mknod d13/c50 0 2026-03-09T00:03:45.354 INFO:tasks.workunit.client.0.vm03.stdout:4/285: symlink d7/d20/d29/d4e/l5c 0 2026-03-09T00:03:45.355 INFO:tasks.workunit.client.0.vm03.stdout:4/286: stat d7/d20/d29/l57 0 2026-03-09T00:03:45.355 INFO:tasks.workunit.client.1.vm06.stdout:9/361: creat d1/d3/d12/f72 x:0 0 0 2026-03-09T00:03:45.369 INFO:tasks.workunit.client.0.vm03.stdout:5/238: unlink d1c/d20/d44/d3b/f3f 0 2026-03-09T00:03:45.370 INFO:tasks.workunit.client.0.vm03.stdout:0/233: creat d2/da/d36/d39/d4b/f4c x:0 0 0 2026-03-09T00:03:45.378 INFO:tasks.workunit.client.0.vm03.stdout:0/234: rename d2 to d2/da/dd/d49/d4d 22 2026-03-09T00:03:45.378 INFO:tasks.workunit.client.0.vm03.stdout:1/292: mknod d4/d15/d5c/c63 0 2026-03-09T00:03:45.378 INFO:tasks.workunit.client.0.vm03.stdout:2/204: mkdir d8/d1b/d2a/d42 0 2026-03-09T00:03:45.385 INFO:tasks.workunit.client.0.vm03.stdout:6/230: readlink l11 0 2026-03-09T00:03:45.386 INFO:tasks.workunit.client.0.vm03.stdout:0/235: write d2/fe [3321395,116104] 0 2026-03-09T00:03:45.386 INFO:tasks.workunit.client.0.vm03.stdout:9/213: symlink d15/d1c/d28/l42 0 2026-03-09T00:03:45.386 INFO:tasks.workunit.client.0.vm03.stdout:9/214: dread - d15/d1c/d21/f41 zero size 2026-03-09T00:03:45.393 INFO:tasks.workunit.client.0.vm03.stdout:0/236: read d2/f32 [1933014,67018] 0 2026-03-09T00:03:45.395 INFO:tasks.workunit.client.0.vm03.stdout:7/262: dwrite d2/d1f/d3a/d31/d37/f56 [0,4194304] 0 2026-03-09T00:03:45.395 INFO:tasks.workunit.client.0.vm03.stdout:7/263: chown d2/d4/f13 1860 1 2026-03-09T00:03:45.399 INFO:tasks.workunit.client.0.vm03.stdout:2/205: dread d8/d1b/d24/f2f [0,4194304] 0 2026-03-09T00:03:45.399 INFO:tasks.workunit.client.0.vm03.stdout:2/206: fdatasync d8/d1b/d24/f41 0 2026-03-09T00:03:45.408 INFO:tasks.workunit.client.1.vm06.stdout:9/362: dwrite d1/f45 [4194304,4194304] 0 2026-03-09T00:03:45.414 INFO:tasks.workunit.client.0.vm03.stdout:4/287: creat d7/f5d x:0 0 0 2026-03-09T00:03:45.424 INFO:tasks.workunit.client.0.vm03.stdout:6/231: rename l11 to d13/d1e/d44/d4a/l51 0 2026-03-09T00:03:45.429 INFO:tasks.workunit.client.0.vm03.stdout:0/237: dread d2/da/dd/f14 [0,4194304] 0 2026-03-09T00:03:45.431 INFO:tasks.workunit.client.0.vm03.stdout:2/207: mkdir d8/d1b/d2a/d42/d43 0 2026-03-09T00:03:45.434 INFO:tasks.workunit.client.0.vm03.stdout:7/264: rename d2/c12 to d2/d1f/d42/d43/c57 0 2026-03-09T00:03:45.437 INFO:tasks.workunit.client.0.vm03.stdout:0/238: unlink d2/da/d1a/c3c 0 2026-03-09T00:03:45.441 INFO:tasks.workunit.client.0.vm03.stdout:0/239: chown d2/da/c10 0 1 2026-03-09T00:03:45.441 INFO:tasks.workunit.client.0.vm03.stdout:7/265: mknod d2/d1f/d3a/d31/d37/d39/c58 0 2026-03-09T00:03:45.441 INFO:tasks.workunit.client.0.vm03.stdout:7/266: chown d2/d1f/d3a/d24/c38 199814047 1 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:8/411: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:6/429: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:8/412: chown db/d1e/f25 0 1 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:6/430: read d4/d27/d3e/f41 [1873032,108998] 0 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:6/431: fsync d4/d16/d53/f82 0 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:5/561: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:5/562: chown d5/d44/d4b/c4d 0 1 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:1/402: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:1/403: fdatasync d6/f81 0 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:1/404: chown d6/d4c/d51/d7f 2394 1 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:4/372: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:4/373: fsync d17/d24/d49/d5f/f6b 0 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:2/533: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:2/534: fsync d7/d1b/d31/f90 0 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:7/443: sync 2026-03-09T00:03:45.442 INFO:tasks.workunit.client.1.vm06.stdout:7/444: chown d0/df/d1a/l33 0 1 2026-03-09T00:03:45.450 INFO:tasks.workunit.client.1.vm06.stdout:6/432: creat d4/d27/d42/d4b/f83 x:0 0 0 2026-03-09T00:03:45.451 INFO:tasks.workunit.client.1.vm06.stdout:5/563: symlink d5/d1c/d21/d28/d5e/lbf 0 2026-03-09T00:03:45.451 INFO:tasks.workunit.client.1.vm06.stdout:5/564: fdatasync d5/d1c/d21/d28/f3b 0 2026-03-09T00:03:45.451 INFO:tasks.workunit.client.0.vm03.stdout:6/232: getdents d13 0 2026-03-09T00:03:45.451 INFO:tasks.workunit.client.0.vm03.stdout:6/233: fdatasync d13/f1c 0 2026-03-09T00:03:45.453 INFO:tasks.workunit.client.1.vm06.stdout:1/405: mkdir d6/d4c/d71/d83 0 2026-03-09T00:03:45.453 INFO:tasks.workunit.client.1.vm06.stdout:4/374: mknod d17/d21/d4c/c7c 0 2026-03-09T00:03:45.453 INFO:tasks.workunit.client.0.vm03.stdout:6/234: stat d13/c2b 0 2026-03-09T00:03:45.454 INFO:tasks.workunit.client.1.vm06.stdout:2/535: link d7/da/d1c/f5f d7/da/d4e/d57/f9f 0 2026-03-09T00:03:45.455 INFO:tasks.workunit.client.1.vm06.stdout:0/451: sync 2026-03-09T00:03:45.455 INFO:tasks.workunit.client.1.vm06.stdout:3/503: sync 2026-03-09T00:03:45.457 INFO:tasks.workunit.client.0.vm03.stdout:6/235: truncate d13/f17 840981 0 2026-03-09T00:03:45.457 INFO:tasks.workunit.client.0.vm03.stdout:6/236: write d13/f14 [113096,66334] 0 2026-03-09T00:03:45.457 INFO:tasks.workunit.client.0.vm03.stdout:6/237: fsync d13/d1e/f48 0 2026-03-09T00:03:45.457 INFO:tasks.workunit.client.0.vm03.stdout:6/238: stat d13/d1e/d44/f49 0 2026-03-09T00:03:45.458 INFO:tasks.workunit.client.1.vm06.stdout:5/565: rename d5/d44/d4b/d92/d49/da0/cba to d5/d1c/d68/cc0 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:6/433: read d4/f36 [1854452,1561] 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:6/434: fdatasync d4/d27/d3e/f55 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:6/435: truncate d4/d27/d3e/f7a 152746 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:6/436: creat d4/d27/f84 x:0 0 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:4/375: unlink d17/d24/d3b/c72 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:7/445: getdents d0/df/d17 0 2026-03-09T00:03:45.462 INFO:tasks.workunit.client.1.vm06.stdout:1/406: dread d6/d21/f3d [0,4194304] 0 2026-03-09T00:03:45.463 INFO:tasks.workunit.client.1.vm06.stdout:2/536: rename d7/f26 to d7/da/d4e/d57/d9d/fa0 0 2026-03-09T00:03:45.464 INFO:tasks.workunit.client.1.vm06.stdout:5/566: creat d5/d1c/d21/d28/d5e/d66/d78/fc1 x:0 0 0 2026-03-09T00:03:45.465 INFO:tasks.workunit.client.1.vm06.stdout:6/437: mknod d4/d27/d3e/d78/c85 0 2026-03-09T00:03:45.465 INFO:tasks.workunit.client.1.vm06.stdout:6/438: write d4/d27/d3e/d57/f79 [232454,41427] 0 2026-03-09T00:03:45.466 INFO:tasks.workunit.client.1.vm06.stdout:4/376: unlink d17/d21/c73 0 2026-03-09T00:03:45.469 INFO:tasks.workunit.client.1.vm06.stdout:1/407: unlink d6/d21/d2d/d3b/f7e 0 2026-03-09T00:03:45.481 INFO:tasks.workunit.client.1.vm06.stdout:0/452: write d3/d18/d2c/d2d/d31/f5d [1006351,11093] 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:5/567: getdents d5/d44/d84 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:2/537: symlink d7/d1a/d25/d97/la1 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:5/568: readlink d5/l88 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:2/538: chown d7/d1a/l2a 1471 1 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:6/439: symlink d4/d27/d42/d52/l86 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:6/440: stat d4/d16/f5e 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:6/441: fdatasync d4/d16/d53/f82 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:3/504: write d11/d28/f4f [30653,97056] 0 2026-03-09T00:03:45.482 INFO:tasks.workunit.client.1.vm06.stdout:4/377: symlink d17/d21/l7d 0 2026-03-09T00:03:45.483 INFO:tasks.workunit.client.0.vm03.stdout:8/228: sync 2026-03-09T00:03:45.484 INFO:tasks.workunit.client.1.vm06.stdout:0/453: getdents d3/d18/d28/d45 0 2026-03-09T00:03:45.484 INFO:tasks.workunit.client.0.vm03.stdout:8/229: creat d7/f48 x:0 0 0 2026-03-09T00:03:45.485 INFO:tasks.workunit.client.1.vm06.stdout:6/442: mknod d4/d16/c87 0 2026-03-09T00:03:45.486 INFO:tasks.workunit.client.1.vm06.stdout:3/505: rename d11/d28/c2a to d11/d28/d4d/d89/cae 0 2026-03-09T00:03:45.486 INFO:tasks.workunit.client.1.vm06.stdout:3/506: creat d11/d28/d2e/d2f/d36/faf x:0 0 0 2026-03-09T00:03:45.489 INFO:tasks.workunit.client.1.vm06.stdout:4/378: unlink d17/d21/d4c/f57 0 2026-03-09T00:03:45.489 INFO:tasks.workunit.client.1.vm06.stdout:0/454: creat d3/d18/d1f/d44/d6a/f96 x:0 0 0 2026-03-09T00:03:45.489 INFO:tasks.workunit.client.1.vm06.stdout:4/379: chown d17/d24/d49/l48 1317 1 2026-03-09T00:03:45.491 INFO:tasks.workunit.client.1.vm06.stdout:4/380: dread d17/d24/d49/d5f/f6b [0,4194304] 0 2026-03-09T00:03:45.495 INFO:tasks.workunit.client.1.vm06.stdout:3/507: symlink d11/d28/d4d/d89/lb0 0 2026-03-09T00:03:45.496 INFO:tasks.workunit.client.1.vm06.stdout:4/381: unlink d17/d24/d3b/d5e/f6c 0 2026-03-09T00:03:45.496 INFO:tasks.workunit.client.1.vm06.stdout:3/508: mkdir d11/d28/d2e/d2f/d5b/d5f/db1 0 2026-03-09T00:03:45.496 INFO:tasks.workunit.client.1.vm06.stdout:3/509: stat d11/d28/d2e/d2f/d36/f55 0 2026-03-09T00:03:45.496 INFO:tasks.workunit.client.1.vm06.stdout:3/510: mkdir d11/d28/d2e/db2 0 2026-03-09T00:03:45.497 INFO:tasks.workunit.client.1.vm06.stdout:3/511: creat d11/d28/d2e/d2f/d5b/d94/fb3 x:0 0 0 2026-03-09T00:03:45.497 INFO:tasks.workunit.client.1.vm06.stdout:3/512: mknod d11/d28/d2e/d2f/d5b/d5f/d91/da9/cb4 0 2026-03-09T00:03:45.497 INFO:tasks.workunit.client.1.vm06.stdout:0/455: dread d3/d18/d2c/f4d [0,4194304] 0 2026-03-09T00:03:45.499 INFO:tasks.workunit.client.1.vm06.stdout:0/456: creat d3/d18/d28/d45/f97 x:0 0 0 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: pgmap v7: 65 pgs: 65 active+clean; 1.5 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 93 MiB/s rd, 121 MiB/s wr, 196 op/s 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: mgrmap e25: vm06.rzcvhn(active, since 8s), standbys: vm03.yvcons 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:03:45.558 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:45 vm03.local ceph-mon[52346]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T00:03:45.558 INFO:tasks.workunit.client.0.vm03.stdout:1/293: dwrite d4/d6/f8 [4194304,4194304] 0 2026-03-09T00:03:45.559 INFO:tasks.workunit.client.0.vm03.stdout:1/294: chown d4/c3b 33184212 1 2026-03-09T00:03:45.559 INFO:tasks.workunit.client.0.vm03.stdout:1/295: write d4/d15/f18 [2024500,126817] 0 2026-03-09T00:03:45.559 INFO:tasks.workunit.client.0.vm03.stdout:1/296: stat d4/l7 0 2026-03-09T00:03:45.563 INFO:tasks.workunit.client.0.vm03.stdout:3/178: sync 2026-03-09T00:03:45.563 INFO:tasks.workunit.client.0.vm03.stdout:3/179: fdatasync d2/db/f1a 0 2026-03-09T00:03:45.563 INFO:tasks.workunit.client.0.vm03.stdout:3/180: creat d2/db/d2d/f37 x:0 0 0 2026-03-09T00:03:45.563 INFO:tasks.workunit.client.0.vm03.stdout:5/239: truncate d1c/d20/d44/d3b/f3c 472451 0 2026-03-09T00:03:45.566 INFO:tasks.workunit.client.1.vm06.stdout:9/363: dwrite d1/d4/f24 [0,4194304] 0 2026-03-09T00:03:45.568 INFO:tasks.workunit.client.0.vm03.stdout:3/181: symlink d2/db/l38 0 2026-03-09T00:03:45.569 INFO:tasks.workunit.client.1.vm06.stdout:9/364: mkdir d1/d73 0 2026-03-09T00:03:45.569 INFO:tasks.workunit.client.1.vm06.stdout:9/365: chown d1 505318 1 2026-03-09T00:03:45.569 INFO:tasks.workunit.client.1.vm06.stdout:0/457: dread d3/d18/d1f/d39/d69/f91 [0,4194304] 0 2026-03-09T00:03:45.569 INFO:tasks.workunit.client.0.vm03.stdout:3/182: rmdir d2/db 39 2026-03-09T00:03:45.569 INFO:tasks.workunit.client.0.vm03.stdout:5/240: mkdir d1c/d51 0 2026-03-09T00:03:45.569 INFO:tasks.workunit.client.0.vm03.stdout:5/241: fdatasync fb 0 2026-03-09T00:03:45.570 INFO:tasks.workunit.client.1.vm06.stdout:0/458: truncate d3/f7 2327854 0 2026-03-09T00:03:45.572 INFO:tasks.workunit.client.1.vm06.stdout:4/382: dread d17/f19 [0,4194304] 0 2026-03-09T00:03:45.572 INFO:tasks.workunit.client.0.vm03.stdout:3/183: truncate d2/db/f17 223743 0 2026-03-09T00:03:45.573 INFO:tasks.workunit.client.0.vm03.stdout:5/242: link ff d1c/d20/d44/f52 0 2026-03-09T00:03:45.575 INFO:tasks.workunit.client.1.vm06.stdout:0/459: symlink d3/l98 0 2026-03-09T00:03:45.575 INFO:tasks.workunit.client.1.vm06.stdout:4/383: creat d17/d24/d3b/d54/f7e x:0 0 0 2026-03-09T00:03:45.576 INFO:tasks.workunit.client.0.vm03.stdout:3/184: symlink d2/l39 0 2026-03-09T00:03:45.576 INFO:tasks.workunit.client.0.vm03.stdout:3/185: chown d2/f16 2 1 2026-03-09T00:03:45.577 INFO:tasks.workunit.client.1.vm06.stdout:0/460: symlink d3/d18/d2c/d2d/l99 0 2026-03-09T00:03:45.577 INFO:tasks.workunit.client.1.vm06.stdout:0/461: fdatasync d3/d18/f68 0 2026-03-09T00:03:45.577 INFO:tasks.workunit.client.1.vm06.stdout:0/462: readlink d3/l98 0 2026-03-09T00:03:45.577 INFO:tasks.workunit.client.0.vm03.stdout:5/243: link d1c/d20/d44/f42 d1c/d20/d44/d43/f53 0 2026-03-09T00:03:45.578 INFO:tasks.workunit.client.0.vm03.stdout:3/186: creat d2/db/f3a x:0 0 0 2026-03-09T00:03:45.590 INFO:tasks.workunit.client.1.vm06.stdout:4/384: rmdir d17/d24/d49 39 2026-03-09T00:03:45.590 INFO:tasks.workunit.client.1.vm06.stdout:4/385: mknod d17/d24/d49/d5f/c7f 0 2026-03-09T00:03:45.590 INFO:tasks.workunit.client.1.vm06.stdout:4/386: creat d17/d24/d3b/d54/f80 x:0 0 0 2026-03-09T00:03:45.593 INFO:tasks.workunit.client.0.vm03.stdout:4/288: dwrite d7/f15 [0,4194304] 0 2026-03-09T00:03:45.603 INFO:tasks.workunit.client.1.vm06.stdout:8/413: dwrite db/dd/d24/f6e [0,4194304] 0 2026-03-09T00:03:45.603 INFO:tasks.workunit.client.1.vm06.stdout:8/414: read db/d1e/d46/f5d [627272,24329] 0 2026-03-09T00:03:45.605 INFO:tasks.workunit.client.0.vm03.stdout:5/244: dread d1c/d20/f39 [0,4194304] 0 2026-03-09T00:03:45.607 INFO:tasks.workunit.client.1.vm06.stdout:8/415: truncate db/d1e/f60 212805 0 2026-03-09T00:03:45.608 INFO:tasks.workunit.client.1.vm06.stdout:8/416: getdents db/d1e/d46 0 2026-03-09T00:03:45.608 INFO:tasks.workunit.client.1.vm06.stdout:8/417: readlink db/l26 0 2026-03-09T00:03:45.608 INFO:tasks.workunit.client.1.vm06.stdout:8/418: readlink db/l2f 0 2026-03-09T00:03:45.608 INFO:tasks.workunit.client.1.vm06.stdout:8/419: mkdir db/dd/d85 0 2026-03-09T00:03:45.609 INFO:tasks.workunit.client.1.vm06.stdout:8/420: creat db/dd/f86 x:0 0 0 2026-03-09T00:03:45.610 INFO:tasks.workunit.client.1.vm06.stdout:8/421: mkdir db/d74/d87 0 2026-03-09T00:03:45.611 INFO:tasks.workunit.client.1.vm06.stdout:8/422: link db/f3f db/dd/d24/d80/f88 0 2026-03-09T00:03:45.611 INFO:tasks.workunit.client.1.vm06.stdout:8/423: creat db/dd/d48/f89 x:0 0 0 2026-03-09T00:03:45.629 INFO:tasks.workunit.client.0.vm03.stdout:9/215: dwrite d15/f17 [0,4194304] 0 2026-03-09T00:03:45.634 INFO:tasks.workunit.client.0.vm03.stdout:1/297: dread f0 [0,4194304] 0 2026-03-09T00:03:45.634 INFO:tasks.workunit.client.0.vm03.stdout:9/216: link d15/l19 d15/d1c/d28/l43 0 2026-03-09T00:03:45.638 INFO:tasks.workunit.client.0.vm03.stdout:1/298: write d4/fb [7911323,48663] 0 2026-03-09T00:03:45.640 INFO:tasks.workunit.client.0.vm03.stdout:1/299: readlink d4/l7 0 2026-03-09T00:03:45.641 INFO:tasks.workunit.client.0.vm03.stdout:1/300: creat d4/d3a/d3d/f64 x:0 0 0 2026-03-09T00:03:45.641 INFO:tasks.workunit.client.0.vm03.stdout:1/301: write d4/fb [5339032,9603] 0 2026-03-09T00:03:45.641 INFO:tasks.workunit.client.1.vm06.stdout:8/424: write db/dd/f27 [3877493,62792] 0 2026-03-09T00:03:45.644 INFO:tasks.workunit.client.0.vm03.stdout:1/302: link d4/d3a/d43/f49 d4/d3a/d61/f65 0 2026-03-09T00:03:45.645 INFO:tasks.workunit.client.0.vm03.stdout:1/303: mknod d4/d5e/c66 0 2026-03-09T00:03:45.649 INFO:tasks.workunit.client.1.vm06.stdout:3/513: rename d11/d28/d2e/d2f/d5b/d5f/d91/da9 to d11/d28/d2e/d2f/d5b/db5 0 2026-03-09T00:03:45.649 INFO:tasks.workunit.client.1.vm06.stdout:3/514: readlink d11/d28/d2e/d2f/d5b/d5f/l8b 0 2026-03-09T00:03:45.649 INFO:tasks.workunit.client.1.vm06.stdout:3/515: symlink d11/lb6 0 2026-03-09T00:03:45.651 INFO:tasks.workunit.client.1.vm06.stdout:9/366: rename d1/d4/d6e/d14/d25/f60 to d1/d3/d4f/f74 0 2026-03-09T00:03:45.652 INFO:tasks.workunit.client.1.vm06.stdout:9/367: link d1/d3/f5c d1/d3/d4f/d52/f75 0 2026-03-09T00:03:45.653 INFO:tasks.workunit.client.1.vm06.stdout:3/516: rename d11/d28/d4d/d89/fa6 to d11/d28/d2e/d2f/d36/fb7 0 2026-03-09T00:03:45.654 INFO:tasks.workunit.client.1.vm06.stdout:3/517: link d11/f27 d11/d28/d2e/fb8 0 2026-03-09T00:03:45.654 INFO:tasks.workunit.client.1.vm06.stdout:3/518: stat d11/d28/d2e/d2f/d36/fb7 0 2026-03-09T00:03:45.658 INFO:tasks.workunit.client.1.vm06.stdout:3/519: unlink d11/d28/d2e/d2f/d5b/d94/fad 0 2026-03-09T00:03:45.658 INFO:tasks.workunit.client.0.vm03.stdout:1/304: dread d4/d6/f33 [0,4194304] 0 2026-03-09T00:03:45.658 INFO:tasks.workunit.client.1.vm06.stdout:3/520: unlink l1 0 2026-03-09T00:03:45.659 INFO:tasks.workunit.client.0.vm03.stdout:1/305: rename d4/d15/d5c/c63 to d4/d3a/d3d/d46/c67 0 2026-03-09T00:03:45.659 INFO:tasks.workunit.client.0.vm03.stdout:1/306: creat d4/d3a/d32/f68 x:0 0 0 2026-03-09T00:03:45.659 INFO:tasks.workunit.client.0.vm03.stdout:1/307: write d4/d3a/d3d/f4a [124702,117378] 0 2026-03-09T00:03:45.659 INFO:tasks.workunit.client.0.vm03.stdout:1/308: dread - d4/d3a/d32/f68 zero size 2026-03-09T00:03:45.659 INFO:tasks.workunit.client.1.vm06.stdout:3/521: symlink d11/d28/lb9 0 2026-03-09T00:03:45.660 INFO:tasks.workunit.client.0.vm03.stdout:1/309: mknod d4/d15/d5c/c69 0 2026-03-09T00:03:45.660 INFO:tasks.workunit.client.0.vm03.stdout:1/310: mkdir d4/d3a/d32/d6a 0 2026-03-09T00:03:45.662 INFO:tasks.workunit.client.0.vm03.stdout:5/245: dread d1c/f1e [0,4194304] 0 2026-03-09T00:03:45.663 INFO:tasks.workunit.client.0.vm03.stdout:0/240: dwrite f0 [0,4194304] 0 2026-03-09T00:03:45.670 INFO:tasks.workunit.client.1.vm06.stdout:4/387: dwrite d17/d24/f36 [0,4194304] 0 2026-03-09T00:03:45.670 INFO:tasks.workunit.client.1.vm06.stdout:4/388: truncate d17/d21/f4b 2218704 0 2026-03-09T00:03:45.670 INFO:tasks.workunit.client.1.vm06.stdout:4/389: creat d17/f81 x:0 0 0 2026-03-09T00:03:45.670 INFO:tasks.workunit.client.0.vm03.stdout:0/241: read d2/da/dd/f14 [6650036,5384] 0 2026-03-09T00:03:45.670 INFO:tasks.workunit.client.0.vm03.stdout:0/242: chown d2/da/d36/d39/l40 3423 1 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: pgmap v7: 65 pgs: 65 active+clean; 1.5 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 93 MiB/s rd, 121 MiB/s wr, 196 op/s 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: mgrmap e25: vm06.rzcvhn(active, since 8s), standbys: vm03.yvcons 2026-03-09T00:03:45.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:03:45.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:45 vm06.local ceph-mon[58395]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T00:03:45.673 INFO:tasks.workunit.client.0.vm03.stdout:1/311: dread d4/d3a/f48 [4194304,4194304] 0 2026-03-09T00:03:45.673 INFO:tasks.workunit.client.0.vm03.stdout:1/312: truncate d4/d3a/d3d/f4a 786990 0 2026-03-09T00:03:45.675 INFO:tasks.workunit.client.1.vm06.stdout:3/522: dread d11/d28/d4d/f6e [0,4194304] 0 2026-03-09T00:03:45.677 INFO:tasks.workunit.client.1.vm06.stdout:3/523: write d11/d28/d2e/d7e/f73 [865556,117983] 0 2026-03-09T00:03:45.677 INFO:tasks.workunit.client.1.vm06.stdout:3/524: read - d11/d28/d2e/d2f/d36/f55 zero size 2026-03-09T00:03:45.677 INFO:tasks.workunit.client.1.vm06.stdout:3/525: dread - d11/d28/d4d/f9c zero size 2026-03-09T00:03:45.677 INFO:tasks.workunit.client.1.vm06.stdout:3/526: creat d11/d28/d4d/d89/d90/fba x:0 0 0 2026-03-09T00:03:45.680 INFO:tasks.workunit.client.0.vm03.stdout:5/246: symlink d1c/d20/l54 0 2026-03-09T00:03:45.693 INFO:tasks.workunit.client.1.vm06.stdout:1/408: dwrite d6/d21/d2d/d37/f77 [0,4194304] 0 2026-03-09T00:03:45.693 INFO:tasks.workunit.client.1.vm06.stdout:5/569: dwrite d5/d44/f8b [0,4194304] 0 2026-03-09T00:03:45.693 INFO:tasks.workunit.client.1.vm06.stdout:7/446: dwrite d0/df/d1a/f25 [0,4194304] 0 2026-03-09T00:03:45.693 INFO:tasks.workunit.client.1.vm06.stdout:7/447: fsync d0/df/d1a/d27/f43 0 2026-03-09T00:03:45.697 INFO:tasks.workunit.client.1.vm06.stdout:6/443: dwrite d4/d16/f63 [0,4194304] 0 2026-03-09T00:03:45.707 INFO:tasks.workunit.client.1.vm06.stdout:6/444: fsync d4/d27/f61 0 2026-03-09T00:03:45.717 INFO:tasks.workunit.client.0.vm03.stdout:0/243: mkdir d2/da/d4e 0 2026-03-09T00:03:45.725 INFO:tasks.workunit.client.1.vm06.stdout:4/390: symlink d17/d24/d3b/d54/l82 0 2026-03-09T00:03:45.725 INFO:tasks.workunit.client.1.vm06.stdout:4/391: dread - d17/d21/d4c/d66/f7b zero size 2026-03-09T00:03:45.725 INFO:tasks.workunit.client.1.vm06.stdout:4/392: creat d17/d5b/f83 x:0 0 0 2026-03-09T00:03:45.725 INFO:tasks.workunit.client.1.vm06.stdout:6/445: dread d4/d27/f31 [0,4194304] 0 2026-03-09T00:03:45.731 INFO:tasks.workunit.client.0.vm03.stdout:9/217: getdents d15/d1c/d28 0 2026-03-09T00:03:45.735 INFO:tasks.workunit.client.0.vm03.stdout:9/218: write f11 [3884340,2729] 0 2026-03-09T00:03:45.736 INFO:tasks.workunit.client.1.vm06.stdout:3/527: symlink d11/d28/d4d/d89/d90/lbb 0 2026-03-09T00:03:45.736 INFO:tasks.workunit.client.1.vm06.stdout:5/570: truncate d5/d1c/d23/d51/f60 3714471 0 2026-03-09T00:03:45.742 INFO:tasks.workunit.client.1.vm06.stdout:5/571: dread d5/f43 [0,4194304] 0 2026-03-09T00:03:45.747 INFO:tasks.workunit.client.1.vm06.stdout:5/572: write d5/d1c/d21/d28/f63 [603770,83154] 0 2026-03-09T00:03:45.754 INFO:tasks.workunit.client.1.vm06.stdout:5/573: stat d5/d1c/d21/la7 0 2026-03-09T00:03:45.754 INFO:tasks.workunit.client.1.vm06.stdout:5/574: chown d5/d44/d4b/d92/d49/c94 1924 1 2026-03-09T00:03:45.754 INFO:tasks.workunit.client.1.vm06.stdout:5/575: write d5/f14 [3220933,52157] 0 2026-03-09T00:03:45.754 INFO:tasks.workunit.client.1.vm06.stdout:7/448: unlink d0/c19 0 2026-03-09T00:03:45.754 INFO:tasks.workunit.client.1.vm06.stdout:7/449: read d0/f7 [602882,83570] 0 2026-03-09T00:03:45.765 INFO:tasks.workunit.client.1.vm06.stdout:0/463: dwrite d3/d18/d2c/f4e [0,4194304] 0 2026-03-09T00:03:45.767 INFO:tasks.workunit.client.0.vm03.stdout:5/247: dwrite f12 [0,4194304] 0 2026-03-09T00:03:45.770 INFO:tasks.workunit.client.0.vm03.stdout:5/248: dread fe [0,4194304] 0 2026-03-09T00:03:45.789 INFO:tasks.workunit.client.0.vm03.stdout:4/289: dwrite d7/f1d [0,4194304] 0 2026-03-09T00:03:45.789 INFO:tasks.workunit.client.0.vm03.stdout:4/290: readlink d7/d20/d29/l48 0 2026-03-09T00:03:45.789 INFO:tasks.workunit.client.0.vm03.stdout:4/291: chown d7/f15 6018 1 2026-03-09T00:03:45.802 INFO:tasks.workunit.client.1.vm06.stdout:4/393: rename d17/d24/d3b/c70 to d17/d24/d3b/d5e/c84 0 2026-03-09T00:03:45.802 INFO:tasks.workunit.client.1.vm06.stdout:4/394: write d17/d24/f5c [2051213,87129] 0 2026-03-09T00:03:45.802 INFO:tasks.workunit.client.1.vm06.stdout:4/395: dread - d17/f81 zero size 2026-03-09T00:03:45.807 INFO:tasks.workunit.client.1.vm06.stdout:6/446: symlink d4/d27/d42/d52/l88 0 2026-03-09T00:03:45.807 INFO:tasks.workunit.client.1.vm06.stdout:6/447: creat d4/d16/d46/f89 x:0 0 0 2026-03-09T00:03:45.807 INFO:tasks.workunit.client.1.vm06.stdout:6/448: truncate d4/d27/d3e/d57/f65 456430 0 2026-03-09T00:03:45.807 INFO:tasks.workunit.client.1.vm06.stdout:6/449: dread - d4/d27/f84 zero size 2026-03-09T00:03:45.812 INFO:tasks.workunit.client.1.vm06.stdout:6/450: dread d4/d27/d3e/f44 [0,4194304] 0 2026-03-09T00:03:45.825 INFO:tasks.workunit.client.1.vm06.stdout:3/528: write d11/d28/d2e/d2f/d5b/d94/fa1 [679851,124185] 0 2026-03-09T00:03:45.825 INFO:tasks.workunit.client.1.vm06.stdout:3/529: write d11/f24 [4650124,40638] 0 2026-03-09T00:03:45.825 INFO:tasks.workunit.client.1.vm06.stdout:3/530: fsync d11/d28/d2e/d2f/d36/faf 0 2026-03-09T00:03:45.825 INFO:tasks.workunit.client.1.vm06.stdout:3/531: fsync d11/f1e 0 2026-03-09T00:03:45.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 -- 192.168.123.103:0/4148016384 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c075a40 msgr2=0x7f1c4c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:45.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 --2- 192.168.123.103:0/4148016384 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c075a40 0x7f1c4c077ed0 secure :-1 s=READY pgs=299 cs=0 l=1 rev1=1 crypto rx=0x7f1c44009230 tx=0x7f1c44009260 comp rx=0 tx=0).stop 2026-03-09T00:03:45.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 -- 192.168.123.103:0/4148016384 shutdown_connections 2026-03-09T00:03:45.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 --2- 192.168.123.103:0/4148016384 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c075a40 0x7f1c4c077ed0 unknown :-1 s=CLOSED pgs=299 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:45.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 --2- 192.168.123.103:0/4148016384 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:45.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 -- 192.168.123.103:0/4148016384 >> 192.168.123.103:0/4148016384 conn(0x7f1c4c06dae0 msgr2=0x7f1c4c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 -- 192.168.123.103:0/4148016384 shutdown_connections 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.831+0000 7f1c51aff700 1 -- 192.168.123.103:0/4148016384 wait complete. 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c51aff700 1 Processor -- start 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c51aff700 1 -- start start 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c51aff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c083100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c51aff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c083640 0x7f1c4c12e400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c51aff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c4c083b80 con 0x7f1c4c083640 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c51aff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1c4c083cf0 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c4b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c083100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c4b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c083100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49258/0 (socket says 192.168.123.103:49258) 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.832+0000 7f1c4b7fe700 1 -- 192.168.123.103:0/4227357732 learned_addr learned my addr 192.168.123.103:0/4227357732 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c4b7fe700 1 -- 192.168.123.103:0/4227357732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c083640 msgr2=0x7f1c4c12e400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c4b7fe700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c083640 0x7f1c4c12e400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c4b7fe700 1 -- 192.168.123.103:0/4227357732 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1c44008ee0 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c4b7fe700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c083100 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f1c3c00bfd0 tx=0x7f1c3c009d70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c3c010040 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c51aff700 1 -- 192.168.123.103:0/4227357732 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1c4c12ea60 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.833+0000 7f1c51aff700 1 -- 192.168.123.103:0/4227357732 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1c4c12ef60 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.834+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1c3c00ec20 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.834+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1c3c014e40 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.834+0000 7f1c51aff700 1 -- 192.168.123.103:0/4227357732 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1c38005320 con 0x7f1c4c072b50 2026-03-09T00:03:45.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.835+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f1c3c014590 con 0x7f1c4c072b50 2026-03-09T00:03:45.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.835+0000 7f1c48ff9700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f1c34071ea0 0x7f1c34074360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:45.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.836+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f1c3c092420 con 0x7f1c4c072b50 2026-03-09T00:03:45.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.836+0000 7f1c4affd700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f1c34071ea0 0x7f1c34074360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:45.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.837+0000 7f1c4affd700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f1c34071ea0 0x7f1c34074360 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f1c44009200 tx=0x7f1c44007640 comp rx=0 tx=0).ready entity=mgr.24345 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:45.838 INFO:tasks.workunit.client.0.vm03.stdout:5/249: rename d1c/d20/d44 to d1c/d20/d55 0 2026-03-09T00:03:45.839 INFO:tasks.workunit.client.0.vm03.stdout:5/250: mkdir d1c/d20/d56 0 2026-03-09T00:03:45.839 INFO:tasks.workunit.client.0.vm03.stdout:5/251: write d1c/f4c [938632,92538] 0 2026-03-09T00:03:45.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.841+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f1c3c05b090 con 0x7f1c4c072b50 2026-03-09T00:03:45.840 INFO:tasks.workunit.client.0.vm03.stdout:5/252: creat d1c/d20/d55/d3b/f57 x:0 0 0 2026-03-09T00:03:45.841 INFO:tasks.workunit.client.0.vm03.stdout:5/253: fdatasync d1c/f37 0 2026-03-09T00:03:45.841 INFO:tasks.workunit.client.0.vm03.stdout:5/254: getdents d1c/d20/d55/d43 0 2026-03-09T00:03:45.842 INFO:tasks.workunit.client.0.vm03.stdout:5/255: mkdir d1c/d20/d55/d4f/d58 0 2026-03-09T00:03:45.842 INFO:tasks.workunit.client.0.vm03.stdout:5/256: write d1c/f1e [1074607,129809] 0 2026-03-09T00:03:45.853 INFO:tasks.workunit.client.1.vm06.stdout:7/450: mknod d0/c80 0 2026-03-09T00:03:45.853 INFO:tasks.workunit.client.1.vm06.stdout:7/451: read - d0/df/d1a/d27/f37 zero size 2026-03-09T00:03:45.853 INFO:tasks.workunit.client.1.vm06.stdout:7/452: chown d0/l1c 16625664 1 2026-03-09T00:03:45.853 INFO:tasks.workunit.client.1.vm06.stdout:7/453: stat d0/df/d1a/d3a/f23 0 2026-03-09T00:03:45.857 INFO:tasks.workunit.client.0.vm03.stdout:0/244: dwrite d2/da/d1a/f1c [0,4194304] 0 2026-03-09T00:03:45.867 INFO:tasks.workunit.client.1.vm06.stdout:0/464: rename d3/d18/d1f/d39/d3b/l61 to d3/d18/d2c/d2d/d74/d90/l9a 0 2026-03-09T00:03:45.867 INFO:tasks.workunit.client.1.vm06.stdout:4/396: creat d17/d21/d32/f85 x:0 0 0 2026-03-09T00:03:45.868 INFO:tasks.workunit.client.1.vm06.stdout:5/576: dwrite d5/f43 [0,4194304] 0 2026-03-09T00:03:45.888 INFO:tasks.workunit.client.0.vm03.stdout:0/245: creat d2/da/f4f x:0 0 0 2026-03-09T00:03:45.889 INFO:tasks.workunit.client.0.vm03.stdout:0/246: dread d2/f22 [0,4194304] 0 2026-03-09T00:03:45.891 INFO:tasks.workunit.client.0.vm03.stdout:0/247: dread d2/f32 [0,4194304] 0 2026-03-09T00:03:45.891 INFO:tasks.workunit.client.0.vm03.stdout:0/248: fsync d2/da/d1a/f1c 0 2026-03-09T00:03:45.894 INFO:tasks.workunit.client.1.vm06.stdout:6/451: unlink d4/d27/d42/l7f 0 2026-03-09T00:03:45.894 INFO:tasks.workunit.client.1.vm06.stdout:6/452: fsync d4/d27/d3e/d57/f65 0 2026-03-09T00:03:45.895 INFO:tasks.workunit.client.0.vm03.stdout:0/249: symlink d2/da/d36/d39/d4b/l50 0 2026-03-09T00:03:45.899 INFO:tasks.workunit.client.1.vm06.stdout:3/532: mknod d11/d28/d2e/d2f/d36/d8f/cbc 0 2026-03-09T00:03:45.899 INFO:tasks.workunit.client.1.vm06.stdout:3/533: chown d11/d28/d2e/d2f/f74 433980791 1 2026-03-09T00:03:45.903 INFO:tasks.workunit.client.0.vm03.stdout:9/219: dwrite f11 [0,4194304] 0 2026-03-09T00:03:45.910 INFO:tasks.workunit.client.0.vm03.stdout:0/250: unlink d2/da/d1a/l2f 0 2026-03-09T00:03:45.923 INFO:tasks.workunit.client.0.vm03.stdout:9/220: rename d15/f18 to d15/f44 0 2026-03-09T00:03:45.926 INFO:tasks.workunit.client.1.vm06.stdout:7/454: rename d0/d39/f56 to d0/df/d1a/d35/d62/f81 0 2026-03-09T00:03:45.926 INFO:tasks.workunit.client.1.vm06.stdout:7/455: stat d0/df/d1a/l33 0 2026-03-09T00:03:45.930 INFO:tasks.workunit.client.0.vm03.stdout:9/221: mknod d15/c45 0 2026-03-09T00:03:45.932 INFO:tasks.workunit.client.1.vm06.stdout:4/397: rename d17/l1b to d17/l86 0 2026-03-09T00:03:45.933 INFO:tasks.workunit.client.1.vm06.stdout:4/398: read d17/f20 [750557,32779] 0 2026-03-09T00:03:45.937 INFO:tasks.workunit.client.0.vm03.stdout:9/222: rmdir d15/d1c/d21 39 2026-03-09T00:03:45.948 INFO:tasks.workunit.client.0.vm03.stdout:9/223: fdatasync d15/f2c 0 2026-03-09T00:03:45.948 INFO:tasks.workunit.client.0.vm03.stdout:9/224: dread - d15/d1c/d28/f29 zero size 2026-03-09T00:03:45.948 INFO:tasks.workunit.client.0.vm03.stdout:4/292: dwrite d7/d23/d25/f3e [4194304,4194304] 0 2026-03-09T00:03:45.954 INFO:tasks.workunit.client.1.vm06.stdout:2/539: dwrite d7/d1b/d31/f90 [0,4194304] 0 2026-03-09T00:03:45.954 INFO:tasks.workunit.client.1.vm06.stdout:2/540: chown d7/d1b/d31/c36 98 1 2026-03-09T00:03:45.954 INFO:tasks.workunit.client.1.vm06.stdout:2/541: write d7/d1b/f22 [473366,87550] 0 2026-03-09T00:03:45.956 INFO:tasks.workunit.client.0.vm03.stdout:5/257: dwrite d1c/d20/d55/f3d [0,4194304] 0 2026-03-09T00:03:45.956 INFO:tasks.workunit.client.0.vm03.stdout:5/258: stat d1c/d20/d55/l40 0 2026-03-09T00:03:45.967 INFO:tasks.workunit.client.0.vm03.stdout:9/225: creat d15/d1c/d21/f46 x:0 0 0 2026-03-09T00:03:45.967 INFO:tasks.workunit.client.0.vm03.stdout:9/226: chown d15/d1c/d21/f41 409238 1 2026-03-09T00:03:45.967 INFO:tasks.workunit.client.0.vm03.stdout:9/227: fdatasync d15/f1b 0 2026-03-09T00:03:45.967 INFO:tasks.workunit.client.0.vm03.stdout:9/228: write d15/d1c/d28/f39 [4072766,18500] 0 2026-03-09T00:03:45.973 INFO:tasks.workunit.client.1.vm06.stdout:4/399: creat d17/d21/d4c/f87 x:0 0 0 2026-03-09T00:03:45.973 INFO:tasks.workunit.client.1.vm06.stdout:4/400: chown d17/d21 55925943 1 2026-03-09T00:03:45.973 INFO:tasks.workunit.client.1.vm06.stdout:4/401: dread - d17/d21/d4c/d66/f7b zero size 2026-03-09T00:03:45.973 INFO:tasks.workunit.client.1.vm06.stdout:4/402: chown d17/d24/f39 12918900 1 2026-03-09T00:03:45.977 INFO:tasks.workunit.client.0.vm03.stdout:9/229: rename d15/l24 to d15/l47 0 2026-03-09T00:03:45.980 INFO:tasks.workunit.client.1.vm06.stdout:8/425: dwrite db/dd/d48/f68 [0,4194304] 0 2026-03-09T00:03:45.984 INFO:tasks.workunit.client.0.vm03.stdout:9/230: link d15/d1c/d21/f46 d15/f48 0 2026-03-09T00:03:45.984 INFO:tasks.workunit.client.0.vm03.stdout:9/231: dread - d15/f48 zero size 2026-03-09T00:03:45.984 INFO:tasks.workunit.client.0.vm03.stdout:9/232: truncate d15/d1c/d28/d30/f3d 300050 0 2026-03-09T00:03:45.984 INFO:tasks.workunit.client.0.vm03.stdout:9/233: write d15/f1f [721496,58186] 0 2026-03-09T00:03:45.987 INFO:tasks.workunit.client.1.vm06.stdout:2/542: symlink d7/d1a/d56/la2 0 2026-03-09T00:03:45.987 INFO:tasks.workunit.client.1.vm06.stdout:2/543: chown d7/d1a/d89 104490634 1 2026-03-09T00:03:45.987 INFO:tasks.workunit.client.1.vm06.stdout:2/544: fdatasync d7/da/d55/f5b 0 2026-03-09T00:03:45.987 INFO:tasks.workunit.client.1.vm06.stdout:2/545: dread - d7/f5d zero size 2026-03-09T00:03:45.988 INFO:tasks.workunit.client.1.vm06.stdout:3/534: link d11/d3f/c51 d11/d28/d2e/cbd 0 2026-03-09T00:03:45.990 INFO:tasks.workunit.client.0.vm03.stdout:9/234: mknod d15/d1c/d28/c49 0 2026-03-09T00:03:45.990 INFO:tasks.workunit.client.0.vm03.stdout:9/235: dread d15/d1c/d21/f34 [0,4194304] 0 2026-03-09T00:03:45.990 INFO:tasks.workunit.client.0.vm03.stdout:9/236: chown d15/f44 54685336 1 2026-03-09T00:03:45.990 INFO:tasks.workunit.client.0.vm03.stdout:9/237: chown d15/c16 894 1 2026-03-09T00:03:45.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.994+0000 7f1c51aff700 1 -- 192.168.123.103:0/4227357732 --> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1c38000bf0 con 0x7f1c34071ea0 2026-03-09T00:03:46.000 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:45.999+0000 7f1c48ff9700 1 -- 192.168.123.103:0/4227357732 <== mgr.24345 v2:192.168.123.106:6828/4100748704 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+310 (secure 0 0 0) 0x7f1c38000bf0 con 0x7f1c34071ea0 2026-03-09T00:03:46.000 INFO:tasks.workunit.client.0.vm03.stdout:9/238: dread fd [0,4194304] 0 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.001+0000 7f1c327fc700 1 -- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f1c34071ea0 msgr2=0x7f1c34074360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.001+0000 7f1c327fc700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f1c34071ea0 0x7f1c34074360 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f1c44009200 tx=0x7f1c44007640 comp rx=0 tx=0).stop 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.001+0000 7f1c327fc700 1 -- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 msgr2=0x7f1c4c083100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.001+0000 7f1c327fc700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c083100 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f1c3c00bfd0 tx=0x7f1c3c009d70 comp rx=0 tx=0).stop 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 -- 192.168.123.103:0/4227357732 shutdown_connections 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1c4c072b50 0x7f1c4c083100 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f1c34071ea0 0x7f1c34074360 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 --2- 192.168.123.103:0/4227357732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1c4c083640 0x7f1c4c12e400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 -- 192.168.123.103:0/4227357732 >> 192.168.123.103:0/4227357732 conn(0x7f1c4c06dae0 msgr2=0x7f1c4c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 -- 192.168.123.103:0/4227357732 shutdown_connections 2026-03-09T00:03:46.001 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.002+0000 7f1c327fc700 1 -- 192.168.123.103:0/4227357732 wait complete. 2026-03-09T00:03:46.002 INFO:tasks.workunit.client.1.vm06.stdout:4/403: symlink d17/d24/d3b/l88 0 2026-03-09T00:03:46.005 INFO:tasks.workunit.client.0.vm03.stdout:2/208: sync 2026-03-09T00:03:46.006 INFO:tasks.workunit.client.0.vm03.stdout:4/293: dwrite d7/d27/f31 [0,4194304] 0 2026-03-09T00:03:46.010 INFO:tasks.workunit.client.1.vm06.stdout:6/453: dwrite d4/d16/f33 [0,4194304] 0 2026-03-09T00:03:46.014 INFO:tasks.workunit.client.0.vm03.stdout:5/259: dread d1c/f1e [0,4194304] 0 2026-03-09T00:03:46.014 INFO:tasks.workunit.client.0.vm03.stdout:5/260: write f14 [1607529,3707] 0 2026-03-09T00:03:46.015 INFO:tasks.workunit.client.0.vm03.stdout:1/313: dwrite d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:46.015 INFO:tasks.workunit.client.0.vm03.stdout:1/314: write d4/d3a/d32/f4b [1240663,82962] 0 2026-03-09T00:03:46.017 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:03:46.020 INFO:tasks.workunit.client.1.vm06.stdout:1/409: dwrite d6/d21/d2d/d37/f77 [0,4194304] 0 2026-03-09T00:03:46.031 INFO:tasks.workunit.client.0.vm03.stdout:9/239: creat d15/d1c/d36/f4a x:0 0 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.0.vm03.stdout:4/294: symlink d7/d20/d29/d38/d3a/l5e 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.0.vm03.stdout:5/261: creat d1c/d20/d56/f59 x:0 0 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.1.vm06.stdout:5/577: dwrite d5/d44/d4b/f70 [0,4194304] 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.1.vm06.stdout:5/578: fsync d5/d1c/d23/f4f 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.1.vm06.stdout:5/579: fsync d5/d44/d4b/d92/f52 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.1.vm06.stdout:2/546: rmdir d7/d1a/d25/d97/d99 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.1.vm06.stdout:2/547: write d7/da/d1c/f9e [406084,114278] 0 2026-03-09T00:03:46.032 INFO:tasks.workunit.client.1.vm06.stdout:3/535: creat d11/d28/d4d/d89/fbe x:0 0 0 2026-03-09T00:03:46.033 INFO:tasks.workunit.client.0.vm03.stdout:0/251: dwrite f0 [0,4194304] 0 2026-03-09T00:03:46.034 INFO:tasks.workunit.client.0.vm03.stdout:1/315: rename d4/d6/l38 to d4/l6b 0 2026-03-09T00:03:46.034 INFO:tasks.workunit.client.0.vm03.stdout:4/295: rename d7 to d7/d27/d5f 22 2026-03-09T00:03:46.034 INFO:tasks.workunit.client.0.vm03.stdout:1/316: fsync d4/d3a/d43/f57 0 2026-03-09T00:03:46.034 INFO:tasks.workunit.client.0.vm03.stdout:4/296: truncate d7/d23/f4a 899734 0 2026-03-09T00:03:46.035 INFO:tasks.workunit.client.0.vm03.stdout:0/252: dread d2/f1e [0,4194304] 0 2026-03-09T00:03:46.037 INFO:tasks.workunit.client.1.vm06.stdout:4/404: link d17/d5b/f83 d17/d5b/f89 0 2026-03-09T00:03:46.037 INFO:tasks.workunit.client.1.vm06.stdout:4/405: read f14 [636320,77464] 0 2026-03-09T00:03:46.037 INFO:tasks.workunit.client.1.vm06.stdout:4/406: readlink d17/l1c 0 2026-03-09T00:03:46.043 INFO:tasks.workunit.client.0.vm03.stdout:4/297: dread d7/d27/f2c [0,4194304] 0 2026-03-09T00:03:46.043 INFO:tasks.workunit.client.0.vm03.stdout:4/298: fdatasync d7/f1f 0 2026-03-09T00:03:46.045 INFO:tasks.workunit.client.0.vm03.stdout:2/209: dwrite d8/f21 [0,4194304] 0 2026-03-09T00:03:46.046 INFO:tasks.workunit.client.0.vm03.stdout:0/253: dread f0 [0,4194304] 0 2026-03-09T00:03:46.046 INFO:tasks.workunit.client.1.vm06.stdout:0/465: dwrite d3/d18/d2c/d2d/f46 [0,4194304] 0 2026-03-09T00:03:46.046 INFO:tasks.workunit.client.1.vm06.stdout:0/466: truncate d3/d18/d1f/d39/d3b/f66 995917 0 2026-03-09T00:03:46.056 INFO:tasks.workunit.client.1.vm06.stdout:6/454: symlink d4/d27/d42/d52/d5d/l8a 0 2026-03-09T00:03:46.068 INFO:tasks.workunit.client.1.vm06.stdout:1/410: creat d6/d4c/d71/f84 x:0 0 0 2026-03-09T00:03:46.072 INFO:tasks.workunit.client.1.vm06.stdout:5/580: link d5/d44/d4b/d92/f52 d5/d44/d4b/d92/d49/fc2 0 2026-03-09T00:03:46.074 INFO:tasks.workunit.client.1.vm06.stdout:8/426: creat db/d53/d6d/d7b/f8a x:0 0 0 2026-03-09T00:03:46.080 INFO:tasks.workunit.client.1.vm06.stdout:2/548: rename d7/da/d55/f5b to d7/d1a/d25/fa3 0 2026-03-09T00:03:46.080 INFO:tasks.workunit.client.0.vm03.stdout:9/240: symlink d15/d1c/d21/l4b 0 2026-03-09T00:03:46.081 INFO:tasks.workunit.client.0.vm03.stdout:9/241: creat d15/d1c/d21/f4c x:0 0 0 2026-03-09T00:03:46.081 INFO:tasks.workunit.client.0.vm03.stdout:5/262: creat d1c/d20/d55/f5a x:0 0 0 2026-03-09T00:03:46.083 INFO:tasks.workunit.client.0.vm03.stdout:4/299: rename d7/d20/f33 to d7/d20/d29/d4e/f60 0 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.1.vm06.stdout:9/368: sync 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.0.vm03.stdout:3/187: sync 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.0.vm03.stdout:3/188: chown d2/db/f26 6142419 1 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.0.vm03.stdout:6/239: sync 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.0.vm03.stdout:8/230: sync 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.0.vm03.stdout:7/267: sync 2026-03-09T00:03:46.087 INFO:tasks.workunit.client.0.vm03.stdout:7/268: chown d2/d1f/c10 29 1 2026-03-09T00:03:46.094 INFO:tasks.workunit.client.1.vm06.stdout:6/455: dwrite d4/d27/d42/d4b/f83 [0,4194304] 0 2026-03-09T00:03:46.097 INFO:tasks.workunit.client.0.vm03.stdout:2/210: symlink d8/d1b/d24/l44 0 2026-03-09T00:03:46.097 INFO:tasks.workunit.client.0.vm03.stdout:2/211: creat d8/d17/f45 x:0 0 0 2026-03-09T00:03:46.097 INFO:tasks.workunit.client.0.vm03.stdout:2/212: fdatasync d8/d17/f45 0 2026-03-09T00:03:46.097 INFO:tasks.workunit.client.0.vm03.stdout:2/213: creat d8/d1b/d24/f46 x:0 0 0 2026-03-09T00:03:46.098 INFO:tasks.workunit.client.0.vm03.stdout:0/254: mknod d2/da/dd/c51 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.1.vm06.stdout:8/427: creat db/dd/d24/d80/f8b x:0 0 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.1.vm06.stdout:3/536: rename d11/d28/d2e/d2f/d5b/f7d to d11/d28/fbf 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.1.vm06.stdout:3/537: fdatasync d11/d28/d57/f7b 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.1.vm06.stdout:9/369: symlink d1/l76 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.0.vm03.stdout:9/242: mkdir d15/d1c/d36/d4d 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.0.vm03.stdout:9/243: read - d15/d1c/d28/f2f zero size 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.0.vm03.stdout:9/244: creat d15/d1c/d28/d30/f4e x:0 0 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.0.vm03.stdout:9/245: write d15/d1c/d21/f4c [948586,92089] 0 2026-03-09T00:03:46.109 INFO:tasks.workunit.client.0.vm03.stdout:5/263: stat d1c/c31 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:1/317: truncate f2 362262 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:1/318: write d4/d15/d5c/f62 [21669,28204] 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:1/319: write d4/d3a/d43/f49 [380546,42202] 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:3/189: mkdir d2/db/d3b 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:3/190: stat d2/db/d2d/f37 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:3/191: chown d2/db/f17 6871 1 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:3/192: fsync d2/db/f27 0 2026-03-09T00:03:46.110 INFO:tasks.workunit.client.0.vm03.stdout:3/193: dread - d2/db/d2d/f37 zero size 2026-03-09T00:03:46.111 INFO:tasks.workunit.client.0.vm03.stdout:3/194: write d2/fc [61735,89783] 0 2026-03-09T00:03:46.111 INFO:tasks.workunit.client.0.vm03.stdout:3/195: chown d2/db/l1e 10 1 2026-03-09T00:03:46.113 INFO:tasks.workunit.client.0.vm03.stdout:4/300: dwrite d7/d27/f2c [0,4194304] 0 2026-03-09T00:03:46.113 INFO:tasks.workunit.client.0.vm03.stdout:6/240: mkdir d13/d1e/d44/d4a/d52 0 2026-03-09T00:03:46.117 INFO:tasks.workunit.client.0.vm03.stdout:6/241: dread d13/d1e/f21 [0,4194304] 0 2026-03-09T00:03:46.118 INFO:tasks.workunit.client.1.vm06.stdout:7/456: sync 2026-03-09T00:03:46.119 INFO:tasks.workunit.client.1.vm06.stdout:7/457: dread - d0/df/d1a/d27/f60 zero size 2026-03-09T00:03:46.119 INFO:tasks.workunit.client.1.vm06.stdout:7/458: truncate d0/df/d1a/d27/d4c/d40/f67 854198 0 2026-03-09T00:03:46.119 INFO:tasks.workunit.client.1.vm06.stdout:7/459: write d0/df/d1a/f50 [1425620,116033] 0 2026-03-09T00:03:46.119 INFO:tasks.workunit.client.1.vm06.stdout:7/460: chown d0/df/d7b 945 1 2026-03-09T00:03:46.119 INFO:tasks.workunit.client.1.vm06.stdout:7/461: chown d0/df/d1a/f44 7 1 2026-03-09T00:03:46.119 INFO:tasks.workunit.client.1.vm06.stdout:0/467: sync 2026-03-09T00:03:46.125 INFO:tasks.workunit.client.0.vm03.stdout:8/231: rename d7/df/f30 to d7/f49 0 2026-03-09T00:03:46.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.130+0000 7f74f725e700 1 -- 192.168.123.103:0/106702430 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 msgr2=0x7f74f010cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.130 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.130+0000 7f74f725e700 1 --2- 192.168.123.103:0/106702430 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f010cb90 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f74e4009a60 tx=0x7f74e4009d70 comp rx=0 tx=0).stop 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.131+0000 7f74f725e700 1 -- 192.168.123.103:0/106702430 shutdown_connections 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.131+0000 7f74f725e700 1 --2- 192.168.123.103:0/106702430 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f010cb90 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.131+0000 7f74f725e700 1 --2- 192.168.123.103:0/106702430 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74f0107d90 0x7f74f010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.131+0000 7f74f725e700 1 -- 192.168.123.103:0/106702430 >> 192.168.123.103:0/106702430 conn(0x7f74f006dda0 msgr2=0x7f74f0070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.131+0000 7f74f725e700 1 -- 192.168.123.103:0/106702430 shutdown_connections 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 -- 192.168.123.103:0/106702430 wait complete. 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 Processor -- start 2026-03-09T00:03:46.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 -- start start 2026-03-09T00:03:46.131 INFO:tasks.workunit.client.1.vm06.stdout:2/549: sync 2026-03-09T00:03:46.131 INFO:tasks.workunit.client.1.vm06.stdout:5/581: sync 2026-03-09T00:03:46.131 INFO:tasks.workunit.client.1.vm06.stdout:5/582: chown d5/d44/d84/c8d 1 1 2026-03-09T00:03:46.131 INFO:tasks.workunit.client.1.vm06.stdout:2/550: write f3 [1450535,107814] 0 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:2/551: chown d7/da/d4e/d57 588578 1 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:2/552: chown d7/d1a/d25/l59 227 1 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:6/456: mknod d4/d16/c8b 0 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:6/457: chown d4/le 0 1 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:6/458: chown d4/d16/c8b 40930 1 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:6/459: write d4/f38 [99492,26982] 0 2026-03-09T00:03:46.132 INFO:tasks.workunit.client.1.vm06.stdout:6/460: truncate d4/d27/d3e/f41 4592687 0 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74f0107d90 0x7f74f01a55c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f01a5b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74f01a6120 con 0x7f74f0107d90 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74f725e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f74f01a6260 con 0x7f74f010a700 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.132+0000 7f74effff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f01a5b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.133+0000 7f74effff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f01a5b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49270/0 (socket says 192.168.123.103:49270) 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.133+0000 7f74effff700 1 -- 192.168.123.103:0/3845565312 learned_addr learned my addr 192.168.123.103:0/3845565312 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.133+0000 7f74effff700 1 -- 192.168.123.103:0/3845565312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74f0107d90 msgr2=0x7f74f01a55c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.133+0000 7f74effff700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74f0107d90 0x7f74f01a55c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.133+0000 7f74effff700 1 -- 192.168.123.103:0/3845565312 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f74e4009710 con 0x7f74f010a700 2026-03-09T00:03:46.132 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.133+0000 7f74effff700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f01a5b00 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f74e4009fd0 tx=0x7f74e400f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:46.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.134+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74e401d070 con 0x7f74f010a700 2026-03-09T00:03:46.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.134+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f74f01aacb0 con 0x7f74f010a700 2026-03-09T00:03:46.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.134+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f74e400fca0 con 0x7f74f010a700 2026-03-09T00:03:46.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.134+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f74f01ab1a0 con 0x7f74f010a700 2026-03-09T00:03:46.134 INFO:tasks.workunit.client.0.vm03.stdout:7/269: getdents d2/d4/d1e 0 2026-03-09T00:03:46.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.135+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f74e4017750 con 0x7f74f010a700 2026-03-09T00:03:46.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.135+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f74e4017930 con 0x7f74f010a700 2026-03-09T00:03:46.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.135+0000 7f74d77fe700 1 -- 192.168.123.103:0/3845565312 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f74dc005320 con 0x7f74f010a700 2026-03-09T00:03:46.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.136+0000 7f74edffb700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f74d8071c80 0x7f74d8074140 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.136+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f74e4095000 con 0x7f74f010a700 2026-03-09T00:03:46.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.136+0000 7f74f4ffa700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f74d8071c80 0x7f74d8074140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.136+0000 7f74f4ffa700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f74d8071c80 0x7f74d8074140 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f74e0005950 tx=0x7f74e00058e0 comp rx=0 tx=0).ready entity=mgr.24345 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:46.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.139+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f74e405dd20 con 0x7f74f010a700 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.1.vm06.stdout:8/428: symlink db/dd/d24/d80/l8c 0 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.1.vm06.stdout:4/407: rename d17/d24/d3b/d5e/c84 to d17/c8a 0 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.1.vm06.stdout:4/408: dread - d17/d24/d3b/d5e/f6f zero size 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.0.vm03.stdout:2/214: link d8/d1b/d24/f41 d8/d1b/f47 0 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.0.vm03.stdout:2/215: write d8/d1b/d2a/d2e/f35 [708078,90316] 0 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.0.vm03.stdout:2/216: fdatasync d8/d17/f3c 0 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.0.vm03.stdout:0/255: link d2/da/dd/f38 d2/da/d36/d39/f52 0 2026-03-09T00:03:46.143 INFO:tasks.workunit.client.0.vm03.stdout:0/256: creat d2/da/d1a/f53 x:0 0 0 2026-03-09T00:03:46.147 INFO:tasks.workunit.client.0.vm03.stdout:2/217: read f2 [1542385,56006] 0 2026-03-09T00:03:46.152 INFO:tasks.workunit.client.1.vm06.stdout:3/538: mknod d11/d28/d2e/d2f/d5b/d5f/cc0 0 2026-03-09T00:03:46.163 INFO:tasks.workunit.client.1.vm06.stdout:3/539: truncate d11/d28/d4d/d89/d90/fba 1026574 0 2026-03-09T00:03:46.164 INFO:tasks.workunit.client.0.vm03.stdout:9/246: link d15/d1c/d28/d30/l32 d15/d1c/d28/l4f 0 2026-03-09T00:03:46.165 INFO:tasks.workunit.client.1.vm06.stdout:9/370: mknod d1/d3/d12/c77 0 2026-03-09T00:03:46.166 INFO:tasks.workunit.client.1.vm06.stdout:9/371: chown d1/d4/d6e/d9/f40 208138 1 2026-03-09T00:03:46.174 INFO:tasks.workunit.client.0.vm03.stdout:1/320: mkdir d4/d15/d5c/d6c 0 2026-03-09T00:03:46.185 INFO:tasks.workunit.client.0.vm03.stdout:3/196: mknod d2/db/d2d/c3c 0 2026-03-09T00:03:46.185 INFO:tasks.workunit.client.0.vm03.stdout:3/197: dread - d2/db/f26 zero size 2026-03-09T00:03:46.185 INFO:tasks.workunit.client.1.vm06.stdout:5/583: dwrite d5/d1c/d23/f4c [0,4194304] 0 2026-03-09T00:03:46.185 INFO:tasks.workunit.client.1.vm06.stdout:5/584: write d5/d44/d4b/d92/d95/fb0 [90896,58015] 0 2026-03-09T00:03:46.188 INFO:tasks.workunit.client.1.vm06.stdout:0/468: truncate d3/d18/f14 462267 0 2026-03-09T00:03:46.189 INFO:tasks.workunit.client.1.vm06.stdout:0/469: chown d3/d18/c20 203 1 2026-03-09T00:03:46.189 INFO:tasks.workunit.client.0.vm03.stdout:9/247: dwrite d15/d1c/d28/f33 [0,4194304] 0 2026-03-09T00:03:46.189 INFO:tasks.workunit.client.0.vm03.stdout:9/248: read d15/d1c/d28/d30/f3d [235571,16971] 0 2026-03-09T00:03:46.193 INFO:tasks.workunit.client.1.vm06.stdout:5/585: dread d5/d1c/d21/d28/d5e/d66/f8a [0,4194304] 0 2026-03-09T00:03:46.193 INFO:tasks.workunit.client.1.vm06.stdout:5/586: readlink d5/l8 0 2026-03-09T00:03:46.196 INFO:tasks.workunit.client.0.vm03.stdout:6/242: stat f12 0 2026-03-09T00:03:46.197 INFO:tasks.workunit.client.1.vm06.stdout:6/461: link d4/d16/d53/l5b d4/d16/d53/d67/l8c 0 2026-03-09T00:03:46.197 INFO:tasks.workunit.client.1.vm06.stdout:8/429: creat db/dd/d84/f8d x:0 0 0 2026-03-09T00:03:46.197 INFO:tasks.workunit.client.1.vm06.stdout:8/430: write db/d1e/d46/f5d [780769,95427] 0 2026-03-09T00:03:46.197 INFO:tasks.workunit.client.0.vm03.stdout:5/264: rename d1c/d20/c38 to d1c/d20/d56/c5b 0 2026-03-09T00:03:46.202 INFO:tasks.workunit.client.1.vm06.stdout:4/409: mknod d17/d21/d4c/c8b 0 2026-03-09T00:03:46.202 INFO:tasks.workunit.client.0.vm03.stdout:8/232: unlink d7/df/d1e/f36 0 2026-03-09T00:03:46.203 INFO:tasks.workunit.client.1.vm06.stdout:1/411: rename d6/d4c/d71/c72 to d6/d21/d2d/c85 0 2026-03-09T00:03:46.205 INFO:tasks.workunit.client.0.vm03.stdout:7/270: truncate d2/d1f/d3a/f19 4670053 0 2026-03-09T00:03:46.206 INFO:tasks.workunit.client.1.vm06.stdout:3/540: mkdir d11/d28/d2e/d2f/dc1 0 2026-03-09T00:03:46.206 INFO:tasks.workunit.client.1.vm06.stdout:3/541: dread - d11/d28/d2e/d2f/d36/f55 zero size 2026-03-09T00:03:46.206 INFO:tasks.workunit.client.1.vm06.stdout:3/542: chown d11/d28/d2e/d7e/d83/f9a 1 1 2026-03-09T00:03:46.209 INFO:tasks.workunit.client.1.vm06.stdout:9/372: creat d1/f78 x:0 0 0 2026-03-09T00:03:46.209 INFO:tasks.workunit.client.1.vm06.stdout:9/373: fsync d1/d3/d4f/d52/f6b 0 2026-03-09T00:03:46.210 INFO:tasks.workunit.client.0.vm03.stdout:2/218: creat d8/d1b/d2a/d42/f48 x:0 0 0 2026-03-09T00:03:46.210 INFO:tasks.workunit.client.0.vm03.stdout:2/219: fsync d8/d17/f1d 0 2026-03-09T00:03:46.217 INFO:tasks.workunit.client.0.vm03.stdout:6/243: write d13/d1e/d44/f49 [3451353,18747] 0 2026-03-09T00:03:46.225 INFO:tasks.workunit.client.1.vm06.stdout:0/470: link d3/d18/d1f/d44/d6a/l7a d3/d18/d79/l9b 0 2026-03-09T00:03:46.229 INFO:tasks.workunit.client.1.vm06.stdout:9/374: dread d1/f45 [0,4194304] 0 2026-03-09T00:03:46.230 INFO:tasks.workunit.client.0.vm03.stdout:8/233: dread d7/f10 [0,4194304] 0 2026-03-09T00:03:46.230 INFO:tasks.workunit.client.0.vm03.stdout:8/234: fdatasync d7/df/d1a/d2b/f44 0 2026-03-09T00:03:46.230 INFO:tasks.workunit.client.0.vm03.stdout:8/235: fdatasync d7/df/d1e/d38/f3e 0 2026-03-09T00:03:46.245 INFO:tasks.workunit.client.0.vm03.stdout:3/198: creat d2/f3d x:0 0 0 2026-03-09T00:03:46.245 INFO:tasks.workunit.client.0.vm03.stdout:3/199: chown d2/db/f28 141426075 1 2026-03-09T00:03:46.249 INFO:tasks.workunit.client.1.vm06.stdout:5/587: mkdir d5/d1c/d23/d51/dc3 0 2026-03-09T00:03:46.257 INFO:tasks.workunit.client.1.vm06.stdout:6/462: mkdir d4/d8d 0 2026-03-09T00:03:46.257 INFO:tasks.workunit.client.0.vm03.stdout:5/265: mknod d1c/d20/d55/d4f/c5c 0 2026-03-09T00:03:46.265 INFO:tasks.workunit.client.0.vm03.stdout:4/301: rename d7/d27/l55 to d7/d20/d29/d54/l61 0 2026-03-09T00:03:46.270 INFO:tasks.workunit.client.1.vm06.stdout:1/412: rmdir d6/d4c/d51/d7f 0 2026-03-09T00:03:46.294 INFO:tasks.workunit.client.1.vm06.stdout:1/413: creat d6/d21/d2d/d37/f86 x:0 0 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.0.vm03.stdout:7/271: creat d2/d1f/d42/f59 x:0 0 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.0.vm03.stdout:4/302: dread d7/f22 [0,4194304] 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.0.vm03.stdout:4/303: chown d7/d23/d25/l2e 104578 1 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.0.vm03.stdout:2/220: mknod d8/d17/c49 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.0.vm03.stdout:5/266: mkdir d1c/d20/d55/d4f/d58/d5d 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:7/462: rename d0/df/d1a/d27/c79 to d0/df/d1a/d3a/d4e/c82 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:3/543: truncate f7 135686 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:9/375: symlink d1/d4/d6e/d9/l79 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:9/376: write d1/d4/d6e/d14/d25/f70 [965313,3508] 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:9/377: creat d1/d4/d6e/d14/d25/f7a x:0 0 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:9/378: truncate d1/d3/d2b/d58/f5f 746194 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:5/588: creat d5/d1c/d21/d28/d5e/fc4 x:0 0 0 2026-03-09T00:03:46.295 INFO:tasks.workunit.client.1.vm06.stdout:9/379: write d1/d3/d4f/d52/f75 [6690928,41821] 0 2026-03-09T00:03:46.298 INFO:tasks.workunit.client.0.vm03.stdout:4/304: dread d7/d27/f2c [0,4194304] 0 2026-03-09T00:03:46.298 INFO:tasks.workunit.client.0.vm03.stdout:4/305: creat d7/f62 x:0 0 0 2026-03-09T00:03:46.304 INFO:tasks.workunit.client.0.vm03.stdout:7/272: dread d2/d1f/d3a/f29 [0,4194304] 0 2026-03-09T00:03:46.315 INFO:tasks.workunit.client.0.vm03.stdout:0/257: rename d2/da/d1a/c20 to d2/d1f/c54 0 2026-03-09T00:03:46.318 INFO:tasks.workunit.client.0.vm03.stdout:1/321: dwrite d4/d3a/d43/f49 [0,4194304] 0 2026-03-09T00:03:46.318 INFO:tasks.workunit.client.0.vm03.stdout:1/322: creat d4/d15/f6d x:0 0 0 2026-03-09T00:03:46.322 INFO:tasks.workunit.client.1.vm06.stdout:6/463: dread d4/ff [0,4194304] 0 2026-03-09T00:03:46.324 INFO:tasks.workunit.client.1.vm06.stdout:6/464: chown d4/d27/d42/d4b/l58 49500 1 2026-03-09T00:03:46.325 INFO:tasks.workunit.client.0.vm03.stdout:4/306: write d7/f15 [2903213,63085] 0 2026-03-09T00:03:46.325 INFO:tasks.workunit.client.0.vm03.stdout:4/307: chown d7/d20/d29/d4e 1913756 1 2026-03-09T00:03:46.325 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.325+0000 7f74d77fe700 1 -- 192.168.123.103:0/3845565312 --> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f74dc000bf0 con 0x7f74d8071c80 2026-03-09T00:03:46.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.328+0000 7f74edffb700 1 -- 192.168.123.103:0/3845565312 <== mgr.24345 v2:192.168.123.106:6828/4100748704 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+310 (secure 0 0 0) 0x7f74dc000bf0 con 0x7f74d8071c80 2026-03-09T00:03:46.328 INFO:tasks.workunit.client.0.vm03.stdout:2/221: dread f6 [0,4194304] 0 2026-03-09T00:03:46.329 INFO:tasks.workunit.client.0.vm03.stdout:5/267: creat d1c/d20/d55/d4f/d58/d5d/f5e x:0 0 0 2026-03-09T00:03:46.329 INFO:tasks.workunit.client.0.vm03.stdout:5/268: readlink d1c/d20/l50 0 2026-03-09T00:03:46.329 INFO:tasks.workunit.client.0.vm03.stdout:5/269: chown d1c/c31 503461 1 2026-03-09T00:03:46.332 INFO:tasks.workunit.client.1.vm06.stdout:1/414: mkdir d6/d21/d2d/d3b/d87 0 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f74d8071c80 msgr2=0x7f74d8074140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f74d8071c80 0x7f74d8074140 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f74e0005950 tx=0x7f74e00058e0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 msgr2=0x7f74f01a5b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f01a5b00 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f74e4009fd0 tx=0x7f74e400f740 comp rx=0 tx=0).stop 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 shutdown_connections 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f74d8071c80 0x7f74d8074140 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f74f0107d90 0x7f74f01a55c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 --2- 192.168.123.103:0/3845565312 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f74f010a700 0x7f74f01a5b00 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 >> 192.168.123.103:0/3845565312 conn(0x7f74f006dda0 msgr2=0x7f74f010c130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 shutdown_connections 2026-03-09T00:03:46.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.333+0000 7f74f725e700 1 -- 192.168.123.103:0/3845565312 wait complete. 2026-03-09T00:03:46.337 INFO:tasks.workunit.client.1.vm06.stdout:8/431: dwrite db/f1d [0,4194304] 0 2026-03-09T00:03:46.341 INFO:tasks.workunit.client.1.vm06.stdout:7/463: creat d0/df/d1a/d3a/f83 x:0 0 0 2026-03-09T00:03:46.345 INFO:tasks.workunit.client.0.vm03.stdout:0/258: mkdir d2/da/d36/d39/d4b/d55 0 2026-03-09T00:03:46.345 INFO:tasks.workunit.client.0.vm03.stdout:0/259: readlink d2/da/l13 0 2026-03-09T00:03:46.346 INFO:tasks.workunit.client.1.vm06.stdout:8/432: read db/d1e/f52 [2281900,36914] 0 2026-03-09T00:03:46.346 INFO:tasks.workunit.client.1.vm06.stdout:8/433: getdents db/d74/d87 0 2026-03-09T00:03:46.354 INFO:tasks.workunit.client.1.vm06.stdout:2/553: rename d7/da/d4e/f83 to d7/d1a/d56/fa4 0 2026-03-09T00:03:46.356 INFO:tasks.workunit.client.0.vm03.stdout:9/249: rename d15/f48 to d15/d1c/d28/d30/f50 0 2026-03-09T00:03:46.357 INFO:tasks.workunit.client.0.vm03.stdout:9/250: write d15/d1c/d36/f4a [329032,111182] 0 2026-03-09T00:03:46.357 INFO:tasks.workunit.client.1.vm06.stdout:3/544: readlink d11/lb6 0 2026-03-09T00:03:46.357 INFO:tasks.workunit.client.1.vm06.stdout:3/545: read d11/d28/d2e/f62 [5263453,23880] 0 2026-03-09T00:03:46.357 INFO:tasks.workunit.client.1.vm06.stdout:3/546: stat d11/d28/d2e/d2f 0 2026-03-09T00:03:46.360 INFO:tasks.workunit.client.1.vm06.stdout:9/380: mknod d1/d3/d2b/d58/c7b 0 2026-03-09T00:03:46.362 INFO:tasks.workunit.client.1.vm06.stdout:2/554: write d7/da/f18 [2415473,111809] 0 2026-03-09T00:03:46.364 INFO:tasks.workunit.client.0.vm03.stdout:8/236: dwrite d7/f3c [0,4194304] 0 2026-03-09T00:03:46.365 INFO:tasks.workunit.client.0.vm03.stdout:8/237: dread d7/f25 [0,4194304] 0 2026-03-09T00:03:46.374 INFO:tasks.workunit.client.0.vm03.stdout:1/323: creat d4/d6/f6e x:0 0 0 2026-03-09T00:03:46.376 INFO:tasks.workunit.client.1.vm06.stdout:4/410: dwrite d17/d24/d3b/d54/f80 [0,4194304] 0 2026-03-09T00:03:46.376 INFO:tasks.workunit.client.1.vm06.stdout:4/411: creat d17/d21/d4c/d50/f8c x:0 0 0 2026-03-09T00:03:46.377 INFO:tasks.workunit.client.0.vm03.stdout:4/308: mknod d7/d20/d35/c63 0 2026-03-09T00:03:46.379 INFO:tasks.workunit.client.0.vm03.stdout:2/222: link d8/d17/c49 d8/d1b/d24/c4a 0 2026-03-09T00:03:46.385 INFO:tasks.workunit.client.1.vm06.stdout:7/464: truncate d0/df/d1a/d22/f28 895040 0 2026-03-09T00:03:46.385 INFO:tasks.workunit.client.1.vm06.stdout:7/465: stat d0/df/d1a/d27/d4c/d40/f41 0 2026-03-09T00:03:46.386 INFO:tasks.workunit.client.0.vm03.stdout:5/270: unlink d1c/d20/d55/d4f/d58/d5d/f5e 0 2026-03-09T00:03:46.386 INFO:tasks.workunit.client.0.vm03.stdout:5/271: readlink d1c/d20/d55/l40 0 2026-03-09T00:03:46.389 INFO:tasks.workunit.client.1.vm06.stdout:8/434: creat db/d74/f8e x:0 0 0 2026-03-09T00:03:46.389 INFO:tasks.workunit.client.1.vm06.stdout:8/435: fsync db/d53/d70/f71 0 2026-03-09T00:03:46.390 INFO:tasks.workunit.client.1.vm06.stdout:0/471: dwrite d3/fa [0,4194304] 0 2026-03-09T00:03:46.390 INFO:tasks.workunit.client.1.vm06.stdout:0/472: write d3/d18/d1f/d39/f83 [44631,98627] 0 2026-03-09T00:03:46.397 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:46 vm03.local ceph-mon[52346]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T00:03:46.407 INFO:tasks.workunit.client.0.vm03.stdout:3/200: dread d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:03:46.420 INFO:tasks.workunit.client.1.vm06.stdout:2/555: dread d7/da/db/f6e [0,4194304] 0 2026-03-09T00:03:46.420 INFO:tasks.workunit.client.1.vm06.stdout:2/556: fdatasync d7/f4c 0 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:3/547: mkdir d11/d28/d2e/db2/dc2 0 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:3/548: readlink d11/d28/l43 0 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:3/549: write d11/d28/f6b [293874,42227] 0 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:3/550: dread - d11/d28/d2e/d2f/f64 zero size 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:3/551: dread - d11/d28/d2e/d2f/d36/f44 zero size 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:2/557: write d7/d1b/d5a/d86/f8a [125051,95917] 0 2026-03-09T00:03:46.421 INFO:tasks.workunit.client.1.vm06.stdout:2/558: readlink d7/d1a/d56/l6f 0 2026-03-09T00:03:46.424 INFO:tasks.workunit.client.0.vm03.stdout:8/238: link d7/l39 d7/df/d1e/d38/l4a 0 2026-03-09T00:03:46.427 INFO:tasks.workunit.client.0.vm03.stdout:2/223: read d8/d1b/f30 [3918439,55047] 0 2026-03-09T00:03:46.441 INFO:tasks.workunit.client.0.vm03.stdout:1/324: unlink d4/d3a/d3d/f4a 0 2026-03-09T00:03:46.450 INFO:tasks.workunit.client.1.vm06.stdout:8/436: dread f7 [0,4194304] 0 2026-03-09T00:03:46.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.451+0000 7fbcc75bd700 1 -- 192.168.123.103:0/1479827385 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0075a40 msgr2=0x7fbcc0077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.451+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/1479827385 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0075a40 0x7fbcc0077ed0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fbcb8009230 tx=0x7fbcb8009260 comp rx=0 tx=0).stop 2026-03-09T00:03:46.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 -- 192.168.123.103:0/1479827385 shutdown_connections 2026-03-09T00:03:46.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/1479827385 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0075a40 0x7fbcc0077ed0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/1479827385 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc0072b50 0x7fbcc0072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 -- 192.168.123.103:0/1479827385 >> 192.168.123.103:0/1479827385 conn(0x7fbcc006dae0 msgr2=0x7fbcc006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 -- 192.168.123.103:0/1479827385 shutdown_connections 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 -- 192.168.123.103:0/1479827385 wait complete. 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.452+0000 7fbcc75bd700 1 Processor -- start 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc75bd700 1 -- start start 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc75bd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0072b50 0x7fbcc00830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc75bd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 0x7fbcc012e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc75bd700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbcc0083af0 con 0x7fbcc00835e0 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc75bd700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbcc0083c60 con 0x7fbcc0072b50 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc4b58700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 0x7fbcc012e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc4b58700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 0x7fbcc012e3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56946/0 (socket says 192.168.123.103:56946) 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc4b58700 1 -- 192.168.123.103:0/3799204622 learned_addr learned my addr 192.168.123.103:0/3799204622 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc5359700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0072b50 0x7fbcc00830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc4b58700 1 -- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0072b50 msgr2=0x7fbcc00830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc4b58700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0072b50 0x7fbcc00830a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.453+0000 7fbcc4b58700 1 -- 192.168.123.103:0/3799204622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbcb8008ee0 con 0x7fbcc00835e0 2026-03-09T00:03:46.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.454+0000 7fbcc4b58700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 0x7fbcc012e3f0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7fbcb8003fa0 tx=0x7fbcb8008e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:46.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.454+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbcb801d070 con 0x7fbcc00835e0 2026-03-09T00:03:46.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.455+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbcc012e930 con 0x7fbcc00835e0 2026-03-09T00:03:46.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.455+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbcc012ee20 con 0x7fbcc00835e0 2026-03-09T00:03:46.461 INFO:tasks.workunit.client.1.vm06.stdout:8/437: write db/dd/d48/f4e [143595,9263] 0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.455+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbcb8007cb0 con 0x7fbcc00835e0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.456+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbcb800eaf0 con 0x7fbcc00835e0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.456+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbca4005320 con 0x7fbcc00835e0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.457+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 25) v1 ==== 95238+0+0 (secure 0 0 0) 0x7fbcb800ec50 con 0x7fbcc00835e0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.457+0000 7fbcb67fc700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7fbcac071f50 0x7fbcac074410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.458+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fbcb8012070 con 0x7fbcc00835e0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.458+0000 7fbcc5359700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7fbcac071f50 0x7fbcac074410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.461 INFO:tasks.workunit.client.0.vm03.stdout:5/272: mkdir d1c/d20/d55/d4f/d5f 0 2026-03-09T00:03:46.461 INFO:tasks.workunit.client.0.vm03.stdout:5/273: stat d1c/f1e 0 2026-03-09T00:03:46.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.460+0000 7fbcc5359700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7fbcac071f50 0x7fbcac074410 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fbcbc00b3c0 tx=0x7fbcbc00d040 comp rx=0 tx=0).ready entity=mgr.24345 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:46.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.463+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fbcb805cc40 con 0x7fbcc00835e0 2026-03-09T00:03:46.463 INFO:tasks.workunit.client.0.vm03.stdout:9/251: dwrite d15/f26 [0,4194304] 0 2026-03-09T00:03:46.463 INFO:tasks.workunit.client.0.vm03.stdout:9/252: readlink d15/l19 0 2026-03-09T00:03:46.463 INFO:tasks.workunit.client.1.vm06.stdout:5/589: dwrite d5/d44/d4b/d92/f4e [0,4194304] 0 2026-03-09T00:03:46.463 INFO:tasks.workunit.client.1.vm06.stdout:5/590: stat d5/f36 0 2026-03-09T00:03:46.487 INFO:tasks.workunit.client.1.vm06.stdout:4/412: unlink d17/c8a 0 2026-03-09T00:03:46.489 INFO:tasks.workunit.client.0.vm03.stdout:1/325: unlink d4/d15/c51 0 2026-03-09T00:03:46.489 INFO:tasks.workunit.client.0.vm03.stdout:1/326: creat d4/d15/d5c/f6f x:0 0 0 2026-03-09T00:03:46.492 INFO:tasks.workunit.client.1.vm06.stdout:6/465: rmdir d4/d16/d53 39 2026-03-09T00:03:46.492 INFO:tasks.workunit.client.1.vm06.stdout:6/466: dread - d4/d27/d3e/d45/f4d zero size 2026-03-09T00:03:46.510 INFO:tasks.workunit.client.0.vm03.stdout:5/274: mknod d1c/d20/d55/d4f/c60 0 2026-03-09T00:03:46.510 INFO:tasks.workunit.client.0.vm03.stdout:5/275: write d1c/f1f [5037371,35421] 0 2026-03-09T00:03:46.510 INFO:tasks.workunit.client.0.vm03.stdout:5/276: chown f15 980998696 1 2026-03-09T00:03:46.510 INFO:tasks.workunit.client.0.vm03.stdout:5/277: chown d1c/d20/d55/l40 552 1 2026-03-09T00:03:46.510 INFO:tasks.workunit.client.0.vm03.stdout:5/278: readlink d1c/d20/l41 0 2026-03-09T00:03:46.510 INFO:tasks.workunit.client.0.vm03.stdout:5/279: write d1c/d20/d55/d3b/f57 [789488,96803] 0 2026-03-09T00:03:46.511 INFO:tasks.workunit.client.0.vm03.stdout:3/201: dwrite d2/db/f14 [4194304,4194304] 0 2026-03-09T00:03:46.520 INFO:tasks.workunit.client.0.vm03.stdout:5/280: dread d1c/d20/f39 [0,4194304] 0 2026-03-09T00:03:46.522 INFO:tasks.workunit.client.0.vm03.stdout:5/281: write f14 [934607,73740] 0 2026-03-09T00:03:46.522 INFO:tasks.workunit.client.0.vm03.stdout:5/282: write d1c/d20/f4e [901821,3044] 0 2026-03-09T00:03:46.531 INFO:tasks.workunit.client.1.vm06.stdout:0/473: truncate d3/f1b 2382422 0 2026-03-09T00:03:46.538 INFO:tasks.workunit.client.0.vm03.stdout:4/309: dwrite d7/d20/f34 [0,4194304] 0 2026-03-09T00:03:46.543 INFO:tasks.workunit.client.1.vm06.stdout:7/466: dwrite d0/df/d1a/d3a/d4e/d5e/f73 [0,4194304] 0 2026-03-09T00:03:46.550 INFO:tasks.workunit.client.0.vm03.stdout:1/327: dwrite d4/d3a/d43/f49 [0,4194304] 0 2026-03-09T00:03:46.550 INFO:tasks.workunit.client.0.vm03.stdout:1/328: chown d4/d15/d5c/c69 332790 1 2026-03-09T00:03:46.554 INFO:tasks.workunit.client.1.vm06.stdout:1/415: rename d6/d21/d2d/c85 to d6/c88 0 2026-03-09T00:03:46.560 INFO:tasks.workunit.client.0.vm03.stdout:2/224: getdents d8/d1b/d2a/d2e 0 2026-03-09T00:03:46.560 INFO:tasks.workunit.client.0.vm03.stdout:2/225: stat d8 0 2026-03-09T00:03:46.564 INFO:tasks.workunit.client.1.vm06.stdout:3/552: write d11/d28/d2e/d2f/d5b/d94/faa [1334960,29609] 0 2026-03-09T00:03:46.566 INFO:tasks.workunit.client.1.vm06.stdout:8/438: mkdir db/d53/d7c/d8f 0 2026-03-09T00:03:46.567 INFO:tasks.workunit.client.1.vm06.stdout:4/413: unlink d17/d21/d4c/f56 0 2026-03-09T00:03:46.568 INFO:tasks.workunit.client.1.vm06.stdout:4/414: write f15 [535850,126942] 0 2026-03-09T00:03:46.568 INFO:tasks.workunit.client.1.vm06.stdout:4/415: write d17/d24/d49/f5a [225772,9111] 0 2026-03-09T00:03:46.569 INFO:tasks.workunit.client.1.vm06.stdout:9/381: dwrite d1/d3/d4f/f74 [0,4194304] 0 2026-03-09T00:03:46.569 INFO:tasks.workunit.client.1.vm06.stdout:9/382: chown d1/d4/ff 27 1 2026-03-09T00:03:46.569 INFO:tasks.workunit.client.1.vm06.stdout:6/467: mknod d4/d27/d3e/d57/c8e 0 2026-03-09T00:03:46.570 INFO:tasks.workunit.client.1.vm06.stdout:8/439: write db/dd/f1c [1400017,17780] 0 2026-03-09T00:03:46.570 INFO:tasks.workunit.client.1.vm06.stdout:8/440: write db/d1e/f51 [4345472,86634] 0 2026-03-09T00:03:46.570 INFO:tasks.workunit.client.1.vm06.stdout:8/441: readlink db/d53/d70/l77 0 2026-03-09T00:03:46.571 INFO:tasks.workunit.client.0.vm03.stdout:8/239: dwrite d7/df/d1a/f2a [4194304,4194304] 0 2026-03-09T00:03:46.571 INFO:tasks.workunit.client.0.vm03.stdout:8/240: fsync d7/f10 0 2026-03-09T00:03:46.581 INFO:tasks.workunit.client.1.vm06.stdout:5/591: dwrite d5/d1c/d21/d28/f59 [0,4194304] 0 2026-03-09T00:03:46.583 INFO:tasks.workunit.client.1.vm06.stdout:5/592: dread d5/d1c/d21/d28/f57 [0,4194304] 0 2026-03-09T00:03:46.583 INFO:tasks.workunit.client.0.vm03.stdout:5/283: link d1c/f37 d1c/d20/d55/f61 0 2026-03-09T00:03:46.592 INFO:tasks.workunit.client.1.vm06.stdout:0/474: symlink d3/l9c 0 2026-03-09T00:03:46.602 INFO:tasks.workunit.client.0.vm03.stdout:1/329: creat d4/d3a/d3d/d46/f70 x:0 0 0 2026-03-09T00:03:46.602 INFO:tasks.workunit.client.0.vm03.stdout:1/330: readlink d4/d3a/d3d/d46/l50 0 2026-03-09T00:03:46.610 INFO:tasks.workunit.client.0.vm03.stdout:4/310: dwrite d7/f1d [0,4194304] 0 2026-03-09T00:03:46.611 INFO:tasks.workunit.client.0.vm03.stdout:8/241: dwrite d7/df/d1a/d2b/f44 [0,4194304] 0 2026-03-09T00:03:46.611 INFO:tasks.workunit.client.0.vm03.stdout:8/242: fsync d7/df/d1a/d2b/f44 0 2026-03-09T00:03:46.611 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.610+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 --> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fbca4000bf0 con 0x7fbcac071f50 2026-03-09T00:03:46.616 INFO:tasks.workunit.client.0.vm03.stdout:9/253: getdents d15/d1c/d36 0 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.619+0000 7fbcb67fc700 1 -- 192.168.123.103:0/3799204622 <== mgr.24345 v2:192.168.123.106:6828/4100748704 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7fbca4000bf0 con 0x7fbcac071f50 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 4s ago 4m 24.7M - 0.25.0 c8568f914cd2 9b05d2f3502a 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (4m) 4s ago 4m 8321k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (3m) 5s ago 3m 8460k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 4s ago 4m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (3m) 5s ago 3m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 4s ago 3m 82.8M - 9.4.7 954c08fa6188 9db2e5805e97 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (2m) 4s ago 2m 15.6M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (2m) 4s ago 2m 204M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (119s) 5s ago 119s 18.0M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (2m) 5s ago 2m 13.8M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:03:46.618 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:9283,8765,8443 running (4m) 4s ago 4m 282M - 18.2.1 5be31c24972a e48c90025d56 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (18s) 5s ago 3m 544M - 19.2.3-678-ge911bdeb 654f31e6858e 86f8a8de528c 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 4s ago 4m 54.8M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (3m) 5s ago 3m 52.0M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 4s ago 4m 14.4M - 1.5.0 0da6a335fe13 750af7597536 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 5s ago 3m 15.5M - 1.5.0 0da6a335fe13 a82b7dc84593 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 4s ago 3m 283M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 4s ago 3m 289M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 4s ago 2m 274M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (2m) 5s ago 2m 352M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (2m) 5s ago 2m 309M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (2m) 5s ago 2m 274M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:03:46.619 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 4s ago 3m 53.7M - 2.43.0 a07b618ecd1d a4a1b4f06180 2026-03-09T00:03:46.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7fbcac071f50 msgr2=0x7fbcac074410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7fbcac071f50 0x7fbcac074410 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fbcbc00b3c0 tx=0x7fbcbc00d040 comp rx=0 tx=0).stop 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 msgr2=0x7fbcc012e3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 0x7fbcc012e3f0 secure :-1 s=READY pgs=300 cs=0 l=1 rev1=1 crypto rx=0x7fbcb8003fa0 tx=0x7fbcb8008e70 comp rx=0 tx=0).stop 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 shutdown_connections 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbcc0072b50 0x7fbcc00830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7fbcac071f50 0x7fbcac074410 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 --2- 192.168.123.103:0/3799204622 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbcc00835e0 0x7fbcc012e3f0 unknown :-1 s=CLOSED pgs=300 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 >> 192.168.123.103:0/3799204622 conn(0x7fbcc006dae0 msgr2=0x7fbcc006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.623+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 shutdown_connections 2026-03-09T00:03:46.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.624+0000 7fbcc75bd700 1 -- 192.168.123.103:0/3799204622 wait complete. 2026-03-09T00:03:46.641 INFO:tasks.workunit.client.0.vm03.stdout:1/331: dwrite d4/d6/f20 [0,4194304] 0 2026-03-09T00:03:46.670 INFO:tasks.workunit.client.0.vm03.stdout:2/226: unlink d8/d1b/f3f 0 2026-03-09T00:03:46.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:46 vm06.local ceph-mon[58395]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T00:03:46.683 INFO:tasks.workunit.client.1.vm06.stdout:7/467: getdents d0/df/d1a/d27/d4c/d40/d51 0 2026-03-09T00:03:46.683 INFO:tasks.workunit.client.1.vm06.stdout:7/468: fsync d0/df/d17/f38 0 2026-03-09T00:03:46.684 INFO:tasks.workunit.client.1.vm06.stdout:1/416: mknod d6/d21/d2d/d3b/d42/c89 0 2026-03-09T00:03:46.688 INFO:tasks.workunit.client.0.vm03.stdout:9/254: dwrite f8 [0,4194304] 0 2026-03-09T00:03:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.710+0000 7f0033527700 1 -- 192.168.123.103:0/1055988893 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c075a40 msgr2=0x7f002c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.710+0000 7f0033527700 1 --2- 192.168.123.103:0/1055988893 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c075a40 0x7f002c077ed0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f002400d3f0 tx=0x7f002400d700 comp rx=0 tx=0).stop 2026-03-09T00:03:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- 192.168.123.103:0/1055988893 shutdown_connections 2026-03-09T00:03:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 --2- 192.168.123.103:0/1055988893 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c075a40 0x7f002c077ed0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 --2- 192.168.123.103:0/1055988893 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- 192.168.123.103:0/1055988893 >> 192.168.123.103:0/1055988893 conn(0x7f002c06dae0 msgr2=0x7f002c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- 192.168.123.103:0/1055988893 shutdown_connections 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- 192.168.123.103:0/1055988893 wait complete. 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 Processor -- start 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- start start 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c083180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c0836c0 0x7f002c1b3240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f002c083b40 con 0x7f002c0836c0 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0033527700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f002c083cb0 con 0x7f002c072b50 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f0030ac2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c0836c0 0x7f002c1b3240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.711+0000 7f00312c3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c083180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00312c3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c083180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49312/0 (socket says 192.168.123.103:49312) 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00312c3700 1 -- 192.168.123.103:0/3149584614 learned_addr learned my addr 192.168.123.103:0/3149584614 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00312c3700 1 -- 192.168.123.103:0/3149584614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c0836c0 msgr2=0x7f002c1b3240 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00312c3700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c0836c0 0x7f002c1b3240 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00312c3700 1 -- 192.168.123.103:0/3149584614 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0024007ed0 con 0x7f002c072b50 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00312c3700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c083180 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f002800c610 tx=0x7f002800c920 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:46.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f002801e410 con 0x7f002c072b50 2026-03-09T00:03:46.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f002c1b37e0 con 0x7f002c072b50 2026-03-09T00:03:46.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.712+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f002c1b3d00 con 0x7f002c072b50 2026-03-09T00:03:46.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.713+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f002801ea50 con 0x7f002c072b50 2026-03-09T00:03:46.712 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.713+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0028016860 con 0x7f002c072b50 2026-03-09T00:03:46.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.713+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0010005320 con 0x7f002c072b50 2026-03-09T00:03:46.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.714+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f0028016030 con 0x7f002c072b50 2026-03-09T00:03:46.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.714+0000 7f00227fc700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f0018071e80 0x7f0018074340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:46.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.715+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f002809abf0 con 0x7f002c072b50 2026-03-09T00:03:46.714 INFO:tasks.workunit.client.1.vm06.stdout:6/468: creat d4/d16/d53/d67/f8f x:0 0 0 2026-03-09T00:03:46.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.715+0000 7f0030ac2700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f0018071e80 0x7f0018074340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:46.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.716+0000 7f0030ac2700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f0018071e80 0x7f0018074340 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f002400e010 tx=0x7f00240061f0 comp rx=0 tx=0).ready entity=mgr.24345 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:46.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.718+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f0028063890 con 0x7f002c072b50 2026-03-09T00:03:46.724 INFO:tasks.workunit.client.1.vm06.stdout:7/469: dwrite d0/df/d1a/d27/d4c/d40/d5b/f78 [0,4194304] 0 2026-03-09T00:03:46.724 INFO:tasks.workunit.client.1.vm06.stdout:7/470: stat d0/df/d1a/d27/d4c/f32 0 2026-03-09T00:03:46.738 INFO:tasks.workunit.client.1.vm06.stdout:8/442: mknod db/d53/d70/d38/c90 0 2026-03-09T00:03:46.740 INFO:tasks.workunit.client.0.vm03.stdout:3/202: getdents d2/db/d2d 0 2026-03-09T00:03:46.741 INFO:tasks.workunit.client.0.vm03.stdout:1/332: dwrite d4/d3a/d43/f47 [0,4194304] 0 2026-03-09T00:03:46.750 INFO:tasks.workunit.client.0.vm03.stdout:6/244: sync 2026-03-09T00:03:46.750 INFO:tasks.workunit.client.0.vm03.stdout:5/284: mknod d1c/d20/c62 0 2026-03-09T00:03:46.750 INFO:tasks.workunit.client.0.vm03.stdout:5/285: getdents d1c/d51 0 2026-03-09T00:03:46.750 INFO:tasks.workunit.client.0.vm03.stdout:7/273: sync 2026-03-09T00:03:46.750 INFO:tasks.workunit.client.0.vm03.stdout:0/260: sync 2026-03-09T00:03:46.750 INFO:tasks.workunit.client.1.vm06.stdout:6/469: dread d4/d16/f34 [4194304,4194304] 0 2026-03-09T00:03:46.754 INFO:tasks.workunit.client.0.vm03.stdout:6/245: dread d13/d1e/f21 [0,4194304] 0 2026-03-09T00:03:46.756 INFO:tasks.workunit.client.1.vm06.stdout:6/470: dread d4/f3d [0,4194304] 0 2026-03-09T00:03:46.767 INFO:tasks.workunit.client.1.vm06.stdout:5/593: mkdir d5/d44/d84/dc5 0 2026-03-09T00:03:46.770 INFO:tasks.workunit.client.0.vm03.stdout:8/243: mknod d7/df/d1a/d2b/d43/c4b 0 2026-03-09T00:03:46.782 INFO:tasks.workunit.client.0.vm03.stdout:2/227: mkdir d8/d1b/d2a/d42/d4b 0 2026-03-09T00:03:46.782 INFO:tasks.workunit.client.0.vm03.stdout:2/228: read d8/d1b/d24/f2f [505944,67838] 0 2026-03-09T00:03:46.786 INFO:tasks.workunit.client.0.vm03.stdout:2/229: write d8/f3e [3376351,71309] 0 2026-03-09T00:03:46.798 INFO:tasks.workunit.client.0.vm03.stdout:9/255: mknod d15/d1c/d21/c51 0 2026-03-09T00:03:46.798 INFO:tasks.workunit.client.0.vm03.stdout:9/256: write d15/d1c/d28/d30/f4e [748076,54234] 0 2026-03-09T00:03:46.813 INFO:tasks.workunit.client.0.vm03.stdout:4/311: sync 2026-03-09T00:03:46.832 INFO:tasks.workunit.client.0.vm03.stdout:1/333: link d4/d3a/f2c d4/d15/d5c/d6c/f71 0 2026-03-09T00:03:46.832 INFO:tasks.workunit.client.0.vm03.stdout:1/334: write d4/fb [4438590,5627] 0 2026-03-09T00:03:46.841 INFO:tasks.workunit.client.0.vm03.stdout:0/261: dwrite d2/da/d36/d39/f52 [0,4194304] 0 2026-03-09T00:03:46.842 INFO:tasks.workunit.client.0.vm03.stdout:5/286: mknod d1c/d20/d55/d3b/c63 0 2026-03-09T00:03:46.842 INFO:tasks.workunit.client.0.vm03.stdout:7/274: rmdir d2/d1f/d3a/d31/d37 39 2026-03-09T00:03:46.849 INFO:tasks.workunit.client.0.vm03.stdout:8/244: dread d7/df/f2c [0,4194304] 0 2026-03-09T00:03:46.849 INFO:tasks.workunit.client.0.vm03.stdout:8/245: write d7/df/d1a/f2e [456453,54428] 0 2026-03-09T00:03:46.860 INFO:tasks.workunit.client.0.vm03.stdout:2/230: link d8/d1b/f32 d8/d1b/d2a/f4c 0 2026-03-09T00:03:46.860 INFO:tasks.workunit.client.0.vm03.stdout:2/231: chown d8/d17 910 1 2026-03-09T00:03:46.874 INFO:tasks.workunit.client.1.vm06.stdout:0/475: mknod d3/d18/d1f/d39/d3b/c9d 0 2026-03-09T00:03:46.876 INFO:tasks.workunit.client.0.vm03.stdout:3/203: getdents d2/db 0 2026-03-09T00:03:46.876 INFO:tasks.workunit.client.0.vm03.stdout:3/204: readlink d2/db/l33 0 2026-03-09T00:03:46.892 INFO:tasks.workunit.client.0.vm03.stdout:4/312: rename d7/d20/d29/d38/d3a/l49 to d7/d20/d29/d4e/l64 0 2026-03-09T00:03:46.902 INFO:tasks.workunit.client.1.vm06.stdout:1/417: mknod d6/d21/d2d/d3b/c8a 0 2026-03-09T00:03:46.903 INFO:tasks.workunit.client.1.vm06.stdout:1/418: dread d6/d4c/d79/f59 [0,4194304] 0 2026-03-09T00:03:46.903 INFO:tasks.workunit.client.1.vm06.stdout:1/419: write d6/d4c/d71/f84 [806169,76721] 0 2026-03-09T00:03:46.903 INFO:tasks.workunit.client.1.vm06.stdout:1/420: chown d6/fb 273 1 2026-03-09T00:03:46.903 INFO:tasks.workunit.client.1.vm06.stdout:1/421: truncate d6/f34 1159585 0 2026-03-09T00:03:46.903 INFO:tasks.workunit.client.1.vm06.stdout:1/422: readlink d6/d4c/d79/l53 0 2026-03-09T00:03:46.903 INFO:tasks.workunit.client.1.vm06.stdout:1/423: creat d6/d21/d2d/d37/f8b x:0 0 0 2026-03-09T00:03:46.904 INFO:tasks.workunit.client.1.vm06.stdout:3/553: rmdir d11/d3f 39 2026-03-09T00:03:46.909 INFO:tasks.workunit.client.1.vm06.stdout:4/416: rmdir d17/d21/d4c/d50 39 2026-03-09T00:03:46.913 INFO:tasks.workunit.client.1.vm06.stdout:4/417: stat d17/d24/d49/d5f/c7f 0 2026-03-09T00:03:46.913 INFO:tasks.workunit.client.1.vm06.stdout:4/418: fsync f1 0 2026-03-09T00:03:46.913 INFO:tasks.workunit.client.0.vm03.stdout:9/257: getdents d15/d1c/d21 0 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.913+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0010005cc0 con 0x7f002c072b50 2026-03-09T00:03:46.914 INFO:tasks.workunit.client.0.vm03.stdout:1/335: rename d4/d6/le to d4/l72 0 2026-03-09T00:03:46.914 INFO:tasks.workunit.client.1.vm06.stdout:9/383: getdents d1/d3/d2b/d58 0 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.915+0000 7f00227fc700 1 -- 192.168.123.103:0/3149584614 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+770 (secure 0 0 0) 0x7f0028062fe0 con 0x7f002c072b50 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 1, 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:03:46.914 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:03:46.915 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 13, 2026-03-09T00:03:46.915 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:03:46.915 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:03:46.915 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:03:46.917 INFO:tasks.workunit.client.0.vm03.stdout:4/313: rmdir d7/d20/d35 39 2026-03-09T00:03:46.918 INFO:tasks.workunit.client.0.vm03.stdout:9/258: mknod d15/c52 0 2026-03-09T00:03:46.918 INFO:tasks.workunit.client.1.vm06.stdout:9/384: write d1/d4/d6e/d9/f40 [5057407,28664] 0 2026-03-09T00:03:46.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.919+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f0018071e80 msgr2=0x7f0018074340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.919+0000 7f0033527700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f0018071e80 0x7f0018074340 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f002400e010 tx=0x7f00240061f0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.919+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 msgr2=0x7f002c083180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:46.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.919+0000 7f0033527700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c083180 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f002800c610 tx=0x7f002800c920 comp rx=0 tx=0).stop 2026-03-09T00:03:46.919 INFO:tasks.workunit.client.1.vm06.stdout:7/471: rename d0/df/d1a/f7a to d0/df/d1a/d3a/f84 0 2026-03-09T00:03:46.919 INFO:tasks.workunit.client.0.vm03.stdout:7/275: rename d2/d1f/f28 to d2/d1f/d35/f5a 0 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.920+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 shutdown_connections 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.920+0000 7f0033527700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f002c072b50 0x7f002c083180 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.920+0000 7f0033527700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7f0018071e80 0x7f0018074340 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.920+0000 7f0033527700 1 --2- 192.168.123.103:0/3149584614 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f002c0836c0 0x7f002c1b3240 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.920+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 >> 192.168.123.103:0/3149584614 conn(0x7f002c06dae0 msgr2=0x7f002c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.921+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 shutdown_connections 2026-03-09T00:03:46.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:46.921+0000 7f0033527700 1 -- 192.168.123.103:0/3149584614 wait complete. 2026-03-09T00:03:46.921 INFO:tasks.workunit.client.0.vm03.stdout:4/314: dread d7/fe [0,4194304] 0 2026-03-09T00:03:46.921 INFO:tasks.workunit.client.0.vm03.stdout:4/315: creat d7/d20/d29/d38/d3a/f65 x:0 0 0 2026-03-09T00:03:46.922 INFO:tasks.workunit.client.0.vm03.stdout:8/246: dwrite d7/df/f3d [0,4194304] 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:9/259: mknod d15/d1c/d28/d30/c53 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:8/247: dread d7/f25 [0,4194304] 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:7/276: link d2/d1f/d3a/f19 d2/d1f/d42/d46/f5b 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:7/277: write d2/d1f/d35/f3e [3647346,62533] 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:7/278: truncate d2/d1f/d3a/d31/d37/d39/f4b 734985 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:9/260: mkdir d15/d1c/d21/d54 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.0.vm03.stdout:9/261: fsync d15/f44 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.1.vm06.stdout:5/594: mknod d5/d44/d4b/d92/cc6 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.1.vm06.stdout:5/595: creat d5/d1c/d68/fc7 x:0 0 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.1.vm06.stdout:3/554: mknod d11/d28/d4d/d89/cc3 0 2026-03-09T00:03:46.932 INFO:tasks.workunit.client.1.vm06.stdout:9/385: truncate d1/d3/d12/f28 3373928 0 2026-03-09T00:03:46.933 INFO:tasks.workunit.client.1.vm06.stdout:9/386: fsync d1/d4/d6e/d14/d25/f6f 0 2026-03-09T00:03:46.933 INFO:tasks.workunit.client.1.vm06.stdout:7/472: mkdir d0/d55/d85 0 2026-03-09T00:03:46.933 INFO:tasks.workunit.client.1.vm06.stdout:7/473: chown d0/df/d1a/d27/d70 6 1 2026-03-09T00:03:46.933 INFO:tasks.workunit.client.1.vm06.stdout:6/471: rename d4/d27/d42/d52/d5d to d4/d16/d46/d90 0 2026-03-09T00:03:46.933 INFO:tasks.workunit.client.0.vm03.stdout:6/246: dwrite d13/f4d [0,4194304] 0 2026-03-09T00:03:46.941 INFO:tasks.workunit.client.1.vm06.stdout:6/472: read d4/d27/d42/f60 [2409511,27757] 0 2026-03-09T00:03:46.950 INFO:tasks.workunit.client.1.vm06.stdout:6/473: dread d4/d16/f21 [0,4194304] 0 2026-03-09T00:03:46.951 INFO:tasks.workunit.client.1.vm06.stdout:0/476: dwrite d3/d18/d2c/f6b [0,4194304] 0 2026-03-09T00:03:46.951 INFO:tasks.workunit.client.0.vm03.stdout:0/262: dwrite d2/f22 [0,4194304] 0 2026-03-09T00:03:46.951 INFO:tasks.workunit.client.0.vm03.stdout:0/263: chown d2/da/d36/d39/d4b/d55 8 1 2026-03-09T00:03:46.951 INFO:tasks.workunit.client.0.vm03.stdout:0/264: chown d2/f32 834633 1 2026-03-09T00:03:46.959 INFO:tasks.workunit.client.0.vm03.stdout:8/248: truncate d7/f25 92088 0 2026-03-09T00:03:46.965 INFO:tasks.workunit.client.0.vm03.stdout:5/287: dwrite d1c/d20/d55/f46 [0,4194304] 0 2026-03-09T00:03:46.965 INFO:tasks.workunit.client.0.vm03.stdout:5/288: creat d1c/d20/d55/d4f/d58/d5d/f64 x:0 0 0 2026-03-09T00:03:46.965 INFO:tasks.workunit.client.0.vm03.stdout:5/289: creat d1c/d20/f65 x:0 0 0 2026-03-09T00:03:46.966 INFO:tasks.workunit.client.1.vm06.stdout:2/559: sync 2026-03-09T00:03:46.966 INFO:tasks.workunit.client.1.vm06.stdout:2/560: fdatasync f2 0 2026-03-09T00:03:46.972 INFO:tasks.workunit.client.1.vm06.stdout:4/419: dwrite f15 [0,4194304] 0 2026-03-09T00:03:46.975 INFO:tasks.workunit.client.1.vm06.stdout:2/561: write d7/d1b/f46 [1124729,39244] 0 2026-03-09T00:03:46.978 INFO:tasks.workunit.client.1.vm06.stdout:4/420: write f1 [2747333,129956] 0 2026-03-09T00:03:46.998 INFO:tasks.workunit.client.1.vm06.stdout:8/443: getdents db/d53/d70/d38/d4d 0 2026-03-09T00:03:47.002 INFO:tasks.workunit.client.0.vm03.stdout:7/279: rename d2/d1f/d3a/l27 to d2/d1f/d35/l5c 0 2026-03-09T00:03:47.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 -- 192.168.123.103:0/3798501169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf18075a40 msgr2=0x7faf18077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:47.002 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 --2- 192.168.123.103:0/3798501169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf18075a40 0x7faf18077ed0 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7faf1000d3f0 tx=0x7faf1000d700 comp rx=0 tx=0).stop 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 -- 192.168.123.103:0/3798501169 shutdown_connections 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 --2- 192.168.123.103:0/3798501169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf18075a40 0x7faf18077ed0 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 --2- 192.168.123.103:0/3798501169 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 -- 192.168.123.103:0/3798501169 >> 192.168.123.103:0/3798501169 conn(0x7faf1806dae0 msgr2=0x7faf1806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 -- 192.168.123.103:0/3798501169 shutdown_connections 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.003+0000 7faf1da1f700 1 -- 192.168.123.103:0/3798501169 wait complete. 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf1da1f700 1 Processor -- start 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf1da1f700 1 -- start start 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf1da1f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18083180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf1da1f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf180836c0 0x7faf1812e500 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf1da1f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf18083bd0 con 0x7faf180836c0 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf1da1f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf18083d40 con 0x7faf18072b50 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf167fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf180836c0 0x7faf1812e500 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf16ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18083180 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf16ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18083180 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49316/0 (socket says 192.168.123.103:49316) 2026-03-09T00:03:47.003 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf16ffd700 1 -- 192.168.123.103:0/2944668462 learned_addr learned my addr 192.168.123.103:0/2944668462 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:03:47.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf16ffd700 1 -- 192.168.123.103:0/2944668462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf180836c0 msgr2=0x7faf1812e500 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:47.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf16ffd700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf180836c0 0x7faf1812e500 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:47.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.004+0000 7faf16ffd700 1 -- 192.168.123.103:0/2944668462 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faf10007ed0 con 0x7faf18072b50 2026-03-09T00:03:47.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.005+0000 7faf16ffd700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18083180 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7faf0800d8d0 tx=0x7faf0800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:47.004 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.005+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faf08009940 con 0x7faf18072b50 2026-03-09T00:03:47.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.005+0000 7faf1da1f700 1 -- 192.168.123.103:0/2944668462 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faf1812eaa0 con 0x7faf18072b50 2026-03-09T00:03:47.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.005+0000 7faf1da1f700 1 -- 192.168.123.103:0/2944668462 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faf1812eff0 con 0x7faf18072b50 2026-03-09T00:03:47.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.006+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7faf08010460 con 0x7faf18072b50 2026-03-09T00:03:47.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.006+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faf0800f5d0 con 0x7faf18072b50 2026-03-09T00:03:47.005 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.006+0000 7faf1da1f700 1 -- 192.168.123.103:0/2944668462 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faf04005320 con 0x7faf18072b50 2026-03-09T00:03:47.006 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.007+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 25) v1 ==== 95238+0+0 (secure 0 0 0) 0x7faf080105d0 con 0x7faf18072b50 2026-03-09T00:03:47.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.007+0000 7faf1ca1d700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7faf00071ea0 0x7faf00074360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:03:47.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.007+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(41..41 src has 1..41) v4 ==== 5534+0+0 (secure 0 0 0) 0x7faf08092ca0 con 0x7faf18072b50 2026-03-09T00:03:47.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.008+0000 7faf167fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7faf00071ea0 0x7faf00074360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:03:47.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.008+0000 7faf167fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7faf00071ea0 0x7faf00074360 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7faf10000f80 tx=0x7faf1000db00 comp rx=0 tx=0).ready entity=mgr.24345 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:03:47.010 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.010+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7faf0805b240 con 0x7faf18072b50 2026-03-09T00:03:47.019 INFO:tasks.workunit.client.0.vm03.stdout:7/280: write d2/d4/f34 [3263754,98109] 0 2026-03-09T00:03:47.041 INFO:tasks.workunit.client.0.vm03.stdout:4/316: getdents d7/d20/d29/d4e 0 2026-03-09T00:03:47.044 INFO:tasks.workunit.client.0.vm03.stdout:9/262: creat d15/d1c/d28/f55 x:0 0 0 2026-03-09T00:03:47.044 INFO:tasks.workunit.client.0.vm03.stdout:9/263: truncate d15/d1c/d28/f2f 919550 0 2026-03-09T00:03:47.045 INFO:tasks.workunit.client.1.vm06.stdout:1/424: rename d6/ff to d6/f8c 0 2026-03-09T00:03:47.045 INFO:tasks.workunit.client.1.vm06.stdout:9/387: symlink d1/d3/d4f/d52/l7c 0 2026-03-09T00:03:47.063 INFO:tasks.workunit.client.1.vm06.stdout:7/474: mkdir d0/df/d1a/d27/d4c/d40/d51/d86 0 2026-03-09T00:03:47.078 INFO:tasks.workunit.client.0.vm03.stdout:2/232: dwrite d8/fb [0,4194304] 0 2026-03-09T00:03:47.090 INFO:tasks.workunit.client.0.vm03.stdout:8/249: unlink d7/l39 0 2026-03-09T00:03:47.093 INFO:tasks.workunit.client.1.vm06.stdout:2/562: mkdir d7/d1b/da5 0 2026-03-09T00:03:47.093 INFO:tasks.workunit.client.1.vm06.stdout:2/563: read d7/d1a/d25/d66/f8d [3441361,51785] 0 2026-03-09T00:03:47.095 INFO:tasks.workunit.client.1.vm06.stdout:1/425: dwrite d6/d63/f75 [0,4194304] 0 2026-03-09T00:03:47.110 INFO:tasks.workunit.client.1.vm06.stdout:8/444: link db/f28 db/d53/d70/f91 0 2026-03-09T00:03:47.110 INFO:tasks.workunit.client.1.vm06.stdout:4/421: mknod d17/d24/d3b/d5e/d7a/c8d 0 2026-03-09T00:03:47.110 INFO:tasks.workunit.client.1.vm06.stdout:4/422: write f14 [2136760,41455] 0 2026-03-09T00:03:47.126 INFO:tasks.workunit.client.1.vm06.stdout:9/388: truncate d1/d4/f24 3617697 0 2026-03-09T00:03:47.130 INFO:tasks.workunit.client.1.vm06.stdout:7/475: dwrite d0/df/d1a/d35/f61 [0,4194304] 0 2026-03-09T00:03:47.139 INFO:tasks.workunit.client.1.vm06.stdout:6/474: rmdir d4/d27/d42/d52 39 2026-03-09T00:03:47.139 INFO:tasks.workunit.client.1.vm06.stdout:6/475: write d4/d16/d46/f89 [125851,126631] 0 2026-03-09T00:03:47.142 INFO:tasks.workunit.client.1.vm06.stdout:2/564: unlink d7/da/d55/c7e 0 2026-03-09T00:03:47.142 INFO:tasks.workunit.client.1.vm06.stdout:2/565: read - d7/da/d1c/f92 zero size 2026-03-09T00:03:47.143 INFO:tasks.workunit.client.0.vm03.stdout:6/247: rename d13/l4e to d13/d1e/d44/l53 0 2026-03-09T00:03:47.144 INFO:tasks.workunit.client.1.vm06.stdout:6/476: dread d4/d27/d3e/f41 [0,4194304] 0 2026-03-09T00:03:47.144 INFO:tasks.workunit.client.1.vm06.stdout:6/477: chown d4/d27/d42/d7e 379351672 1 2026-03-09T00:03:47.144 INFO:tasks.workunit.client.1.vm06.stdout:6/478: chown d4/f36 327 1 2026-03-09T00:03:47.146 INFO:tasks.workunit.client.0.vm03.stdout:4/317: getdents d7/d20/d29/d38 0 2026-03-09T00:03:47.147 INFO:tasks.workunit.client.1.vm06.stdout:1/426: link d6/d21/f2e d6/d21/d2d/d3b/d87/f8d 0 2026-03-09T00:03:47.150 INFO:tasks.workunit.client.0.vm03.stdout:9/264: mknod d15/d1c/d36/d4d/c56 0 2026-03-09T00:03:47.166 INFO:tasks.workunit.client.1.vm06.stdout:7/476: dwrite d0/df/d1a/d27/d4c/d40/d5b/f78 [0,4194304] 0 2026-03-09T00:03:47.167 INFO:tasks.workunit.client.0.vm03.stdout:7/281: dwrite d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:03:47.182 INFO:tasks.workunit.client.1.vm06.stdout:7/477: dread d0/df/d17/f21 [0,4194304] 0 2026-03-09T00:03:47.183 INFO:tasks.workunit.client.1.vm06.stdout:8/445: rmdir db/d53 39 2026-03-09T00:03:47.184 INFO:tasks.workunit.client.1.vm06.stdout:8/446: read db/d53/d70/d38/d4d/f65 [1256753,2135] 0 2026-03-09T00:03:47.186 INFO:tasks.workunit.client.1.vm06.stdout:4/423: mknod d17/d24/d3b/d5e/c8e 0 2026-03-09T00:03:47.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.187+0000 7faf1da1f700 1 -- 192.168.123.103:0/2944668462 --> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faf04000bf0 con 0x7faf00071ea0 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.189+0000 7faf1ca1d700 1 -- 192.168.123.103:0/2944668462 <== mgr.24345 v2:192.168.123.106:6828/4100748704 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+310 (secure 0 0 0) 0x7faf04000bf0 con 0x7faf00071ea0 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "1/2 daemons upgraded", 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:03:47.188 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.191+0000 7faefe7fc700 1 -- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7faf00071ea0 msgr2=0x7faf00074360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.191+0000 7faefe7fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7faf00071ea0 0x7faf00074360 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7faf10000f80 tx=0x7faf1000db00 comp rx=0 tx=0).stop 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.191+0000 7faefe7fc700 1 -- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 msgr2=0x7faf18083180 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.191+0000 7faefe7fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18083180 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7faf0800d8d0 tx=0x7faf0800dc90 comp rx=0 tx=0).stop 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 -- 192.168.123.103:0/2944668462 shutdown_connections 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7faf18072b50 0x7faf18083180 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.106:6828/4100748704,v1:192.168.123.106:6829/4100748704] conn(0x7faf00071ea0 0x7faf00074360 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 --2- 192.168.123.103:0/2944668462 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7faf180836c0 0x7faf1812e500 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 -- 192.168.123.103:0/2944668462 >> 192.168.123.103:0/2944668462 conn(0x7faf1806dae0 msgr2=0x7faf1806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 -- 192.168.123.103:0/2944668462 shutdown_connections 2026-03-09T00:03:47.191 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:03:47.192+0000 7faefe7fc700 1 -- 192.168.123.103:0/2944668462 wait complete. 2026-03-09T00:03:47.194 INFO:tasks.workunit.client.1.vm06.stdout:7/478: write d0/df/d1a/d22/f2c [1868504,58361] 0 2026-03-09T00:03:47.194 INFO:tasks.workunit.client.1.vm06.stdout:7/479: write d0/df/d1a/d27/f43 [4294313,97896] 0 2026-03-09T00:03:47.194 INFO:tasks.workunit.client.0.vm03.stdout:2/233: mknod d8/d1b/d2a/d42/d4b/c4d 0 2026-03-09T00:03:47.194 INFO:tasks.workunit.client.0.vm03.stdout:2/234: dread - d8/d1b/f22 zero size 2026-03-09T00:03:47.205 INFO:tasks.workunit.client.1.vm06.stdout:7/480: write d0/df/d1a/d27/d4c/d40/f41 [520712,10607] 0 2026-03-09T00:03:47.208 INFO:tasks.workunit.client.1.vm06.stdout:5/596: rename d5/d1c/d23/d51 to d5/d1c/d21/d28/d5e/d66/d78/dc8 0 2026-03-09T00:03:47.218 INFO:tasks.workunit.client.1.vm06.stdout:2/566: truncate d7/d1a/d25/f33 2262364 0 2026-03-09T00:03:47.224 INFO:tasks.workunit.client.1.vm06.stdout:2/567: chown d7/d1a/d56/fa4 110210 1 2026-03-09T00:03:47.224 INFO:tasks.workunit.client.1.vm06.stdout:2/568: chown d7/d1a/d56/fa4 29 1 2026-03-09T00:03:47.224 INFO:tasks.workunit.client.1.vm06.stdout:6/479: link d4/d27/d3e/f41 d4/d27/d3e/d78/f91 0 2026-03-09T00:03:47.227 INFO:tasks.workunit.client.1.vm06.stdout:6/480: read d4/d27/d42/d52/f6c [1358775,16319] 0 2026-03-09T00:03:47.227 INFO:tasks.workunit.client.1.vm06.stdout:6/481: chown d4/d27/f84 2164 1 2026-03-09T00:03:47.227 INFO:tasks.workunit.client.1.vm06.stdout:6/482: truncate d4/ff 5013874 0 2026-03-09T00:03:47.227 INFO:tasks.workunit.client.1.vm06.stdout:6/483: truncate d4/d27/d3e/d57/f65 1007124 0 2026-03-09T00:03:47.229 INFO:tasks.workunit.client.0.vm03.stdout:5/290: rename d1c/d20/d55/d4f/d5f to d1c/d20/d55/d66 0 2026-03-09T00:03:47.240 INFO:tasks.workunit.client.0.vm03.stdout:7/282: dwrite d2/d4/f13 [0,4194304] 0 2026-03-09T00:03:47.240 INFO:tasks.workunit.client.0.vm03.stdout:7/283: fdatasync d2/d1f/d35/f5a 0 2026-03-09T00:03:47.246 INFO:tasks.workunit.client.0.vm03.stdout:6/248: creat d13/d1e/d44/d4a/d52/f54 x:0 0 0 2026-03-09T00:03:47.250 INFO:tasks.workunit.client.0.vm03.stdout:3/205: sync 2026-03-09T00:03:47.250 INFO:tasks.workunit.client.0.vm03.stdout:3/206: stat d2/db 0 2026-03-09T00:03:47.263 INFO:tasks.workunit.client.1.vm06.stdout:1/427: creat d6/d4c/f8e x:0 0 0 2026-03-09T00:03:47.264 INFO:tasks.workunit.client.1.vm06.stdout:8/447: dwrite db/dd/f27 [0,4194304] 0 2026-03-09T00:03:47.264 INFO:tasks.workunit.client.1.vm06.stdout:8/448: dread - db/dd/f67 zero size 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.1.vm06.stdout:4/424: mkdir d17/d5b/d8f 0 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.1.vm06.stdout:4/425: dread - d17/f81 zero size 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.1.vm06.stdout:4/426: chown d17/d21/f5d 724127 1 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.1.vm06.stdout:4/427: creat d17/d21/d4c/f90 x:0 0 0 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.0.vm03.stdout:4/318: mkdir d7/d20/d35/d66 0 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.0.vm03.stdout:4/319: write d7/f1c [3587799,52388] 0 2026-03-09T00:03:47.265 INFO:tasks.workunit.client.0.vm03.stdout:4/320: dread - d7/d20/d29/f53 zero size 2026-03-09T00:03:47.273 INFO:tasks.workunit.client.0.vm03.stdout:9/265: dwrite d15/f1b [0,4194304] 0 2026-03-09T00:03:47.273 INFO:tasks.workunit.client.0.vm03.stdout:2/235: dwrite d8/d17/f2c [0,4194304] 0 2026-03-09T00:03:47.273 INFO:tasks.workunit.client.1.vm06.stdout:8/449: write db/f3f [3350719,92626] 0 2026-03-09T00:03:47.273 INFO:tasks.workunit.client.1.vm06.stdout:8/450: truncate db/d53/d70/d38/d4d/f65 4442959 0 2026-03-09T00:03:47.273 INFO:tasks.workunit.client.1.vm06.stdout:8/451: chown db/dd/f13 2076 1 2026-03-09T00:03:47.279 INFO:tasks.workunit.client.1.vm06.stdout:7/481: unlink d0/df/d1a/d27/d4c/d40/l5f 0 2026-03-09T00:03:47.281 INFO:tasks.workunit.client.0.vm03.stdout:9/266: write f8 [4154700,110081] 0 2026-03-09T00:03:47.286 INFO:tasks.workunit.client.0.vm03.stdout:8/250: getdents d7/df/d1e 0 2026-03-09T00:03:47.289 INFO:tasks.workunit.client.1.vm06.stdout:5/597: creat d5/d1c/d21/d28/d5e/d66/fc9 x:0 0 0 2026-03-09T00:03:47.292 INFO:tasks.workunit.client.1.vm06.stdout:3/555: rename d11/d28/l34 to d11/d28/d4d/lc4 0 2026-03-09T00:03:47.293 INFO:tasks.workunit.client.0.vm03.stdout:5/291: dwrite - open d1c/d20/f3e failed 14 2026-03-09T00:03:47.295 INFO:tasks.workunit.client.1.vm06.stdout:2/569: getdents d7/d1b 0 2026-03-09T00:03:47.308 INFO:tasks.workunit.client.1.vm06.stdout:6/484: creat d4/d27/d3e/d78/f92 x:0 0 0 2026-03-09T00:03:47.308 INFO:tasks.workunit.client.1.vm06.stdout:6/485: chown d4/d27/d3e/d57/f5c 32604440 1 2026-03-09T00:03:47.308 INFO:tasks.workunit.client.1.vm06.stdout:6/486: fsync d4/d16/f5e 0 2026-03-09T00:03:47.310 INFO:tasks.workunit.client.0.vm03.stdout:7/284: creat d2/d1f/d3a/f5d x:0 0 0 2026-03-09T00:03:47.310 INFO:tasks.workunit.client.0.vm03.stdout:7/285: write d2/fc [2112472,69741] 0 2026-03-09T00:03:47.310 INFO:tasks.workunit.client.0.vm03.stdout:7/286: write d2/d4/f2e [836158,99419] 0 2026-03-09T00:03:47.310 INFO:tasks.workunit.client.0.vm03.stdout:7/287: fsync d2/d1f/d35/f5a 0 2026-03-09T00:03:47.311 INFO:tasks.workunit.client.1.vm06.stdout:6/487: dread d4/f26 [0,4194304] 0 2026-03-09T00:03:47.311 INFO:tasks.workunit.client.0.vm03.stdout:7/288: write d2/f3 [4239415,40511] 0 2026-03-09T00:03:47.313 INFO:tasks.workunit.client.1.vm06.stdout:6/488: dread d4/d27/d3e/d78/f91 [0,4194304] 0 2026-03-09T00:03:47.313 INFO:tasks.workunit.client.1.vm06.stdout:6/489: readlink d4/d27/d3e/l56 0 2026-03-09T00:03:47.314 INFO:tasks.workunit.client.0.vm03.stdout:6/249: creat d13/f55 x:0 0 0 2026-03-09T00:03:47.317 INFO:tasks.workunit.client.0.vm03.stdout:4/321: mknod d7/d20/d29/d54/c67 0 2026-03-09T00:03:47.317 INFO:tasks.workunit.client.0.vm03.stdout:4/322: dread - d7/d20/d29/d38/d3a/f4b zero size 2026-03-09T00:03:47.320 INFO:tasks.workunit.client.1.vm06.stdout:4/428: rmdir d17/d24 39 2026-03-09T00:03:47.333 INFO:tasks.workunit.client.0.vm03.stdout:9/267: dwrite d15/f1f [0,4194304] 0 2026-03-09T00:03:47.333 INFO:tasks.workunit.client.0.vm03.stdout:9/268: stat d15/d1c/d28/f39 0 2026-03-09T00:03:47.334 INFO:tasks.workunit.client.1.vm06.stdout:1/428: dwrite d6/d21/d2d/d3b/d42/f80 [4194304,4194304] 0 2026-03-09T00:03:47.340 INFO:tasks.workunit.client.0.vm03.stdout:9/269: write fb [1782770,53185] 0 2026-03-09T00:03:47.348 INFO:tasks.workunit.client.0.vm03.stdout:1/336: sync 2026-03-09T00:03:47.348 INFO:tasks.workunit.client.0.vm03.stdout:1/337: stat d4/d15/f35 0 2026-03-09T00:03:47.348 INFO:tasks.workunit.client.0.vm03.stdout:1/338: write d4/d15/f45 [142653,6257] 0 2026-03-09T00:03:47.349 INFO:tasks.workunit.client.0.vm03.stdout:0/265: sync 2026-03-09T00:03:47.349 INFO:tasks.workunit.client.0.vm03.stdout:0/266: fsync d2/da/dd/f24 0 2026-03-09T00:03:47.349 INFO:tasks.workunit.client.0.vm03.stdout:0/267: creat d2/da/d1a/f56 x:0 0 0 2026-03-09T00:03:47.354 INFO:tasks.workunit.client.1.vm06.stdout:7/482: creat d0/df/d1a/d35/d62/f87 x:0 0 0 2026-03-09T00:03:47.355 INFO:tasks.workunit.client.1.vm06.stdout:5/598: truncate d5/f3d 2376907 0 2026-03-09T00:03:47.366 INFO:tasks.workunit.client.0.vm03.stdout:8/251: truncate d7/df/f3d 1292712 0 2026-03-09T00:03:47.369 INFO:tasks.workunit.client.1.vm06.stdout:0/477: rename d3/d18/c16 to d3/c9e 0 2026-03-09T00:03:47.369 INFO:tasks.workunit.client.1.vm06.stdout:0/478: stat d3/d18/d2c/d2d/d31/f5d 0 2026-03-09T00:03:47.371 INFO:tasks.workunit.client.1.vm06.stdout:6/490: rmdir d4/d16/d53 39 2026-03-09T00:03:47.371 INFO:tasks.workunit.client.1.vm06.stdout:6/491: creat d4/d16/d46/f93 x:0 0 0 2026-03-09T00:03:47.372 INFO:tasks.workunit.client.1.vm06.stdout:0/479: dread d3/d18/d1f/f5e [0,4194304] 0 2026-03-09T00:03:47.373 INFO:tasks.workunit.client.0.vm03.stdout:7/289: stat d2/d1f/l20 0 2026-03-09T00:03:47.373 INFO:tasks.workunit.client.0.vm03.stdout:7/290: fdatasync d2/d1f/d3a/d31/f3f 0 2026-03-09T00:03:47.377 INFO:tasks.workunit.client.0.vm03.stdout:4/323: link d7/d20/d29/f2a d7/d20/d35/f68 0 2026-03-09T00:03:47.380 INFO:tasks.workunit.client.0.vm03.stdout:9/270: dwrite d15/d1c/d28/f55 [0,4194304] 0 2026-03-09T00:03:47.382 INFO:tasks.workunit.client.0.vm03.stdout:4/324: dread d7/d23/d25/f3e [0,4194304] 0 2026-03-09T00:03:47.382 INFO:tasks.workunit.client.0.vm03.stdout:4/325: truncate d7/d20/d29/d38/d3a/f4b 220682 0 2026-03-09T00:03:47.390 INFO:tasks.workunit.client.1.vm06.stdout:7/483: symlink d0/df/d1a/d27/d4c/d40/d51/d86/l88 0 2026-03-09T00:03:47.394 INFO:tasks.workunit.client.1.vm06.stdout:7/484: read - d0/df/d1a/d27/f37 zero size 2026-03-09T00:03:47.395 INFO:tasks.workunit.client.0.vm03.stdout:4/326: dread d7/f1d [0,4194304] 0 2026-03-09T00:03:47.396 INFO:tasks.workunit.client.1.vm06.stdout:4/429: dwrite d17/d5b/f83 [0,4194304] 0 2026-03-09T00:03:47.399 INFO:tasks.workunit.client.1.vm06.stdout:3/556: link d11/fa2 d11/d28/d4d/d9b/fc5 0 2026-03-09T00:03:47.406 INFO:tasks.workunit.client.1.vm06.stdout:9/389: rename d1/d4/d6e/d14/d25/c38 to d1/d73/c7d 0 2026-03-09T00:03:47.412 INFO:tasks.workunit.client.1.vm06.stdout:6/492: mkdir d4/d16/d46/d94 0 2026-03-09T00:03:47.413 INFO:tasks.workunit.client.0.vm03.stdout:1/339: mknod d4/d15/d1a/c73 0 2026-03-09T00:03:47.435 INFO:tasks.workunit.client.1.vm06.stdout:5/599: dwrite d5/d1c/d23/f54 [0,4194304] 0 2026-03-09T00:03:47.435 INFO:tasks.workunit.client.1.vm06.stdout:5/600: creat d5/d1c/d21/d28/d5e/d66/fca x:0 0 0 2026-03-09T00:03:47.437 INFO:tasks.workunit.client.0.vm03.stdout:0/268: creat d2/da/d36/d39/f57 x:0 0 0 2026-03-09T00:03:47.444 INFO:tasks.workunit.client.1.vm06.stdout:1/429: rmdir d6/d63 39 2026-03-09T00:03:47.444 INFO:tasks.workunit.client.1.vm06.stdout:1/430: chown d6/c36 8900515 1 2026-03-09T00:03:47.450 INFO:tasks.workunit.client.0.vm03.stdout:3/207: sync 2026-03-09T00:03:47.450 INFO:tasks.workunit.client.0.vm03.stdout:3/208: write d2/f30 [3252735,26926] 0 2026-03-09T00:03:47.450 INFO:tasks.workunit.client.0.vm03.stdout:2/236: rmdir d8/d1b 39 2026-03-09T00:03:47.451 INFO:tasks.workunit.client.0.vm03.stdout:8/252: truncate f6 911679 0 2026-03-09T00:03:47.451 INFO:tasks.workunit.client.0.vm03.stdout:8/253: dread - d7/f48 zero size 2026-03-09T00:03:47.451 INFO:tasks.workunit.client.0.vm03.stdout:8/254: chown d7/df/d1a/d2b 479387519 1 2026-03-09T00:03:47.451 INFO:tasks.workunit.client.0.vm03.stdout:5/292: truncate d1c/f30 1250161 0 2026-03-09T00:03:47.464 INFO:tasks.workunit.client.1.vm06.stdout:4/430: getdents d17/d24 0 2026-03-09T00:03:47.471 INFO:tasks.workunit.client.0.vm03.stdout:7/291: mkdir d2/d4/d1e/d5e 0 2026-03-09T00:03:47.471 INFO:tasks.workunit.client.0.vm03.stdout:6/250: link d13/d1e/l43 d13/d1e/l56 0 2026-03-09T00:03:47.471 INFO:tasks.workunit.client.0.vm03.stdout:6/251: chown d13/d1e/f30 671668 1 2026-03-09T00:03:47.474 INFO:tasks.workunit.client.1.vm06.stdout:7/485: dwrite d0/f7 [0,4194304] 0 2026-03-09T00:03:47.476 INFO:tasks.workunit.client.0.vm03.stdout:6/252: write d13/f1c [4084017,95045] 0 2026-03-09T00:03:47.487 INFO:tasks.workunit.client.0.vm03.stdout:7/292: read d2/d1f/d35/f3e [728361,3647] 0 2026-03-09T00:03:47.496 INFO:tasks.workunit.client.0.vm03.stdout:7/293: dread d2/d4/f13 [0,4194304] 0 2026-03-09T00:03:47.496 INFO:tasks.workunit.client.0.vm03.stdout:7/294: write d2/d1f/d42/d43/f4a [433695,93176] 0 2026-03-09T00:03:47.503 INFO:tasks.workunit.client.0.vm03.stdout:4/327: creat d7/d20/d35/d66/f69 x:0 0 0 2026-03-09T00:03:47.503 INFO:tasks.workunit.client.0.vm03.stdout:2/237: rmdir d8/d1b/d2a/d42/d43 0 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: pgmap v8: 65 pgs: 65 active+clean; 1.5 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 71 MiB/s rd, 92 MiB/s wr, 149 op/s 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='client.24379 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/3149584614' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:03:47.513 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:47 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:47.518 INFO:tasks.workunit.client.1.vm06.stdout:8/452: rename db/dd/f13 to db/dd/d24/d63/f92 0 2026-03-09T00:03:47.518 INFO:tasks.workunit.client.1.vm06.stdout:8/453: chown db/d74/d78 959102 1 2026-03-09T00:03:47.518 INFO:tasks.workunit.client.1.vm06.stdout:8/454: truncate db/d53/d5c/f6f 721264 0 2026-03-09T00:03:47.519 INFO:tasks.workunit.client.1.vm06.stdout:6/493: rmdir d4/d16/d53 39 2026-03-09T00:03:47.519 INFO:tasks.workunit.client.1.vm06.stdout:6/494: creat d4/d27/d42/d4b/f95 x:0 0 0 2026-03-09T00:03:47.521 INFO:tasks.workunit.client.1.vm06.stdout:0/480: getdents d3/d18/d1f/d44/d6a 0 2026-03-09T00:03:47.522 INFO:tasks.workunit.client.1.vm06.stdout:0/481: fsync d3/f29 0 2026-03-09T00:03:47.536 INFO:tasks.workunit.client.1.vm06.stdout:5/601: mknod d5/d1c/d21/d28/d5e/d66/d78/ccb 0 2026-03-09T00:03:47.537 INFO:tasks.workunit.client.0.vm03.stdout:1/340: dwrite d4/d15/d5c/f6f [0,4194304] 0 2026-03-09T00:03:47.537 INFO:tasks.workunit.client.0.vm03.stdout:1/341: fdatasync d4/d3a/d32/f4b 0 2026-03-09T00:03:47.537 INFO:tasks.workunit.client.0.vm03.stdout:1/342: creat d4/d15/d5c/f74 x:0 0 0 2026-03-09T00:03:47.539 INFO:tasks.workunit.client.1.vm06.stdout:5/602: dread d5/d1c/d21/d28/d5e/d66/d78/dc8/f7a [0,4194304] 0 2026-03-09T00:03:47.550 INFO:tasks.workunit.client.0.vm03.stdout:3/209: dwrite d2/f8 [0,4194304] 0 2026-03-09T00:03:47.550 INFO:tasks.workunit.client.1.vm06.stdout:4/431: mknod d17/c91 0 2026-03-09T00:03:47.550 INFO:tasks.workunit.client.1.vm06.stdout:4/432: chown ca 124346 1 2026-03-09T00:03:47.555 INFO:tasks.workunit.client.0.vm03.stdout:7/295: rmdir d2/d1f/d42/d46 39 2026-03-09T00:03:47.555 INFO:tasks.workunit.client.0.vm03.stdout:7/296: readlink d2/l2f 0 2026-03-09T00:03:47.556 INFO:tasks.workunit.client.0.vm03.stdout:6/253: dread f10 [0,4194304] 0 2026-03-09T00:03:47.557 INFO:tasks.workunit.client.0.vm03.stdout:4/328: dwrite d7/d20/d29/d38/d3a/f4b [0,4194304] 0 2026-03-09T00:03:47.558 INFO:tasks.workunit.client.0.vm03.stdout:4/329: write d7/f62 [355605,11248] 0 2026-03-09T00:03:47.558 INFO:tasks.workunit.client.0.vm03.stdout:9/271: truncate f8 3145583 0 2026-03-09T00:03:47.558 INFO:tasks.workunit.client.0.vm03.stdout:9/272: chown d15/d1c/d28/d30/f3d 180897 1 2026-03-09T00:03:47.559 INFO:tasks.workunit.client.0.vm03.stdout:2/238: mknod d8/d1b/d2a/d2e/c4e 0 2026-03-09T00:03:47.559 INFO:tasks.workunit.client.0.vm03.stdout:8/255: mkdir d7/df/d1e/d38/d4c 0 2026-03-09T00:03:47.559 INFO:tasks.workunit.client.0.vm03.stdout:8/256: write d7/df/d1e/f3a [19494,94305] 0 2026-03-09T00:03:47.561 INFO:tasks.workunit.client.0.vm03.stdout:1/343: stat d4/d6/l5b 0 2026-03-09T00:03:47.561 INFO:tasks.workunit.client.0.vm03.stdout:1/344: chown f0 7550 1 2026-03-09T00:03:47.561 INFO:tasks.workunit.client.0.vm03.stdout:3/210: creat d2/db/d3b/f3e x:0 0 0 2026-03-09T00:03:47.563 INFO:tasks.workunit.client.1.vm06.stdout:8/455: rmdir db/dd/d24 39 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.1.vm06.stdout:1/431: unlink d6/c11 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.1.vm06.stdout:6/495: rename d4/d27/d3e/d57/c8e to d4/d16/d46/d94/c96 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.1.vm06.stdout:1/432: write d6/d21/d2d/f74 [807481,93621] 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.1.vm06.stdout:8/456: rename db/d1e/f58 to db/d74/d78/f93 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.0.vm03.stdout:6/254: mknod d13/d1e/d44/d4a/c57 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.0.vm03.stdout:4/330: truncate d7/d20/f34 554245 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.0.vm03.stdout:9/273: mknod d15/d1c/c57 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.0.vm03.stdout:2/239: mkdir d8/d1b/d2a/d42/d4b/d4f 0 2026-03-09T00:03:47.573 INFO:tasks.workunit.client.0.vm03.stdout:8/257: creat d7/df/d1a/d40/f4d x:0 0 0 2026-03-09T00:03:47.574 INFO:tasks.workunit.client.0.vm03.stdout:1/345: link d4/d15/f35 d4/d3a/d61/f75 0 2026-03-09T00:03:47.574 INFO:tasks.workunit.client.0.vm03.stdout:1/346: truncate d4/d3a/d3d/d46/f70 206314 0 2026-03-09T00:03:47.574 INFO:tasks.workunit.client.0.vm03.stdout:1/347: fdatasync d4/d3a/f48 0 2026-03-09T00:03:47.576 INFO:tasks.workunit.client.0.vm03.stdout:3/211: unlink d2/c3 0 2026-03-09T00:03:47.577 INFO:tasks.workunit.client.0.vm03.stdout:6/255: creat d13/d1e/d44/d4a/f58 x:0 0 0 2026-03-09T00:03:47.579 INFO:tasks.workunit.client.1.vm06.stdout:1/433: write d6/d21/f3d [4133739,58861] 0 2026-03-09T00:03:47.579 INFO:tasks.workunit.client.1.vm06.stdout:1/434: truncate d6/f25 5597302 0 2026-03-09T00:03:47.579 INFO:tasks.workunit.client.0.vm03.stdout:4/331: dread d7/f15 [4194304,4194304] 0 2026-03-09T00:03:47.579 INFO:tasks.workunit.client.0.vm03.stdout:4/332: fsync d7/fe 0 2026-03-09T00:03:47.580 INFO:tasks.workunit.client.1.vm06.stdout:8/457: read db/d53/d70/d38/f5b [1183238,81750] 0 2026-03-09T00:03:47.580 INFO:tasks.workunit.client.0.vm03.stdout:2/240: mkdir d8/d1b/d2a/d42/d4b/d50 0 2026-03-09T00:03:47.583 INFO:tasks.workunit.client.0.vm03.stdout:6/256: dread d13/f31 [0,4194304] 0 2026-03-09T00:03:47.583 INFO:tasks.workunit.client.0.vm03.stdout:6/257: write d13/f55 [856979,43652] 0 2026-03-09T00:03:47.584 INFO:tasks.workunit.client.0.vm03.stdout:1/348: rename d4/d15/f35 to d4/d3a/d32/d6a/f76 0 2026-03-09T00:03:47.584 INFO:tasks.workunit.client.0.vm03.stdout:1/349: chown d4/d3a/f2c 35340051 1 2026-03-09T00:03:47.584 INFO:tasks.workunit.client.0.vm03.stdout:4/333: mkdir d7/d20/d6a 0 2026-03-09T00:03:47.586 INFO:tasks.workunit.client.1.vm06.stdout:1/435: mkdir d6/d8f 0 2026-03-09T00:03:47.586 INFO:tasks.workunit.client.0.vm03.stdout:2/241: write d8/d17/f1c [4021831,38415] 0 2026-03-09T00:03:47.595 INFO:tasks.workunit.client.1.vm06.stdout:1/436: creat d6/d4c/f90 x:0 0 0 2026-03-09T00:03:47.597 INFO:tasks.workunit.client.0.vm03.stdout:3/212: write d2/db/f10 [4056474,36184] 0 2026-03-09T00:03:47.597 INFO:tasks.workunit.client.0.vm03.stdout:3/213: getdents d2/db 0 2026-03-09T00:03:47.604 INFO:tasks.workunit.client.1.vm06.stdout:5/603: dwrite d5/f19 [0,4194304] 0 2026-03-09T00:03:47.604 INFO:tasks.workunit.client.1.vm06.stdout:5/604: chown d5/d1c/d23/f82 2579376 1 2026-03-09T00:03:47.633 INFO:tasks.workunit.client.0.vm03.stdout:6/258: mkdir d13/d1e/d44/d59 0 2026-03-09T00:03:47.638 INFO:tasks.workunit.client.0.vm03.stdout:1/350: mkdir d4/d15/d77 0 2026-03-09T00:03:47.651 INFO:tasks.workunit.client.0.vm03.stdout:4/334: unlink d7/d20/d29/d38/d3a/l5e 0 2026-03-09T00:03:47.651 INFO:tasks.workunit.client.0.vm03.stdout:4/335: getdents d7/d20/d6a 0 2026-03-09T00:03:47.651 INFO:tasks.workunit.client.0.vm03.stdout:4/336: truncate d7/fe 4853141 0 2026-03-09T00:03:47.657 INFO:tasks.workunit.client.0.vm03.stdout:7/297: dread d2/d4/f2e [0,4194304] 0 2026-03-09T00:03:47.667 INFO:tasks.workunit.client.0.vm03.stdout:1/351: mkdir d4/d3a/d61/d78 0 2026-03-09T00:03:47.667 INFO:tasks.workunit.client.0.vm03.stdout:1/352: write d4/d3a/d32/f4b [1091077,74122] 0 2026-03-09T00:03:47.673 INFO:tasks.workunit.client.1.vm06.stdout:2/570: sync 2026-03-09T00:03:47.673 INFO:tasks.workunit.client.1.vm06.stdout:2/571: chown d7/d1a/d3c/c69 59788387 1 2026-03-09T00:03:47.673 INFO:tasks.workunit.client.1.vm06.stdout:2/572: readlink d7/d1b/d5a/l8f 0 2026-03-09T00:03:47.673 INFO:tasks.workunit.client.1.vm06.stdout:9/390: sync 2026-03-09T00:03:47.673 INFO:tasks.workunit.client.1.vm06.stdout:9/391: write d1/d3/d2b/f6d [699614,118420] 0 2026-03-09T00:03:47.673 INFO:tasks.workunit.client.1.vm06.stdout:3/557: sync 2026-03-09T00:03:47.674 INFO:tasks.workunit.client.1.vm06.stdout:9/392: getdents d1/d3/d4f/d52 0 2026-03-09T00:03:47.675 INFO:tasks.workunit.client.1.vm06.stdout:6/496: getdents d4/d27/d3e/d57 0 2026-03-09T00:03:47.675 INFO:tasks.workunit.client.1.vm06.stdout:6/497: getdents d4/d27/d42/d52/d7d 0 2026-03-09T00:03:47.676 INFO:tasks.workunit.client.1.vm06.stdout:3/558: stat d11/d3f/c4b 0 2026-03-09T00:03:47.677 INFO:tasks.workunit.client.1.vm06.stdout:9/393: rmdir d1/d3/d2b/d58 39 2026-03-09T00:03:47.677 INFO:tasks.workunit.client.1.vm06.stdout:9/394: fsync d1/d4/f44 0 2026-03-09T00:03:47.677 INFO:tasks.workunit.client.1.vm06.stdout:6/498: mkdir d4/d27/d3e/d78/d97 0 2026-03-09T00:03:47.678 INFO:tasks.workunit.client.1.vm06.stdout:3/559: fsync d11/f1e 0 2026-03-09T00:03:47.678 INFO:tasks.workunit.client.1.vm06.stdout:3/560: creat d11/d28/d4d/fc6 x:0 0 0 2026-03-09T00:03:47.678 INFO:tasks.workunit.client.1.vm06.stdout:3/561: chown d11/d28/d2e/d2f/d36/d8f/cbc 21227 1 2026-03-09T00:03:47.678 INFO:tasks.workunit.client.1.vm06.stdout:3/562: readlink d11/d28/d2e/d2f/d5b/d5f/l8b 0 2026-03-09T00:03:47.678 INFO:tasks.workunit.client.1.vm06.stdout:3/563: chown d11/d28/d2e/d2f/f49 23 1 2026-03-09T00:03:47.678 INFO:tasks.workunit.client.1.vm06.stdout:6/499: mknod d4/d27/c98 0 2026-03-09T00:03:47.679 INFO:tasks.workunit.client.1.vm06.stdout:6/500: link d4/d27/d3e/d57/c71 d4/d27/d3e/c99 0 2026-03-09T00:03:47.680 INFO:tasks.workunit.client.1.vm06.stdout:3/564: rename d11/d28/d4d/fc6 to d11/d28/d4d/fc7 0 2026-03-09T00:03:47.681 INFO:tasks.workunit.client.1.vm06.stdout:6/501: rmdir d4/d16/d46/d94 39 2026-03-09T00:03:47.688 INFO:tasks.workunit.client.1.vm06.stdout:0/482: dwrite d3/f7 [0,4194304] 0 2026-03-09T00:03:47.690 INFO:tasks.workunit.client.1.vm06.stdout:0/483: mkdir d3/d18/d2c/d2d/d74/d7d/d9f 0 2026-03-09T00:03:47.690 INFO:tasks.workunit.client.1.vm06.stdout:0/484: chown d3/d18/d1f/d39/d3b 0 1 2026-03-09T00:03:47.690 INFO:tasks.workunit.client.1.vm06.stdout:0/485: creat d3/d18/d3c/fa0 x:0 0 0 2026-03-09T00:03:47.690 INFO:tasks.workunit.client.1.vm06.stdout:0/486: rmdir d3/d18/d1f/d39/d3b 39 2026-03-09T00:03:47.692 INFO:tasks.workunit.client.1.vm06.stdout:9/395: dread d1/d4/d6e/d14/d25/f32 [0,4194304] 0 2026-03-09T00:03:47.692 INFO:tasks.workunit.client.1.vm06.stdout:9/396: write d1/d4/d6e/f5d [577010,66497] 0 2026-03-09T00:03:47.693 INFO:tasks.workunit.client.1.vm06.stdout:0/487: mknod d3/d18/d1f/d39/d3b/ca1 0 2026-03-09T00:03:47.693 INFO:tasks.workunit.client.1.vm06.stdout:0/488: stat d3/d18/l67 0 2026-03-09T00:03:47.693 INFO:tasks.workunit.client.1.vm06.stdout:9/397: symlink d1/d73/l7e 0 2026-03-09T00:03:47.694 INFO:tasks.workunit.client.1.vm06.stdout:9/398: link d1/d4/d6e/d14/d25/f4e d1/d4/d2f/f7f 0 2026-03-09T00:03:47.699 INFO:tasks.workunit.client.1.vm06.stdout:0/489: write d3/d18/d1f/d44/f5a [3759218,82399] 0 2026-03-09T00:03:47.699 INFO:tasks.workunit.client.1.vm06.stdout:0/490: creat d3/d18/d3c/fa2 x:0 0 0 2026-03-09T00:03:47.699 INFO:tasks.workunit.client.1.vm06.stdout:0/491: chown d3/d18/d1f/d39/d3b/f57 318727636 1 2026-03-09T00:03:47.700 INFO:tasks.workunit.client.1.vm06.stdout:0/492: creat d3/d18/d2c/d2d/d74/d90/fa3 x:0 0 0 2026-03-09T00:03:47.722 INFO:tasks.workunit.client.1.vm06.stdout:6/502: dread d4/f38 [0,4194304] 0 2026-03-09T00:03:47.722 INFO:tasks.workunit.client.1.vm06.stdout:6/503: readlink d4/d27/l3a 0 2026-03-09T00:03:47.722 INFO:tasks.workunit.client.1.vm06.stdout:1/437: dwrite d6/d21/d2d/f3c [0,4194304] 0 2026-03-09T00:03:47.722 INFO:tasks.workunit.client.1.vm06.stdout:1/438: dread - d6/f81 zero size 2026-03-09T00:03:47.722 INFO:tasks.workunit.client.1.vm06.stdout:1/439: readlink d6/d4c/d79/l53 0 2026-03-09T00:03:47.726 INFO:tasks.workunit.client.1.vm06.stdout:6/504: mknod d4/d16/c9a 0 2026-03-09T00:03:47.726 INFO:tasks.workunit.client.1.vm06.stdout:1/440: symlink d6/d21/d2d/d37/l91 0 2026-03-09T00:03:47.726 INFO:tasks.workunit.client.1.vm06.stdout:1/441: mknod d6/d8f/c92 0 2026-03-09T00:03:47.726 INFO:tasks.workunit.client.1.vm06.stdout:1/442: write d6/d4c/d71/f4b [282387,45082] 0 2026-03-09T00:03:47.727 INFO:tasks.workunit.client.1.vm06.stdout:6/505: creat d4/d27/d3e/d78/d97/f9b x:0 0 0 2026-03-09T00:03:47.727 INFO:tasks.workunit.client.1.vm06.stdout:6/506: write d4/d16/f34 [6698853,63399] 0 2026-03-09T00:03:47.728 INFO:tasks.workunit.client.1.vm06.stdout:6/507: mknod d4/d27/c9c 0 2026-03-09T00:03:47.728 INFO:tasks.workunit.client.1.vm06.stdout:1/443: rmdir d6/d21/d2d/d3b 39 2026-03-09T00:03:47.729 INFO:tasks.workunit.client.1.vm06.stdout:1/444: getdents d6/d21/d2d/d3b 0 2026-03-09T00:03:47.730 INFO:tasks.workunit.client.1.vm06.stdout:1/445: mknod d6/d21/d2d/c93 0 2026-03-09T00:03:47.730 INFO:tasks.workunit.client.1.vm06.stdout:1/446: rename d6/d21/d2d/l33 to d6/d21/d2d/d3b/d87/l94 0 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:6/508: write d4/f3d [1589673,96648] 0 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:6/509: chown d4/d27/d3e 4785711 1 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:1/447: unlink d6/c8 0 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:6/510: link d4/d27/d42/d52/f6c d4/d27/d42/d52/d7d/f9d 0 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:6/511: creat d4/d27/d42/d52/f9e x:0 0 0 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:6/512: write d4/d27/d42/d52/f6c [2097370,73429] 0 2026-03-09T00:03:47.735 INFO:tasks.workunit.client.1.vm06.stdout:6/513: readlink d4/d27/d42/d4b/l4f 0 2026-03-09T00:03:47.743 INFO:tasks.workunit.client.1.vm06.stdout:8/458: dwrite db/d53/d70/f54 [0,4194304] 0 2026-03-09T00:03:47.743 INFO:tasks.workunit.client.1.vm06.stdout:8/459: chown db/d1e/d46/f4b 275884 1 2026-03-09T00:03:47.746 INFO:tasks.workunit.client.1.vm06.stdout:6/514: rename d4/d16/d46/d94/c96 to d4/d27/d3e/d78/d97/c9f 0 2026-03-09T00:03:47.759 INFO:tasks.workunit.client.1.vm06.stdout:8/460: stat db/dd/l19 0 2026-03-09T00:03:47.767 INFO:tasks.workunit.client.0.vm03.stdout:3/214: dwrite d2/db/f28 [0,4194304] 0 2026-03-09T00:03:47.767 INFO:tasks.workunit.client.0.vm03.stdout:3/215: chown d2/f2a 21 1 2026-03-09T00:03:47.767 INFO:tasks.workunit.client.0.vm03.stdout:6/259: dwrite d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:03:47.772 INFO:tasks.workunit.client.1.vm06.stdout:7/486: dwrite d0/df/d1a/d35/d62/f81 [0,4194304] 0 2026-03-09T00:03:47.777 INFO:tasks.workunit.client.1.vm06.stdout:6/515: rename d4/d27/d42/d4b/l4f to d4/d27/d3e/d78/d97/la0 0 2026-03-09T00:03:47.781 INFO:tasks.workunit.client.1.vm06.stdout:6/516: write d4/f36 [6115709,119111] 0 2026-03-09T00:03:47.783 INFO:tasks.workunit.client.0.vm03.stdout:9/274: dwrite d15/f1b [0,4194304] 0 2026-03-09T00:03:47.787 INFO:tasks.workunit.client.0.vm03.stdout:8/258: rmdir d7 39 2026-03-09T00:03:47.787 INFO:tasks.workunit.client.0.vm03.stdout:8/259: chown d7/df/d1e/f24 2 1 2026-03-09T00:03:47.787 INFO:tasks.workunit.client.0.vm03.stdout:8/260: fsync d7/f34 0 2026-03-09T00:03:47.787 INFO:tasks.workunit.client.0.vm03.stdout:8/261: readlink d7/df/l17 0 2026-03-09T00:03:47.789 INFO:tasks.workunit.client.1.vm06.stdout:6/517: write d4/d16/f21 [3562576,2826] 0 2026-03-09T00:03:47.790 INFO:tasks.workunit.client.1.vm06.stdout:7/487: dread d0/df/d1a/d3a/d4e/d5e/f73 [0,4194304] 0 2026-03-09T00:03:47.792 INFO:tasks.workunit.client.1.vm06.stdout:7/488: dread d0/df/d1a/d3a/d4e/d5e/f73 [0,4194304] 0 2026-03-09T00:03:47.792 INFO:tasks.workunit.client.1.vm06.stdout:7/489: write d0/df/d1a/d27/f66 [429282,117616] 0 2026-03-09T00:03:47.809 INFO:tasks.workunit.client.1.vm06.stdout:8/461: unlink db/d1e/f60 0 2026-03-09T00:03:47.821 INFO:tasks.workunit.client.1.vm06.stdout:5/605: dwrite d5/d1c/d23/fb3 [0,4194304] 0 2026-03-09T00:03:47.834 INFO:tasks.workunit.client.1.vm06.stdout:6/518: symlink d4/d27/d42/d4b/la1 0 2026-03-09T00:03:47.839 INFO:tasks.workunit.client.1.vm06.stdout:6/519: write d4/d16/f33 [5082548,128350] 0 2026-03-09T00:03:47.839 INFO:tasks.workunit.client.1.vm06.stdout:7/490: mknod d0/df/d1a/d3f/d53/c89 0 2026-03-09T00:03:47.839 INFO:tasks.workunit.client.1.vm06.stdout:5/606: mkdir d5/db1/dcc 0 2026-03-09T00:03:47.839 INFO:tasks.workunit.client.1.vm06.stdout:7/491: creat d0/df/f8a x:0 0 0 2026-03-09T00:03:47.839 INFO:tasks.workunit.client.1.vm06.stdout:7/492: chown d0/df/d1a/d27/d4c/f6d 22921 1 2026-03-09T00:03:47.840 INFO:tasks.workunit.client.1.vm06.stdout:5/607: mknod d5/d1c/ccd 0 2026-03-09T00:03:47.840 INFO:tasks.workunit.client.1.vm06.stdout:5/608: fdatasync d5/d1c/d21/d28/d5e/d66/f8a 0 2026-03-09T00:03:47.841 INFO:tasks.workunit.client.1.vm06.stdout:7/493: dread d0/df/d1a/d3a/f5d [0,4194304] 0 2026-03-09T00:03:47.841 INFO:tasks.workunit.client.1.vm06.stdout:7/494: rmdir d0/df/d1a/d3f/d53 39 2026-03-09T00:03:47.841 INFO:tasks.workunit.client.1.vm06.stdout:7/495: dread - d0/df/d1a/d3f/f7d zero size 2026-03-09T00:03:47.858 INFO:tasks.workunit.client.0.vm03.stdout:9/275: dread fb [0,4194304] 0 2026-03-09T00:03:47.859 INFO:tasks.workunit.client.1.vm06.stdout:1/448: read d6/d21/d2d/f74 [529659,75408] 0 2026-03-09T00:03:47.875 INFO:tasks.workunit.client.0.vm03.stdout:9/276: read d15/d1c/d28/f55 [2408805,109702] 0 2026-03-09T00:03:47.878 INFO:tasks.workunit.client.0.vm03.stdout:9/277: symlink d15/d1c/l58 0 2026-03-09T00:03:47.878 INFO:tasks.workunit.client.0.vm03.stdout:9/278: dread d15/d1c/d28/f2f [0,4194304] 0 2026-03-09T00:03:47.882 INFO:tasks.workunit.client.0.vm03.stdout:9/279: symlink d15/l59 0 2026-03-09T00:03:47.883 INFO:tasks.workunit.client.0.vm03.stdout:9/280: dread d15/d1c/d28/d30/f3d [0,4194304] 0 2026-03-09T00:03:47.884 INFO:tasks.workunit.client.0.vm03.stdout:9/281: rename d15/l47 to d15/d1c/d21/d54/l5a 0 2026-03-09T00:03:47.886 INFO:tasks.workunit.client.0.vm03.stdout:9/282: rmdir d15/d1c/d36/d4d 39 2026-03-09T00:03:47.886 INFO:tasks.workunit.client.0.vm03.stdout:9/283: creat d15/d1c/d28/f5b x:0 0 0 2026-03-09T00:03:47.889 INFO:tasks.workunit.client.1.vm06.stdout:4/433: dwrite d17/d5b/f83 [0,4194304] 0 2026-03-09T00:03:47.889 INFO:tasks.workunit.client.1.vm06.stdout:4/434: stat d17/l59 0 2026-03-09T00:03:47.894 INFO:tasks.workunit.client.1.vm06.stdout:4/435: mkdir d17/d21/d32/d92 0 2026-03-09T00:03:47.895 INFO:tasks.workunit.client.1.vm06.stdout:4/436: rename d17/d21/c30 to d17/d24/c93 0 2026-03-09T00:03:47.896 INFO:tasks.workunit.client.1.vm06.stdout:4/437: rename d17/d24/d49/d5f/c71 to d17/d24/d3b/d75/c94 0 2026-03-09T00:03:47.896 INFO:tasks.workunit.client.1.vm06.stdout:4/438: stat d17/d24/l4f 0 2026-03-09T00:03:47.896 INFO:tasks.workunit.client.1.vm06.stdout:4/439: chown d17/d24/f2c 3650 1 2026-03-09T00:03:47.899 INFO:tasks.workunit.client.1.vm06.stdout:4/440: dread d17/d24/f3a [0,4194304] 0 2026-03-09T00:03:47.908 INFO:tasks.workunit.client.1.vm06.stdout:3/565: dwrite d11/d28/d2e/f65 [0,4194304] 0 2026-03-09T00:03:47.908 INFO:tasks.workunit.client.1.vm06.stdout:4/441: write d17/d5b/f83 [234501,87402] 0 2026-03-09T00:03:47.911 INFO:tasks.workunit.client.1.vm06.stdout:4/442: unlink fe 0 2026-03-09T00:03:47.911 INFO:tasks.workunit.client.1.vm06.stdout:4/443: write d17/d21/f38 [376493,77914] 0 2026-03-09T00:03:47.911 INFO:tasks.workunit.client.1.vm06.stdout:4/444: readlink d17/l28 0 2026-03-09T00:03:47.920 INFO:tasks.workunit.client.0.vm03.stdout:5/293: dwrite d1c/f29 [0,4194304] 0 2026-03-09T00:03:47.921 INFO:tasks.workunit.client.0.vm03.stdout:5/294: dread - d1c/d20/d56/f59 zero size 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: pgmap v8: 65 pgs: 65 active+clean; 1.5 GiB data, 5.8 GiB used, 114 GiB / 120 GiB avail; 71 MiB/s rd, 92 MiB/s wr, 149 op/s 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='client.24379 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/3149584614' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:03:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:47 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:47.921 INFO:tasks.workunit.client.0.vm03.stdout:2/242: dwrite d8/d1b/d2a/f4c [0,4194304] 0 2026-03-09T00:03:47.929 INFO:tasks.workunit.client.0.vm03.stdout:2/243: dread d8/f9 [0,4194304] 0 2026-03-09T00:03:47.930 INFO:tasks.workunit.client.0.vm03.stdout:2/244: readlink d8/l19 0 2026-03-09T00:03:47.930 INFO:tasks.workunit.client.0.vm03.stdout:2/245: chown d8/d1b/d2a/d42/d4b/d50 923 1 2026-03-09T00:03:47.930 INFO:tasks.workunit.client.0.vm03.stdout:2/246: fsync d8/f11 0 2026-03-09T00:03:47.930 INFO:tasks.workunit.client.0.vm03.stdout:2/247: chown d8/d1b 1 1 2026-03-09T00:03:47.930 INFO:tasks.workunit.client.0.vm03.stdout:2/248: symlink d8/d1b/d2a/d2e/l51 0 2026-03-09T00:03:47.931 INFO:tasks.workunit.client.0.vm03.stdout:2/249: rename d8/d1b/d2a/c39 to d8/d1b/d24/c52 0 2026-03-09T00:03:47.933 INFO:tasks.workunit.client.0.vm03.stdout:2/250: unlink d8/d1b/d2a/d2e/f35 0 2026-03-09T00:03:47.999 INFO:tasks.workunit.client.0.vm03.stdout:1/353: dwrite d4/d15/d5c/f62 [0,4194304] 0 2026-03-09T00:03:48.021 INFO:tasks.workunit.client.0.vm03.stdout:6/260: dwrite d13/f4d [0,4194304] 0 2026-03-09T00:03:48.022 INFO:tasks.workunit.client.0.vm03.stdout:6/261: link d13/c41 d13/d1e/d44/c5a 0 2026-03-09T00:03:48.022 INFO:tasks.workunit.client.0.vm03.stdout:6/262: truncate d13/d1e/f34 531466 0 2026-03-09T00:03:48.023 INFO:tasks.workunit.client.0.vm03.stdout:6/263: creat d13/f5b x:0 0 0 2026-03-09T00:03:48.023 INFO:tasks.workunit.client.0.vm03.stdout:6/264: write d13/d1e/f48 [984656,3606] 0 2026-03-09T00:03:48.030 INFO:tasks.workunit.client.0.vm03.stdout:6/265: dread d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:03:48.031 INFO:tasks.workunit.client.0.vm03.stdout:6/266: mknod d13/d1e/c5c 0 2026-03-09T00:03:48.045 INFO:tasks.workunit.client.1.vm06.stdout:9/399: dwrite d1/d3/d12/f28 [0,4194304] 0 2026-03-09T00:03:48.045 INFO:tasks.workunit.client.1.vm06.stdout:9/400: readlink d1/d3/d12/l30 0 2026-03-09T00:03:48.046 INFO:tasks.workunit.client.1.vm06.stdout:8/462: dwrite db/d1e/f82 [0,4194304] 0 2026-03-09T00:03:48.046 INFO:tasks.workunit.client.1.vm06.stdout:8/463: read - db/dd/d84/f8d zero size 2026-03-09T00:03:48.047 INFO:tasks.workunit.client.1.vm06.stdout:9/401: mknod d1/d4/d6e/d14/d25/c80 0 2026-03-09T00:03:48.054 INFO:tasks.workunit.client.1.vm06.stdout:9/402: dread d1/d4/d6e/d14/d25/f70 [0,4194304] 0 2026-03-09T00:03:48.055 INFO:tasks.workunit.client.0.vm03.stdout:1/354: write d4/d15/d5c/f62 [3026673,127807] 0 2026-03-09T00:03:48.056 INFO:tasks.workunit.client.0.vm03.stdout:4/337: dwrite d7/d20/d29/f2a [0,4194304] 0 2026-03-09T00:03:48.061 INFO:tasks.workunit.client.1.vm06.stdout:8/464: write db/dd/d24/f33 [3298151,103977] 0 2026-03-09T00:03:48.061 INFO:tasks.workunit.client.1.vm06.stdout:8/465: truncate db/dd/d84/f8d 987097 0 2026-03-09T00:03:48.063 INFO:tasks.workunit.client.1.vm06.stdout:6/520: dwrite d4/f22 [0,4194304] 0 2026-03-09T00:03:48.064 INFO:tasks.workunit.client.0.vm03.stdout:1/355: creat d4/d3a/d61/d78/f79 x:0 0 0 2026-03-09T00:03:48.066 INFO:tasks.workunit.client.1.vm06.stdout:8/466: dread db/d53/d70/d38/d4d/f65 [0,4194304] 0 2026-03-09T00:03:48.066 INFO:tasks.workunit.client.1.vm06.stdout:8/467: write db/d1e/f2e [2428605,68679] 0 2026-03-09T00:03:48.066 INFO:tasks.workunit.client.0.vm03.stdout:5/295: dwrite d1c/d20/f4e [0,4194304] 0 2026-03-09T00:03:48.067 INFO:tasks.workunit.client.1.vm06.stdout:4/445: fsync d17/d21/f38 0 2026-03-09T00:03:48.072 INFO:tasks.workunit.client.1.vm06.stdout:1/449: dwrite d6/d63/f6a [0,4194304] 0 2026-03-09T00:03:48.072 INFO:tasks.workunit.client.0.vm03.stdout:4/338: creat d7/d20/d29/d54/d58/f6b x:0 0 0 2026-03-09T00:03:48.072 INFO:tasks.workunit.client.0.vm03.stdout:4/339: creat d7/d20/d35/d66/f6c x:0 0 0 2026-03-09T00:03:48.073 INFO:tasks.workunit.client.0.vm03.stdout:2/251: dwrite f6 [0,4194304] 0 2026-03-09T00:03:48.073 INFO:tasks.workunit.client.0.vm03.stdout:8/262: dwrite d7/df/f37 [0,4194304] 0 2026-03-09T00:03:48.096 INFO:tasks.workunit.client.1.vm06.stdout:7/496: dwrite d0/df/d17/f7e [0,4194304] 0 2026-03-09T00:03:48.096 INFO:tasks.workunit.client.1.vm06.stdout:7/497: fsync d0/d39/f3e 0 2026-03-09T00:03:48.096 INFO:tasks.workunit.client.1.vm06.stdout:7/498: write d0/df/d1a/d35/d62/f87 [863342,12993] 0 2026-03-09T00:03:48.096 INFO:tasks.workunit.client.1.vm06.stdout:7/499: chown d0/df/d1a/d27/d4c/f6d 1788 1 2026-03-09T00:03:48.112 INFO:tasks.workunit.client.1.vm06.stdout:3/566: dwrite d11/d28/d2e/d2f/f92 [0,4194304] 0 2026-03-09T00:03:48.112 INFO:tasks.workunit.client.1.vm06.stdout:3/567: readlink d11/d28/d2e/d2f/d5b/l8c 0 2026-03-09T00:03:48.124 INFO:tasks.workunit.client.0.vm03.stdout:6/267: write d13/d1e/d44/d4a/f58 [4236006,88861] 0 2026-03-09T00:03:48.124 INFO:tasks.workunit.client.0.vm03.stdout:6/268: write d13/d1e/f28 [99710,54548] 0 2026-03-09T00:03:48.128 INFO:tasks.workunit.client.1.vm06.stdout:5/609: write d5/d1c/d21/d28/d5e/d66/d78/dc8/f60 [1915955,55637] 0 2026-03-09T00:03:48.128 INFO:tasks.workunit.client.1.vm06.stdout:5/610: write d5/f6b [367745,90761] 0 2026-03-09T00:03:48.129 INFO:tasks.workunit.client.1.vm06.stdout:9/403: symlink d1/d3/d4f/d52/l81 0 2026-03-09T00:03:48.131 INFO:tasks.workunit.client.0.vm03.stdout:2/252: mknod d8/d26/c53 0 2026-03-09T00:03:48.131 INFO:tasks.workunit.client.0.vm03.stdout:2/253: write d8/d17/f45 [948672,1151] 0 2026-03-09T00:03:48.131 INFO:tasks.workunit.client.0.vm03.stdout:2/254: write f7 [458157,116241] 0 2026-03-09T00:03:48.132 INFO:tasks.workunit.client.1.vm06.stdout:8/468: mkdir db/d1e/d46/d94 0 2026-03-09T00:03:48.132 INFO:tasks.workunit.client.1.vm06.stdout:8/469: stat db/f2d 0 2026-03-09T00:03:48.132 INFO:tasks.workunit.client.1.vm06.stdout:8/470: readlink db/d53/d70/l41 0 2026-03-09T00:03:48.140 INFO:tasks.workunit.client.1.vm06.stdout:4/446: dwrite d17/d21/d4c/d50/f8c [0,4194304] 0 2026-03-09T00:03:48.142 INFO:tasks.workunit.client.0.vm03.stdout:1/356: getdents d4/d6 0 2026-03-09T00:03:48.144 INFO:tasks.workunit.client.1.vm06.stdout:1/450: mknod d6/d4c/d51/c95 0 2026-03-09T00:03:48.145 INFO:tasks.workunit.client.0.vm03.stdout:5/296: mkdir d1c/d67 0 2026-03-09T00:03:48.148 INFO:tasks.workunit.client.0.vm03.stdout:2/255: creat d8/d1b/d2a/d42/d4b/d50/f54 x:0 0 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.0.vm03.stdout:2/256: write d8/d1b/d2a/f33 [270944,67646] 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.0.vm03.stdout:5/297: creat d1c/d51/f68 x:0 0 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.0.vm03.stdout:5/298: write d1c/f4c [1624506,43367] 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.0.vm03.stdout:5/299: link d1c/d20/d56/f59 d1c/d20/d55/d4f/f69 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.0.vm03.stdout:5/300: chown d1c/d20/d56/f59 627554 1 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.1.vm06.stdout:7/500: symlink d0/df/d1a/d27/d70/l8b 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.1.vm06.stdout:7/501: chown d0/df/d1a/d27/d4c/d40/d51/l76 6 1 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.1.vm06.stdout:7/502: write d0/df/d1a/d3a/f3c [4297267,72172] 0 2026-03-09T00:03:48.154 INFO:tasks.workunit.client.1.vm06.stdout:5/611: creat d5/d44/d4b/d92/d49/da0/fce x:0 0 0 2026-03-09T00:03:48.160 INFO:tasks.workunit.client.0.vm03.stdout:5/301: dread d1c/d20/d55/f52 [0,4194304] 0 2026-03-09T00:03:48.160 INFO:tasks.workunit.client.0.vm03.stdout:5/302: chown f14 6 1 2026-03-09T00:03:48.160 INFO:tasks.workunit.client.0.vm03.stdout:5/303: readlink d1c/d20/l41 0 2026-03-09T00:03:48.160 INFO:tasks.workunit.client.0.vm03.stdout:5/304: write f12 [5215621,72283] 0 2026-03-09T00:03:48.169 INFO:tasks.workunit.client.0.vm03.stdout:2/257: write d8/d1b/f32 [2899428,92476] 0 2026-03-09T00:03:48.170 INFO:tasks.workunit.client.0.vm03.stdout:2/258: readlink d8/le 0 2026-03-09T00:03:48.171 INFO:tasks.workunit.client.0.vm03.stdout:3/216: dwrite d2/f30 [0,4194304] 0 2026-03-09T00:03:48.176 INFO:tasks.workunit.client.1.vm06.stdout:6/521: dwrite d4/d16/f5e [0,4194304] 0 2026-03-09T00:03:48.176 INFO:tasks.workunit.client.1.vm06.stdout:6/522: chown d4/d27/l3a 36538472 1 2026-03-09T00:03:48.177 INFO:tasks.workunit.client.0.vm03.stdout:6/269: dread d13/d1e/f48 [0,4194304] 0 2026-03-09T00:03:48.177 INFO:tasks.workunit.client.0.vm03.stdout:6/270: creat d13/f5d x:0 0 0 2026-03-09T00:03:48.191 INFO:tasks.workunit.client.1.vm06.stdout:3/568: dwrite d11/d28/d2e/f38 [0,4194304] 0 2026-03-09T00:03:48.191 INFO:tasks.workunit.client.1.vm06.stdout:3/569: chown d11/d28/f3a 17516 1 2026-03-09T00:03:48.195 INFO:tasks.workunit.client.1.vm06.stdout:8/471: creat db/d53/d7c/f95 x:0 0 0 2026-03-09T00:03:48.195 INFO:tasks.workunit.client.1.vm06.stdout:9/404: creat d1/d4/d6e/d9/f82 x:0 0 0 2026-03-09T00:03:48.195 INFO:tasks.workunit.client.1.vm06.stdout:8/472: dread - db/d53/d7c/f95 zero size 2026-03-09T00:03:48.196 INFO:tasks.workunit.client.1.vm06.stdout:4/447: rename d17/c51 to d17/d5b/d8f/c95 0 2026-03-09T00:03:48.196 INFO:tasks.workunit.client.1.vm06.stdout:4/448: truncate d17/d24/d49/d5f/f76 815757 0 2026-03-09T00:03:48.196 INFO:tasks.workunit.client.1.vm06.stdout:4/449: write d17/d21/d4c/f90 [85469,38126] 0 2026-03-09T00:03:48.205 INFO:tasks.workunit.client.1.vm06.stdout:1/451: mknod d6/d63/c96 0 2026-03-09T00:03:48.209 INFO:tasks.workunit.client.0.vm03.stdout:1/357: dread d4/d15/d5c/f62 [0,4194304] 0 2026-03-09T00:03:48.214 INFO:tasks.workunit.client.1.vm06.stdout:7/503: creat d0/df/d1a/d27/d4c/f8c x:0 0 0 2026-03-09T00:03:48.227 INFO:tasks.workunit.client.0.vm03.stdout:4/340: fsync d7/d20/d35/d66/f6c 0 2026-03-09T00:03:48.227 INFO:tasks.workunit.client.0.vm03.stdout:4/341: write d7/d20/f21 [277188,37968] 0 2026-03-09T00:03:48.228 INFO:tasks.workunit.client.0.vm03.stdout:0/269: sync 2026-03-09T00:03:48.231 INFO:tasks.workunit.client.0.vm03.stdout:4/342: write d7/d27/f2c [2253919,98076] 0 2026-03-09T00:03:48.231 INFO:tasks.workunit.client.0.vm03.stdout:4/343: fdatasync d7/f22 0 2026-03-09T00:03:48.231 INFO:tasks.workunit.client.0.vm03.stdout:4/344: write d7/d27/f2c [3526820,72282] 0 2026-03-09T00:03:48.231 INFO:tasks.workunit.client.0.vm03.stdout:4/345: chown d7/d20/d29/d4e/f60 639 1 2026-03-09T00:03:48.231 INFO:tasks.workunit.client.0.vm03.stdout:4/346: chown d7/d23 241 1 2026-03-09T00:03:48.235 INFO:tasks.workunit.client.0.vm03.stdout:3/217: mkdir d2/db/d3b/d3f 0 2026-03-09T00:03:48.236 INFO:tasks.workunit.client.0.vm03.stdout:1/358: rename d4/d15/f3c to d4/d15/d77/f7a 0 2026-03-09T00:03:48.238 INFO:tasks.workunit.client.0.vm03.stdout:3/218: mkdir d2/db/d40 0 2026-03-09T00:03:48.241 INFO:tasks.workunit.client.1.vm06.stdout:2/573: sync 2026-03-09T00:03:48.243 INFO:tasks.workunit.client.1.vm06.stdout:8/473: link db/dd/d48/f4a db/d53/d70/d38/d4d/d79/f96 0 2026-03-09T00:03:48.243 INFO:tasks.workunit.client.1.vm06.stdout:9/405: mknod d1/d3/c83 0 2026-03-09T00:03:48.243 INFO:tasks.workunit.client.1.vm06.stdout:6/523: mknod d4/d27/d42/d7e/ca2 0 2026-03-09T00:03:48.243 INFO:tasks.workunit.client.0.vm03.stdout:1/359: unlink d4/d15/d1a/l3e 0 2026-03-09T00:03:48.243 INFO:tasks.workunit.client.0.vm03.stdout:1/360: symlink d4/d3a/d61/l7b 0 2026-03-09T00:03:48.247 INFO:tasks.workunit.client.1.vm06.stdout:2/574: link d7/d1b/f5c d7/d1a/d25/d66/fa6 0 2026-03-09T00:03:48.249 INFO:tasks.workunit.client.1.vm06.stdout:1/452: rename d6/d21/d2d/l39 to d6/d4c/l97 0 2026-03-09T00:03:48.252 INFO:tasks.workunit.client.0.vm03.stdout:5/305: dwrite d1c/d20/f25 [0,4194304] 0 2026-03-09T00:03:48.261 INFO:tasks.workunit.client.1.vm06.stdout:6/524: creat d4/d27/d42/fa3 x:0 0 0 2026-03-09T00:03:48.261 INFO:tasks.workunit.client.1.vm06.stdout:6/525: fsync d4/d27/d42/d4b/f95 0 2026-03-09T00:03:48.263 INFO:tasks.workunit.client.1.vm06.stdout:8/474: link db/dd/d84/f8d db/dd/f97 0 2026-03-09T00:03:48.276 INFO:tasks.workunit.client.0.vm03.stdout:5/306: mkdir d1c/d51/d6a 0 2026-03-09T00:03:48.276 INFO:tasks.workunit.client.0.vm03.stdout:5/307: fsync f15 0 2026-03-09T00:03:48.279 INFO:tasks.workunit.client.1.vm06.stdout:1/453: unlink d6/d21/d2d/l73 0 2026-03-09T00:03:48.285 INFO:tasks.workunit.client.1.vm06.stdout:6/526: mkdir d4/d27/d3e/da4 0 2026-03-09T00:03:48.286 INFO:tasks.workunit.client.0.vm03.stdout:2/259: dwrite d8/d1b/f1f [0,4194304] 0 2026-03-09T00:03:48.286 INFO:tasks.workunit.client.0.vm03.stdout:2/260: stat d8/d1b/d2a/d42/d4b/d4f 0 2026-03-09T00:03:48.286 INFO:tasks.workunit.client.0.vm03.stdout:2/261: chown d8/f15 21464928 1 2026-03-09T00:03:48.288 INFO:tasks.workunit.client.0.vm03.stdout:5/308: unlink la 0 2026-03-09T00:03:48.288 INFO:tasks.workunit.client.0.vm03.stdout:5/309: stat d1c/d20/d55/c2b 0 2026-03-09T00:03:48.289 INFO:tasks.workunit.client.1.vm06.stdout:8/475: mkdir db/d74/d78/d98 0 2026-03-09T00:03:48.291 INFO:tasks.workunit.client.1.vm06.stdout:6/527: creat d4/d27/d42/d52/d7d/fa5 x:0 0 0 2026-03-09T00:03:48.292 INFO:tasks.workunit.client.0.vm03.stdout:2/262: mknod d8/d1b/d2a/d42/d4b/d4f/c55 0 2026-03-09T00:03:48.292 INFO:tasks.workunit.client.0.vm03.stdout:2/263: stat d8/f9 0 2026-03-09T00:03:48.299 INFO:tasks.workunit.client.1.vm06.stdout:8/476: creat db/d53/d70/d38/f99 x:0 0 0 2026-03-09T00:03:48.318 INFO:tasks.workunit.client.1.vm06.stdout:6/528: mkdir d4/d27/d42/da6 0 2026-03-09T00:03:48.318 INFO:tasks.workunit.client.1.vm06.stdout:6/529: chown d4/d27/d3e/d78/d97 355 1 2026-03-09T00:03:48.318 INFO:tasks.workunit.client.1.vm06.stdout:6/530: creat d4/d27/d3e/d57/fa7 x:0 0 0 2026-03-09T00:03:48.318 INFO:tasks.workunit.client.1.vm06.stdout:6/531: getdents d4/d27/d3e/da4 0 2026-03-09T00:03:48.326 INFO:tasks.workunit.client.1.vm06.stdout:8/477: dread db/d1e/f51 [0,4194304] 0 2026-03-09T00:03:48.326 INFO:tasks.workunit.client.1.vm06.stdout:8/478: creat db/d53/d6d/d7b/f9a x:0 0 0 2026-03-09T00:03:48.327 INFO:tasks.workunit.client.0.vm03.stdout:3/219: write d2/db/f24 [49539,121520] 0 2026-03-09T00:03:48.328 INFO:tasks.workunit.client.0.vm03.stdout:3/220: chown d2/db/d3b/d3f 250945488 1 2026-03-09T00:03:48.330 INFO:tasks.workunit.client.1.vm06.stdout:8/479: dread db/dd/f1c [0,4194304] 0 2026-03-09T00:03:48.331 INFO:tasks.workunit.client.0.vm03.stdout:4/347: dwrite d7/d20/d29/f53 [0,4194304] 0 2026-03-09T00:03:48.342 INFO:tasks.workunit.client.0.vm03.stdout:3/221: rmdir d2/db 39 2026-03-09T00:03:48.345 INFO:tasks.workunit.client.1.vm06.stdout:8/480: mkdir db/d1e/d9b 0 2026-03-09T00:03:48.345 INFO:tasks.workunit.client.1.vm06.stdout:8/481: chown db/d53/d70/d38/d47/c49 1 1 2026-03-09T00:03:48.345 INFO:tasks.workunit.client.1.vm06.stdout:8/482: write db/dd/f67 [115767,52637] 0 2026-03-09T00:03:48.348 INFO:tasks.workunit.client.0.vm03.stdout:3/222: dread d2/db/f28 [0,4194304] 0 2026-03-09T00:03:48.348 INFO:tasks.workunit.client.0.vm03.stdout:3/223: write d2/db/f1a [1331172,36300] 0 2026-03-09T00:03:48.349 INFO:tasks.workunit.client.0.vm03.stdout:4/348: symlink d7/l6d 0 2026-03-09T00:03:48.350 INFO:tasks.workunit.client.1.vm06.stdout:8/483: mkdir db/d74/d78/d98/d9c 0 2026-03-09T00:03:48.350 INFO:tasks.workunit.client.1.vm06.stdout:8/484: dread - db/dd/d24/d80/f8b zero size 2026-03-09T00:03:48.350 INFO:tasks.workunit.client.1.vm06.stdout:8/485: read db/d74/d78/f93 [1513939,84961] 0 2026-03-09T00:03:48.350 INFO:tasks.workunit.client.1.vm06.stdout:8/486: getdents db/dd/d85 0 2026-03-09T00:03:48.353 INFO:tasks.workunit.client.0.vm03.stdout:3/224: write d2/fc [757738,80448] 0 2026-03-09T00:03:48.353 INFO:tasks.workunit.client.0.vm03.stdout:3/225: stat d2/db/f27 0 2026-03-09T00:03:48.359 INFO:tasks.workunit.client.1.vm06.stdout:3/570: dwrite d11/f48 [0,4194304] 0 2026-03-09T00:03:48.374 INFO:tasks.workunit.client.1.vm06.stdout:0/493: sync 2026-03-09T00:03:48.374 INFO:tasks.workunit.client.1.vm06.stdout:0/494: chown d3/d18/f59 0 1 2026-03-09T00:03:48.374 INFO:tasks.workunit.client.1.vm06.stdout:0/495: chown d3/d18/d2c/d2d/l99 31260 1 2026-03-09T00:03:48.375 INFO:tasks.workunit.client.1.vm06.stdout:0/496: creat d3/d18/d1f/d39/d3b/fa4 x:0 0 0 2026-03-09T00:03:48.375 INFO:tasks.workunit.client.1.vm06.stdout:0/497: chown d3/d18/d1f/d44/f58 82019946 1 2026-03-09T00:03:48.376 INFO:tasks.workunit.client.1.vm06.stdout:0/498: rename d3/ce to d3/d18/d79/ca5 0 2026-03-09T00:03:48.376 INFO:tasks.workunit.client.1.vm06.stdout:0/499: truncate d3/d18/d1f/d44/d6a/f96 683928 0 2026-03-09T00:03:48.426 INFO:tasks.workunit.client.1.vm06.stdout:6/532: dread d4/d27/d3e/d57/f65 [0,4194304] 0 2026-03-09T00:03:48.427 INFO:tasks.workunit.client.0.vm03.stdout:1/361: dwrite d4/d3a/d3d/d46/f4c [0,4194304] 0 2026-03-09T00:03:48.428 INFO:tasks.workunit.client.1.vm06.stdout:7/504: dwrite d0/df/d1a/d22/f2c [4194304,4194304] 0 2026-03-09T00:03:48.430 INFO:tasks.workunit.client.1.vm06.stdout:6/533: mknod d4/d27/d3e/d78/d97/ca8 0 2026-03-09T00:03:48.433 INFO:tasks.workunit.client.0.vm03.stdout:1/362: link d4/d3a/f48 d4/d15/d77/f7c 0 2026-03-09T00:03:48.433 INFO:tasks.workunit.client.0.vm03.stdout:1/363: fdatasync f0 0 2026-03-09T00:03:48.443 INFO:tasks.workunit.client.0.vm03.stdout:1/364: dread d4/d15/d1a/f1d [0,4194304] 0 2026-03-09T00:03:48.443 INFO:tasks.workunit.client.0.vm03.stdout:1/365: creat d4/f7d x:0 0 0 2026-03-09T00:03:48.446 INFO:tasks.workunit.client.1.vm06.stdout:9/406: dwrite d1/d4/d6e/f5d [0,4194304] 0 2026-03-09T00:03:48.461 INFO:tasks.workunit.client.0.vm03.stdout:0/270: dwrite d2/da/f2d [0,4194304] 0 2026-03-09T00:03:48.463 INFO:tasks.workunit.client.0.vm03.stdout:0/271: creat d2/da/d36/f58 x:0 0 0 2026-03-09T00:03:48.492 INFO:tasks.workunit.client.1.vm06.stdout:0/500: dwrite d3/d18/d2c/d2d/d74/f77 [0,4194304] 0 2026-03-09T00:03:48.504 INFO:tasks.workunit.client.1.vm06.stdout:4/450: dwrite d17/d24/d3b/d5e/f6f [0,4194304] 0 2026-03-09T00:03:48.504 INFO:tasks.workunit.client.1.vm06.stdout:4/451: truncate d17/d21/d4c/f87 459308 0 2026-03-09T00:03:48.504 INFO:tasks.workunit.client.1.vm06.stdout:4/452: creat d17/d21/d32/f96 x:0 0 0 2026-03-09T00:03:48.508 INFO:tasks.workunit.client.0.vm03.stdout:1/366: write d4/f1e [944644,36492] 0 2026-03-09T00:03:48.509 INFO:tasks.workunit.client.1.vm06.stdout:4/453: mkdir d17/d24/d3b/d97 0 2026-03-09T00:03:48.511 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='client.14606 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:48.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:48.535 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:03:48.536 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: Upgrade: Need to upgrade myself (mgr.vm06.rzcvhn) 2026-03-09T00:03:48.536 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: Upgrade: Need to upgrade myself (mgr.vm06.rzcvhn) 2026-03-09T00:03:48.536 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:48.536 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.yvcons", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:03:48.536 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:03:48.536 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:48 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:48.543 INFO:tasks.workunit.client.1.vm06.stdout:4/454: dread d17/d24/d3b/d5e/f6d [4194304,4194304] 0 2026-03-09T00:03:48.545 INFO:tasks.workunit.client.1.vm06.stdout:4/455: symlink d17/d24/d3b/l98 0 2026-03-09T00:03:48.545 INFO:tasks.workunit.client.1.vm06.stdout:4/456: truncate d17/d5b/f77 4736832 0 2026-03-09T00:03:48.545 INFO:tasks.workunit.client.1.vm06.stdout:4/457: chown d17/d21/d4c/f87 1 1 2026-03-09T00:03:48.548 INFO:tasks.workunit.client.1.vm06.stdout:4/458: link d17/d24/d3b/d5e/c8e d17/d24/d3b/d5e/c99 0 2026-03-09T00:03:48.572 INFO:tasks.workunit.client.1.vm06.stdout:4/459: chown d17/d21/d32/c42 18274983 1 2026-03-09T00:03:48.580 INFO:tasks.workunit.client.1.vm06.stdout:2/575: dwrite d7/d1b/f5c [4194304,4194304] 0 2026-03-09T00:03:48.580 INFO:tasks.workunit.client.0.vm03.stdout:5/310: dwrite d1c/d20/d55/f46 [0,4194304] 0 2026-03-09T00:03:48.581 INFO:tasks.workunit.client.0.vm03.stdout:5/311: read d1c/d20/d55/f52 [3075183,79695] 0 2026-03-09T00:03:48.584 INFO:tasks.workunit.client.0.vm03.stdout:5/312: mkdir d1c/d20/d55/d66/d6b 0 2026-03-09T00:03:48.586 INFO:tasks.workunit.client.0.vm03.stdout:5/313: rmdir d1c/d20/d55/d4f/d58 39 2026-03-09T00:03:48.586 INFO:tasks.workunit.client.0.vm03.stdout:5/314: chown d1c/d20/c28 180001 1 2026-03-09T00:03:48.586 INFO:tasks.workunit.client.0.vm03.stdout:5/315: dread - d1c/d20/d55/f5a zero size 2026-03-09T00:03:48.588 INFO:tasks.workunit.client.1.vm06.stdout:2/576: symlink d7/da/db/la7 0 2026-03-09T00:03:48.590 INFO:tasks.workunit.client.1.vm06.stdout:2/577: mkdir d7/d1a/d25/d66/d87/da8 0 2026-03-09T00:03:48.616 INFO:tasks.workunit.client.1.vm06.stdout:7/505: dwrite d0/df/d1a/f44 [4194304,4194304] 0 2026-03-09T00:03:48.617 INFO:tasks.workunit.client.0.vm03.stdout:1/367: dwrite d4/d3a/d32/f4b [0,4194304] 0 2026-03-09T00:03:48.633 INFO:tasks.workunit.client.0.vm03.stdout:7/298: sync 2026-03-09T00:03:48.637 INFO:tasks.workunit.client.0.vm03.stdout:7/299: getdents d2/d1f/d3a 0 2026-03-09T00:03:48.637 INFO:tasks.workunit.client.0.vm03.stdout:7/300: write d2/d1f/d3a/d31/f44 [2109986,1394] 0 2026-03-09T00:03:48.637 INFO:tasks.workunit.client.0.vm03.stdout:7/301: readlink d2/d1f/d42/d43/l45 0 2026-03-09T00:03:48.639 INFO:tasks.workunit.client.0.vm03.stdout:7/302: creat d2/d1f/d42/d43/f5f x:0 0 0 2026-03-09T00:03:48.641 INFO:tasks.workunit.client.1.vm06.stdout:7/506: dread d0/f4f [0,4194304] 0 2026-03-09T00:03:48.641 INFO:tasks.workunit.client.1.vm06.stdout:7/507: write d0/df/d1a/d3a/d4e/d5e/f6f [488132,112106] 0 2026-03-09T00:03:48.641 INFO:tasks.workunit.client.1.vm06.stdout:7/508: fsync d0/df/d1a/d22/f2c 0 2026-03-09T00:03:48.642 INFO:tasks.workunit.client.0.vm03.stdout:7/303: unlink d2/d1f/d3a/d31/d37/d39/f4b 0 2026-03-09T00:03:48.645 INFO:tasks.workunit.client.1.vm06.stdout:7/509: symlink d0/df/d7b/l8d 0 2026-03-09T00:03:48.651 INFO:tasks.workunit.client.1.vm06.stdout:7/510: symlink d0/df/d1a/d27/d70/l8e 0 2026-03-09T00:03:48.651 INFO:tasks.workunit.client.1.vm06.stdout:7/511: fsync d0/df/d1a/d27/f60 0 2026-03-09T00:03:48.651 INFO:tasks.workunit.client.1.vm06.stdout:7/512: chown d0/df/d1a/d27/d4c/c42 58102 1 2026-03-09T00:03:48.651 INFO:tasks.workunit.client.1.vm06.stdout:7/513: stat d0/df/d1a/d3f 0 2026-03-09T00:03:48.656 INFO:tasks.workunit.client.1.vm06.stdout:7/514: mknod d0/df/d1a/d27/d4c/d40/c8f 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.0.vm03.stdout:2/264: rename d8/d1b/d2a/d42/d4b/d4f to d8/d1b/d2a/d56 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.0.vm03.stdout:2/265: fdatasync d8/f9 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.0.vm03.stdout:4/349: dwrite d7/f22 [0,4194304] 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.0.vm03.stdout:3/226: dwrite d2/f16 [0,4194304] 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.1.vm06.stdout:9/407: dwrite d1/d4/f6 [4194304,4194304] 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.1.vm06.stdout:6/534: dwrite d4/d27/d3e/f7a [0,4194304] 0 2026-03-09T00:03:48.668 INFO:tasks.workunit.client.1.vm06.stdout:3/571: write d11/f1d [1250045,2417] 0 2026-03-09T00:03:48.673 INFO:tasks.workunit.client.1.vm06.stdout:7/515: write d0/f14 [6200919,112984] 0 2026-03-09T00:03:48.673 INFO:tasks.workunit.client.1.vm06.stdout:0/501: dwrite d3/d18/d1f/d44/d6a/f96 [0,4194304] 0 2026-03-09T00:03:48.678 INFO:tasks.workunit.client.0.vm03.stdout:5/316: write d1c/d20/d55/f46 [3692410,108907] 0 2026-03-09T00:03:48.686 INFO:tasks.workunit.client.1.vm06.stdout:9/408: creat d1/d4/d2f/f84 x:0 0 0 2026-03-09T00:03:48.691 INFO:tasks.workunit.client.0.vm03.stdout:1/368: dwrite d4/d6/f33 [0,4194304] 0 2026-03-09T00:03:48.693 INFO:tasks.workunit.client.1.vm06.stdout:4/460: dwrite d17/d5b/f64 [0,4194304] 0 2026-03-09T00:03:48.693 INFO:tasks.workunit.client.1.vm06.stdout:4/461: creat d17/d21/d4c/d66/f9a x:0 0 0 2026-03-09T00:03:48.693 INFO:tasks.workunit.client.1.vm06.stdout:4/462: truncate d17/d24/f5c 2551692 0 2026-03-09T00:03:48.702 INFO:tasks.workunit.client.1.vm06.stdout:4/463: write d17/d5b/f64 [1467764,115351] 0 2026-03-09T00:03:48.707 INFO:tasks.workunit.client.1.vm06.stdout:4/464: truncate d17/d21/d4c/d50/f69 294258 0 2026-03-09T00:03:48.708 INFO:tasks.workunit.client.0.vm03.stdout:3/227: symlink d2/db/d2d/l41 0 2026-03-09T00:03:48.708 INFO:tasks.workunit.client.1.vm06.stdout:3/572: symlink d11/d28/d4d/d9b/lc8 0 2026-03-09T00:03:48.714 INFO:tasks.workunit.client.0.vm03.stdout:0/272: dwrite d2/da/d36/d39/f3b [0,4194304] 0 2026-03-09T00:03:48.714 INFO:tasks.workunit.client.0.vm03.stdout:0/273: chown d2/da/dd/c51 70 1 2026-03-09T00:03:48.719 INFO:tasks.workunit.client.1.vm06.stdout:7/516: mkdir d0/df/d1a/d27/d4c/d40/d51/d90 0 2026-03-09T00:03:48.719 INFO:tasks.workunit.client.1.vm06.stdout:7/517: read - d0/df/d1a/d27/d4c/f8c zero size 2026-03-09T00:03:48.720 INFO:tasks.workunit.client.1.vm06.stdout:0/502: symlink d3/d18/d1f/d39/la6 0 2026-03-09T00:03:48.721 INFO:tasks.workunit.client.0.vm03.stdout:5/317: symlink d1c/d51/l6c 0 2026-03-09T00:03:48.733 INFO:tasks.workunit.client.0.vm03.stdout:1/369: symlink d4/d3a/d61/d78/l7e 0 2026-03-09T00:03:48.733 INFO:tasks.workunit.client.0.vm03.stdout:1/370: chown d4/d3a/d3d/f64 332 1 2026-03-09T00:03:48.733 INFO:tasks.workunit.client.0.vm03.stdout:1/371: write d4/d3a/d3d/f58 [378388,58381] 0 2026-03-09T00:03:48.733 INFO:tasks.workunit.client.0.vm03.stdout:1/372: creat d4/d15/f7f x:0 0 0 2026-03-09T00:03:48.733 INFO:tasks.workunit.client.0.vm03.stdout:1/373: write f2 [4381573,107903] 0 2026-03-09T00:03:48.738 INFO:tasks.workunit.client.0.vm03.stdout:3/228: mknod d2/db/d3b/d3f/c42 0 2026-03-09T00:03:48.744 INFO:tasks.workunit.client.1.vm06.stdout:3/573: truncate d11/f12 3332413 0 2026-03-09T00:03:48.750 INFO:tasks.workunit.client.0.vm03.stdout:0/274: link d2/f1e d2/f59 0 2026-03-09T00:03:48.751 INFO:tasks.workunit.client.1.vm06.stdout:7/518: symlink d0/df/d1a/d27/d4c/d40/d51/d86/l91 0 2026-03-09T00:03:48.751 INFO:tasks.workunit.client.0.vm03.stdout:5/318: mknod d1c/d20/d55/d4f/c6d 0 2026-03-09T00:03:48.753 INFO:tasks.workunit.client.0.vm03.stdout:3/229: creat d2/db/d2d/f43 x:0 0 0 2026-03-09T00:03:48.754 INFO:tasks.workunit.client.0.vm03.stdout:0/275: mkdir d2/d5a 0 2026-03-09T00:03:48.754 INFO:tasks.workunit.client.0.vm03.stdout:0/276: write d2/da/d36/d39/f41 [1021337,28213] 0 2026-03-09T00:03:48.757 INFO:tasks.workunit.client.1.vm06.stdout:7/519: rename d0/df/d1a/d27/d4c/d40/c4a to d0/df/d1a/d3a/d4e/d5e/c92 0 2026-03-09T00:03:48.757 INFO:tasks.workunit.client.1.vm06.stdout:7/520: getdents d0/d55/d85 0 2026-03-09T00:03:48.758 INFO:tasks.workunit.client.0.vm03.stdout:3/230: mkdir d2/db/d40/d44 0 2026-03-09T00:03:48.759 INFO:tasks.workunit.client.0.vm03.stdout:0/277: creat d2/da/d36/d39/d4b/d55/f5b x:0 0 0 2026-03-09T00:03:48.759 INFO:tasks.workunit.client.0.vm03.stdout:0/278: creat d2/da/d36/d39/f5c x:0 0 0 2026-03-09T00:03:48.760 INFO:tasks.workunit.client.1.vm06.stdout:7/521: getdents d0/df/d1a/d35 0 2026-03-09T00:03:48.760 INFO:tasks.workunit.client.1.vm06.stdout:7/522: chown d0/df 217594364 1 2026-03-09T00:03:48.760 INFO:tasks.workunit.client.1.vm06.stdout:7/523: stat d0/df/d1a/d27/d70/l8e 0 2026-03-09T00:03:48.760 INFO:tasks.workunit.client.1.vm06.stdout:7/524: chown d0/df/f13 183 1 2026-03-09T00:03:48.760 INFO:tasks.workunit.client.1.vm06.stdout:7/525: dread - d0/df/d1a/d27/d4c/f8c zero size 2026-03-09T00:03:48.760 INFO:tasks.workunit.client.0.vm03.stdout:7/304: dwrite d2/d1f/d3a/d31/d37/f56 [0,4194304] 0 2026-03-09T00:03:48.762 INFO:tasks.workunit.client.1.vm06.stdout:7/526: link d0/df/d1a/d27/f66 d0/df/d1a/d3a/d4e/d5e/f93 0 2026-03-09T00:03:48.780 INFO:tasks.workunit.client.0.vm03.stdout:7/305: getdents d2/d1f/d3a/d31 0 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.1.vm06.stdout:1/454: sync 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.1.vm06.stdout:5/612: sync 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.1.vm06.stdout:5/613: dread - d5/d1c/d21/d28/d5e/fc4 zero size 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.1.vm06.stdout:5/614: mkdir d5/d1c/d23/d34/d47/dcf 0 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.1.vm06.stdout:5/615: readlink d5/d1c/l30 0 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.0.vm03.stdout:3/231: dread d2/fc [0,4194304] 0 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.0.vm03.stdout:3/232: dread - d2/db/f26 zero size 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.0.vm03.stdout:3/233: stat d2/db/f24 0 2026-03-09T00:03:48.781 INFO:tasks.workunit.client.0.vm03.stdout:3/234: chown d2/db/d40 60 1 2026-03-09T00:03:48.783 INFO:tasks.workunit.client.0.vm03.stdout:3/235: unlink d2/l12 0 2026-03-09T00:03:48.809 INFO:tasks.workunit.client.1.vm06.stdout:6/535: dwrite d4/d27/d42/d4b/f83 [0,4194304] 0 2026-03-09T00:03:48.809 INFO:tasks.workunit.client.1.vm06.stdout:6/536: read - d4/d27/d42/fa3 zero size 2026-03-09T00:03:48.809 INFO:tasks.workunit.client.1.vm06.stdout:6/537: stat d4/d27/l4c 0 2026-03-09T00:03:48.813 INFO:tasks.workunit.client.1.vm06.stdout:5/616: dread d5/d1c/d21/f73 [0,4194304] 0 2026-03-09T00:03:48.813 INFO:tasks.workunit.client.1.vm06.stdout:5/617: fdatasync d5/d44/f81 0 2026-03-09T00:03:48.813 INFO:tasks.workunit.client.1.vm06.stdout:6/538: write d4/f22 [5079069,73221] 0 2026-03-09T00:03:48.813 INFO:tasks.workunit.client.1.vm06.stdout:6/539: creat d4/d27/d42/d4b/fa9 x:0 0 0 2026-03-09T00:03:48.816 INFO:tasks.workunit.client.1.vm06.stdout:5/618: symlink d5/d44/d4b/d92/d49/ld0 0 2026-03-09T00:03:48.819 INFO:tasks.workunit.client.0.vm03.stdout:0/279: write d2/f22 [1239181,91650] 0 2026-03-09T00:03:48.821 INFO:tasks.workunit.client.1.vm06.stdout:5/619: getdents d5/d1c 0 2026-03-09T00:03:48.821 INFO:tasks.workunit.client.1.vm06.stdout:5/620: creat d5/d1c/d23/fd1 x:0 0 0 2026-03-09T00:03:48.822 INFO:tasks.workunit.client.1.vm06.stdout:6/540: rmdir d4/d16/d53/d67 39 2026-03-09T00:03:48.828 INFO:tasks.workunit.client.0.vm03.stdout:0/280: dread d2/d1f/f43 [0,4194304] 0 2026-03-09T00:03:48.830 INFO:tasks.workunit.client.0.vm03.stdout:0/281: creat d2/da/d36/f5d x:0 0 0 2026-03-09T00:03:48.830 INFO:tasks.workunit.client.0.vm03.stdout:7/306: read d2/d4/fb [3840903,41741] 0 2026-03-09T00:03:48.831 INFO:tasks.workunit.client.1.vm06.stdout:9/409: dwrite d1/d3/f11 [0,4194304] 0 2026-03-09T00:03:48.835 INFO:tasks.workunit.client.0.vm03.stdout:0/282: symlink d2/da/d1a/l5e 0 2026-03-09T00:03:48.839 INFO:tasks.workunit.client.0.vm03.stdout:7/307: mkdir d2/d1f/d42/d46/d54/d60 0 2026-03-09T00:03:48.845 INFO:tasks.workunit.client.1.vm06.stdout:9/410: rename d1/d3/d12 to d1/d4/d6e/d14/d25/d85 0 2026-03-09T00:03:48.850 INFO:tasks.workunit.client.1.vm06.stdout:2/578: dwrite d7/d1b/d31/f90 [4194304,4194304] 0 2026-03-09T00:03:48.851 INFO:tasks.workunit.client.1.vm06.stdout:9/411: read d1/d3/f23 [3232049,36951] 0 2026-03-09T00:03:48.852 INFO:tasks.workunit.client.1.vm06.stdout:2/579: mknod d7/da/ca9 0 2026-03-09T00:03:48.854 INFO:tasks.workunit.client.1.vm06.stdout:9/412: creat d1/d3/d2b/d58/f86 x:0 0 0 2026-03-09T00:03:48.861 INFO:tasks.workunit.client.1.vm06.stdout:9/413: read d1/d3/d4f/d52/f5e [4555155,40971] 0 2026-03-09T00:03:48.862 INFO:tasks.workunit.client.1.vm06.stdout:4/465: dwrite d17/d24/d49/f65 [0,4194304] 0 2026-03-09T00:03:48.864 INFO:tasks.workunit.client.1.vm06.stdout:9/414: rmdir d1/d4/d6e 39 2026-03-09T00:03:48.864 INFO:tasks.workunit.client.1.vm06.stdout:9/415: creat d1/d4/d6e/d9/f87 x:0 0 0 2026-03-09T00:03:48.867 INFO:tasks.workunit.client.1.vm06.stdout:9/416: dread d1/d4/d6e/d14/d25/f32 [0,4194304] 0 2026-03-09T00:03:48.867 INFO:tasks.workunit.client.1.vm06.stdout:9/417: chown d1/d4/f6 10 1 2026-03-09T00:03:48.867 INFO:tasks.workunit.client.1.vm06.stdout:9/418: read - d1/f78 zero size 2026-03-09T00:03:48.869 INFO:tasks.workunit.client.1.vm06.stdout:2/580: write d7/da/d4e/d57/d9d/fa0 [2633415,4293] 0 2026-03-09T00:03:48.870 INFO:tasks.workunit.client.1.vm06.stdout:0/503: dwrite d3/f19 [0,4194304] 0 2026-03-09T00:03:48.870 INFO:tasks.workunit.client.1.vm06.stdout:9/419: symlink d1/d4/d6e/d14/d25/d85/d49/l88 0 2026-03-09T00:03:48.872 INFO:tasks.workunit.client.1.vm06.stdout:2/581: truncate d7/d1b/f22 912465 0 2026-03-09T00:03:48.874 INFO:tasks.workunit.client.1.vm06.stdout:0/504: creat d3/d18/d2c/d2d/d74/fa7 x:0 0 0 2026-03-09T00:03:48.876 INFO:tasks.workunit.client.1.vm06.stdout:9/420: truncate d1/d4/d6e/f5d 3504693 0 2026-03-09T00:03:48.876 INFO:tasks.workunit.client.1.vm06.stdout:9/421: write d1/d4/d6e/d14/f1a [625023,62886] 0 2026-03-09T00:03:48.881 INFO:tasks.workunit.client.1.vm06.stdout:2/582: rename d7/d1a/d25/d66/d87/d8e to d7/d1b/da5/daa 0 2026-03-09T00:03:48.881 INFO:tasks.workunit.client.1.vm06.stdout:2/583: creat d7/d1b/d31/fab x:0 0 0 2026-03-09T00:03:48.895 INFO:tasks.workunit.client.1.vm06.stdout:2/584: link d7/l68 d7/d1a/d25/d97/lac 0 2026-03-09T00:03:48.895 INFO:tasks.workunit.client.1.vm06.stdout:2/585: fsync d7/da/f18 0 2026-03-09T00:03:48.898 INFO:tasks.workunit.client.0.vm03.stdout:1/374: dwrite d4/d3a/d43/f49 [0,4194304] 0 2026-03-09T00:03:48.902 INFO:tasks.workunit.client.1.vm06.stdout:3/574: dwrite d11/f48 [0,4194304] 0 2026-03-09T00:03:48.910 INFO:tasks.workunit.client.1.vm06.stdout:2/586: link f3 d7/d1b/da5/daa/fad 0 2026-03-09T00:03:48.919 INFO:tasks.workunit.client.1.vm06.stdout:3/575: mknod d11/d3f/d8d/cc9 0 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='client.14606 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='client.24391 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: Upgrade: Need to upgrade myself (mgr.vm06.rzcvhn) 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: Upgrade: Need to upgrade myself (mgr.vm06.rzcvhn) 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.yvcons", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:03:48.926 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:48 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:48.926 INFO:tasks.workunit.client.1.vm06.stdout:2/587: rename f3 to d7/d1a/d25/fae 0 2026-03-09T00:03:48.926 INFO:tasks.workunit.client.0.vm03.stdout:9/284: sync 2026-03-09T00:03:48.927 INFO:tasks.workunit.client.0.vm03.stdout:9/285: read d15/d1c/d21/f34 [1036054,50906] 0 2026-03-09T00:03:48.928 INFO:tasks.workunit.client.0.vm03.stdout:2/266: dwrite d8/d1b/d2a/d42/f48 [0,4194304] 0 2026-03-09T00:03:48.930 INFO:tasks.workunit.client.0.vm03.stdout:0/283: dwrite d2/da/d36/d39/f48 [0,4194304] 0 2026-03-09T00:03:48.930 INFO:tasks.workunit.client.0.vm03.stdout:0/284: write d2/da/f4f [989254,130621] 0 2026-03-09T00:03:48.931 INFO:tasks.workunit.client.1.vm06.stdout:3/576: rename d11/d28/d2e/d7e/f73 to d11/d28/d2e/d2f/d36/d8f/fca 0 2026-03-09T00:03:48.931 INFO:tasks.workunit.client.1.vm06.stdout:2/588: getdents d7/da/d4e/d57/d9d 0 2026-03-09T00:03:48.932 INFO:tasks.workunit.client.0.vm03.stdout:9/286: creat d15/d1c/d36/f5c x:0 0 0 2026-03-09T00:03:48.935 INFO:tasks.workunit.client.1.vm06.stdout:2/589: getdents d7/d1a/d25/d66 0 2026-03-09T00:03:48.935 INFO:tasks.workunit.client.1.vm06.stdout:3/577: rename d11/d28/d4d/d89/d90/lbb to d11/d28/d2e/d2f/d36/d8f/lcb 0 2026-03-09T00:03:48.935 INFO:tasks.workunit.client.1.vm06.stdout:3/578: write f7 [196438,89098] 0 2026-03-09T00:03:48.935 INFO:tasks.workunit.client.0.vm03.stdout:9/287: dread d15/d1c/d28/d30/f3d [0,4194304] 0 2026-03-09T00:03:48.935 INFO:tasks.workunit.client.0.vm03.stdout:2/267: creat d8/d1b/d2a/d56/f57 x:0 0 0 2026-03-09T00:03:48.939 INFO:tasks.workunit.client.0.vm03.stdout:2/268: dread d8/d1b/d24/f2f [0,4194304] 0 2026-03-09T00:03:48.947 INFO:tasks.workunit.client.0.vm03.stdout:0/285: symlink d2/da/d36/d39/l5f 0 2026-03-09T00:03:48.947 INFO:tasks.workunit.client.0.vm03.stdout:2/269: symlink d8/d1b/d2a/d56/l58 0 2026-03-09T00:03:48.947 INFO:tasks.workunit.client.0.vm03.stdout:9/288: getdents d15/d1c/d28/d30 0 2026-03-09T00:03:48.947 INFO:tasks.workunit.client.0.vm03.stdout:9/289: read - d15/d1c/d21/f41 zero size 2026-03-09T00:03:48.948 INFO:tasks.workunit.client.0.vm03.stdout:0/286: unlink d2/da/c34 0 2026-03-09T00:03:48.948 INFO:tasks.workunit.client.0.vm03.stdout:0/287: chown d2/da/d36/d39/l40 16374331 1 2026-03-09T00:03:48.952 INFO:tasks.workunit.client.1.vm06.stdout:3/579: symlink d11/d28/d2e/d2f/d5b/d5f/lcc 0 2026-03-09T00:03:48.967 INFO:tasks.workunit.client.0.vm03.stdout:9/290: dread d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:48.968 INFO:tasks.workunit.client.0.vm03.stdout:9/291: getdents d15/d1c/d28 0 2026-03-09T00:03:48.968 INFO:tasks.workunit.client.0.vm03.stdout:9/292: read fb [369516,73599] 0 2026-03-09T00:03:48.969 INFO:tasks.workunit.client.0.vm03.stdout:9/293: creat d15/d1c/d36/d4d/f5d x:0 0 0 2026-03-09T00:03:48.969 INFO:tasks.workunit.client.0.vm03.stdout:9/294: chown d15/d1c/d28/f5b 1739732954 1 2026-03-09T00:03:48.969 INFO:tasks.workunit.client.0.vm03.stdout:9/295: readlink d15/l19 0 2026-03-09T00:03:48.970 INFO:tasks.workunit.client.0.vm03.stdout:9/296: unlink d15/f3e 0 2026-03-09T00:03:48.970 INFO:tasks.workunit.client.0.vm03.stdout:9/297: creat d15/d1c/d28/f5e x:0 0 0 2026-03-09T00:03:48.970 INFO:tasks.workunit.client.0.vm03.stdout:9/298: fsync fb 0 2026-03-09T00:03:48.971 INFO:tasks.workunit.client.0.vm03.stdout:9/299: mknod d15/d1c/d28/d30/c5f 0 2026-03-09T00:03:48.973 INFO:tasks.workunit.client.0.vm03.stdout:9/300: mknod d15/c60 0 2026-03-09T00:03:48.998 INFO:tasks.workunit.client.1.vm06.stdout:7/527: dwrite d0/df/d1a/f50 [0,4194304] 0 2026-03-09T00:03:49.000 INFO:tasks.workunit.client.1.vm06.stdout:7/528: creat d0/df/d1a/d35/f94 x:0 0 0 2026-03-09T00:03:49.001 INFO:tasks.workunit.client.0.vm03.stdout:7/308: dwrite d2/d1f/d42/d43/f49 [0,4194304] 0 2026-03-09T00:03:49.001 INFO:tasks.workunit.client.0.vm03.stdout:7/309: write d2/d1f/f3b [1119973,1426] 0 2026-03-09T00:03:49.007 INFO:tasks.workunit.client.1.vm06.stdout:7/529: link d0/c49 d0/df/d1a/d27/d4c/d40/d51/d90/c95 0 2026-03-09T00:03:49.009 INFO:tasks.workunit.client.0.vm03.stdout:9/301: dread d15/d1c/d28/f55 [0,4194304] 0 2026-03-09T00:03:49.010 INFO:tasks.workunit.client.1.vm06.stdout:0/505: dwrite d3/d18/d3c/fa0 [0,4194304] 0 2026-03-09T00:03:49.015 INFO:tasks.workunit.client.0.vm03.stdout:8/263: sync 2026-03-09T00:03:49.015 INFO:tasks.workunit.client.0.vm03.stdout:8/264: stat d7/df/c45 0 2026-03-09T00:03:49.015 INFO:tasks.workunit.client.0.vm03.stdout:8/265: write d7/f18 [250080,107672] 0 2026-03-09T00:03:49.015 INFO:tasks.workunit.client.0.vm03.stdout:8/266: write d7/f34 [872874,90050] 0 2026-03-09T00:03:49.015 INFO:tasks.workunit.client.0.vm03.stdout:6/271: sync 2026-03-09T00:03:49.017 INFO:tasks.workunit.client.1.vm06.stdout:7/530: mknod d0/df/d1a/d3a/d4e/d5e/c96 0 2026-03-09T00:03:49.022 INFO:tasks.workunit.client.1.vm06.stdout:0/506: mkdir d3/d18/d2c/d2d/d74/da8 0 2026-03-09T00:03:49.029 INFO:tasks.workunit.client.1.vm06.stdout:0/507: dread - d3/d18/f82 zero size 2026-03-09T00:03:49.029 INFO:tasks.workunit.client.1.vm06.stdout:7/531: mkdir d0/df/d1a/d27/d4c/d40/d5b/d97 0 2026-03-09T00:03:49.038 INFO:tasks.workunit.client.0.vm03.stdout:9/302: creat d15/d1c/d21/f61 x:0 0 0 2026-03-09T00:03:49.038 INFO:tasks.workunit.client.0.vm03.stdout:9/303: read d15/d1c/d28/f55 [1096700,56474] 0 2026-03-09T00:03:49.041 INFO:tasks.workunit.client.1.vm06.stdout:0/508: dread d3/d18/d2c/d2d/d74/f77 [0,4194304] 0 2026-03-09T00:03:49.057 INFO:tasks.workunit.client.1.vm06.stdout:0/509: stat d3/d18/d2c/l32 0 2026-03-09T00:03:49.058 INFO:tasks.workunit.client.1.vm06.stdout:0/510: link d3/f19 d3/d18/d28/d45/fa9 0 2026-03-09T00:03:49.058 INFO:tasks.workunit.client.1.vm06.stdout:0/511: mknod d3/d18/d1f/d39/d3b/caa 0 2026-03-09T00:03:49.058 INFO:tasks.workunit.client.0.vm03.stdout:8/267: mknod d7/df/d1e/d38/d4c/c4e 0 2026-03-09T00:03:49.058 INFO:tasks.workunit.client.0.vm03.stdout:8/268: stat d7/df/d1a 0 2026-03-09T00:03:49.058 INFO:tasks.workunit.client.0.vm03.stdout:6/272: rename d13/d1e/c45 to d13/d35/d4c/c5e 0 2026-03-09T00:03:49.058 INFO:tasks.workunit.client.0.vm03.stdout:9/304: mkdir d15/d1c/d21/d54/d62 0 2026-03-09T00:03:49.059 INFO:tasks.workunit.client.0.vm03.stdout:0/288: dwrite d2/da/f1b [0,4194304] 0 2026-03-09T00:03:49.069 INFO:tasks.workunit.client.0.vm03.stdout:9/305: rename d15/d1c/d28/l43 to d15/d1c/d36/l63 0 2026-03-09T00:03:49.075 INFO:tasks.workunit.client.0.vm03.stdout:6/273: write f12 [3494453,9935] 0 2026-03-09T00:03:49.089 INFO:tasks.workunit.client.0.vm03.stdout:6/274: dread d13/d1e/f3e [0,4194304] 0 2026-03-09T00:03:49.089 INFO:tasks.workunit.client.0.vm03.stdout:7/310: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:49.091 INFO:tasks.workunit.client.0.vm03.stdout:6/275: mknod d13/d1e/d44/d4a/d52/c5f 0 2026-03-09T00:03:49.094 INFO:tasks.workunit.client.0.vm03.stdout:7/311: truncate d2/d4/f34 434715 0 2026-03-09T00:03:49.097 INFO:tasks.workunit.client.0.vm03.stdout:6/276: symlink d13/d35/d4c/l60 0 2026-03-09T00:03:49.099 INFO:tasks.workunit.client.1.vm06.stdout:2/590: dwrite d7/f8 [8388608,4194304] 0 2026-03-09T00:03:49.099 INFO:tasks.workunit.client.0.vm03.stdout:6/277: mknod d13/d1e/d44/d4a/c61 0 2026-03-09T00:03:49.102 INFO:tasks.workunit.client.1.vm06.stdout:2/591: symlink d7/d1a/d25/d97/laf 0 2026-03-09T00:03:49.104 INFO:tasks.workunit.client.1.vm06.stdout:2/592: rename d7/d1b/l64 to d7/da/d63/d81/lb0 0 2026-03-09T00:03:49.111 INFO:tasks.workunit.client.0.vm03.stdout:0/289: dread d2/fb [0,4194304] 0 2026-03-09T00:03:49.125 INFO:tasks.workunit.client.0.vm03.stdout:0/290: link d2/da/d36/d39/f57 d2/da/d36/d39/d4b/d55/f60 0 2026-03-09T00:03:49.129 INFO:tasks.workunit.client.1.vm06.stdout:5/621: dwrite d5/f6b [0,4194304] 0 2026-03-09T00:03:49.129 INFO:tasks.workunit.client.1.vm06.stdout:5/622: write d5/d1c/d21/d28/f63 [697543,119770] 0 2026-03-09T00:03:49.131 INFO:tasks.workunit.client.0.vm03.stdout:1/375: getdents d4/d3a/d61/d78 0 2026-03-09T00:03:49.133 INFO:tasks.workunit.client.1.vm06.stdout:8/487: sync 2026-03-09T00:03:49.133 INFO:tasks.workunit.client.1.vm06.stdout:8/488: chown db/d53/d70/l41 0 1 2026-03-09T00:03:49.133 INFO:tasks.workunit.client.0.vm03.stdout:3/236: getdents d2/db/d3b/d3f 0 2026-03-09T00:03:49.137 INFO:tasks.workunit.client.1.vm06.stdout:0/512: dwrite d3/d18/d3c/f87 [0,4194304] 0 2026-03-09T00:03:49.143 INFO:tasks.workunit.client.0.vm03.stdout:5/319: dwrite d1c/d20/d55/f46 [0,4194304] 0 2026-03-09T00:03:49.164 INFO:tasks.workunit.client.0.vm03.stdout:0/291: mkdir d2/da/d36/d39/d4b/d61 0 2026-03-09T00:03:49.164 INFO:tasks.workunit.client.1.vm06.stdout:7/532: dwrite d0/df/f8a [0,4194304] 0 2026-03-09T00:03:49.165 INFO:tasks.workunit.client.1.vm06.stdout:7/533: chown d0/df/d1a/d27/d4c/d40/d5b 0 1 2026-03-09T00:03:49.165 INFO:tasks.workunit.client.1.vm06.stdout:7/534: dread - d0/d39/f3e zero size 2026-03-09T00:03:49.166 INFO:tasks.workunit.client.1.vm06.stdout:4/466: rmdir d17/d21 39 2026-03-09T00:03:49.166 INFO:tasks.workunit.client.1.vm06.stdout:4/467: read - d17/d24/d3b/d54/f7e zero size 2026-03-09T00:03:49.166 INFO:tasks.workunit.client.0.vm03.stdout:0/292: link d2/da/d36/d39/l4a d2/l62 0 2026-03-09T00:03:49.168 INFO:tasks.workunit.client.0.vm03.stdout:0/293: link d2/f59 d2/d5a/f63 0 2026-03-09T00:03:49.171 INFO:tasks.workunit.client.0.vm03.stdout:0/294: link d2/da/d36/d39/l4a d2/da/d36/d39/l64 0 2026-03-09T00:03:49.173 INFO:tasks.workunit.client.0.vm03.stdout:0/295: mknod d2/da/d36/d39/d4b/d55/c65 0 2026-03-09T00:03:49.181 INFO:tasks.workunit.client.1.vm06.stdout:5/623: link d5/d44/f81 d5/d44/d84/dc5/fd2 0 2026-03-09T00:03:49.188 INFO:tasks.workunit.client.1.vm06.stdout:8/489: mknod db/c9d 0 2026-03-09T00:03:49.188 INFO:tasks.workunit.client.1.vm06.stdout:0/513: creat d3/d18/d1f/d44/d6a/d73/fab x:0 0 0 2026-03-09T00:03:49.188 INFO:tasks.workunit.client.1.vm06.stdout:7/535: truncate d0/df/d1a/d35/d62/f81 1095564 0 2026-03-09T00:03:49.188 INFO:tasks.workunit.client.1.vm06.stdout:7/536: readlink d0/df/d1a/d27/d4c/d40/d51/l76 0 2026-03-09T00:03:49.189 INFO:tasks.workunit.client.1.vm06.stdout:7/537: creat d0/df/d1a/d35/d62/f98 x:0 0 0 2026-03-09T00:03:49.189 INFO:tasks.workunit.client.1.vm06.stdout:4/468: symlink d17/d24/d3b/d97/l9b 0 2026-03-09T00:03:49.189 INFO:tasks.workunit.client.1.vm06.stdout:4/469: dread d17/d21/d4c/f87 [0,4194304] 0 2026-03-09T00:03:49.189 INFO:tasks.workunit.client.1.vm06.stdout:5/624: rename d5/d1c/l3a to d5/d1c/d21/d28/d5e/d66/d78/dc8/dc3/ld3 0 2026-03-09T00:03:49.190 INFO:tasks.workunit.client.1.vm06.stdout:2/593: dwrite d7/d1a/f30 [4194304,4194304] 0 2026-03-09T00:03:49.190 INFO:tasks.workunit.client.1.vm06.stdout:2/594: truncate d7/f4c 1141023 0 2026-03-09T00:03:49.192 INFO:tasks.workunit.client.1.vm06.stdout:7/538: mkdir d0/d55/d99 0 2026-03-09T00:03:49.197 INFO:tasks.workunit.client.1.vm06.stdout:4/470: dread d17/d24/f36 [0,4194304] 0 2026-03-09T00:03:49.197 INFO:tasks.workunit.client.1.vm06.stdout:0/514: rename d3/d18/f8e to d3/d18/d2c/d2d/d74/d90/fac 0 2026-03-09T00:03:49.197 INFO:tasks.workunit.client.1.vm06.stdout:0/515: chown d3/d18/d2c/d2d/d74/d7d 5844 1 2026-03-09T00:03:49.199 INFO:tasks.workunit.client.1.vm06.stdout:7/539: symlink d0/l9a 0 2026-03-09T00:03:49.227 INFO:tasks.workunit.client.0.vm03.stdout:1/376: dwrite d4/d15/f44 [4194304,4194304] 0 2026-03-09T00:03:49.228 INFO:tasks.workunit.client.0.vm03.stdout:5/320: dwrite ff [0,4194304] 0 2026-03-09T00:03:49.233 INFO:tasks.workunit.client.0.vm03.stdout:1/377: dread d4/d15/d5c/d6c/f71 [0,4194304] 0 2026-03-09T00:03:49.233 INFO:tasks.workunit.client.0.vm03.stdout:5/321: dread d1c/f29 [0,4194304] 0 2026-03-09T00:03:49.233 INFO:tasks.workunit.client.0.vm03.stdout:5/322: stat d1c/d51 0 2026-03-09T00:03:49.238 INFO:tasks.workunit.client.0.vm03.stdout:5/323: rmdir d1c/d51 39 2026-03-09T00:03:49.238 INFO:tasks.workunit.client.0.vm03.stdout:5/324: fsync fb 0 2026-03-09T00:03:49.238 INFO:tasks.workunit.client.0.vm03.stdout:5/325: chown d1c/d20/d55/d43/f53 7240391 1 2026-03-09T00:03:49.238 INFO:tasks.workunit.client.0.vm03.stdout:5/326: unlink c19 0 2026-03-09T00:03:49.275 INFO:tasks.workunit.client.1.vm06.stdout:8/490: dwrite db/dd/f27 [0,4194304] 0 2026-03-09T00:03:49.279 INFO:tasks.workunit.client.1.vm06.stdout:8/491: mknod db/dd/d24/d80/c9e 0 2026-03-09T00:03:49.280 INFO:tasks.workunit.client.1.vm06.stdout:8/492: truncate db/dd/d48/f4e 104862 0 2026-03-09T00:03:49.280 INFO:tasks.workunit.client.1.vm06.stdout:7/540: dread d0/df/d1a/d22/f28 [0,4194304] 0 2026-03-09T00:03:49.283 INFO:tasks.workunit.client.1.vm06.stdout:7/541: unlink d0/df/d1a/d27/d70/l8e 0 2026-03-09T00:03:49.288 INFO:tasks.workunit.client.0.vm03.stdout:8/269: dwrite f3 [0,4194304] 0 2026-03-09T00:03:49.289 INFO:tasks.workunit.client.1.vm06.stdout:2/595: write d7/d1b/f22 [21887,101244] 0 2026-03-09T00:03:49.289 INFO:tasks.workunit.client.1.vm06.stdout:2/596: read d7/d1b/f46 [2860931,12748] 0 2026-03-09T00:03:49.301 INFO:tasks.workunit.client.1.vm06.stdout:9/422: write d1/d4/d6e/f5d [1238747,125469] 0 2026-03-09T00:03:49.301 INFO:tasks.workunit.client.1.vm06.stdout:9/423: chown d1/d4/l42 17247 1 2026-03-09T00:03:49.302 INFO:tasks.workunit.client.1.vm06.stdout:2/597: symlink d7/da/d4e/d57/d9d/lb1 0 2026-03-09T00:03:49.305 INFO:tasks.workunit.client.1.vm06.stdout:2/598: unlink d7/da/d4e/d57/d9d/lb1 0 2026-03-09T00:03:49.308 INFO:tasks.workunit.client.1.vm06.stdout:2/599: mkdir d7/d1a/d25/d66/d87/da8/db2 0 2026-03-09T00:03:49.309 INFO:tasks.workunit.client.1.vm06.stdout:3/580: fsync d11/d28/d2e/d2f/d36/f59 0 2026-03-09T00:03:49.309 INFO:tasks.workunit.client.1.vm06.stdout:3/581: truncate d11/d28/d2e/f62 7910062 0 2026-03-09T00:03:49.309 INFO:tasks.workunit.client.1.vm06.stdout:3/582: stat d11/d28/d2e/d2f/d5b/d94/faa 0 2026-03-09T00:03:49.316 INFO:tasks.workunit.client.0.vm03.stdout:9/306: rename d15/d1c/d28/d30 to d15/d1c/d21/d64 0 2026-03-09T00:03:49.316 INFO:tasks.workunit.client.1.vm06.stdout:9/424: write d1/d4/d6e/d14/d25/f32 [1022746,104749] 0 2026-03-09T00:03:49.316 INFO:tasks.workunit.client.1.vm06.stdout:9/425: write d1/f2a [15813,47546] 0 2026-03-09T00:03:49.319 INFO:tasks.workunit.client.1.vm06.stdout:2/600: dread d7/da/db/de/f49 [0,4194304] 0 2026-03-09T00:03:49.319 INFO:tasks.workunit.client.1.vm06.stdout:2/601: fdatasync d7/d1a/d25/fae 0 2026-03-09T00:03:49.320 INFO:tasks.workunit.client.0.vm03.stdout:9/307: dread d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:49.324 INFO:tasks.workunit.client.1.vm06.stdout:5/625: dwrite d5/d1c/d68/fc7 [0,4194304] 0 2026-03-09T00:03:49.326 INFO:tasks.workunit.client.1.vm06.stdout:0/516: dwrite d3/d18/d1f/d44/f58 [0,4194304] 0 2026-03-09T00:03:49.326 INFO:tasks.workunit.client.1.vm06.stdout:0/517: stat d3/f7 0 2026-03-09T00:03:49.332 INFO:tasks.workunit.client.1.vm06.stdout:4/471: dwrite d17/f20 [0,4194304] 0 2026-03-09T00:03:49.333 INFO:tasks.workunit.client.0.vm03.stdout:1/378: dwrite d4/d15/d5c/d6c/f71 [0,4194304] 0 2026-03-09T00:03:49.337 INFO:tasks.workunit.client.0.vm03.stdout:9/308: write d15/d1c/d36/f3a [3927447,118355] 0 2026-03-09T00:03:49.337 INFO:tasks.workunit.client.0.vm03.stdout:9/309: readlink d15/l19 0 2026-03-09T00:03:49.345 INFO:tasks.workunit.client.0.vm03.stdout:9/310: dread fb [0,4194304] 0 2026-03-09T00:03:49.345 INFO:tasks.workunit.client.0.vm03.stdout:9/311: write d15/d1c/d28/f29 [969153,94535] 0 2026-03-09T00:03:49.345 INFO:tasks.workunit.client.1.vm06.stdout:9/426: truncate d1/d4/d2f/f43 24282 0 2026-03-09T00:03:49.346 INFO:tasks.workunit.client.1.vm06.stdout:2/602: truncate d7/d1a/d25/d66/f84 54743 0 2026-03-09T00:03:49.347 INFO:tasks.workunit.client.0.vm03.stdout:3/237: rename d2/f3d to d2/db/d2d/f45 0 2026-03-09T00:03:49.347 INFO:tasks.workunit.client.0.vm03.stdout:3/238: readlink d2/db/d2d/l41 0 2026-03-09T00:03:49.348 INFO:tasks.workunit.client.1.vm06.stdout:5/626: creat d5/d1c/d21/d28/d5e/d66/d78/da6/fd4 x:0 0 0 2026-03-09T00:03:49.349 INFO:tasks.workunit.client.1.vm06.stdout:0/518: rmdir d3/d18/d1f 39 2026-03-09T00:03:49.351 INFO:tasks.workunit.client.1.vm06.stdout:2/603: dread d7/f8 [8388608,4194304] 0 2026-03-09T00:03:49.352 INFO:tasks.workunit.client.1.vm06.stdout:2/604: readlink d7/l68 0 2026-03-09T00:03:49.354 INFO:tasks.workunit.client.1.vm06.stdout:7/542: dwrite d0/df/d17/f38 [4194304,4194304] 0 2026-03-09T00:03:49.354 INFO:tasks.workunit.client.1.vm06.stdout:2/605: dread d7/da/db/de/f49 [0,4194304] 0 2026-03-09T00:03:49.354 INFO:tasks.workunit.client.1.vm06.stdout:7/543: write d0/df/d1a/d27/f37 [985158,93694] 0 2026-03-09T00:03:49.362 INFO:tasks.workunit.client.1.vm06.stdout:3/583: dwrite d11/f3c [0,4194304] 0 2026-03-09T00:03:49.368 INFO:tasks.workunit.client.0.vm03.stdout:1/379: rename d4/d3a/d3d/d46/l54 to d4/d3a/d32/l80 0 2026-03-09T00:03:49.374 INFO:tasks.workunit.client.0.vm03.stdout:1/380: mkdir d4/d3a/d61/d78/d81 0 2026-03-09T00:03:49.396 INFO:tasks.workunit.client.0.vm03.stdout:1/381: write d4/d3a/d3d/d46/f70 [612798,62743] 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.0.vm03.stdout:6/278: getdents d13/d1e/d44/d4a 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:9/427: symlink d1/d3/d50/l89 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:5/627: mkdir d5/d1c/d21/d28/d5e/d66/d78/dd5 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:0/519: symlink d3/d18/d1f/d44/lad 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:0/520: dread - d3/d18/d2c/d2d/d74/d90/fac zero size 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:0/521: chown d3/d18/d1f/f26 319 1 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:2/606: mknod d7/d1a/d25/d66/d87/cb3 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:7/544: mkdir d0/df/d1a/d27/d70/d9b 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:7/545: chown d0/df/d1a/d3a/l71 5611772 1 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:7/546: write d0/fe [631133,83221] 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:9/428: rmdir d1 39 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:9/429: readlink d1/d3/d4f/d52/l59 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:9/430: stat d1/d4/d6e 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:9/431: dread - d1/d4/d6e/d14/d25/f7a zero size 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:9/432: creat d1/d4/d6e/d9/f8a x:0 0 0 2026-03-09T00:03:49.397 INFO:tasks.workunit.client.1.vm06.stdout:7/547: rmdir d0/df/d1a/d27/d4c/d40/d5b/d97 0 2026-03-09T00:03:49.399 INFO:tasks.workunit.client.1.vm06.stdout:9/433: unlink d1/d4/f44 0 2026-03-09T00:03:49.399 INFO:tasks.workunit.client.1.vm06.stdout:9/434: chown d1/d4/d6e/d14/d25/d85/d49/f69 1819783642 1 2026-03-09T00:03:49.399 INFO:tasks.workunit.client.1.vm06.stdout:9/435: dread - d1/d4/d6e/d14/d25/f6f zero size 2026-03-09T00:03:49.401 INFO:tasks.workunit.client.0.vm03.stdout:1/382: write d4/d3a/d61/f65 [3879695,32249] 0 2026-03-09T00:03:49.402 INFO:tasks.workunit.client.1.vm06.stdout:0/522: write d3/d18/d28/d45/fa9 [1215235,74068] 0 2026-03-09T00:03:49.402 INFO:tasks.workunit.client.1.vm06.stdout:0/523: readlink d3/d18/d28/d45/l6d 0 2026-03-09T00:03:49.406 INFO:tasks.workunit.client.1.vm06.stdout:7/548: truncate d0/df/d1a/d3a/d4e/f63 4121109 0 2026-03-09T00:03:49.409 INFO:tasks.workunit.client.1.vm06.stdout:9/436: creat d1/d3/d4f/d52/f8b x:0 0 0 2026-03-09T00:03:49.410 INFO:tasks.workunit.client.0.vm03.stdout:1/383: rename d4/d15/f3f to d4/d5e/f82 0 2026-03-09T00:03:49.427 INFO:tasks.workunit.client.0.vm03.stdout:8/270: creat d7/df/d1a/f4f x:0 0 0 2026-03-09T00:03:49.430 INFO:tasks.workunit.client.0.vm03.stdout:3/239: dwrite d2/db/f24 [0,4194304] 0 2026-03-09T00:03:49.430 INFO:tasks.workunit.client.0.vm03.stdout:3/240: chown d2/db/c29 898 1 2026-03-09T00:03:49.446 INFO:tasks.workunit.client.0.vm03.stdout:9/312: dwrite d15/d1c/d28/f2f [0,4194304] 0 2026-03-09T00:03:49.450 INFO:tasks.workunit.client.0.vm03.stdout:8/271: write d7/fd [3122078,81714] 0 2026-03-09T00:03:49.452 INFO:tasks.workunit.client.0.vm03.stdout:9/313: read d15/d1c/d28/f2f [2266208,83306] 0 2026-03-09T00:03:49.457 INFO:tasks.workunit.client.0.vm03.stdout:1/384: creat d4/d6/f83 x:0 0 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.1.vm06.stdout:2/607: dwrite d7/da/db/f74 [0,4194304] 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.1.vm06.stdout:2/608: chown d7/d1a/d3c/l75 345849948 1 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:8/272: mknod d7/df/d1e/d38/d4c/c50 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:9/314: link d15/f17 d15/d1c/d21/d54/f65 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:9/315: write d15/d1c/d28/f55 [1847407,27297] 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:8/273: mknod d7/df/d1e/d38/d4c/c51 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:8/274: rename d7/df/d1e/l41 to d7/df/l52 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:8/275: stat d7/df/c14 0 2026-03-09T00:03:49.468 INFO:tasks.workunit.client.0.vm03.stdout:8/276: rmdir d7/df/d1a/d40 39 2026-03-09T00:03:49.471 INFO:tasks.workunit.client.0.vm03.stdout:1/385: dread d4/d15/f45 [0,4194304] 0 2026-03-09T00:03:49.478 INFO:tasks.workunit.client.0.vm03.stdout:8/277: getdents d7/df/d1a/d2b/d43 0 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/278: creat d7/df/f53 x:0 0 0 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/279: rmdir d7 39 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/280: write d7/df/d1a/f1c [2288104,89986] 0 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/281: truncate d7/df/d1e/d38/f3e 540650 0 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/282: symlink d7/df/d1e/d3f/l54 0 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/283: creat d7/df/f55 x:0 0 0 2026-03-09T00:03:49.490 INFO:tasks.workunit.client.0.vm03.stdout:8/284: stat d7/df/f37 0 2026-03-09T00:03:49.491 INFO:tasks.workunit.client.0.vm03.stdout:6/279: dwrite d13/d1e/f28 [0,4194304] 0 2026-03-09T00:03:49.492 INFO:tasks.workunit.client.0.vm03.stdout:8/285: rename d7/fd to d7/df/f56 0 2026-03-09T00:03:49.495 INFO:tasks.workunit.client.0.vm03.stdout:6/280: mkdir d13/d35/d4c/d62 0 2026-03-09T00:03:49.497 INFO:tasks.workunit.client.0.vm03.stdout:8/286: creat d7/df/d1a/d2b/d43/f57 x:0 0 0 2026-03-09T00:03:49.514 INFO:tasks.workunit.client.0.vm03.stdout:6/281: truncate f2 1875393 0 2026-03-09T00:03:49.516 INFO:tasks.workunit.client.0.vm03.stdout:6/282: symlink d13/d1e/d44/d4a/l63 0 2026-03-09T00:03:49.520 INFO:tasks.workunit.client.0.vm03.stdout:6/283: truncate d13/f31 2263991 0 2026-03-09T00:03:49.543 INFO:tasks.workunit.client.1.vm06.stdout:1/455: sync 2026-03-09T00:03:49.544 INFO:tasks.workunit.client.1.vm06.stdout:6/541: sync 2026-03-09T00:03:49.545 INFO:tasks.workunit.client.1.vm06.stdout:6/542: mkdir d4/d27/d42/d7e/daa 0 2026-03-09T00:03:49.549 INFO:tasks.workunit.client.0.vm03.stdout:9/316: dwrite d15/d1c/d36/d4d/f5d [0,4194304] 0 2026-03-09T00:03:49.553 INFO:tasks.workunit.client.1.vm06.stdout:0/524: dwrite d3/d18/d28/d45/f97 [0,4194304] 0 2026-03-09T00:03:49.557 INFO:tasks.workunit.client.0.vm03.stdout:9/317: mknod d15/d1c/d21/c66 0 2026-03-09T00:03:49.557 INFO:tasks.workunit.client.0.vm03.stdout:9/318: mkdir d15/d1c/d21/d67 0 2026-03-09T00:03:49.561 INFO:tasks.workunit.client.0.vm03.stdout:9/319: write d15/d1c/d36/d4d/f5d [2737534,122684] 0 2026-03-09T00:03:49.561 INFO:tasks.workunit.client.0.vm03.stdout:9/320: fsync d15/f23 0 2026-03-09T00:03:49.562 INFO:tasks.workunit.client.1.vm06.stdout:8/493: rename db/dd/d24/d80 to db/dd/d85/d9f 0 2026-03-09T00:03:49.562 INFO:tasks.workunit.client.1.vm06.stdout:8/494: unlink db/dd/d48/f89 0 2026-03-09T00:03:49.563 INFO:tasks.workunit.client.1.vm06.stdout:8/495: rename db/d53/d70/d38/d4d/f83 to db/d53/d7c/fa0 0 2026-03-09T00:03:49.565 INFO:tasks.workunit.client.0.vm03.stdout:9/321: dread d15/d1c/d21/f4c [0,4194304] 0 2026-03-09T00:03:49.565 INFO:tasks.workunit.client.1.vm06.stdout:8/496: symlink db/d1e/d46/d94/la1 0 2026-03-09T00:03:49.565 INFO:tasks.workunit.client.1.vm06.stdout:8/497: fdatasync db/d1e/f5f 0 2026-03-09T00:03:49.566 INFO:tasks.workunit.client.1.vm06.stdout:2/609: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:03:49.567 INFO:tasks.workunit.client.1.vm06.stdout:8/498: rename db/d1e/d46/f69 to db/d53/d6d/fa2 0 2026-03-09T00:03:49.568 INFO:tasks.workunit.client.1.vm06.stdout:8/499: mknod db/d74/d78/d98/ca3 0 2026-03-09T00:03:49.569 INFO:tasks.workunit.client.1.vm06.stdout:8/500: mknod db/dd/d85/ca4 0 2026-03-09T00:03:49.569 INFO:tasks.workunit.client.1.vm06.stdout:8/501: mknod db/dd/d85/d9f/ca5 0 2026-03-09T00:03:49.569 INFO:tasks.workunit.client.1.vm06.stdout:8/502: chown db/d53/d70/f75 15 1 2026-03-09T00:03:49.569 INFO:tasks.workunit.client.1.vm06.stdout:8/503: fsync db/d1e/f2e 0 2026-03-09T00:03:49.570 INFO:tasks.workunit.client.1.vm06.stdout:8/504: creat db/d53/d70/fa6 x:0 0 0 2026-03-09T00:03:49.570 INFO:tasks.workunit.client.1.vm06.stdout:8/505: mkdir db/dd/d24/da7 0 2026-03-09T00:03:49.576 INFO:tasks.workunit.client.0.vm03.stdout:8/287: dwrite d7/df/f2c [0,4194304] 0 2026-03-09T00:03:49.576 INFO:tasks.workunit.client.0.vm03.stdout:8/288: write d7/f11 [1355848,31601] 0 2026-03-09T00:03:49.577 INFO:tasks.workunit.client.1.vm06.stdout:6/543: dwrite d4/d27/d42/d4b/f83 [0,4194304] 0 2026-03-09T00:03:49.583 INFO:tasks.workunit.client.1.vm06.stdout:6/544: rmdir d4/d16/d46/d94 0 2026-03-09T00:03:49.584 INFO:tasks.workunit.client.1.vm06.stdout:6/545: symlink d4/d16/d46/lab 0 2026-03-09T00:03:49.584 INFO:tasks.workunit.client.1.vm06.stdout:6/546: fsync d4/d27/d42/d52/f9e 0 2026-03-09T00:03:49.585 INFO:tasks.workunit.client.1.vm06.stdout:6/547: mkdir d4/d27/d42/d7e/dac 0 2026-03-09T00:03:49.585 INFO:tasks.workunit.client.1.vm06.stdout:6/548: creat d4/d27/d42/d4b/fad x:0 0 0 2026-03-09T00:03:49.585 INFO:tasks.workunit.client.1.vm06.stdout:6/549: stat d4/d27/d42/d7e/daa 0 2026-03-09T00:03:49.588 INFO:tasks.workunit.client.1.vm06.stdout:8/506: write db/dd/d48/f68 [3452109,256] 0 2026-03-09T00:03:49.588 INFO:tasks.workunit.client.1.vm06.stdout:8/507: creat db/d53/d70/d38/fa8 x:0 0 0 2026-03-09T00:03:49.588 INFO:tasks.workunit.client.1.vm06.stdout:8/508: fdatasync db/d53/d70/d38/d4d/d79/f96 0 2026-03-09T00:03:49.590 INFO:tasks.workunit.client.1.vm06.stdout:6/550: write d4/f68 [2116355,29404] 0 2026-03-09T00:03:49.590 INFO:tasks.workunit.client.1.vm06.stdout:3/584: creat d11/d3f/fcd x:0 0 0 2026-03-09T00:03:49.590 INFO:tasks.workunit.client.1.vm06.stdout:3/585: fsync d11/d28/d2e/d2f/d36/d8f/fca 0 2026-03-09T00:03:49.591 INFO:tasks.workunit.client.1.vm06.stdout:3/586: readlink d11/d28/d2e/d2f/d36/d8f/lcb 0 2026-03-09T00:03:49.591 INFO:tasks.workunit.client.1.vm06.stdout:3/587: read - d11/d28/d2e/d2f/d5b/d94/fb3 zero size 2026-03-09T00:03:49.591 INFO:tasks.workunit.client.1.vm06.stdout:3/588: write d11/d28/d2e/d2f/d5b/d5f/f81 [1668639,72871] 0 2026-03-09T00:03:49.591 INFO:tasks.workunit.client.1.vm06.stdout:3/589: write d11/d28/d2e/d2f/d36/fb7 [5595417,30478] 0 2026-03-09T00:03:49.593 INFO:tasks.workunit.client.1.vm06.stdout:6/551: rmdir d4/d16/d46/d90 39 2026-03-09T00:03:49.593 INFO:tasks.workunit.client.1.vm06.stdout:8/509: dread db/dd/f7a [0,4194304] 0 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:3/590: creat d11/d28/d2e/d2f/d5b/d5f/d91/fce x:0 0 0 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:3/591: truncate d11/d28/d4d/d9b/f9d 725150 0 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:6/552: creat d4/fae x:0 0 0 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:6/553: creat d4/d27/d42/d52/d7d/faf x:0 0 0 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:6/554: dread - d4/f6e zero size 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:6/555: chown d4/d27/c48 37 1 2026-03-09T00:03:49.594 INFO:tasks.workunit.client.1.vm06.stdout:6/556: write d4/d27/d42/f75 [191904,63415] 0 2026-03-09T00:03:49.595 INFO:tasks.workunit.client.1.vm06.stdout:3/592: symlink d11/d28/d2e/d7e/d83/lcf 0 2026-03-09T00:03:49.595 INFO:tasks.workunit.client.1.vm06.stdout:8/510: symlink db/dd/d24/da7/la9 0 2026-03-09T00:03:49.595 INFO:tasks.workunit.client.1.vm06.stdout:8/511: chown db/dd/d85/d9f/f88 700 1 2026-03-09T00:03:49.609 INFO:tasks.workunit.client.1.vm06.stdout:3/593: symlink d11/d28/d2e/d2f/d5b/d5f/d91/ld0 0 2026-03-09T00:03:49.609 INFO:tasks.workunit.client.1.vm06.stdout:3/594: readlink d11/d28/d2e/d7e/d83/l8a 0 2026-03-09T00:03:49.609 INFO:tasks.workunit.client.1.vm06.stdout:3/595: fsync d11/d28/f5e 0 2026-03-09T00:03:49.612 INFO:tasks.workunit.client.1.vm06.stdout:3/596: creat d11/d28/d2e/db2/dc2/fd1 x:0 0 0 2026-03-09T00:03:49.612 INFO:tasks.workunit.client.1.vm06.stdout:3/597: read d11/d28/d2e/d2f/d5b/d94/faa [976587,84316] 0 2026-03-09T00:03:49.612 INFO:tasks.workunit.client.1.vm06.stdout:3/598: stat f9 0 2026-03-09T00:03:49.612 INFO:tasks.workunit.client.1.vm06.stdout:3/599: write d11/d28/d2e/d2f/d36/f55 [170291,50455] 0 2026-03-09T00:03:49.613 INFO:tasks.workunit.client.0.vm03.stdout:4/350: sync 2026-03-09T00:03:49.615 INFO:tasks.workunit.client.1.vm06.stdout:3/600: mkdir d11/d28/d4d/d89/d90/dd2 0 2026-03-09T00:03:49.617 INFO:tasks.workunit.client.0.vm03.stdout:4/351: creat d7/d20/d29/d38/f6e x:0 0 0 2026-03-09T00:03:49.617 INFO:tasks.workunit.client.0.vm03.stdout:4/352: write d7/d20/d35/d66/f6c [38616,52281] 0 2026-03-09T00:03:49.617 INFO:tasks.workunit.client.0.vm03.stdout:4/353: write d7/d23/d25/f3e [2081559,98010] 0 2026-03-09T00:03:49.619 INFO:tasks.workunit.client.1.vm06.stdout:3/601: rmdir d11/d28/d2e/d2f/d36/d8f 39 2026-03-09T00:03:49.624 INFO:tasks.workunit.client.1.vm06.stdout:0/525: rmdir d3/d18/d2c/d2d/d74 39 2026-03-09T00:03:49.624 INFO:tasks.workunit.client.1.vm06.stdout:0/526: chown d3/d18/d2c 526 1 2026-03-09T00:03:49.625 INFO:tasks.workunit.client.1.vm06.stdout:0/527: mknod d3/d18/d2c/d2d/d74/da8/cae 0 2026-03-09T00:03:49.635 INFO:tasks.workunit.client.1.vm06.stdout:3/602: write d11/d28/d2e/f65 [1553347,121879] 0 2026-03-09T00:03:49.635 INFO:tasks.workunit.client.1.vm06.stdout:3/603: write d11/d28/d57/f7b [1777057,8531] 0 2026-03-09T00:03:49.652 INFO:tasks.workunit.client.1.vm06.stdout:0/528: rmdir d3/d18/d1f/d44 39 2026-03-09T00:03:49.652 INFO:tasks.workunit.client.1.vm06.stdout:0/529: chown d3/f7 14492 1 2026-03-09T00:03:49.658 INFO:tasks.workunit.client.1.vm06.stdout:8/512: dwrite db/d53/d70/d38/d4d/f65 [0,4194304] 0 2026-03-09T00:03:49.658 INFO:tasks.workunit.client.1.vm06.stdout:8/513: creat db/d74/faa x:0 0 0 2026-03-09T00:03:49.659 INFO:tasks.workunit.client.1.vm06.stdout:0/530: unlink d3/d18/d1f/d39/d49/d60/l8d 0 2026-03-09T00:03:49.673 INFO:tasks.workunit.client.1.vm06.stdout:8/514: mkdir db/dd/d24/da7/dab 0 2026-03-09T00:03:49.678 INFO:tasks.workunit.client.1.vm06.stdout:8/515: dread db/dd/d24/f6e [0,4194304] 0 2026-03-09T00:03:49.679 INFO:tasks.workunit.client.0.vm03.stdout:4/354: dwrite d7/d27/f2c [4194304,4194304] 0 2026-03-09T00:03:49.679 INFO:tasks.workunit.client.0.vm03.stdout:4/355: chown d7/f1d 48439 1 2026-03-09T00:03:49.679 INFO:tasks.workunit.client.0.vm03.stdout:4/356: dread - d7/d20/d29/d38/d3a/f50 zero size 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:1/456: creat d6/f98 x:0 0 0 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:9/437: symlink d1/d3/d4f/l8c 0 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:1/457: fdatasync d6/d4c/d71/f84 0 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:9/438: fdatasync d1/d3/d4f/f71 0 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:9/439: dread - d1/d3/d4f/f71 zero size 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:9/440: truncate d1/d4/d6e/d14/d25/f7a 354168 0 2026-03-09T00:03:49.680 INFO:tasks.workunit.client.1.vm06.stdout:9/441: fsync d1/d4/d6e/d9/f65 0 2026-03-09T00:03:49.682 INFO:tasks.workunit.client.0.vm03.stdout:4/357: mkdir d7/d6f 0 2026-03-09T00:03:49.683 INFO:tasks.workunit.client.1.vm06.stdout:8/516: mkdir db/dd/d24/dac 0 2026-03-09T00:03:49.684 INFO:tasks.workunit.client.0.vm03.stdout:4/358: creat d7/d20/f70 x:0 0 0 2026-03-09T00:03:49.684 INFO:tasks.workunit.client.0.vm03.stdout:4/359: getdents d7/d20/d29/d4e 0 2026-03-09T00:03:49.689 INFO:tasks.workunit.client.1.vm06.stdout:7/549: write d0/df/d1a/d3a/d4e/f63 [369985,36427] 0 2026-03-09T00:03:49.689 INFO:tasks.workunit.client.1.vm06.stdout:7/550: chown d0/df/d1a/d22 299354908 1 2026-03-09T00:03:49.689 INFO:tasks.workunit.client.1.vm06.stdout:7/551: read - d0/df/d17/f74 zero size 2026-03-09T00:03:49.690 INFO:tasks.workunit.client.0.vm03.stdout:4/360: dread d7/d20/f34 [0,4194304] 0 2026-03-09T00:03:49.690 INFO:tasks.workunit.client.0.vm03.stdout:4/361: creat d7/f71 x:0 0 0 2026-03-09T00:03:49.696 INFO:tasks.workunit.client.1.vm06.stdout:7/552: symlink d0/df/d1a/d27/d70/l9c 0 2026-03-09T00:03:49.699 INFO:tasks.workunit.client.0.vm03.stdout:4/362: unlink d7/d20/d35/d66/f6c 0 2026-03-09T00:03:49.699 INFO:tasks.workunit.client.0.vm03.stdout:4/363: chown d7/f28 0 1 2026-03-09T00:03:49.699 INFO:tasks.workunit.client.0.vm03.stdout:4/364: stat d7/d20/d29/d54/d58 0 2026-03-09T00:03:49.705 INFO:tasks.workunit.client.0.vm03.stdout:6/284: rmdir d13/d35/d4c 39 2026-03-09T00:03:49.705 INFO:tasks.workunit.client.0.vm03.stdout:6/285: truncate d13/f3a 974789 0 2026-03-09T00:03:49.705 INFO:tasks.workunit.client.1.vm06.stdout:7/553: write d0/f7 [2501333,24852] 0 2026-03-09T00:03:49.709 INFO:tasks.workunit.client.1.vm06.stdout:7/554: mknod d0/d55/d99/c9d 0 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.1.vm06.stdout:7/555: creat d0/df/d1a/d22/f9e x:0 0 0 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.1.vm06.stdout:7/556: dread - d0/df/d1a/f72 zero size 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.0.vm03.stdout:6/286: dread d13/f17 [0,4194304] 0 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.1.vm06.stdout:7/557: write d0/df/d1a/d3a/d4e/f63 [2090044,92271] 0 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.1.vm06.stdout:7/558: readlink d0/l9a 0 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.1.vm06.stdout:7/559: fsync d0/df/d1a/d35/d62/f81 0 2026-03-09T00:03:49.713 INFO:tasks.workunit.client.1.vm06.stdout:7/560: fsync d0/df/d17/f38 0 2026-03-09T00:03:49.714 INFO:tasks.workunit.client.1.vm06.stdout:7/561: getdents d0/df/d1a/d27/d4c/d40/d51 0 2026-03-09T00:03:49.714 INFO:tasks.workunit.client.1.vm06.stdout:7/562: chown d0/df/d1a/d3a/f83 1 1 2026-03-09T00:03:49.714 INFO:tasks.workunit.client.1.vm06.stdout:7/563: fsync d0/df/d1a/f25 0 2026-03-09T00:03:49.714 INFO:tasks.workunit.client.1.vm06.stdout:7/564: getdents d0/df/d17 0 2026-03-09T00:03:49.714 INFO:tasks.workunit.client.1.vm06.stdout:7/565: creat d0/df/d1a/d27/d70/d9b/f9f x:0 0 0 2026-03-09T00:03:49.716 INFO:tasks.workunit.client.1.vm06.stdout:7/566: symlink d0/d39/la0 0 2026-03-09T00:03:49.758 INFO:tasks.workunit.client.1.vm06.stdout:3/604: rename d11/f20 to d11/d28/d2e/d7e/fd3 0 2026-03-09T00:03:49.760 INFO:tasks.workunit.client.1.vm06.stdout:1/458: rename d6/d4c/d71/f4b to d6/d63/f99 0 2026-03-09T00:03:49.761 INFO:tasks.workunit.client.1.vm06.stdout:8/517: dwrite db/d74/faa [0,4194304] 0 2026-03-09T00:03:49.761 INFO:tasks.workunit.client.1.vm06.stdout:1/459: creat d6/d21/d2d/d3b/d42/f9a x:0 0 0 2026-03-09T00:03:49.762 INFO:tasks.workunit.client.0.vm03.stdout:4/365: dwrite f4 [0,4194304] 0 2026-03-09T00:03:49.763 INFO:tasks.workunit.client.1.vm06.stdout:8/518: mknod db/d74/d87/cad 0 2026-03-09T00:03:49.764 INFO:tasks.workunit.client.1.vm06.stdout:9/442: dwrite d1/d4/f6 [0,4194304] 0 2026-03-09T00:03:49.777 INFO:tasks.workunit.client.1.vm06.stdout:9/443: mknod d1/d3/d50/c8d 0 2026-03-09T00:03:49.777 INFO:tasks.workunit.client.1.vm06.stdout:9/444: chown d1/d3/d4f/d52/l7c 16 1 2026-03-09T00:03:49.777 INFO:tasks.workunit.client.1.vm06.stdout:9/445: write d1/d4/d6e/d14/d25/f4e [166231,62176] 0 2026-03-09T00:03:49.780 INFO:tasks.workunit.client.0.vm03.stdout:1/386: mkdir d4/d3a/d84 0 2026-03-09T00:03:49.780 INFO:tasks.workunit.client.0.vm03.stdout:1/387: chown d4/d3a/d61/d78/d81 140214845 1 2026-03-09T00:03:49.780 INFO:tasks.workunit.client.1.vm06.stdout:9/446: symlink d1/d3/d50/l8e 0 2026-03-09T00:03:49.780 INFO:tasks.workunit.client.0.vm03.stdout:1/388: mknod d4/d3a/d3d/d46/c85 0 2026-03-09T00:03:49.780 INFO:tasks.workunit.client.0.vm03.stdout:1/389: fdatasync d4/d15/d5c/d6c/f71 0 2026-03-09T00:03:49.780 INFO:tasks.workunit.client.0.vm03.stdout:1/390: fdatasync d4/f7d 0 2026-03-09T00:03:49.782 INFO:tasks.workunit.client.0.vm03.stdout:1/391: mkdir d4/d15/d86 0 2026-03-09T00:03:49.790 INFO:tasks.workunit.client.0.vm03.stdout:1/392: truncate f1 827148 0 2026-03-09T00:03:49.790 INFO:tasks.workunit.client.0.vm03.stdout:1/393: chown d4/l7 12333 1 2026-03-09T00:03:49.791 INFO:tasks.workunit.client.0.vm03.stdout:1/394: unlink d4/l72 0 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:0/296: sync 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:7/312: sync 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:0/297: chown d2/d1f/f3f 3 1 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:5/327: rmdir d1c/d20/d55/d66 39 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:5/328: getdents d1c/d20/d55/d43 0 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:5/329: chown d1c/d20/d55/d43/f53 7124 1 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:5/330: write d1c/d20/d55/f5a [800646,50787] 0 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:5/331: dread - d1c/d51/f68 zero size 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:2/270: sync 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:2/271: dread - d8/d1b/f3d zero size 2026-03-09T00:03:49.793 INFO:tasks.workunit.client.0.vm03.stdout:2/272: read - d8/d1b/f22 zero size 2026-03-09T00:03:49.800 INFO:tasks.workunit.client.0.vm03.stdout:9/322: rename d15/d1c/c57 to d15/d1c/d36/d4d/c68 0 2026-03-09T00:03:49.805 INFO:tasks.workunit.client.0.vm03.stdout:9/323: fdatasync d15/d1c/d28/f39 0 2026-03-09T00:03:49.805 INFO:tasks.workunit.client.0.vm03.stdout:6/287: mknod d13/d1e/c64 0 2026-03-09T00:03:49.805 INFO:tasks.workunit.client.0.vm03.stdout:6/288: truncate d13/f17 1121416 0 2026-03-09T00:03:49.805 INFO:tasks.workunit.client.0.vm03.stdout:2/273: link d8/f3e d8/f59 0 2026-03-09T00:03:49.805 INFO:tasks.workunit.client.0.vm03.stdout:2/274: write d8/d1b/d24/f46 [497057,21465] 0 2026-03-09T00:03:49.807 INFO:tasks.workunit.client.0.vm03.stdout:1/395: dread d4/d15/d5c/f6f [0,4194304] 0 2026-03-09T00:03:49.810 INFO:tasks.workunit.client.0.vm03.stdout:9/324: rename d15/f1f to d15/d1c/d21/d54/f69 0 2026-03-09T00:03:49.821 INFO:tasks.workunit.client.0.vm03.stdout:2/275: truncate d8/d1b/f1f 1268077 0 2026-03-09T00:03:49.821 INFO:tasks.workunit.client.0.vm03.stdout:2/276: write d8/d17/f27 [5083880,79319] 0 2026-03-09T00:03:49.822 INFO:tasks.workunit.client.0.vm03.stdout:5/332: dread f12 [0,4194304] 0 2026-03-09T00:03:49.823 INFO:tasks.workunit.client.1.vm06.stdout:2/610: rmdir d7/da/d55 0 2026-03-09T00:03:49.825 INFO:tasks.workunit.client.0.vm03.stdout:1/396: mkdir d4/d3a/d32/d87 0 2026-03-09T00:03:49.825 INFO:tasks.workunit.client.0.vm03.stdout:2/277: creat d8/d26/f5a x:0 0 0 2026-03-09T00:03:49.826 INFO:tasks.workunit.client.0.vm03.stdout:5/333: mknod d1c/d20/d55/d43/c6e 0 2026-03-09T00:03:49.826 INFO:tasks.workunit.client.1.vm06.stdout:2/611: mkdir d7/d1b/d71/d79/db4 0 2026-03-09T00:03:49.827 INFO:tasks.workunit.client.1.vm06.stdout:4/472: sync 2026-03-09T00:03:49.827 INFO:tasks.workunit.client.1.vm06.stdout:5/628: sync 2026-03-09T00:03:49.834 INFO:tasks.workunit.client.1.vm06.stdout:2/612: dread d7/d1b/f3b [0,4194304] 0 2026-03-09T00:03:49.837 INFO:tasks.workunit.client.1.vm06.stdout:4/473: dread d17/d21/d4c/d50/f8c [0,4194304] 0 2026-03-09T00:03:49.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:49 vm03.local ceph-mon[52346]: Upgrade: Updating mgr.vm03.yvcons 2026-03-09T00:03:49.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:49 vm03.local ceph-mon[52346]: Deploying daemon mgr.vm03.yvcons on vm03 2026-03-09T00:03:49.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:49 vm03.local ceph-mon[52346]: pgmap v9: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 128 MiB/s rd, 154 MiB/s wr, 238 op/s 2026-03-09T00:03:49.838 INFO:tasks.workunit.client.1.vm06.stdout:5/629: truncate d5/d1c/d68/fb4 660505 0 2026-03-09T00:03:49.849 INFO:tasks.workunit.client.1.vm06.stdout:5/630: rename d5/d1c/c37 to d5/d44/d4b/cd6 0 2026-03-09T00:03:49.849 INFO:tasks.workunit.client.0.vm03.stdout:1/397: creat d4/d5e/f88 x:0 0 0 2026-03-09T00:03:49.850 INFO:tasks.workunit.client.0.vm03.stdout:1/398: creat d4/d15/d5c/f89 x:0 0 0 2026-03-09T00:03:49.850 INFO:tasks.workunit.client.0.vm03.stdout:8/289: rename d7/df/d1a/d2b/d43 to d7/df/d1a/d40/d58 0 2026-03-09T00:03:49.850 INFO:tasks.workunit.client.0.vm03.stdout:0/298: rename d2/da/d36/d39/d4b/d61 to d2/da/d36/d39/d4b/d61/d66 22 2026-03-09T00:03:49.850 INFO:tasks.workunit.client.0.vm03.stdout:0/299: creat d2/da/d36/d39/d4b/f67 x:0 0 0 2026-03-09T00:03:49.852 INFO:tasks.workunit.client.0.vm03.stdout:8/290: rmdir d7/df/d1e 39 2026-03-09T00:03:49.856 INFO:tasks.workunit.client.1.vm06.stdout:4/474: dread f15 [0,4194304] 0 2026-03-09T00:03:49.857 INFO:tasks.workunit.client.1.vm06.stdout:4/475: rename d17/d24/d3b/d54/f58 to d17/d21/d4c/d50/f9c 0 2026-03-09T00:03:49.857 INFO:tasks.workunit.client.1.vm06.stdout:4/476: readlink d17/d24/l67 0 2026-03-09T00:03:49.857 INFO:tasks.workunit.client.1.vm06.stdout:4/477: getdents d17/d24/d3b/d97 0 2026-03-09T00:03:49.857 INFO:tasks.workunit.client.0.vm03.stdout:8/291: dread d7/f15 [0,4194304] 0 2026-03-09T00:03:49.857 INFO:tasks.workunit.client.0.vm03.stdout:8/292: fsync d7/df/d1e/f24 0 2026-03-09T00:03:49.858 INFO:tasks.workunit.client.1.vm06.stdout:2/613: dread d7/da/d1c/f9e [0,4194304] 0 2026-03-09T00:03:49.858 INFO:tasks.workunit.client.1.vm06.stdout:4/478: creat d17/d24/f9d x:0 0 0 2026-03-09T00:03:49.859 INFO:tasks.workunit.client.0.vm03.stdout:8/293: creat d7/df/d1e/d3f/f59 x:0 0 0 2026-03-09T00:03:49.859 INFO:tasks.workunit.client.0.vm03.stdout:8/294: fsync d7/df/f37 0 2026-03-09T00:03:49.859 INFO:tasks.workunit.client.1.vm06.stdout:4/479: creat d17/d24/d3b/d75/f9e x:0 0 0 2026-03-09T00:03:49.860 INFO:tasks.workunit.client.0.vm03.stdout:8/295: truncate d7/df/d1e/f3a 99623 0 2026-03-09T00:03:49.860 INFO:tasks.workunit.client.0.vm03.stdout:8/296: write d7/f18 [187532,107657] 0 2026-03-09T00:03:49.860 INFO:tasks.workunit.client.0.vm03.stdout:8/297: fsync d7/f9 0 2026-03-09T00:03:49.862 INFO:tasks.workunit.client.1.vm06.stdout:2/614: dread d7/d1a/d25/fa3 [0,4194304] 0 2026-03-09T00:03:49.862 INFO:tasks.workunit.client.1.vm06.stdout:4/480: link d17/d24/d3b/c41 d17/d21/d4c/d66/c9f 0 2026-03-09T00:03:49.862 INFO:tasks.workunit.client.0.vm03.stdout:8/298: dread d7/df/d1e/d38/f3e [0,4194304] 0 2026-03-09T00:03:49.864 INFO:tasks.workunit.client.1.vm06.stdout:2/615: rename d7/f17 to d7/da/d4e/d57/fb5 0 2026-03-09T00:03:49.864 INFO:tasks.workunit.client.1.vm06.stdout:2/616: rename d7 to d7/d1a/d3c/db6 22 2026-03-09T00:03:49.869 INFO:tasks.workunit.client.1.vm06.stdout:4/481: mknod d17/d24/ca0 0 2026-03-09T00:03:49.870 INFO:tasks.workunit.client.1.vm06.stdout:4/482: chown d17/d21/f4b 8199 1 2026-03-09T00:03:49.870 INFO:tasks.workunit.client.1.vm06.stdout:2/617: link d7/da/d4e/d57/d9d/fa0 d7/d1a/d89/fb7 0 2026-03-09T00:03:49.876 INFO:tasks.workunit.client.1.vm06.stdout:2/618: write d7/d1b/d31/f90 [6419496,87677] 0 2026-03-09T00:03:49.897 INFO:tasks.workunit.client.1.vm06.stdout:8/519: dwrite db/dd/d85/d9f/f8b [0,4194304] 0 2026-03-09T00:03:49.897 INFO:tasks.workunit.client.0.vm03.stdout:4/366: dwrite d7/d20/d29/d38/f6e [0,4194304] 0 2026-03-09T00:03:49.900 INFO:tasks.workunit.client.0.vm03.stdout:4/367: mknod d7/d20/d29/d4e/c72 0 2026-03-09T00:03:49.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:49 vm06.local ceph-mon[58395]: Upgrade: Updating mgr.vm03.yvcons 2026-03-09T00:03:49.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:49 vm06.local ceph-mon[58395]: Deploying daemon mgr.vm03.yvcons on vm03 2026-03-09T00:03:49.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:49 vm06.local ceph-mon[58395]: pgmap v9: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 128 MiB/s rd, 154 MiB/s wr, 238 op/s 2026-03-09T00:03:49.924 INFO:tasks.workunit.client.1.vm06.stdout:8/520: dread db/d53/d70/f54 [0,4194304] 0 2026-03-09T00:03:49.925 INFO:tasks.workunit.client.1.vm06.stdout:8/521: dread db/d53/d5c/f6f [0,4194304] 0 2026-03-09T00:03:49.928 INFO:tasks.workunit.client.0.vm03.stdout:4/368: dread d7/d27/f52 [0,4194304] 0 2026-03-09T00:03:49.928 INFO:tasks.workunit.client.0.vm03.stdout:4/369: fdatasync d7/fe 0 2026-03-09T00:03:49.932 INFO:tasks.workunit.client.1.vm06.stdout:8/522: dread db/dd/d85/d9f/f88 [0,4194304] 0 2026-03-09T00:03:49.960 INFO:tasks.workunit.client.1.vm06.stdout:1/460: dwrite d6/d21/d2d/d37/f78 [0,4194304] 0 2026-03-09T00:03:49.964 INFO:tasks.workunit.client.1.vm06.stdout:1/461: creat d6/d4c/d71/d83/f9b x:0 0 0 2026-03-09T00:03:49.985 INFO:tasks.workunit.client.1.vm06.stdout:9/447: dwrite d1/d4/ff [4194304,4194304] 0 2026-03-09T00:03:49.987 INFO:tasks.workunit.client.0.vm03.stdout:7/313: dwrite d2/f50 [0,4194304] 0 2026-03-09T00:03:49.987 INFO:tasks.workunit.client.0.vm03.stdout:9/325: dwrite d15/d1c/d21/d64/f3d [0,4194304] 0 2026-03-09T00:03:49.987 INFO:tasks.workunit.client.0.vm03.stdout:5/334: dwrite d1c/d20/f39 [0,4194304] 0 2026-03-09T00:03:49.987 INFO:tasks.workunit.client.0.vm03.stdout:5/335: fdatasync fb 0 2026-03-09T00:03:49.998 INFO:tasks.workunit.client.1.vm06.stdout:9/448: rename d1/d4/f6 to d1/d73/f8f 0 2026-03-09T00:03:49.998 INFO:tasks.workunit.client.1.vm06.stdout:9/449: write d1/d4/d6e/d9/f87 [834216,46671] 0 2026-03-09T00:03:49.998 INFO:tasks.workunit.client.1.vm06.stdout:9/450: chown d1/f2a 3865734 1 2026-03-09T00:03:50.003 INFO:tasks.workunit.client.0.vm03.stdout:7/314: truncate d2/d1f/d3a/d31/d37/f56 3943868 0 2026-03-09T00:03:50.003 INFO:tasks.workunit.client.0.vm03.stdout:9/326: link d15/l1a d15/d1c/l6a 0 2026-03-09T00:03:50.010 INFO:tasks.workunit.client.0.vm03.stdout:5/336: dread d1c/d20/d55/f52 [0,4194304] 0 2026-03-09T00:03:50.010 INFO:tasks.workunit.client.0.vm03.stdout:5/337: write d1c/d20/d56/f59 [14963,41742] 0 2026-03-09T00:03:50.010 INFO:tasks.workunit.client.0.vm03.stdout:7/315: mknod d2/d1f/d3a/d24/c61 0 2026-03-09T00:03:50.011 INFO:tasks.workunit.client.0.vm03.stdout:7/316: stat d2/d1f/d3a/d31/f3f 0 2026-03-09T00:03:50.011 INFO:tasks.workunit.client.0.vm03.stdout:5/338: getdents d1c/d20/d55/d4f 0 2026-03-09T00:03:50.011 INFO:tasks.workunit.client.0.vm03.stdout:5/339: readlink d1c/d20/l54 0 2026-03-09T00:03:50.011 INFO:tasks.workunit.client.0.vm03.stdout:5/340: fdatasync d1c/f30 0 2026-03-09T00:03:50.012 INFO:tasks.workunit.client.0.vm03.stdout:5/341: fdatasync d1c/d51/f68 0 2026-03-09T00:03:50.015 INFO:tasks.workunit.client.0.vm03.stdout:5/342: creat d1c/d20/d55/d3b/f6f x:0 0 0 2026-03-09T00:03:50.020 INFO:tasks.workunit.client.0.vm03.stdout:5/343: truncate d1c/d20/d55/d3b/f45 324129 0 2026-03-09T00:03:50.020 INFO:tasks.workunit.client.0.vm03.stdout:5/344: mkdir d1c/d20/d55/d66/d70 0 2026-03-09T00:03:50.020 INFO:tasks.workunit.client.0.vm03.stdout:5/345: fdatasync d1c/d20/d55/f52 0 2026-03-09T00:03:50.060 INFO:tasks.workunit.client.1.vm06.stdout:8/523: dwrite db/d1e/f52 [4194304,4194304] 0 2026-03-09T00:03:50.061 INFO:tasks.workunit.client.1.vm06.stdout:8/524: symlink db/d53/d7c/lae 0 2026-03-09T00:03:50.061 INFO:tasks.workunit.client.1.vm06.stdout:8/525: truncate db/f2d 970631 0 2026-03-09T00:03:50.069 INFO:tasks.workunit.client.1.vm06.stdout:2/619: dwrite d7/d1b/d5a/d86/f8b [0,4194304] 0 2026-03-09T00:03:50.069 INFO:tasks.workunit.client.1.vm06.stdout:9/451: write d1/d3/d2b/f6d [668501,661] 0 2026-03-09T00:03:50.071 INFO:tasks.workunit.client.1.vm06.stdout:2/620: mknod d7/da/d4e/d57/cb8 0 2026-03-09T00:03:50.071 INFO:tasks.workunit.client.1.vm06.stdout:2/621: write d7/da/d1c/f9e [426303,101012] 0 2026-03-09T00:03:50.071 INFO:tasks.workunit.client.1.vm06.stdout:2/622: fdatasync d7/d1a/d56/fa4 0 2026-03-09T00:03:50.072 INFO:tasks.workunit.client.1.vm06.stdout:9/452: rename d1/d4/d6e/d14/d25/f32 to d1/d4/d6e/d14/d25/d85/f90 0 2026-03-09T00:03:50.095 INFO:tasks.workunit.client.1.vm06.stdout:2/623: read d7/d1b/d31/f90 [1678176,5804] 0 2026-03-09T00:03:50.099 INFO:tasks.workunit.client.1.vm06.stdout:2/624: read d7/d1a/d25/d66/f84 [44097,1764] 0 2026-03-09T00:03:50.100 INFO:tasks.workunit.client.1.vm06.stdout:2/625: truncate d7/da/d4e/d57/f7a 1735421 0 2026-03-09T00:03:50.101 INFO:tasks.workunit.client.1.vm06.stdout:2/626: symlink d7/d1a/d56/lb9 0 2026-03-09T00:03:50.102 INFO:tasks.workunit.client.1.vm06.stdout:2/627: creat d7/d1a/d96/fba x:0 0 0 2026-03-09T00:03:50.127 INFO:tasks.workunit.client.1.vm06.stdout:1/462: dwrite d6/f8c [0,4194304] 0 2026-03-09T00:03:50.129 INFO:tasks.workunit.client.1.vm06.stdout:1/463: creat d6/d63/f9c x:0 0 0 2026-03-09T00:03:50.129 INFO:tasks.workunit.client.1.vm06.stdout:1/464: stat d6/f81 0 2026-03-09T00:03:50.129 INFO:tasks.workunit.client.1.vm06.stdout:1/465: chown d6/f28 0 1 2026-03-09T00:03:50.136 INFO:tasks.workunit.client.1.vm06.stdout:2/628: write d7/d1b/f22 [2106665,75090] 0 2026-03-09T00:03:50.138 INFO:tasks.workunit.client.1.vm06.stdout:2/629: rename d7/f4c to d7/d1b/d5a/fbb 0 2026-03-09T00:03:50.138 INFO:tasks.workunit.client.1.vm06.stdout:2/630: chown d7/da/db/de/f49 88 1 2026-03-09T00:03:50.146 INFO:tasks.workunit.client.1.vm06.stdout:2/631: write d7/da/db/f74 [719162,26607] 0 2026-03-09T00:03:50.153 INFO:tasks.workunit.client.1.vm06.stdout:2/632: symlink d7/d1a/d25/d66/d87/da8/db2/lbc 0 2026-03-09T00:03:50.153 INFO:tasks.workunit.client.1.vm06.stdout:2/633: write d7/da/db/f6e [2350472,73715] 0 2026-03-09T00:03:50.153 INFO:tasks.workunit.client.1.vm06.stdout:4/483: dwrite d17/d24/f9d [0,4194304] 0 2026-03-09T00:03:50.153 INFO:tasks.workunit.client.1.vm06.stdout:4/484: dread - d17/d21/d4c/d66/f7b zero size 2026-03-09T00:03:50.153 INFO:tasks.workunit.client.1.vm06.stdout:4/485: rename d17/d24/d3b/d5e to d17/d24/d3b/d5e/d6e/da1 22 2026-03-09T00:03:50.158 INFO:tasks.workunit.client.1.vm06.stdout:4/486: creat d17/d21/d4c/d66/fa2 x:0 0 0 2026-03-09T00:03:50.163 INFO:tasks.workunit.client.1.vm06.stdout:4/487: rename d17/d24/d3b/d5e/d7a/c8d to d17/d21/d32/ca3 0 2026-03-09T00:03:50.169 INFO:tasks.workunit.client.1.vm06.stdout:4/488: write d17/d24/d49/d5f/f6b [6739,44239] 0 2026-03-09T00:03:50.171 INFO:tasks.workunit.client.1.vm06.stdout:5/631: dwrite d5/ff [8388608,4194304] 0 2026-03-09T00:03:50.171 INFO:tasks.workunit.client.1.vm06.stdout:5/632: fsync d5/d1c/f62 0 2026-03-09T00:03:50.171 INFO:tasks.workunit.client.1.vm06.stdout:5/633: truncate d5/d1c/d21/d28/d5e/fc4 620000 0 2026-03-09T00:03:50.172 INFO:tasks.workunit.client.1.vm06.stdout:4/489: write d17/d24/d49/f65 [502618,103539] 0 2026-03-09T00:03:50.172 INFO:tasks.workunit.client.1.vm06.stdout:8/526: dwrite db/f1d [0,4194304] 0 2026-03-09T00:03:50.173 INFO:tasks.workunit.client.0.vm03.stdout:9/327: dwrite d15/d1c/d28/f5b [0,4194304] 0 2026-03-09T00:03:50.173 INFO:tasks.workunit.client.0.vm03.stdout:9/328: creat d15/d1c/d36/d4d/f6b x:0 0 0 2026-03-09T00:03:50.173 INFO:tasks.workunit.client.0.vm03.stdout:5/346: dwrite d1c/f37 [0,4194304] 0 2026-03-09T00:03:50.174 INFO:tasks.workunit.client.0.vm03.stdout:4/370: dwrite d7/d20/d29/d38/d3a/f50 [0,4194304] 0 2026-03-09T00:03:50.174 INFO:tasks.workunit.client.0.vm03.stdout:4/371: write d7/d20/d29/f43 [306987,58321] 0 2026-03-09T00:03:50.177 INFO:tasks.workunit.client.0.vm03.stdout:5/347: dread d1c/d20/d55/f3d [0,4194304] 0 2026-03-09T00:03:50.186 INFO:tasks.workunit.client.0.vm03.stdout:4/372: dread d7/d20/d29/d38/d3a/f50 [0,4194304] 0 2026-03-09T00:03:50.186 INFO:tasks.workunit.client.0.vm03.stdout:2/278: dwrite d8/d1b/d24/f46 [0,4194304] 0 2026-03-09T00:03:50.186 INFO:tasks.workunit.client.0.vm03.stdout:2/279: chown d8/d1b/d2a/d2e 14869135 1 2026-03-09T00:03:50.186 INFO:tasks.workunit.client.0.vm03.stdout:2/280: fsync d8/d17/f45 0 2026-03-09T00:03:50.191 INFO:tasks.workunit.client.1.vm06.stdout:5/634: symlink d5/d1c/d23/d34/d47/dcf/ld7 0 2026-03-09T00:03:50.194 INFO:tasks.workunit.client.0.vm03.stdout:3/241: sync 2026-03-09T00:03:50.194 INFO:tasks.workunit.client.1.vm06.stdout:4/490: truncate d17/d24/d49/f2a 2081051 0 2026-03-09T00:03:50.194 INFO:tasks.workunit.client.1.vm06.stdout:4/491: readlink d17/l59 0 2026-03-09T00:03:50.201 INFO:tasks.workunit.client.0.vm03.stdout:9/329: symlink d15/d1c/d36/l6c 0 2026-03-09T00:03:50.201 INFO:tasks.workunit.client.0.vm03.stdout:9/330: stat d15/d1c/d21/d64/f50 0 2026-03-09T00:03:50.201 INFO:tasks.workunit.client.0.vm03.stdout:9/331: chown d15/f26 916 1 2026-03-09T00:03:50.209 INFO:tasks.workunit.client.1.vm06.stdout:9/453: dwrite d1/d4/d6e/d14/d25/f70 [0,4194304] 0 2026-03-09T00:03:50.212 INFO:tasks.workunit.client.1.vm06.stdout:8/527: truncate db/d1e/f34 2416574 0 2026-03-09T00:03:50.213 INFO:tasks.workunit.client.1.vm06.stdout:5/635: truncate d5/d1c/d21/d28/d5e/d66/f8a 4152259 0 2026-03-09T00:03:50.219 INFO:tasks.workunit.client.1.vm06.stdout:4/492: rename d17/f81 to d17/d21/d32/d92/fa4 0 2026-03-09T00:03:50.219 INFO:tasks.workunit.client.1.vm06.stdout:4/493: creat d17/d24/d3b/d54/fa5 x:0 0 0 2026-03-09T00:03:50.219 INFO:tasks.workunit.client.1.vm06.stdout:1/466: dwrite f0 [0,4194304] 0 2026-03-09T00:03:50.219 INFO:tasks.workunit.client.1.vm06.stdout:1/467: write d6/d4c/d71/d83/f9b [544185,7319] 0 2026-03-09T00:03:50.219 INFO:tasks.workunit.client.1.vm06.stdout:1/468: fsync d6/d21/d2d/d3b/d42/f4e 0 2026-03-09T00:03:50.219 INFO:tasks.workunit.client.1.vm06.stdout:1/469: write d6/d4c/f90 [493833,102593] 0 2026-03-09T00:03:50.223 INFO:tasks.workunit.client.1.vm06.stdout:2/634: dwrite d7/da/d4e/d57/fb5 [0,4194304] 0 2026-03-09T00:03:50.230 INFO:tasks.workunit.client.0.vm03.stdout:5/348: stat d1c/c1d 0 2026-03-09T00:03:50.233 INFO:tasks.workunit.client.0.vm03.stdout:5/349: dread d1c/d20/f33 [0,4194304] 0 2026-03-09T00:03:50.233 INFO:tasks.workunit.client.0.vm03.stdout:5/350: readlink d1c/d20/l24 0 2026-03-09T00:03:50.233 INFO:tasks.workunit.client.0.vm03.stdout:5/351: write d1c/f4c [2626429,55179] 0 2026-03-09T00:03:50.237 INFO:tasks.workunit.client.0.vm03.stdout:2/281: link d8/d1b/d2a/d56/l58 d8/d1b/d2a/d42/d4b/l5b 0 2026-03-09T00:03:50.253 INFO:tasks.workunit.client.0.vm03.stdout:3/242: creat d2/db/d3b/d3f/f46 x:0 0 0 2026-03-09T00:03:50.253 INFO:tasks.workunit.client.1.vm06.stdout:5/636: unlink d5/d44/d4b/d92/d49/c77 0 2026-03-09T00:03:50.253 INFO:tasks.workunit.client.1.vm06.stdout:4/494: creat d17/d21/fa6 x:0 0 0 2026-03-09T00:03:50.268 INFO:tasks.workunit.client.1.vm06.stdout:1/470: mkdir d6/d21/d2d/d3b/d87/d9d 0 2026-03-09T00:03:50.270 INFO:tasks.workunit.client.0.vm03.stdout:9/332: stat d15/d1c/d21/d64/l32 0 2026-03-09T00:03:50.279 INFO:tasks.workunit.client.0.vm03.stdout:4/373: getdents d7/d20/d35/d66 0 2026-03-09T00:03:50.279 INFO:tasks.workunit.client.1.vm06.stdout:9/454: truncate d1/d4/d2f/f43 912862 0 2026-03-09T00:03:50.283 INFO:tasks.workunit.client.0.vm03.stdout:6/289: dwrite d13/d1e/f34 [0,4194304] 0 2026-03-09T00:03:50.289 INFO:tasks.workunit.client.0.vm03.stdout:6/290: dread d13/d1e/f2d [0,4194304] 0 2026-03-09T00:03:50.290 INFO:tasks.workunit.client.1.vm06.stdout:5/637: mknod d5/d44/cd8 0 2026-03-09T00:03:50.295 INFO:tasks.workunit.client.0.vm03.stdout:2/282: mknod d8/d26/c5c 0 2026-03-09T00:03:50.308 INFO:tasks.workunit.client.1.vm06.stdout:1/471: creat d6/d21/d2d/d3b/d87/f9e x:0 0 0 2026-03-09T00:03:50.314 INFO:tasks.workunit.client.0.vm03.stdout:3/243: symlink d2/db/l47 0 2026-03-09T00:03:50.320 INFO:tasks.workunit.client.1.vm06.stdout:0/531: sync 2026-03-09T00:03:50.320 INFO:tasks.workunit.client.1.vm06.stdout:6/557: sync 2026-03-09T00:03:50.320 INFO:tasks.workunit.client.1.vm06.stdout:6/558: fsync d4/d27/f70 0 2026-03-09T00:03:50.320 INFO:tasks.workunit.client.0.vm03.stdout:5/352: dwrite d1c/d20/f39 [0,4194304] 0 2026-03-09T00:03:50.323 INFO:tasks.workunit.client.1.vm06.stdout:9/455: truncate d1/f45 434617 0 2026-03-09T00:03:50.323 INFO:tasks.workunit.client.1.vm06.stdout:9/456: write d1/d4/d6e/d14/d25/d85/f90 [5228452,34445] 0 2026-03-09T00:03:50.323 INFO:tasks.workunit.client.1.vm06.stdout:9/457: stat d1/d3/d50/l89 0 2026-03-09T00:03:50.323 INFO:tasks.workunit.client.1.vm06.stdout:9/458: chown d1/d4/d6e/f5d 50 1 2026-03-09T00:03:50.323 INFO:tasks.workunit.client.1.vm06.stdout:9/459: chown d1/d4/l42 62345278 1 2026-03-09T00:03:50.324 INFO:tasks.workunit.client.0.vm03.stdout:1/399: getdents d4/d3a/d32 0 2026-03-09T00:03:50.325 INFO:tasks.workunit.client.0.vm03.stdout:4/374: creat d7/d20/d29/d4e/f73 x:0 0 0 2026-03-09T00:03:50.325 INFO:tasks.workunit.client.0.vm03.stdout:4/375: write d7/d20/f21 [236752,39794] 0 2026-03-09T00:03:50.330 INFO:tasks.workunit.client.1.vm06.stdout:5/638: mkdir d5/d1c/d21/d28/d5e/d66/d78/dc8/dc3/dd9 0 2026-03-09T00:03:50.339 INFO:tasks.workunit.client.1.vm06.stdout:5/639: creat d5/d44/d4b/d92/d49/da0/fda x:0 0 0 2026-03-09T00:03:50.340 INFO:tasks.workunit.client.1.vm06.stdout:5/640: chown d5/d44/d4b/c80 45762062 1 2026-03-09T00:03:50.340 INFO:tasks.workunit.client.1.vm06.stdout:8/528: dwrite db/dd/d84/f8d [0,4194304] 0 2026-03-09T00:03:50.340 INFO:tasks.workunit.client.1.vm06.stdout:9/460: dread d1/d4/d6e/d9/f10 [0,4194304] 0 2026-03-09T00:03:50.340 INFO:tasks.workunit.client.1.vm06.stdout:9/461: write d1/d4/d6e/d9/f87 [346637,101530] 0 2026-03-09T00:03:50.343 INFO:tasks.workunit.client.1.vm06.stdout:8/529: dread db/dd/d85/d9f/f8b [0,4194304] 0 2026-03-09T00:03:50.345 INFO:tasks.workunit.client.1.vm06.stdout:4/495: dwrite d17/d24/d49/f65 [0,4194304] 0 2026-03-09T00:03:50.345 INFO:tasks.workunit.client.1.vm06.stdout:4/496: dread - d17/d21/d32/d92/fa4 zero size 2026-03-09T00:03:50.345 INFO:tasks.workunit.client.1.vm06.stdout:4/497: chown d17/d5b/d8f/c95 1497681358 1 2026-03-09T00:03:50.356 INFO:tasks.workunit.client.1.vm06.stdout:4/498: write d17/d21/f2f [661205,121358] 0 2026-03-09T00:03:50.359 INFO:tasks.workunit.client.1.vm06.stdout:2/635: dwrite d7/d1b/d31/fab [0,4194304] 0 2026-03-09T00:03:50.359 INFO:tasks.workunit.client.1.vm06.stdout:4/499: creat d17/d24/d3b/d75/fa7 x:0 0 0 2026-03-09T00:03:50.359 INFO:tasks.workunit.client.1.vm06.stdout:0/532: unlink d3/d18/d3c/fa2 0 2026-03-09T00:03:50.362 INFO:tasks.workunit.client.1.vm06.stdout:2/636: dread d7/da/db/de/f49 [0,4194304] 0 2026-03-09T00:03:50.370 INFO:tasks.workunit.client.1.vm06.stdout:6/559: unlink d4/f40 0 2026-03-09T00:03:50.379 INFO:tasks.workunit.client.0.vm03.stdout:6/291: rename d13 to d13/d1e/d65 22 2026-03-09T00:03:50.380 INFO:tasks.workunit.client.0.vm03.stdout:2/283: creat d8/f5d x:0 0 0 2026-03-09T00:03:50.380 INFO:tasks.workunit.client.0.vm03.stdout:2/284: chown d8/d1b/d2a/d56 1786961 1 2026-03-09T00:03:50.380 INFO:tasks.workunit.client.0.vm03.stdout:2/285: write d8/d17/f1c [3656518,70248] 0 2026-03-09T00:03:50.384 INFO:tasks.workunit.client.1.vm06.stdout:1/472: dwrite d6/d21/d2d/d3b/d42/f9a [0,4194304] 0 2026-03-09T00:03:50.385 INFO:tasks.workunit.client.0.vm03.stdout:2/286: dread d8/f9 [0,4194304] 0 2026-03-09T00:03:50.385 INFO:tasks.workunit.client.0.vm03.stdout:2/287: fsync d8/d1b/f32 0 2026-03-09T00:03:50.385 INFO:tasks.workunit.client.0.vm03.stdout:2/288: truncate d8/d26/f5a 796304 0 2026-03-09T00:03:50.385 INFO:tasks.workunit.client.0.vm03.stdout:2/289: stat d8/d26/f5a 0 2026-03-09T00:03:50.391 INFO:tasks.workunit.client.0.vm03.stdout:3/244: dread d2/f8 [0,4194304] 0 2026-03-09T00:03:50.391 INFO:tasks.workunit.client.1.vm06.stdout:5/641: symlink d5/d1c/d23/d34/d47/ldb 0 2026-03-09T00:03:50.397 INFO:tasks.workunit.client.1.vm06.stdout:8/530: dwrite db/d53/d70/f75 [0,4194304] 0 2026-03-09T00:03:50.397 INFO:tasks.workunit.client.0.vm03.stdout:1/400: dwrite d4/d15/f18 [4194304,4194304] 0 2026-03-09T00:03:50.397 INFO:tasks.workunit.client.0.vm03.stdout:8/299: truncate d7/df/d1e/f3a 791550 0 2026-03-09T00:03:50.397 INFO:tasks.workunit.client.1.vm06.stdout:5/642: write d5/f6b [2151956,82454] 0 2026-03-09T00:03:50.397 INFO:tasks.workunit.client.1.vm06.stdout:5/643: write d5/d44/d4b/d92/d95/f9c [228530,59608] 0 2026-03-09T00:03:50.402 INFO:tasks.workunit.client.1.vm06.stdout:5/644: dread d5/d44/f4a [0,4194304] 0 2026-03-09T00:03:50.408 INFO:tasks.workunit.client.1.vm06.stdout:9/462: truncate d1/d4/d6e/d9/f10 4320520 0 2026-03-09T00:03:50.408 INFO:tasks.workunit.client.1.vm06.stdout:9/463: dread - d1/d3/d2b/d58/f86 zero size 2026-03-09T00:03:50.408 INFO:tasks.workunit.client.0.vm03.stdout:4/376: link d7/f15 d7/d20/d29/d4e/f74 0 2026-03-09T00:03:50.410 INFO:tasks.workunit.client.1.vm06.stdout:3/605: sync 2026-03-09T00:03:50.414 INFO:tasks.workunit.client.1.vm06.stdout:7/567: sync 2026-03-09T00:03:50.417 INFO:tasks.workunit.client.0.vm03.stdout:6/292: creat d13/d1e/d44/d59/f66 x:0 0 0 2026-03-09T00:03:50.427 INFO:tasks.workunit.client.0.vm03.stdout:2/290: mkdir d8/d26/d5e 0 2026-03-09T00:03:50.427 INFO:tasks.workunit.client.0.vm03.stdout:2/291: stat d8/d1b/d2a/d42/d4b/c4d 0 2026-03-09T00:03:50.427 INFO:tasks.workunit.client.0.vm03.stdout:2/292: chown d8/d26/c53 329840 1 2026-03-09T00:03:50.444 INFO:tasks.workunit.client.1.vm06.stdout:4/500: link d17/f19 d17/d5b/d8f/fa8 0 2026-03-09T00:03:50.444 INFO:tasks.workunit.client.1.vm06.stdout:4/501: truncate d17/f1d 4830898 0 2026-03-09T00:03:50.444 INFO:tasks.workunit.client.1.vm06.stdout:4/502: fdatasync d17/d24/f29 0 2026-03-09T00:03:50.446 INFO:tasks.workunit.client.1.vm06.stdout:0/533: stat d3/d18/d1f/d44/f7c 0 2026-03-09T00:03:50.447 INFO:tasks.workunit.client.0.vm03.stdout:1/401: dwrite d4/f7d [0,4194304] 0 2026-03-09T00:03:50.447 INFO:tasks.workunit.client.0.vm03.stdout:1/402: creat d4/d15/f8a x:0 0 0 2026-03-09T00:03:50.458 INFO:tasks.workunit.client.1.vm06.stdout:2/637: creat d7/d1a/d25/fbd x:0 0 0 2026-03-09T00:03:50.466 INFO:tasks.workunit.client.0.vm03.stdout:8/300: mkdir d7/df/d1e/d5a 0 2026-03-09T00:03:50.485 INFO:tasks.workunit.client.0.vm03.stdout:4/377: mknod d7/c75 0 2026-03-09T00:03:50.492 INFO:tasks.workunit.client.1.vm06.stdout:7/568: dwrite d0/df/d17/f1f [0,4194304] 0 2026-03-09T00:03:50.492 INFO:tasks.workunit.client.1.vm06.stdout:7/569: truncate d0/df/d1a/d27/f37 1379996 0 2026-03-09T00:03:50.495 INFO:tasks.workunit.client.1.vm06.stdout:6/560: write d4/f22 [2762318,112180] 0 2026-03-09T00:03:50.503 INFO:tasks.workunit.client.0.vm03.stdout:6/293: symlink d13/l67 0 2026-03-09T00:03:50.504 INFO:tasks.workunit.client.0.vm03.stdout:5/353: dwrite d1c/f30 [0,4194304] 0 2026-03-09T00:03:50.520 INFO:tasks.workunit.client.1.vm06.stdout:1/473: getdents d6/d21 0 2026-03-09T00:03:50.520 INFO:tasks.workunit.client.1.vm06.stdout:1/474: creat d6/d21/d2d/d3b/d42/f9f x:0 0 0 2026-03-09T00:03:50.522 INFO:tasks.workunit.client.0.vm03.stdout:1/403: dwrite f1 [0,4194304] 0 2026-03-09T00:03:50.523 INFO:tasks.workunit.client.1.vm06.stdout:8/531: symlink db/d53/d5c/laf 0 2026-03-09T00:03:50.523 INFO:tasks.workunit.client.1.vm06.stdout:8/532: readlink db/dd/d24/da7/la9 0 2026-03-09T00:03:50.524 INFO:tasks.workunit.client.0.vm03.stdout:3/245: dwrite d2/f30 [0,4194304] 0 2026-03-09T00:03:50.524 INFO:tasks.workunit.client.0.vm03.stdout:3/246: read - d2/db/d2d/f45 zero size 2026-03-09T00:03:50.524 INFO:tasks.workunit.client.0.vm03.stdout:3/247: fdatasync d2/db/f1a 0 2026-03-09T00:03:50.524 INFO:tasks.workunit.client.0.vm03.stdout:3/248: fdatasync f1 0 2026-03-09T00:03:50.524 INFO:tasks.workunit.client.0.vm03.stdout:3/249: chown d2/db/c18 4 1 2026-03-09T00:03:50.527 INFO:tasks.workunit.client.1.vm06.stdout:1/475: dread d6/d4c/f90 [0,4194304] 0 2026-03-09T00:03:50.527 INFO:tasks.workunit.client.1.vm06.stdout:1/476: chown d6/d21/d2d/d3b/d42 29 1 2026-03-09T00:03:50.529 INFO:tasks.workunit.client.0.vm03.stdout:4/378: dwrite d7/d20/d29/d54/d58/f6b [0,4194304] 0 2026-03-09T00:03:50.529 INFO:tasks.workunit.client.0.vm03.stdout:6/294: unlink d13/d35/l3b 0 2026-03-09T00:03:50.529 INFO:tasks.workunit.client.1.vm06.stdout:5/645: symlink d5/d1c/d21/d28/d5e/d66/d78/dd5/ldc 0 2026-03-09T00:03:50.533 INFO:tasks.workunit.client.1.vm06.stdout:1/477: dread d6/f19 [0,4194304] 0 2026-03-09T00:03:50.554 INFO:tasks.workunit.client.0.vm03.stdout:5/354: creat d1c/d20/d55/d66/d70/f71 x:0 0 0 2026-03-09T00:03:50.557 INFO:tasks.workunit.client.0.vm03.stdout:5/355: dread d1c/f29 [0,4194304] 0 2026-03-09T00:03:50.557 INFO:tasks.workunit.client.0.vm03.stdout:5/356: chown d1c/d20/d55/d43/f53 79317 1 2026-03-09T00:03:50.557 INFO:tasks.workunit.client.0.vm03.stdout:5/357: creat d1c/d20/d55/d66/d70/f72 x:0 0 0 2026-03-09T00:03:50.562 INFO:tasks.workunit.client.1.vm06.stdout:3/606: mknod d11/d28/d2e/d2f/dc1/cd4 0 2026-03-09T00:03:50.562 INFO:tasks.workunit.client.1.vm06.stdout:3/607: chown d11/d28/d2e/d2f/d5b/d5f/c72 3073 1 2026-03-09T00:03:50.575 INFO:tasks.workunit.client.1.vm06.stdout:0/534: readlink d3/d18/d1f/d39/d69/l95 0 2026-03-09T00:03:50.575 INFO:tasks.workunit.client.1.vm06.stdout:5/646: dwrite d5/d1c/d23/f82 [0,4194304] 0 2026-03-09T00:03:50.575 INFO:tasks.workunit.client.1.vm06.stdout:5/647: getdents d5/db1/dcc 0 2026-03-09T00:03:50.575 INFO:tasks.workunit.client.1.vm06.stdout:5/648: dread - d5/d44/d4b/d92/d49/da0/fda zero size 2026-03-09T00:03:50.582 INFO:tasks.workunit.client.0.vm03.stdout:1/404: unlink d4/d3a/d32/f68 0 2026-03-09T00:03:50.582 INFO:tasks.workunit.client.0.vm03.stdout:1/405: read - d4/d3a/d32/f4f zero size 2026-03-09T00:03:50.584 INFO:tasks.workunit.client.1.vm06.stdout:0/535: dread d3/f1a [0,4194304] 0 2026-03-09T00:03:50.584 INFO:tasks.workunit.client.1.vm06.stdout:4/503: dwrite d17/d24/d49/f2a [0,4194304] 0 2026-03-09T00:03:50.586 INFO:tasks.workunit.client.1.vm06.stdout:5/649: write d5/f19 [1990714,88496] 0 2026-03-09T00:03:50.587 INFO:tasks.workunit.client.1.vm06.stdout:2/638: creat d7/da/d4e/d57/d9d/fbe x:0 0 0 2026-03-09T00:03:50.587 INFO:tasks.workunit.client.1.vm06.stdout:2/639: chown d7/da/d1c 1545590 1 2026-03-09T00:03:50.604 INFO:tasks.workunit.client.0.vm03.stdout:3/250: chown d2/c31 6 1 2026-03-09T00:03:50.607 INFO:tasks.workunit.client.0.vm03.stdout:2/293: rename d8/d1b/d2a/d42 to d8/d26/d5e/d5f 0 2026-03-09T00:03:50.609 INFO:tasks.workunit.client.1.vm06.stdout:7/570: getdents d0/df/d1a/d3a/d4e/d5e 0 2026-03-09T00:03:50.619 INFO:tasks.workunit.client.1.vm06.stdout:3/608: dwrite d11/d28/d2e/d2f/f49 [0,4194304] 0 2026-03-09T00:03:50.623 INFO:tasks.workunit.client.1.vm06.stdout:6/561: mknod d4/d8d/cb0 0 2026-03-09T00:03:50.625 INFO:tasks.workunit.client.0.vm03.stdout:4/379: creat d7/d20/d6a/f76 x:0 0 0 2026-03-09T00:03:50.627 INFO:tasks.workunit.client.1.vm06.stdout:8/533: getdents db/d53/d70/d38/d47 0 2026-03-09T00:03:50.627 INFO:tasks.workunit.client.1.vm06.stdout:8/534: read - db/d53/d7c/f95 zero size 2026-03-09T00:03:50.627 INFO:tasks.workunit.client.1.vm06.stdout:8/535: stat db/dd/d48/f68 0 2026-03-09T00:03:50.634 INFO:tasks.workunit.client.0.vm03.stdout:6/295: creat d13/d35/f68 x:0 0 0 2026-03-09T00:03:50.639 INFO:tasks.workunit.client.1.vm06.stdout:1/478: mkdir d6/d8f/da0 0 2026-03-09T00:03:50.642 INFO:tasks.workunit.client.0.vm03.stdout:5/358: mkdir d1c/d20/d55/d4f/d58/d73 0 2026-03-09T00:03:50.654 INFO:tasks.workunit.client.1.vm06.stdout:0/536: mkdir d3/d18/d2c/d2d/d74/daf 0 2026-03-09T00:03:50.663 INFO:tasks.workunit.client.1.vm06.stdout:4/504: truncate d17/d24/f36 1895068 0 2026-03-09T00:03:50.663 INFO:tasks.workunit.client.1.vm06.stdout:4/505: chown d17/d24/d3b/d75/fa7 0 1 2026-03-09T00:03:50.669 INFO:tasks.workunit.client.0.vm03.stdout:2/294: link d8/d17/f3c d8/d26/d5e/d5f/f60 0 2026-03-09T00:03:50.670 INFO:tasks.workunit.client.0.vm03.stdout:5/359: dwrite d1c/d20/d55/f5a [0,4194304] 0 2026-03-09T00:03:50.670 INFO:tasks.workunit.client.0.vm03.stdout:5/360: write d1c/d20/d55/d4f/f69 [336319,75450] 0 2026-03-09T00:03:50.674 INFO:tasks.workunit.client.0.vm03.stdout:5/361: write fb [602169,130702] 0 2026-03-09T00:03:50.674 INFO:tasks.workunit.client.0.vm03.stdout:4/380: rename d7/d23 to d7/d20/d6a/d77 0 2026-03-09T00:03:50.677 INFO:tasks.workunit.client.0.vm03.stdout:6/296: mkdir d13/d35/d69 0 2026-03-09T00:03:50.679 INFO:tasks.workunit.client.0.vm03.stdout:6/297: dread d13/f17 [0,4194304] 0 2026-03-09T00:03:50.687 INFO:tasks.workunit.client.1.vm06.stdout:7/571: mknod d0/df/d1a/d3a/ca1 0 2026-03-09T00:03:50.689 INFO:tasks.workunit.client.0.vm03.stdout:3/251: mknod d2/c48 0 2026-03-09T00:03:50.690 INFO:tasks.workunit.client.0.vm03.stdout:2/295: mkdir d8/d1b/d2a/d61 0 2026-03-09T00:03:50.691 INFO:tasks.workunit.client.0.vm03.stdout:0/300: sync 2026-03-09T00:03:50.691 INFO:tasks.workunit.client.0.vm03.stdout:7/317: sync 2026-03-09T00:03:50.691 INFO:tasks.workunit.client.0.vm03.stdout:7/318: write d2/d1f/d3a/f29 [4297906,55390] 0 2026-03-09T00:03:50.694 INFO:tasks.workunit.client.1.vm06.stdout:3/609: rmdir d11/d28/d4d 39 2026-03-09T00:03:50.695 INFO:tasks.workunit.client.0.vm03.stdout:5/362: truncate d1c/d20/f33 3118845 0 2026-03-09T00:03:50.697 INFO:tasks.workunit.client.0.vm03.stdout:4/381: mkdir d7/d20/d29/d78 0 2026-03-09T00:03:50.702 INFO:tasks.workunit.client.0.vm03.stdout:3/252: chown d2/c2c 276 1 2026-03-09T00:03:50.709 INFO:tasks.workunit.client.0.vm03.stdout:2/296: truncate d8/d17/f1d 607247 0 2026-03-09T00:03:50.709 INFO:tasks.workunit.client.0.vm03.stdout:2/297: chown d8/d1b/f1e 3394899 1 2026-03-09T00:03:50.710 INFO:tasks.workunit.client.0.vm03.stdout:9/333: sync 2026-03-09T00:03:50.719 INFO:tasks.workunit.client.0.vm03.stdout:0/301: mknod d2/d1f/c68 0 2026-03-09T00:03:50.723 INFO:tasks.workunit.client.0.vm03.stdout:0/302: readlink d2/da/d1a/l5e 0 2026-03-09T00:03:50.724 INFO:tasks.workunit.client.0.vm03.stdout:7/319: rename d2/d4/f34 to d2/d1f/f62 0 2026-03-09T00:03:50.726 INFO:tasks.workunit.client.0.vm03.stdout:4/382: dwrite d7/d20/d29/f43 [0,4194304] 0 2026-03-09T00:03:50.726 INFO:tasks.workunit.client.1.vm06.stdout:9/464: write d1/f16 [1126265,99451] 0 2026-03-09T00:03:50.738 INFO:tasks.workunit.client.0.vm03.stdout:5/363: mkdir d1c/d20/d56/d74 0 2026-03-09T00:03:50.738 INFO:tasks.workunit.client.0.vm03.stdout:5/364: write f15 [1443759,62647] 0 2026-03-09T00:03:50.738 INFO:tasks.workunit.client.0.vm03.stdout:5/365: write d1c/d51/f68 [392769,40775] 0 2026-03-09T00:03:50.748 INFO:tasks.workunit.client.0.vm03.stdout:6/298: getdents d13 0 2026-03-09T00:03:50.749 INFO:tasks.workunit.client.0.vm03.stdout:6/299: creat d13/d35/f6a x:0 0 0 2026-03-09T00:03:50.753 INFO:tasks.workunit.client.0.vm03.stdout:5/366: write d1c/d20/f3e [1195865,60684] 0 2026-03-09T00:03:50.759 INFO:tasks.workunit.client.1.vm06.stdout:9/465: dwrite d1/d4/d6e/d9/f65 [0,4194304] 0 2026-03-09T00:03:50.762 INFO:tasks.workunit.client.0.vm03.stdout:3/253: unlink d2/c31 0 2026-03-09T00:03:50.762 INFO:tasks.workunit.client.0.vm03.stdout:8/301: sync 2026-03-09T00:03:50.766 INFO:tasks.workunit.client.0.vm03.stdout:2/298: mknod d8/d1b/d2a/d56/c62 0 2026-03-09T00:03:50.772 INFO:tasks.workunit.client.0.vm03.stdout:9/334: truncate d15/d1c/d21/d64/f3d 3570682 0 2026-03-09T00:03:50.772 INFO:tasks.workunit.client.0.vm03.stdout:9/335: creat d15/d1c/d36/f6d x:0 0 0 2026-03-09T00:03:50.772 INFO:tasks.workunit.client.0.vm03.stdout:9/336: write fb [5211693,40130] 0 2026-03-09T00:03:50.772 INFO:tasks.workunit.client.0.vm03.stdout:1/406: sync 2026-03-09T00:03:50.772 INFO:tasks.workunit.client.0.vm03.stdout:1/407: dread - d4/f39 zero size 2026-03-09T00:03:50.774 INFO:tasks.workunit.client.0.vm03.stdout:7/320: unlink d2/d1f/d42/d43/c57 0 2026-03-09T00:03:50.787 INFO:tasks.workunit.client.0.vm03.stdout:0/303: rename d2/d1f/f2c to d2/da/dd/d49/f69 0 2026-03-09T00:03:50.804 INFO:tasks.workunit.client.0.vm03.stdout:1/408: dwrite d4/d5e/f88 [0,4194304] 0 2026-03-09T00:03:50.807 INFO:tasks.workunit.client.1.vm06.stdout:8/536: mkdir db/dd/d24/db0 0 2026-03-09T00:03:50.812 INFO:tasks.workunit.client.0.vm03.stdout:4/383: getdents d7/d20/d6a/d77 0 2026-03-09T00:03:50.813 INFO:tasks.workunit.client.1.vm06.stdout:1/479: creat d6/d21/fa1 x:0 0 0 2026-03-09T00:03:50.817 INFO:tasks.workunit.client.1.vm06.stdout:0/537: symlink d3/d18/d2c/d2d/d74/lb0 0 2026-03-09T00:03:50.819 INFO:tasks.workunit.client.0.vm03.stdout:4/384: write d7/d20/d29/d4e/f60 [7931583,73122] 0 2026-03-09T00:03:50.821 INFO:tasks.workunit.client.1.vm06.stdout:5/650: dwrite d5/f3d [0,4194304] 0 2026-03-09T00:03:50.821 INFO:tasks.workunit.client.0.vm03.stdout:6/300: symlink d13/d1e/d44/d4a/l6b 0 2026-03-09T00:03:50.821 INFO:tasks.workunit.client.0.vm03.stdout:6/301: chown f2 58943 1 2026-03-09T00:03:50.821 INFO:tasks.workunit.client.0.vm03.stdout:6/302: creat d13/d1e/d44/d59/f6c x:0 0 0 2026-03-09T00:03:50.827 INFO:tasks.workunit.client.1.vm06.stdout:4/506: mknod d17/d21/d4c/d66/ca9 0 2026-03-09T00:03:50.828 INFO:tasks.workunit.client.0.vm03.stdout:5/367: mkdir d1c/d51/d6a/d75 0 2026-03-09T00:03:50.829 INFO:tasks.workunit.client.0.vm03.stdout:5/368: write d1c/d20/d55/d3b/f57 [555266,102762] 0 2026-03-09T00:03:50.836 INFO:tasks.workunit.client.1.vm06.stdout:7/572: rename d0/l9a to d0/df/d1a/d27/d4c/d40/d51/d90/la2 0 2026-03-09T00:03:50.837 INFO:tasks.workunit.client.0.vm03.stdout:3/254: mknod d2/db/d3b/d3f/c49 0 2026-03-09T00:03:50.842 INFO:tasks.workunit.client.0.vm03.stdout:8/302: mknod d7/df/d1a/c5b 0 2026-03-09T00:03:50.842 INFO:tasks.workunit.client.0.vm03.stdout:8/303: fsync d7/df/d1e/d38/f3e 0 2026-03-09T00:03:50.847 INFO:tasks.workunit.client.0.vm03.stdout:2/299: creat d8/d26/d5e/d5f/d4b/d50/f63 x:0 0 0 2026-03-09T00:03:50.847 INFO:tasks.workunit.client.0.vm03.stdout:2/300: write d8/f59 [4252824,26744] 0 2026-03-09T00:03:50.847 INFO:tasks.workunit.client.0.vm03.stdout:2/301: write f7 [700980,47349] 0 2026-03-09T00:03:50.847 INFO:tasks.workunit.client.0.vm03.stdout:2/302: truncate d8/d1b/f3d 911241 0 2026-03-09T00:03:50.855 INFO:tasks.workunit.client.0.vm03.stdout:9/337: mkdir d15/d1c/d28/d6e 0 2026-03-09T00:03:50.860 INFO:tasks.workunit.client.1.vm06.stdout:6/562: rmdir d4/d27/d3e/d78/d97 39 2026-03-09T00:03:50.860 INFO:tasks.workunit.client.0.vm03.stdout:7/321: stat d2/l41 0 2026-03-09T00:03:50.866 INFO:tasks.workunit.client.1.vm06.stdout:1/480: unlink d6/d4c/d71/l70 0 2026-03-09T00:03:50.866 INFO:tasks.workunit.client.1.vm06.stdout:8/537: mkdir db/d53/d70/d38/d4d/db1 0 2026-03-09T00:03:50.866 INFO:tasks.workunit.client.1.vm06.stdout:8/538: readlink db/d53/d5c/l6b 0 2026-03-09T00:03:50.870 INFO:tasks.workunit.client.0.vm03.stdout:0/304: creat d2/da/dd/d49/f6a x:0 0 0 2026-03-09T00:03:50.870 INFO:tasks.workunit.client.0.vm03.stdout:0/305: truncate d2/da/d36/d39/d4b/f67 154806 0 2026-03-09T00:03:50.870 INFO:tasks.workunit.client.1.vm06.stdout:0/538: write d3/d18/d1f/f5e [2423758,85218] 0 2026-03-09T00:03:50.876 INFO:tasks.workunit.client.1.vm06.stdout:4/507: truncate f14 1977955 0 2026-03-09T00:03:50.876 INFO:tasks.workunit.client.1.vm06.stdout:4/508: fsync d17/d24/d3b/f4a 0 2026-03-09T00:03:50.880 INFO:tasks.workunit.client.1.vm06.stdout:2/640: sync 2026-03-09T00:03:50.887 INFO:tasks.workunit.client.0.vm03.stdout:1/409: mknod d4/d15/c8b 0 2026-03-09T00:03:50.889 INFO:tasks.workunit.client.1.vm06.stdout:7/573: rename d0/df/d1a/d3a/l71 to d0/df/d1a/d27/d4c/d40/d51/d86/la3 0 2026-03-09T00:03:50.895 INFO:tasks.workunit.client.1.vm06.stdout:1/481: truncate d6/d21/d2d/f74 417728 0 2026-03-09T00:03:50.897 INFO:tasks.workunit.client.1.vm06.stdout:3/610: sync 2026-03-09T00:03:50.898 INFO:tasks.workunit.client.0.vm03.stdout:4/385: mknod d7/d20/c79 0 2026-03-09T00:03:50.898 INFO:tasks.workunit.client.0.vm03.stdout:4/386: dread - d7/d20/d6a/f76 zero size 2026-03-09T00:03:50.900 INFO:tasks.workunit.client.1.vm06.stdout:5/651: getdents d5/d1c/d68/da2 0 2026-03-09T00:03:50.905 INFO:tasks.workunit.client.0.vm03.stdout:6/303: getdents d13/d1e/d44 0 2026-03-09T00:03:50.905 INFO:tasks.workunit.client.0.vm03.stdout:6/304: stat d13/f1a 0 2026-03-09T00:03:50.915 INFO:tasks.workunit.client.1.vm06.stdout:4/509: mknod d17/d21/caa 0 2026-03-09T00:03:50.916 INFO:tasks.workunit.client.1.vm06.stdout:2/641: creat d7/da/fbf x:0 0 0 2026-03-09T00:03:50.922 INFO:tasks.workunit.client.0.vm03.stdout:5/369: mkdir d1c/d20/d55/d4f/d58/d73/d76 0 2026-03-09T00:03:50.930 INFO:tasks.workunit.client.1.vm06.stdout:7/574: rename d0/df/d1a/d27/d4c/f8c to d0/df/d1a/d3a/d4e/fa4 0 2026-03-09T00:03:50.931 INFO:tasks.workunit.client.0.vm03.stdout:8/304: creat d7/df/d1a/d2b/f5c x:0 0 0 2026-03-09T00:03:50.935 INFO:tasks.workunit.client.0.vm03.stdout:7/322: creat d2/d4/d1e/f63 x:0 0 0 2026-03-09T00:03:50.936 INFO:tasks.workunit.client.1.vm06.stdout:4/510: mknod d17/d21/d4c/d66/cab 0 2026-03-09T00:03:50.936 INFO:tasks.workunit.client.0.vm03.stdout:7/323: read d2/d1f/d42/f47 [546479,15954] 0 2026-03-09T00:03:50.936 INFO:tasks.workunit.client.0.vm03.stdout:7/324: getdents d2/d1f/d42/d46/d54/d60 0 2026-03-09T00:03:50.936 INFO:tasks.workunit.client.0.vm03.stdout:7/325: fdatasync d2/d1f/d3a/d31/d37/f56 0 2026-03-09T00:03:50.938 INFO:tasks.workunit.client.0.vm03.stdout:0/306: mkdir d2/da/dd/d49/d6b 0 2026-03-09T00:03:50.942 INFO:tasks.workunit.client.0.vm03.stdout:1/410: mkdir d4/d15/d77/d8c 0 2026-03-09T00:03:50.952 INFO:tasks.workunit.client.0.vm03.stdout:1/411: dread - d4/d3a/d61/d78/f79 zero size 2026-03-09T00:03:50.953 INFO:tasks.workunit.client.1.vm06.stdout:5/652: rename d5/d1c/d21/d28/d5e/d66/d78/dc8/dc3 to d5/d1c/d23/d34/d47/ddd 0 2026-03-09T00:03:50.953 INFO:tasks.workunit.client.0.vm03.stdout:3/255: dwrite d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:03:50.956 INFO:tasks.workunit.client.1.vm06.stdout:2/642: dread d7/da/d4e/d57/f7a [0,4194304] 0 2026-03-09T00:03:50.958 INFO:tasks.workunit.client.1.vm06.stdout:2/643: stat d7/da/d4e/d57/d9d/fbe 0 2026-03-09T00:03:50.958 INFO:tasks.workunit.client.1.vm06.stdout:7/575: creat d0/df/d1a/d27/d4c/d40/fa5 x:0 0 0 2026-03-09T00:03:50.960 INFO:tasks.workunit.client.1.vm06.stdout:7/576: dread d0/df/d17/f2d [0,4194304] 0 2026-03-09T00:03:50.964 INFO:tasks.workunit.client.1.vm06.stdout:3/611: unlink d11/c26 0 2026-03-09T00:03:50.965 INFO:tasks.workunit.client.0.vm03.stdout:4/387: symlink d7/d20/d6a/d77/l7a 0 2026-03-09T00:03:50.966 INFO:tasks.workunit.client.1.vm06.stdout:4/511: getdents d17/d5b/d8f 0 2026-03-09T00:03:50.969 INFO:tasks.workunit.client.1.vm06.stdout:5/653: rmdir d5/d1c/d21/d28 39 2026-03-09T00:03:50.972 INFO:tasks.workunit.client.1.vm06.stdout:2/644: truncate d7/d1a/d25/d66/f84 818798 0 2026-03-09T00:03:50.977 INFO:tasks.workunit.client.1.vm06.stdout:7/577: link d0/df/c6a d0/df/ca6 0 2026-03-09T00:03:50.977 INFO:tasks.workunit.client.1.vm06.stdout:2/645: write f6 [6252706,105064] 0 2026-03-09T00:03:50.977 INFO:tasks.workunit.client.1.vm06.stdout:2/646: chown d7/d1a/d25/l59 22189 1 2026-03-09T00:03:50.977 INFO:tasks.workunit.client.1.vm06.stdout:2/647: readlink d7/da/l2e 0 2026-03-09T00:03:50.979 INFO:tasks.workunit.client.1.vm06.stdout:3/612: rename f9 to d11/d28/d2e/d2f/d36/fd5 0 2026-03-09T00:03:50.979 INFO:tasks.workunit.client.1.vm06.stdout:3/613: read d11/d28/d2e/d2f/d36/d8f/fca [706066,93211] 0 2026-03-09T00:03:50.983 INFO:tasks.workunit.client.1.vm06.stdout:5/654: symlink d5/d1c/d21/d28/d5e/d66/d78/lde 0 2026-03-09T00:03:50.983 INFO:tasks.workunit.client.0.vm03.stdout:8/305: mknod d7/df/d1a/d2b/c5d 0 2026-03-09T00:03:50.987 INFO:tasks.workunit.client.1.vm06.stdout:2/648: rmdir d7/da/db 39 2026-03-09T00:03:51.001 INFO:tasks.workunit.client.1.vm06.stdout:9/466: truncate d1/f16 271092 0 2026-03-09T00:03:51.002 INFO:tasks.workunit.client.0.vm03.stdout:1/412: creat d4/d3a/d32/f8d x:0 0 0 2026-03-09T00:03:51.010 INFO:tasks.workunit.client.1.vm06.stdout:3/614: symlink d11/d28/d2e/d7e/d83/d87/ld6 0 2026-03-09T00:03:51.010 INFO:tasks.workunit.client.1.vm06.stdout:3/615: stat d11/d28/d4d/d89/fbe 0 2026-03-09T00:03:51.014 INFO:tasks.workunit.client.1.vm06.stdout:2/649: link d7/da/db/l76 d7/d1a/d25/d66/d87/da8/lc0 0 2026-03-09T00:03:51.023 INFO:tasks.workunit.client.1.vm06.stdout:2/650: dread d7/da/db/f74 [0,4194304] 0 2026-03-09T00:03:51.023 INFO:tasks.workunit.client.1.vm06.stdout:2/651: fsync d7/d1a/f30 0 2026-03-09T00:03:51.023 INFO:tasks.workunit.client.1.vm06.stdout:1/482: dwrite d6/d21/d2d/d37/f86 [0,4194304] 0 2026-03-09T00:03:51.023 INFO:tasks.workunit.client.1.vm06.stdout:0/539: dwrite d3/d18/f68 [0,4194304] 0 2026-03-09T00:03:51.023 INFO:tasks.workunit.client.1.vm06.stdout:0/540: truncate d3/f1e 735849 0 2026-03-09T00:03:51.024 INFO:tasks.workunit.client.1.vm06.stdout:3/616: mknod d11/d28/d2e/db2/dc2/cd7 0 2026-03-09T00:03:51.024 INFO:tasks.workunit.client.0.vm03.stdout:4/388: symlink d7/d20/l7b 0 2026-03-09T00:03:51.046 INFO:tasks.workunit.client.1.vm06.stdout:0/541: creat d3/d18/d1f/d39/fb1 x:0 0 0 2026-03-09T00:03:51.047 INFO:tasks.workunit.client.0.vm03.stdout:8/306: getdents d7/df/d1a 0 2026-03-09T00:03:51.056 INFO:tasks.workunit.client.0.vm03.stdout:8/307: fdatasync d7/df/d1a/f1c 0 2026-03-09T00:03:51.056 INFO:tasks.workunit.client.1.vm06.stdout:3/617: mkdir d11/d28/d2e/d7e/d83/dd8 0 2026-03-09T00:03:51.056 INFO:tasks.workunit.client.1.vm06.stdout:3/618: write d11/d28/d2e/d2f/f79 [1503697,48368] 0 2026-03-09T00:03:51.056 INFO:tasks.workunit.client.1.vm06.stdout:3/619: chown d11/d28/d2e/d2f/d5b/d94/faa 5 1 2026-03-09T00:03:51.056 INFO:tasks.workunit.client.0.vm03.stdout:0/307: rename d2/da/d36/d39 to d2/da/dd/d49/d6c 0 2026-03-09T00:03:51.057 INFO:tasks.workunit.client.1.vm06.stdout:0/542: unlink d3/d18/d1f/d39/d3b/f6f 0 2026-03-09T00:03:51.057 INFO:tasks.workunit.client.1.vm06.stdout:0/543: creat d3/d18/d3c/fb2 x:0 0 0 2026-03-09T00:03:51.057 INFO:tasks.workunit.client.0.vm03.stdout:8/308: creat d7/df/d1a/d40/f5e x:0 0 0 2026-03-09T00:03:51.062 INFO:tasks.workunit.client.0.vm03.stdout:8/309: rename d7/df/d1a/f3b to d7/df/d1e/d38/d4c/f5f 0 2026-03-09T00:03:51.064 INFO:tasks.workunit.client.1.vm06.stdout:0/544: link d3/d18/f68 d3/d18/d1f/d39/d49/d60/fb3 0 2026-03-09T00:03:51.067 INFO:tasks.workunit.client.0.vm03.stdout:8/310: unlink d7/df/c16 0 2026-03-09T00:03:51.074 INFO:tasks.workunit.client.1.vm06.stdout:0/545: rename d3/d18/d28/d45/f97 to d3/d18/d1f/d39/d69/fb4 0 2026-03-09T00:03:51.079 INFO:tasks.workunit.client.1.vm06.stdout:0/546: unlink d3/d18/d1f/d39/f83 0 2026-03-09T00:03:51.081 INFO:tasks.workunit.client.0.vm03.stdout:8/311: mkdir d7/df/d1e/d38/d60 0 2026-03-09T00:03:51.081 INFO:tasks.workunit.client.0.vm03.stdout:8/312: dread - d7/df/d1a/d40/f5e zero size 2026-03-09T00:03:51.087 INFO:tasks.workunit.client.0.vm03.stdout:9/338: dwrite d15/d1c/d21/f41 [0,4194304] 0 2026-03-09T00:03:51.088 INFO:tasks.workunit.client.0.vm03.stdout:9/339: fsync d15/d1c/d21/f46 0 2026-03-09T00:03:51.088 INFO:tasks.workunit.client.1.vm06.stdout:8/539: dwrite db/d1e/f5f [0,4194304] 0 2026-03-09T00:03:51.088 INFO:tasks.workunit.client.1.vm06.stdout:8/540: dread - db/d53/d70/d38/d4d/d79/f96 zero size 2026-03-09T00:03:51.091 INFO:tasks.workunit.client.1.vm06.stdout:7/578: dread d0/df/d1a/d27/d4c/f32 [0,4194304] 0 2026-03-09T00:03:51.094 INFO:tasks.workunit.client.1.vm06.stdout:0/547: creat d3/d18/d2c/d2d/d8c/fb5 x:0 0 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:8/541: creat db/d53/d70/d38/d47/fb2 x:0 0 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:8/542: chown db/dd/d84/f8d 1161874 1 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:7/579: rmdir d0/df/d1a/d3a/d4e/d5e 39 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:0/548: symlink d3/d18/lb6 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.0.vm03.stdout:9/340: symlink d15/d1c/d28/d6e/l6f 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.0.vm03.stdout:9/341: symlink d15/d1c/d21/d64/l70 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.0.vm03.stdout:9/342: link fc d15/d1c/d21/f71 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.0.vm03.stdout:9/343: write d15/d1c/d28/f29 [889795,115767] 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.0.vm03.stdout:9/344: creat d15/d1c/d36/f72 x:0 0 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.0.vm03.stdout:9/345: unlink d15/d1c/d28/f33 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:8/543: mknod db/d1e/cb3 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:7/580: symlink d0/df/d1a/d35/d62/la7 0 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:7/581: chown d0/df/d1a/f25 8076502 1 2026-03-09T00:03:51.119 INFO:tasks.workunit.client.1.vm06.stdout:7/582: chown d0/df/d1a/d3a/d4e/d5e/c96 784 1 2026-03-09T00:03:51.128 INFO:tasks.workunit.client.0.vm03.stdout:0/308: dread d2/da/dd/d49/d6c/f41 [0,4194304] 0 2026-03-09T00:03:51.129 INFO:tasks.workunit.client.0.vm03.stdout:0/309: chown d2/da/f1b 32135 1 2026-03-09T00:03:51.129 INFO:tasks.workunit.client.0.vm03.stdout:0/310: fsync d2/d5a/f63 0 2026-03-09T00:03:51.129 INFO:tasks.workunit.client.1.vm06.stdout:8/544: read db/d53/d70/f75 [1680257,25641] 0 2026-03-09T00:03:51.129 INFO:tasks.workunit.client.1.vm06.stdout:8/545: chown db/dd/d24/dac 598760 1 2026-03-09T00:03:51.129 INFO:tasks.workunit.client.1.vm06.stdout:5/655: dwrite d5/d44/d4b/d92/f86 [0,4194304] 0 2026-03-09T00:03:51.130 INFO:tasks.workunit.client.0.vm03.stdout:9/346: creat d15/d1c/d21/d54/f73 x:0 0 0 2026-03-09T00:03:51.131 INFO:tasks.workunit.client.1.vm06.stdout:7/583: write d0/df/d17/f7e [1614796,19487] 0 2026-03-09T00:03:51.136 INFO:tasks.workunit.client.0.vm03.stdout:3/256: dwrite d2/f30 [0,4194304] 0 2026-03-09T00:03:51.147 INFO:tasks.workunit.client.0.vm03.stdout:7/326: dwrite d2/d1f/d3a/d31/f3f [0,4194304] 0 2026-03-09T00:03:51.147 INFO:tasks.workunit.client.0.vm03.stdout:7/327: fdatasync d2/d1f/d42/f47 0 2026-03-09T00:03:51.147 INFO:tasks.workunit.client.0.vm03.stdout:6/305: dwrite d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:03:51.147 INFO:tasks.workunit.client.0.vm03.stdout:5/370: dwrite d1c/f37 [0,4194304] 0 2026-03-09T00:03:51.147 INFO:tasks.workunit.client.0.vm03.stdout:0/311: mknod d2/da/dd/d49/c6d 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:9/467: write d1/d4/d6e/d9/f4c [3579540,109057] 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:8/546: rename db/d74/d78/d98/ca3 to db/dd/d85/cb4 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:8/547: dread - db/d53/d7c/fa0 zero size 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:8/548: readlink db/d1e/l2b 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:8/549: truncate db/d1e/f82 4202068 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:5/656: getdents d5/d1c/d21/d28 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:7/584: unlink d0/c2b 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:7/585: chown d0/df/d1a/d3a/d4e/f63 125 1 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:7/586: read d0/df/d1a/d3a/f5d [309129,72537] 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:7/587: dread - d0/df/d1a/d35/f94 zero size 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:7/588: write d0/df/d1a/d35/f77 [1320027,110054] 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.1.vm06.stdout:7/589: dread - d0/df/d1a/d35/f94 zero size 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.0.vm03.stdout:3/257: creat d2/db/d40/f4a x:0 0 0 2026-03-09T00:03:51.148 INFO:tasks.workunit.client.0.vm03.stdout:3/258: write d2/db/d2d/f37 [673770,38760] 0 2026-03-09T00:03:51.177 INFO:tasks.workunit.client.0.vm03.stdout:4/389: dwrite d7/d20/d29/f53 [0,4194304] 0 2026-03-09T00:03:51.177 INFO:tasks.workunit.client.0.vm03.stdout:4/390: chown d7/d20/d6a/d77/d25/l2e 2590 1 2026-03-09T00:03:51.177 INFO:tasks.workunit.client.0.vm03.stdout:4/391: readlink d7/d20/d35/l41 0 2026-03-09T00:03:51.177 INFO:tasks.workunit.client.0.vm03.stdout:4/392: chown d7/d27/c40 2 1 2026-03-09T00:03:51.188 INFO:tasks.workunit.client.1.vm06.stdout:9/468: mkdir d1/d3/d4f/d91 0 2026-03-09T00:03:51.201 INFO:tasks.workunit.client.0.vm03.stdout:7/328: creat d2/d1f/d42/d46/d54/d60/f64 x:0 0 0 2026-03-09T00:03:51.201 INFO:tasks.workunit.client.0.vm03.stdout:7/329: read d2/d4/f2e [799109,62782] 0 2026-03-09T00:03:51.201 INFO:tasks.workunit.client.0.vm03.stdout:6/306: rename d13/f1c to d13/d1e/d44/d4a/d52/f6d 0 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.0.vm03.stdout:6/307: chown d13/d35/l37 957 1 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.0.vm03.stdout:6/308: dread - d13/d35/d4c/f4f zero size 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.0.vm03.stdout:6/309: dread - d13/f1a zero size 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.0.vm03.stdout:6/310: stat d13/d35/d69 0 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.1.vm06.stdout:2/652: dread d7/d1b/d5a/d86/f8b [0,4194304] 0 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.1.vm06.stdout:5/657: mknod d5/db1/cdf 0 2026-03-09T00:03:51.202 INFO:tasks.workunit.client.1.vm06.stdout:9/469: symlink d1/d3/d4f/d52/l92 0 2026-03-09T00:03:51.204 INFO:tasks.workunit.client.0.vm03.stdout:8/313: dwrite d7/f3c [0,4194304] 0 2026-03-09T00:03:51.207 INFO:tasks.workunit.client.1.vm06.stdout:3/620: fdatasync d11/d28/d2e/d2f/f79 0 2026-03-09T00:03:51.213 INFO:tasks.workunit.client.1.vm06.stdout:1/483: dwrite d6/d21/d2d/d37/f77 [4194304,4194304] 0 2026-03-09T00:03:51.213 INFO:tasks.workunit.client.1.vm06.stdout:1/484: fdatasync d6/d4c/d71/d83/f9b 0 2026-03-09T00:03:51.213 INFO:tasks.workunit.client.1.vm06.stdout:1/485: write d6/d21/d2d/d3b/d87/f9e [795650,71193] 0 2026-03-09T00:03:51.213 INFO:tasks.workunit.client.1.vm06.stdout:1/486: dread - d6/d4c/d71/f4a zero size 2026-03-09T00:03:51.213 INFO:tasks.workunit.client.1.vm06.stdout:1/487: write d6/f1d [3586074,5905] 0 2026-03-09T00:03:51.218 INFO:tasks.workunit.client.0.vm03.stdout:5/371: creat d1c/d51/d6a/d75/f77 x:0 0 0 2026-03-09T00:03:51.218 INFO:tasks.workunit.client.0.vm03.stdout:9/347: dwrite d15/f2c [0,4194304] 0 2026-03-09T00:03:51.218 INFO:tasks.workunit.client.0.vm03.stdout:9/348: fsync d15/d1c/d21/f61 0 2026-03-09T00:03:51.218 INFO:tasks.workunit.client.1.vm06.stdout:5/658: symlink d5/d1c/d68/da2/le0 0 2026-03-09T00:03:51.223 INFO:tasks.workunit.client.1.vm06.stdout:0/549: fsync d3/d18/d1f/f5e 0 2026-03-09T00:03:51.223 INFO:tasks.workunit.client.1.vm06.stdout:0/550: dread - d3/d18/d28/f81 zero size 2026-03-09T00:03:51.224 INFO:tasks.workunit.client.1.vm06.stdout:4/512: dwrite d17/d21/d4c/d50/f60 [4194304,4194304] 0 2026-03-09T00:03:51.224 INFO:tasks.workunit.client.1.vm06.stdout:4/513: readlink d17/d24/d3b/d97/l9b 0 2026-03-09T00:03:51.239 INFO:tasks.workunit.client.1.vm06.stdout:0/551: dread d3/d18/d1f/d44/d6a/f96 [0,4194304] 0 2026-03-09T00:03:51.252 INFO:tasks.workunit.client.1.vm06.stdout:8/550: mknod db/dd/cb5 0 2026-03-09T00:03:51.253 INFO:tasks.workunit.client.1.vm06.stdout:8/551: write db/f2d [1241446,59849] 0 2026-03-09T00:03:51.253 INFO:tasks.workunit.client.1.vm06.stdout:8/552: chown db/d1e/f50 26998 1 2026-03-09T00:03:51.256 INFO:tasks.workunit.client.0.vm03.stdout:1/413: dwrite d4/d3a/d32/d6a/f76 [0,4194304] 0 2026-03-09T00:03:51.275 INFO:tasks.workunit.client.0.vm03.stdout:1/414: write d4/d3a/d3d/d46/f4c [4457329,74446] 0 2026-03-09T00:03:51.275 INFO:tasks.workunit.client.0.vm03.stdout:4/393: dwrite d7/d20/d29/d54/d58/f6b [0,4194304] 0 2026-03-09T00:03:51.275 INFO:tasks.workunit.client.0.vm03.stdout:4/394: creat d7/d20/d29/d38/d3a/f7c x:0 0 0 2026-03-09T00:03:51.280 INFO:tasks.workunit.client.0.vm03.stdout:1/415: write d4/d15/f44 [4405635,22186] 0 2026-03-09T00:03:51.280 INFO:tasks.workunit.client.0.vm03.stdout:1/416: chown d4/d3a/d32/c40 8 1 2026-03-09T00:03:51.281 INFO:tasks.workunit.client.0.vm03.stdout:3/259: creat d2/f4b x:0 0 0 2026-03-09T00:03:51.287 INFO:tasks.workunit.client.1.vm06.stdout:7/590: dwrite d0/df/d1a/d27/d4c/f32 [0,4194304] 0 2026-03-09T00:03:51.289 INFO:tasks.workunit.client.1.vm06.stdout:7/591: dread d0/df/d1a/d35/d62/f81 [0,4194304] 0 2026-03-09T00:03:51.289 INFO:tasks.workunit.client.1.vm06.stdout:7/592: truncate d0/df/d1a/d27/d4c/d40/fa5 719775 0 2026-03-09T00:03:51.289 INFO:tasks.workunit.client.1.vm06.stdout:7/593: chown d0/df/d1a/l33 0 1 2026-03-09T00:03:51.290 INFO:tasks.workunit.client.1.vm06.stdout:9/470: rename d1/d4/d6e/d9/f65 to d1/d4/d6e/f93 0 2026-03-09T00:03:51.298 INFO:tasks.workunit.client.0.vm03.stdout:6/311: dwrite d13/d1e/d44/f49 [0,4194304] 0 2026-03-09T00:03:51.298 INFO:tasks.workunit.client.0.vm03.stdout:6/312: readlink d13/d1e/d44/l53 0 2026-03-09T00:03:51.299 INFO:tasks.workunit.client.0.vm03.stdout:6/313: creat d13/d1e/d44/d59/f6e x:0 0 0 2026-03-09T00:03:51.299 INFO:tasks.workunit.client.1.vm06.stdout:3/621: mknod d11/d28/d4d/cd9 0 2026-03-09T00:03:51.304 INFO:tasks.workunit.client.0.vm03.stdout:8/314: dwrite d7/df/d1e/d38/f3e [0,4194304] 0 2026-03-09T00:03:51.305 INFO:tasks.workunit.client.0.vm03.stdout:8/315: dread d7/df/d1a/f2e [0,4194304] 0 2026-03-09T00:03:51.305 INFO:tasks.workunit.client.0.vm03.stdout:8/316: truncate d7/f49 507758 0 2026-03-09T00:03:51.305 INFO:tasks.workunit.client.0.vm03.stdout:8/317: chown d7/c23 100837084 1 2026-03-09T00:03:51.308 INFO:tasks.workunit.client.0.vm03.stdout:6/314: write d13/d1e/f3e [1656669,97857] 0 2026-03-09T00:03:51.308 INFO:tasks.workunit.client.0.vm03.stdout:6/315: chown d13/l2a 16565040 1 2026-03-09T00:03:51.309 INFO:tasks.workunit.client.0.vm03.stdout:6/316: write d13/d1e/d44/d4a/f58 [968232,53068] 0 2026-03-09T00:03:51.309 INFO:tasks.workunit.client.0.vm03.stdout:6/317: stat d13/f3a 0 2026-03-09T00:03:51.311 INFO:tasks.workunit.client.0.vm03.stdout:0/312: rename d2/da/dd/d49/d6b to d2/da/dd/d6e 0 2026-03-09T00:03:51.312 INFO:tasks.workunit.client.0.vm03.stdout:5/372: mknod d1c/d20/d55/d66/c78 0 2026-03-09T00:03:51.313 INFO:tasks.workunit.client.1.vm06.stdout:6/563: sync 2026-03-09T00:03:51.313 INFO:tasks.workunit.client.1.vm06.stdout:6/564: write d4/d16/f34 [6133215,96138] 0 2026-03-09T00:03:51.313 INFO:tasks.workunit.client.1.vm06.stdout:6/565: dread - d4/d16/d53/d67/f8f zero size 2026-03-09T00:03:51.313 INFO:tasks.workunit.client.1.vm06.stdout:1/488: creat d6/d21/d2d/d3b/fa2 x:0 0 0 2026-03-09T00:03:51.318 INFO:tasks.workunit.client.1.vm06.stdout:3/622: dread d11/d28/d2e/d2f/d36/f4a [0,4194304] 0 2026-03-09T00:03:51.318 INFO:tasks.workunit.client.1.vm06.stdout:4/514: dwrite d17/d21/d32/f85 [0,4194304] 0 2026-03-09T00:03:51.318 INFO:tasks.workunit.client.1.vm06.stdout:2/653: getdents d7/d1b/d5a/d86 0 2026-03-09T00:03:51.321 INFO:tasks.workunit.client.1.vm06.stdout:5/659: link d5/d1c/d21/d28/d5e/d66/d78/da6/fd4 d5/d44/d4b/fe1 0 2026-03-09T00:03:51.329 INFO:tasks.workunit.client.0.vm03.stdout:9/349: mkdir d15/d1c/d21/d67/d74 0 2026-03-09T00:03:51.335 INFO:tasks.workunit.client.1.vm06.stdout:1/489: dread d6/f1d [0,4194304] 0 2026-03-09T00:03:51.335 INFO:tasks.workunit.client.1.vm06.stdout:1/490: readlink d6/d21/l60 0 2026-03-09T00:03:51.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:51 vm03.local ceph-mon[52346]: pgmap v10: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 110 MiB/s rd, 133 MiB/s wr, 206 op/s 2026-03-09T00:03:51.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:51 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:51.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:51 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:51.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:51 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:51.341 INFO:tasks.workunit.client.0.vm03.stdout:9/350: dread f10 [0,4194304] 0 2026-03-09T00:03:51.341 INFO:tasks.workunit.client.0.vm03.stdout:9/351: chown d15/d1c/d21/f61 3583 1 2026-03-09T00:03:51.342 INFO:tasks.workunit.client.1.vm06.stdout:8/553: mkdir db/d74/d78/d98/db6 0 2026-03-09T00:03:51.347 INFO:tasks.workunit.client.0.vm03.stdout:2/303: sync 2026-03-09T00:03:51.347 INFO:tasks.workunit.client.1.vm06.stdout:7/594: mknod d0/df/d1a/d27/d4c/d40/d51/ca8 0 2026-03-09T00:03:51.351 INFO:tasks.workunit.client.0.vm03.stdout:9/352: read d15/f44 [3677176,84321] 0 2026-03-09T00:03:51.355 INFO:tasks.workunit.client.1.vm06.stdout:0/552: rename d3/d18/cd to d3/d18/d1f/d39/cb7 0 2026-03-09T00:03:51.361 INFO:tasks.workunit.client.0.vm03.stdout:1/417: unlink f1 0 2026-03-09T00:03:51.364 INFO:tasks.workunit.client.1.vm06.stdout:6/566: mknod d4/d27/d42/d7e/dac/cb1 0 2026-03-09T00:03:51.365 INFO:tasks.workunit.client.1.vm06.stdout:3/623: creat d11/d28/d2e/db2/fda x:0 0 0 2026-03-09T00:03:51.372 INFO:tasks.workunit.client.1.vm06.stdout:4/515: mkdir d17/d5b/dac 0 2026-03-09T00:03:51.372 INFO:tasks.workunit.client.1.vm06.stdout:4/516: fsync d17/d24/f39 0 2026-03-09T00:03:51.395 INFO:tasks.workunit.client.0.vm03.stdout:7/330: rmdir d2 39 2026-03-09T00:03:51.396 INFO:tasks.workunit.client.0.vm03.stdout:7/331: fdatasync d2/d1f/d42/d43/f4a 0 2026-03-09T00:03:51.397 INFO:tasks.workunit.client.1.vm06.stdout:8/554: mkdir db/dd/d85/d9f/db7 0 2026-03-09T00:03:51.397 INFO:tasks.workunit.client.0.vm03.stdout:7/332: dread d2/d1f/d42/f47 [0,4194304] 0 2026-03-09T00:03:51.410 INFO:tasks.workunit.client.0.vm03.stdout:8/318: creat d7/df/d1e/d38/f61 x:0 0 0 2026-03-09T00:03:51.410 INFO:tasks.workunit.client.0.vm03.stdout:8/319: chown d7/df/d1e/d38/d4c/f5f 16907202 1 2026-03-09T00:03:51.410 INFO:tasks.workunit.client.1.vm06.stdout:2/654: rename d7/d1b/d5a to d7/d1b/d71/d79/db4/dc1 0 2026-03-09T00:03:51.410 INFO:tasks.workunit.client.1.vm06.stdout:2/655: readlink d7/da/db/la7 0 2026-03-09T00:03:51.412 INFO:tasks.workunit.client.1.vm06.stdout:4/517: dwrite d17/d5b/d8f/fa8 [0,4194304] 0 2026-03-09T00:03:51.413 INFO:tasks.workunit.client.0.vm03.stdout:6/318: rename d13/f2c to d13/f6f 0 2026-03-09T00:03:51.414 INFO:tasks.workunit.client.0.vm03.stdout:0/313: truncate d2/da/f2d 3373800 0 2026-03-09T00:03:51.414 INFO:tasks.workunit.client.0.vm03.stdout:5/373: truncate ff 4079292 0 2026-03-09T00:03:51.418 INFO:tasks.workunit.client.0.vm03.stdout:6/319: dread d13/d1e/f28 [0,4194304] 0 2026-03-09T00:03:51.420 INFO:tasks.workunit.client.1.vm06.stdout:9/471: getdents d1/d4/d6e/d14/d25 0 2026-03-09T00:03:51.420 INFO:tasks.workunit.client.1.vm06.stdout:9/472: write d1/f45 [1013157,80496] 0 2026-03-09T00:03:51.420 INFO:tasks.workunit.client.0.vm03.stdout:2/304: truncate d8/f59 604230 0 2026-03-09T00:03:51.432 INFO:tasks.workunit.client.0.vm03.stdout:9/353: mkdir d15/d1c/d21/d75 0 2026-03-09T00:03:51.432 INFO:tasks.workunit.client.0.vm03.stdout:9/354: write d15/d1c/d36/f6d [376204,5566] 0 2026-03-09T00:03:51.432 INFO:tasks.workunit.client.0.vm03.stdout:9/355: rename d15/d1c to d15/d1c/d28/d6e/d76 22 2026-03-09T00:03:51.432 INFO:tasks.workunit.client.0.vm03.stdout:4/395: rmdir d7/d20/d6a 39 2026-03-09T00:03:51.439 INFO:tasks.workunit.client.0.vm03.stdout:1/418: creat d4/d3a/d61/d78/f8e x:0 0 0 2026-03-09T00:03:51.442 INFO:tasks.workunit.client.1.vm06.stdout:3/624: unlink d11/d28/d4d/d9b/fc5 0 2026-03-09T00:03:51.442 INFO:tasks.workunit.client.1.vm06.stdout:6/567: symlink d4/d27/d3e/lb2 0 2026-03-09T00:03:51.442 INFO:tasks.workunit.client.1.vm06.stdout:3/625: chown d11/d3f/c77 2007580 1 2026-03-09T00:03:51.442 INFO:tasks.workunit.client.1.vm06.stdout:6/568: dread - d4/d27/d42/d52/d7d/faf zero size 2026-03-09T00:03:51.442 INFO:tasks.workunit.client.0.vm03.stdout:3/260: mknod d2/c4c 0 2026-03-09T00:03:51.442 INFO:tasks.workunit.client.0.vm03.stdout:3/261: chown d2/db/d3b/f3e 206 1 2026-03-09T00:03:51.453 INFO:tasks.workunit.client.1.vm06.stdout:8/555: truncate db/d53/d70/f91 2529064 0 2026-03-09T00:03:51.453 INFO:tasks.workunit.client.1.vm06.stdout:8/556: write db/d1e/f4f [1035111,48094] 0 2026-03-09T00:03:51.454 INFO:tasks.workunit.client.1.vm06.stdout:8/557: creat db/d53/d7c/fb8 x:0 0 0 2026-03-09T00:03:51.454 INFO:tasks.workunit.client.1.vm06.stdout:8/558: dread - db/d53/d70/f45 zero size 2026-03-09T00:03:51.454 INFO:tasks.workunit.client.1.vm06.stdout:8/559: chown db/dd/d48/f7f 0 1 2026-03-09T00:03:51.457 INFO:tasks.workunit.client.0.vm03.stdout:8/320: mkdir d7/df/d1a/d2b/d62 0 2026-03-09T00:03:51.457 INFO:tasks.workunit.client.1.vm06.stdout:1/491: rename d6/d21/d2d/d3b/d42/f7d to d6/d21/d2d/d3b/d87/d9d/fa3 0 2026-03-09T00:03:51.457 INFO:tasks.workunit.client.1.vm06.stdout:1/492: stat d6/f25 0 2026-03-09T00:03:51.458 INFO:tasks.workunit.client.0.vm03.stdout:0/314: mkdir d2/da/dd/d49/d6c/d4b/d55/d6f 0 2026-03-09T00:03:51.465 INFO:tasks.workunit.client.0.vm03.stdout:5/374: symlink d1c/d20/d55/d43/l79 0 2026-03-09T00:03:51.471 INFO:tasks.workunit.client.0.vm03.stdout:6/320: creat d13/f70 x:0 0 0 2026-03-09T00:03:51.478 INFO:tasks.workunit.client.1.vm06.stdout:4/518: creat d17/d24/d49/d5f/fad x:0 0 0 2026-03-09T00:03:51.485 INFO:tasks.workunit.client.0.vm03.stdout:2/305: creat d8/d26/d5e/f64 x:0 0 0 2026-03-09T00:03:51.485 INFO:tasks.workunit.client.0.vm03.stdout:2/306: chown d8/fd 48 1 2026-03-09T00:03:51.496 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:51 vm06.local ceph-mon[58395]: pgmap v10: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 110 MiB/s rd, 133 MiB/s wr, 206 op/s 2026-03-09T00:03:51.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:51 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:51.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:51 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:51.498 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:51 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:51.498 INFO:tasks.workunit.client.1.vm06.stdout:9/473: mkdir d1/d3/d4f/d91/d94 0 2026-03-09T00:03:51.498 INFO:tasks.workunit.client.1.vm06.stdout:9/474: chown d1/d3/d2b/f6d 11 1 2026-03-09T00:03:51.504 INFO:tasks.workunit.client.1.vm06.stdout:6/569: creat d4/d27/d42/da6/fb3 x:0 0 0 2026-03-09T00:03:51.510 INFO:tasks.workunit.client.0.vm03.stdout:3/262: creat d2/db/d40/d44/f4d x:0 0 0 2026-03-09T00:03:51.521 INFO:tasks.workunit.client.0.vm03.stdout:5/375: mknod d1c/d20/d55/c7a 0 2026-03-09T00:03:51.525 INFO:tasks.workunit.client.0.vm03.stdout:5/376: chown d1c/f37 1 1 2026-03-09T00:03:51.528 INFO:tasks.workunit.client.1.vm06.stdout:7/595: rename d0/c30 to d0/df/d1a/d27/d4c/d40/d51/d86/ca9 0 2026-03-09T00:03:51.531 INFO:tasks.workunit.client.1.vm06.stdout:9/475: link d1/d4/d6e/d9/f82 d1/d3/d4f/d91/d94/f95 0 2026-03-09T00:03:51.536 INFO:tasks.workunit.client.1.vm06.stdout:8/560: rename db/d53/d6d/l73 to db/d74/d78/d98/lb9 0 2026-03-09T00:03:51.538 INFO:tasks.workunit.client.1.vm06.stdout:7/596: creat d0/d55/d99/faa x:0 0 0 2026-03-09T00:03:51.539 INFO:tasks.workunit.client.1.vm06.stdout:9/476: mknod d1/d3/d2b/d58/c96 0 2026-03-09T00:03:51.541 INFO:tasks.workunit.client.1.vm06.stdout:6/570: rename d4/d27/d3e/da4 to d4/db4 0 2026-03-09T00:03:51.541 INFO:tasks.workunit.client.1.vm06.stdout:6/571: write d4/f12 [483140,41934] 0 2026-03-09T00:03:51.541 INFO:tasks.workunit.client.1.vm06.stdout:8/561: write db/d53/d70/d38/f72 [3041012,10847] 0 2026-03-09T00:03:51.542 INFO:tasks.workunit.client.1.vm06.stdout:9/477: symlink d1/d3/d2b/d58/l97 0 2026-03-09T00:03:51.547 INFO:tasks.workunit.client.1.vm06.stdout:6/572: mknod d4/d16/cb5 0 2026-03-09T00:03:51.548 INFO:tasks.workunit.client.1.vm06.stdout:6/573: dread - d4/d27/d42/da6/fb3 zero size 2026-03-09T00:03:51.548 INFO:tasks.workunit.client.1.vm06.stdout:6/574: dread - d4/d27/f70 zero size 2026-03-09T00:03:51.548 INFO:tasks.workunit.client.1.vm06.stdout:6/575: chown d4/d27/l51 413 1 2026-03-09T00:03:51.550 INFO:tasks.workunit.client.1.vm06.stdout:7/597: rename d0/df/d1a/d27/d4c/d40/d51/l76 to d0/d55/lab 0 2026-03-09T00:03:51.551 INFO:tasks.workunit.client.1.vm06.stdout:9/478: rmdir d1/d3 39 2026-03-09T00:03:51.551 INFO:tasks.workunit.client.1.vm06.stdout:9/479: write d1/d4/d6e/d14/d25/d85/d49/f56 [1873813,37052] 0 2026-03-09T00:03:51.552 INFO:tasks.workunit.client.1.vm06.stdout:6/576: creat d4/d27/fb6 x:0 0 0 2026-03-09T00:03:51.555 INFO:tasks.workunit.client.1.vm06.stdout:6/577: creat d4/d16/d53/fb7 x:0 0 0 2026-03-09T00:03:51.555 INFO:tasks.workunit.client.1.vm06.stdout:9/480: symlink d1/d4/d6e/d9/l98 0 2026-03-09T00:03:51.565 INFO:tasks.workunit.client.1.vm06.stdout:1/493: dwrite d6/d21/d2d/f5d [0,4194304] 0 2026-03-09T00:03:51.574 INFO:tasks.workunit.client.1.vm06.stdout:6/578: dread d4/f22 [4194304,4194304] 0 2026-03-09T00:03:51.574 INFO:tasks.workunit.client.1.vm06.stdout:6/579: truncate d4/d27/d42/da6/fb3 161897 0 2026-03-09T00:03:51.574 INFO:tasks.workunit.client.1.vm06.stdout:9/481: dread d1/d3/f11 [0,4194304] 0 2026-03-09T00:03:51.581 INFO:tasks.workunit.client.1.vm06.stdout:9/482: read d1/d4/ff [1675851,49653] 0 2026-03-09T00:03:51.589 INFO:tasks.workunit.client.0.vm03.stdout:0/315: dwrite d2/f1e [0,4194304] 0 2026-03-09T00:03:51.591 INFO:tasks.workunit.client.0.vm03.stdout:7/333: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:51.591 INFO:tasks.workunit.client.0.vm03.stdout:7/334: write d2/f3 [4401239,49567] 0 2026-03-09T00:03:51.591 INFO:tasks.workunit.client.0.vm03.stdout:7/335: fsync d2/d1f/d42/d46/d54/d60/f64 0 2026-03-09T00:03:51.596 INFO:tasks.workunit.client.0.vm03.stdout:0/316: read d2/da/d1a/f1c [1516189,114410] 0 2026-03-09T00:03:51.601 INFO:tasks.workunit.client.0.vm03.stdout:0/317: chown d2/da/dd/d49/d6c/d4b/d55/f5b 609 1 2026-03-09T00:03:51.601 INFO:tasks.workunit.client.0.vm03.stdout:7/336: symlink d2/d1f/d3a/l65 0 2026-03-09T00:03:51.607 INFO:tasks.workunit.client.0.vm03.stdout:0/318: symlink d2/d1f/l70 0 2026-03-09T00:03:51.610 INFO:tasks.workunit.client.0.vm03.stdout:7/337: unlink d2/d4/f22 0 2026-03-09T00:03:51.610 INFO:tasks.workunit.client.0.vm03.stdout:0/319: mkdir d2/d71 0 2026-03-09T00:03:51.613 INFO:tasks.workunit.client.0.vm03.stdout:7/338: truncate d2/f50 3671047 0 2026-03-09T00:03:51.614 INFO:tasks.workunit.client.1.vm06.stdout:0/553: sync 2026-03-09T00:03:51.614 INFO:tasks.workunit.client.1.vm06.stdout:0/554: truncate d3/d18/d28/d45/f52 1327638 0 2026-03-09T00:03:51.614 INFO:tasks.workunit.client.1.vm06.stdout:0/555: truncate d3/d18/f59 1036691 0 2026-03-09T00:03:51.614 INFO:tasks.workunit.client.1.vm06.stdout:5/660: sync 2026-03-09T00:03:51.615 INFO:tasks.workunit.client.0.vm03.stdout:0/320: unlink d2/da/d36/f5d 0 2026-03-09T00:03:51.616 INFO:tasks.workunit.client.1.vm06.stdout:0/556: rmdir d3/d18/d2c 39 2026-03-09T00:03:51.624 INFO:tasks.workunit.client.1.vm06.stdout:5/661: rename d5/f19 to d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/fe2 0 2026-03-09T00:03:51.631 INFO:tasks.workunit.client.1.vm06.stdout:8/562: write db/dd/d84/f8d [3839143,19848] 0 2026-03-09T00:03:51.631 INFO:tasks.workunit.client.1.vm06.stdout:8/563: write db/d74/f8e [310036,30799] 0 2026-03-09T00:03:51.631 INFO:tasks.workunit.client.1.vm06.stdout:8/564: fsync db/f2d 0 2026-03-09T00:03:51.631 INFO:tasks.workunit.client.1.vm06.stdout:8/565: read - db/d53/d7c/f95 zero size 2026-03-09T00:03:51.631 INFO:tasks.workunit.client.0.vm03.stdout:3/263: dread d2/db/f17 [0,4194304] 0 2026-03-09T00:03:51.631 INFO:tasks.workunit.client.0.vm03.stdout:9/356: dwrite d15/d1c/d21/d54/f73 [0,4194304] 0 2026-03-09T00:03:51.635 INFO:tasks.workunit.client.1.vm06.stdout:1/494: dread d6/d21/d2d/d3b/d42/f80 [0,4194304] 0 2026-03-09T00:03:51.641 INFO:tasks.workunit.client.0.vm03.stdout:3/264: write d2/db/d2d/f36 [903016,97258] 0 2026-03-09T00:03:51.641 INFO:tasks.workunit.client.1.vm06.stdout:0/557: creat d3/d18/d1f/d44/d6a/d73/fb8 x:0 0 0 2026-03-09T00:03:51.647 INFO:tasks.workunit.client.1.vm06.stdout:5/662: write d5/d1c/d23/f4f [8474757,37625] 0 2026-03-09T00:03:51.649 INFO:tasks.workunit.client.1.vm06.stdout:4/519: dwrite d17/d24/f36 [0,4194304] 0 2026-03-09T00:03:51.649 INFO:tasks.workunit.client.1.vm06.stdout:4/520: stat d17/d24/d3b/l45 0 2026-03-09T00:03:51.651 INFO:tasks.workunit.client.1.vm06.stdout:5/663: dread d5/d1c/d23/f42 [0,4194304] 0 2026-03-09T00:03:51.661 INFO:tasks.workunit.client.1.vm06.stdout:5/664: write d5/d1c/d23/f4f [10522632,12383] 0 2026-03-09T00:03:51.661 INFO:tasks.workunit.client.1.vm06.stdout:5/665: write d5/d1c/d23/d34/fb2 [872840,38523] 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:8/566: creat db/dd/d24/fba x:0 0 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:1/495: creat d6/d4c/d79/fa4 x:0 0 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:1/496: chown d6/d21/d2d/d3b/d42 334989948 1 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:0/558: rename d3/d18/d1f/d39/d3b/c3e to d3/d18/cb9 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:0/559: creat d3/d18/d79/fba x:0 0 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:0/560: creat d3/d18/d1f/d39/d3b/fbb x:0 0 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:4/521: mknod d17/d24/d3b/d5e/d6e/cae 0 2026-03-09T00:03:51.662 INFO:tasks.workunit.client.1.vm06.stdout:4/522: write d17/d24/d3b/d75/f9e [235339,56437] 0 2026-03-09T00:03:51.663 INFO:tasks.workunit.client.0.vm03.stdout:9/357: truncate d15/d1c/d21/d54/f69 1833324 0 2026-03-09T00:03:51.663 INFO:tasks.workunit.client.0.vm03.stdout:9/358: truncate d15/d1c/d36/f6d 1310543 0 2026-03-09T00:03:51.663 INFO:tasks.workunit.client.0.vm03.stdout:9/359: fsync d15/d1c/d36/f5c 0 2026-03-09T00:03:51.663 INFO:tasks.workunit.client.0.vm03.stdout:4/396: dwrite d7/d20/f3d [0,4194304] 0 2026-03-09T00:03:51.664 INFO:tasks.workunit.client.0.vm03.stdout:4/397: write d7/d20/d35/d66/f69 [349040,95015] 0 2026-03-09T00:03:51.664 INFO:tasks.workunit.client.0.vm03.stdout:4/398: write d7/f62 [1007099,79499] 0 2026-03-09T00:03:51.664 INFO:tasks.workunit.client.1.vm06.stdout:5/666: dread d5/d1c/d21/d28/d5e/d66/d78/dc8/f7a [0,4194304] 0 2026-03-09T00:03:51.664 INFO:tasks.workunit.client.1.vm06.stdout:5/667: readlink d5/d1c/d68/l53 0 2026-03-09T00:03:51.664 INFO:tasks.workunit.client.1.vm06.stdout:5/668: fdatasync d5/d1c/d21/d28/f56 0 2026-03-09T00:03:51.665 INFO:tasks.workunit.client.1.vm06.stdout:1/497: creat d6/d4c/d51/fa5 x:0 0 0 2026-03-09T00:03:51.669 INFO:tasks.workunit.client.1.vm06.stdout:6/580: dread d4/f36 [4194304,4194304] 0 2026-03-09T00:03:51.669 INFO:tasks.workunit.client.1.vm06.stdout:6/581: creat d4/d27/d42/fb8 x:0 0 0 2026-03-09T00:03:51.670 INFO:tasks.workunit.client.0.vm03.stdout:2/307: dwrite d8/d1b/d24/f2f [0,4194304] 0 2026-03-09T00:03:51.671 INFO:tasks.workunit.client.0.vm03.stdout:3/265: dread d2/f9 [0,4194304] 0 2026-03-09T00:03:51.672 INFO:tasks.workunit.client.0.vm03.stdout:9/360: mkdir d15/d77 0 2026-03-09T00:03:51.672 INFO:tasks.workunit.client.0.vm03.stdout:9/361: creat d15/d1c/d36/f78 x:0 0 0 2026-03-09T00:03:51.673 INFO:tasks.workunit.client.1.vm06.stdout:6/582: dread d4/d27/d42/f75 [0,4194304] 0 2026-03-09T00:03:51.673 INFO:tasks.workunit.client.1.vm06.stdout:6/583: write d4/d27/fb6 [261244,98055] 0 2026-03-09T00:03:51.688 INFO:tasks.workunit.client.1.vm06.stdout:0/561: creat d3/d18/d2c/d2d/d74/fbc x:0 0 0 2026-03-09T00:03:51.688 INFO:tasks.workunit.client.1.vm06.stdout:0/562: chown d3/d18/d28/f86 3 1 2026-03-09T00:03:51.692 INFO:tasks.workunit.client.1.vm06.stdout:4/523: truncate d17/d24/d3b/d5e/f6d 5598664 0 2026-03-09T00:03:51.692 INFO:tasks.workunit.client.1.vm06.stdout:4/524: fdatasync d17/d24/d3b/d75/fa7 0 2026-03-09T00:03:51.692 INFO:tasks.workunit.client.1.vm06.stdout:4/525: readlink d17/d24/d49/l55 0 2026-03-09T00:03:51.692 INFO:tasks.workunit.client.1.vm06.stdout:4/526: chown f14 7770 1 2026-03-09T00:03:51.694 INFO:tasks.workunit.client.1.vm06.stdout:5/669: creat d5/d1c/d21/d28/d5e/d66/dab/fe3 x:0 0 0 2026-03-09T00:03:51.701 INFO:tasks.workunit.client.1.vm06.stdout:1/498: rename d6/d8f/da0 to d6/d21/da6 0 2026-03-09T00:03:51.705 INFO:tasks.workunit.client.1.vm06.stdout:1/499: write d6/d21/d2d/f3c [4756117,53684] 0 2026-03-09T00:03:51.706 INFO:tasks.workunit.client.1.vm06.stdout:5/670: write d5/d44/d4b/f6c [710499,47167] 0 2026-03-09T00:03:51.706 INFO:tasks.workunit.client.1.vm06.stdout:9/483: dwrite d1/d3/d4f/d52/f8b [0,4194304] 0 2026-03-09T00:03:51.711 INFO:tasks.workunit.client.0.vm03.stdout:9/362: mknod d15/d77/c79 0 2026-03-09T00:03:51.711 INFO:tasks.workunit.client.0.vm03.stdout:9/363: write d15/d1c/f3c [1043750,123443] 0 2026-03-09T00:03:51.715 INFO:tasks.workunit.client.0.vm03.stdout:6/321: dwrite d13/d1e/f3f [0,4194304] 0 2026-03-09T00:03:51.719 INFO:tasks.workunit.client.0.vm03.stdout:6/322: dread d13/d1e/f3e [0,4194304] 0 2026-03-09T00:03:51.723 INFO:tasks.workunit.client.0.vm03.stdout:1/419: dwrite d4/d3a/f41 [0,4194304] 0 2026-03-09T00:03:51.724 INFO:tasks.workunit.client.1.vm06.stdout:0/563: creat d3/d18/d2c/d2d/d74/d7d/fbd x:0 0 0 2026-03-09T00:03:51.726 INFO:tasks.workunit.client.0.vm03.stdout:4/399: getdents d7/d20/d29/d4e 0 2026-03-09T00:03:51.730 INFO:tasks.workunit.client.1.vm06.stdout:1/500: creat d6/fa7 x:0 0 0 2026-03-09T00:03:51.743 INFO:tasks.workunit.client.0.vm03.stdout:1/420: mkdir d4/d3a/d8f 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:5/671: symlink d5/d1c/d23/d34/d47/dcf/le4 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:0/564: creat d3/d18/d1f/d44/fbe x:0 0 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:1/501: rename d6/d21/d2d/d3b/d42/c66 to d6/d21/d2d/d3b/d42/ca8 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:5/672: link d5/d1c/d21/d28/d5e/d66/d78/dc8/c9a d5/ce5 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:0/565: getdents d3 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:1/502: symlink d6/d21/da6/la9 0 2026-03-09T00:03:51.750 INFO:tasks.workunit.client.1.vm06.stdout:0/566: rmdir d3/d18/d2c/d2d 39 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:0/567: getdents d3/d18/d1f/d39 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:0/568: fsync d3/d18/d2c/f4e 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:1/503: dread d6/d21/d2d/d3b/d87/f9e [0,4194304] 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:1/504: write d6/d4c/d79/f5c [5742857,65930] 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:1/505: write d6/d21/d2d/d3b/d42/f4e [268380,47975] 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:0/569: unlink d3/d18/f68 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:1/506: symlink d6/d21/d2d/d37/d6d/laa 0 2026-03-09T00:03:51.753 INFO:tasks.workunit.client.1.vm06.stdout:5/673: dread d5/d1c/d21/d28/f59 [0,4194304] 0 2026-03-09T00:03:51.757 INFO:tasks.workunit.client.1.vm06.stdout:0/570: dread d3/f51 [0,4194304] 0 2026-03-09T00:03:51.760 INFO:tasks.workunit.client.1.vm06.stdout:5/674: link d5/d44/d4b/cd6 d5/d1c/d23/d34/d47/dcf/ce6 0 2026-03-09T00:03:51.761 INFO:tasks.workunit.client.1.vm06.stdout:0/571: symlink d3/d18/lbf 0 2026-03-09T00:03:51.762 INFO:tasks.workunit.client.1.vm06.stdout:5/675: link d5/d1c/d23/d34/d47/fbd d5/fe7 0 2026-03-09T00:03:51.763 INFO:tasks.workunit.client.1.vm06.stdout:5/676: mkdir d5/d44/d84/dc5/de8 0 2026-03-09T00:03:51.764 INFO:tasks.workunit.client.1.vm06.stdout:5/677: truncate d5/d44/d4b/f70 4039418 0 2026-03-09T00:03:51.765 INFO:tasks.workunit.client.1.vm06.stdout:5/678: truncate d5/d44/d4b/d92/d49/f83 1329571 0 2026-03-09T00:03:51.765 INFO:tasks.workunit.client.1.vm06.stdout:5/679: dread - d5/d44/d4b/d92/f46 zero size 2026-03-09T00:03:51.765 INFO:tasks.workunit.client.1.vm06.stdout:5/680: creat d5/d1c/d23/d34/d47/ddd/fe9 x:0 0 0 2026-03-09T00:03:51.807 INFO:tasks.workunit.client.1.vm06.stdout:9/484: dread d1/d4/d6e/f5d [0,4194304] 0 2026-03-09T00:03:51.807 INFO:tasks.workunit.client.1.vm06.stdout:9/485: write d1/d4/f39 [3742336,40246] 0 2026-03-09T00:03:51.808 INFO:tasks.workunit.client.1.vm06.stdout:7/598: dwrite d0/df/d1a/d3a/d4e/d5e/f73 [0,4194304] 0 2026-03-09T00:03:51.812 INFO:tasks.workunit.client.0.vm03.stdout:2/308: dwrite d8/f11 [0,4194304] 0 2026-03-09T00:03:51.816 INFO:tasks.workunit.client.1.vm06.stdout:9/486: truncate d1/d4/fe 1063002 0 2026-03-09T00:03:51.825 INFO:tasks.workunit.client.1.vm06.stdout:4/527: dwrite d17/d24/f2c [0,4194304] 0 2026-03-09T00:03:51.827 INFO:tasks.workunit.client.0.vm03.stdout:0/321: dwrite d2/da/dd/d49/d6c/d4b/f67 [0,4194304] 0 2026-03-09T00:03:51.837 INFO:tasks.workunit.client.0.vm03.stdout:9/364: dwrite d15/d1c/d21/d64/f50 [0,4194304] 0 2026-03-09T00:03:51.837 INFO:tasks.workunit.client.0.vm03.stdout:1/421: dwrite d4/d5e/f82 [4194304,4194304] 0 2026-03-09T00:03:51.837 INFO:tasks.workunit.client.0.vm03.stdout:9/365: creat d15/d1c/d36/f7a x:0 0 0 2026-03-09T00:03:51.837 INFO:tasks.workunit.client.0.vm03.stdout:9/366: chown d15/d1c/d36/d4d/f6b 350251289 1 2026-03-09T00:03:51.837 INFO:tasks.workunit.client.0.vm03.stdout:9/367: creat d15/f7b x:0 0 0 2026-03-09T00:03:51.842 INFO:tasks.workunit.client.0.vm03.stdout:9/368: dread d15/d1c/d21/f4c [0,4194304] 0 2026-03-09T00:03:51.853 INFO:tasks.workunit.client.0.vm03.stdout:9/369: dread - d15/d1c/d36/f72 zero size 2026-03-09T00:03:51.853 INFO:tasks.workunit.client.0.vm03.stdout:1/422: write d4/d3a/d61/f75 [2034944,121416] 0 2026-03-09T00:03:51.857 INFO:tasks.workunit.client.1.vm06.stdout:6/584: dwrite d4/d27/d3e/d57/f79 [0,4194304] 0 2026-03-09T00:03:51.866 INFO:tasks.workunit.client.0.vm03.stdout:2/309: mknod d8/d26/d5e/d5f/c65 0 2026-03-09T00:03:51.874 INFO:tasks.workunit.client.0.vm03.stdout:2/310: fdatasync d8/d1b/f3d 0 2026-03-09T00:03:51.887 INFO:tasks.workunit.client.0.vm03.stdout:5/377: dwrite d1c/d20/d55/f46 [0,4194304] 0 2026-03-09T00:03:51.890 INFO:tasks.workunit.client.1.vm06.stdout:2/656: write d7/da/d4e/d57/f7a [2510608,54996] 0 2026-03-09T00:03:51.910 INFO:tasks.workunit.client.0.vm03.stdout:1/423: dwrite d4/d15/f6d [0,4194304] 0 2026-03-09T00:03:51.910 INFO:tasks.workunit.client.0.vm03.stdout:1/424: creat d4/d6/f90 x:0 0 0 2026-03-09T00:03:51.912 INFO:tasks.workunit.client.1.vm06.stdout:4/528: dread d17/d24/d3b/d75/f9e [0,4194304] 0 2026-03-09T00:03:51.912 INFO:tasks.workunit.client.1.vm06.stdout:4/529: dread d17/d21/d4c/f87 [0,4194304] 0 2026-03-09T00:03:51.913 INFO:tasks.workunit.client.1.vm06.stdout:5/681: dwrite d5/d44/d4b/d92/d49/da0/fce [0,4194304] 0 2026-03-09T00:03:51.913 INFO:tasks.workunit.client.1.vm06.stdout:5/682: truncate d5/d1c/d21/d28/d5e/d66/d78/dc8/f7a 1300218 0 2026-03-09T00:03:51.913 INFO:tasks.workunit.client.1.vm06.stdout:5/683: stat d5/d1c/d21/d28/d5e/f69 0 2026-03-09T00:03:51.914 INFO:tasks.workunit.client.1.vm06.stdout:0/572: rmdir d3/d18/d1f/d44/d6a/d73 39 2026-03-09T00:03:51.915 INFO:tasks.workunit.client.1.vm06.stdout:8/567: fsync db/dd/d24/fba 0 2026-03-09T00:03:51.924 INFO:tasks.workunit.client.1.vm06.stdout:8/568: chown db/d1e/d46 0 1 2026-03-09T00:03:51.924 INFO:tasks.workunit.client.0.vm03.stdout:9/370: truncate fd 6617543 0 2026-03-09T00:03:51.924 INFO:tasks.workunit.client.1.vm06.stdout:7/599: getdents d0/df/d7b 0 2026-03-09T00:03:51.924 INFO:tasks.workunit.client.1.vm06.stdout:7/600: creat d0/d55/d99/fac x:0 0 0 2026-03-09T00:03:51.924 INFO:tasks.workunit.client.1.vm06.stdout:7/601: chown d0/f4f 55 1 2026-03-09T00:03:51.925 INFO:tasks.workunit.client.0.vm03.stdout:0/322: truncate d2/f1e 1992537 0 2026-03-09T00:03:51.934 INFO:tasks.workunit.client.1.vm06.stdout:9/487: creat d1/d3/d4f/d91/f99 x:0 0 0 2026-03-09T00:03:51.936 INFO:tasks.workunit.client.1.vm06.stdout:1/507: getdents d6/d21 0 2026-03-09T00:03:51.936 INFO:tasks.workunit.client.1.vm06.stdout:1/508: creat d6/d4c/d71/fab x:0 0 0 2026-03-09T00:03:51.938 INFO:tasks.workunit.client.1.vm06.stdout:6/585: dwrite d4/d16/d46/f93 [0,4194304] 0 2026-03-09T00:03:51.938 INFO:tasks.workunit.client.1.vm06.stdout:6/586: chown d4/d27/d3e/d57/c5a 1146577 1 2026-03-09T00:03:51.938 INFO:tasks.workunit.client.1.vm06.stdout:6/587: chown d4/d27/d42/d7e/daa 10 1 2026-03-09T00:03:51.945 INFO:tasks.workunit.client.0.vm03.stdout:3/266: write d2/db/f17 [908724,77079] 0 2026-03-09T00:03:51.951 INFO:tasks.workunit.client.0.vm03.stdout:1/425: rename d4/d6/f83 to d4/d3a/d84/f91 0 2026-03-09T00:03:51.951 INFO:tasks.workunit.client.0.vm03.stdout:1/426: dread - d4/d15/f7f zero size 2026-03-09T00:03:51.951 INFO:tasks.workunit.client.0.vm03.stdout:1/427: creat d4/d15/d1a/f92 x:0 0 0 2026-03-09T00:03:51.957 INFO:tasks.workunit.client.1.vm06.stdout:4/530: creat d17/d21/d4c/faf x:0 0 0 2026-03-09T00:03:51.957 INFO:tasks.workunit.client.1.vm06.stdout:4/531: write d17/d24/d3b/d54/fa5 [540616,107926] 0 2026-03-09T00:03:51.957 INFO:tasks.workunit.client.1.vm06.stdout:4/532: fsync d17/d21/fa6 0 2026-03-09T00:03:51.957 INFO:tasks.workunit.client.0.vm03.stdout:9/371: link d15/d1c/d28/f29 d15/d1c/d28/d6e/f7c 0 2026-03-09T00:03:51.957 INFO:tasks.workunit.client.0.vm03.stdout:9/372: fdatasync f11 0 2026-03-09T00:03:51.958 INFO:tasks.workunit.client.0.vm03.stdout:9/373: fsync d15/f17 0 2026-03-09T00:03:51.961 INFO:tasks.workunit.client.0.vm03.stdout:0/323: rmdir d2/da/dd/d49/d6c/d4b/d61 0 2026-03-09T00:03:51.967 INFO:tasks.workunit.client.1.vm06.stdout:5/684: mknod d5/cea 0 2026-03-09T00:03:51.967 INFO:tasks.workunit.client.1.vm06.stdout:5/685: write d5/d1c/d21/d28/f63 [5442770,85890] 0 2026-03-09T00:03:51.967 INFO:tasks.workunit.client.1.vm06.stdout:5/686: chown d5/d44/d4b/d92/d49/fc2 1161 1 2026-03-09T00:03:51.968 INFO:tasks.workunit.client.1.vm06.stdout:0/573: symlink d3/d18/d1f/d39/lc0 0 2026-03-09T00:03:51.968 INFO:tasks.workunit.client.1.vm06.stdout:0/574: truncate d3/d18/f14 612487 0 2026-03-09T00:03:51.971 INFO:tasks.workunit.client.1.vm06.stdout:5/687: dread d5/d44/d4b/d92/f4e [0,4194304] 0 2026-03-09T00:03:51.973 INFO:tasks.workunit.client.0.vm03.stdout:2/311: dwrite d8/d1b/d24/f41 [0,4194304] 0 2026-03-09T00:03:51.974 INFO:tasks.workunit.client.0.vm03.stdout:4/400: dwrite d7/d20/d35/d66/f69 [0,4194304] 0 2026-03-09T00:03:51.974 INFO:tasks.workunit.client.0.vm03.stdout:4/401: chown d7/c24 1 1 2026-03-09T00:03:51.974 INFO:tasks.workunit.client.1.vm06.stdout:0/575: dread d3/d18/d2c/f4d [0,4194304] 0 2026-03-09T00:03:51.974 INFO:tasks.workunit.client.1.vm06.stdout:0/576: creat d3/d18/d79/fc1 x:0 0 0 2026-03-09T00:03:51.980 INFO:tasks.workunit.client.0.vm03.stdout:4/402: dread d7/f1c [0,4194304] 0 2026-03-09T00:03:51.982 INFO:tasks.workunit.client.0.vm03.stdout:5/378: dwrite d1c/d20/d55/d3b/f6f [0,4194304] 0 2026-03-09T00:03:51.988 INFO:tasks.workunit.client.1.vm06.stdout:9/488: dwrite d1/d4/d6e/d14/d25/d85/f28 [0,4194304] 0 2026-03-09T00:03:52.014 INFO:tasks.workunit.client.0.vm03.stdout:9/374: rename d15/d1c/d21/c51 to d15/d1c/d21/c7d 0 2026-03-09T00:03:52.014 INFO:tasks.workunit.client.0.vm03.stdout:9/375: fsync d15/d1c/d28/f55 0 2026-03-09T00:03:52.016 INFO:tasks.workunit.client.1.vm06.stdout:7/602: getdents d0/d55 0 2026-03-09T00:03:52.017 INFO:tasks.workunit.client.0.vm03.stdout:0/324: mknod d2/da/dd/d49/c72 0 2026-03-09T00:03:52.020 INFO:tasks.workunit.client.1.vm06.stdout:1/509: symlink d6/d4c/d71/d83/lac 0 2026-03-09T00:03:52.023 INFO:tasks.workunit.client.0.vm03.stdout:2/312: creat d8/d1b/d24/f66 x:0 0 0 2026-03-09T00:03:52.023 INFO:tasks.workunit.client.0.vm03.stdout:2/313: truncate d8/f59 1043396 0 2026-03-09T00:03:52.036 INFO:tasks.workunit.client.1.vm06.stdout:2/657: rename d7/d1b/da5/daa/f9a to d7/fc2 0 2026-03-09T00:03:52.036 INFO:tasks.workunit.client.1.vm06.stdout:2/658: creat d7/d1a/d25/d66/d87/fc3 x:0 0 0 2026-03-09T00:03:52.058 INFO:tasks.workunit.client.1.vm06.stdout:4/533: mkdir d17/d24/d3b/d5e/d6e/db0 0 2026-03-09T00:03:52.059 INFO:tasks.workunit.client.0.vm03.stdout:1/428: dwrite d4/d15/f44 [0,4194304] 0 2026-03-09T00:03:52.059 INFO:tasks.workunit.client.0.vm03.stdout:1/429: write d4/f12 [1176362,4553] 0 2026-03-09T00:03:52.059 INFO:tasks.workunit.client.0.vm03.stdout:3/267: dwrite d2/db/f17 [0,4194304] 0 2026-03-09T00:03:52.063 INFO:tasks.workunit.client.0.vm03.stdout:1/430: dread d4/d15/f45 [0,4194304] 0 2026-03-09T00:03:52.066 INFO:tasks.workunit.client.0.vm03.stdout:1/431: dread d4/d5e/f88 [0,4194304] 0 2026-03-09T00:03:52.086 INFO:tasks.workunit.client.0.vm03.stdout:5/379: mknod d1c/d20/d55/d3b/c7b 0 2026-03-09T00:03:52.086 INFO:tasks.workunit.client.0.vm03.stdout:8/321: sync 2026-03-09T00:03:52.086 INFO:tasks.workunit.client.0.vm03.stdout:7/339: sync 2026-03-09T00:03:52.086 INFO:tasks.workunit.client.0.vm03.stdout:8/322: chown d7/df/d1e/d5a 0 1 2026-03-09T00:03:52.086 INFO:tasks.workunit.client.0.vm03.stdout:7/340: readlink d2/l36 0 2026-03-09T00:03:52.088 INFO:tasks.workunit.client.1.vm06.stdout:3/626: sync 2026-03-09T00:03:52.105 INFO:tasks.workunit.client.0.vm03.stdout:0/325: symlink d2/da/dd/d49/d6c/l73 0 2026-03-09T00:03:52.105 INFO:tasks.workunit.client.0.vm03.stdout:9/376: creat d15/d1c/d21/d67/f7e x:0 0 0 2026-03-09T00:03:52.105 INFO:tasks.workunit.client.0.vm03.stdout:2/314: unlink d8/d1b/d2a/d56/l58 0 2026-03-09T00:03:52.106 INFO:tasks.workunit.client.1.vm06.stdout:0/577: dread - d3/d18/d1f/d39/d49/d60/f92 zero size 2026-03-09T00:03:52.106 INFO:tasks.workunit.client.1.vm06.stdout:8/569: rmdir db/d53/d70/d38 39 2026-03-09T00:03:52.110 INFO:tasks.workunit.client.0.vm03.stdout:2/315: dread d8/d1b/f30 [0,4194304] 0 2026-03-09T00:03:52.113 INFO:tasks.workunit.client.0.vm03.stdout:9/377: dread d15/d1c/d28/f55 [0,4194304] 0 2026-03-09T00:03:52.126 INFO:tasks.workunit.client.1.vm06.stdout:9/489: mknod d1/d4/d6e/d14/d25/c9a 0 2026-03-09T00:03:52.126 INFO:tasks.workunit.client.1.vm06.stdout:9/490: write d1/d4/f54 [732230,87695] 0 2026-03-09T00:03:52.130 INFO:tasks.workunit.client.0.vm03.stdout:2/316: dread d8/d1b/d24/f46 [0,4194304] 0 2026-03-09T00:03:52.140 INFO:tasks.workunit.client.1.vm06.stdout:6/588: rename d4/d27/f74 to d4/d16/d46/fb9 0 2026-03-09T00:03:52.140 INFO:tasks.workunit.client.1.vm06.stdout:6/589: truncate d4/d27/f70 558608 0 2026-03-09T00:03:52.140 INFO:tasks.workunit.client.1.vm06.stdout:6/590: creat d4/d27/d42/d4b/fba x:0 0 0 2026-03-09T00:03:52.140 INFO:tasks.workunit.client.1.vm06.stdout:6/591: dread - d4/d27/d3e/d57/fa7 zero size 2026-03-09T00:03:52.141 INFO:tasks.workunit.client.0.vm03.stdout:1/432: mkdir d4/d3a/d61/d78/d81/d93 0 2026-03-09T00:03:52.141 INFO:tasks.workunit.client.0.vm03.stdout:1/433: chown d4/l7 1626802334 1 2026-03-09T00:03:52.141 INFO:tasks.workunit.client.0.vm03.stdout:1/434: creat d4/d3a/d61/d78/f94 x:0 0 0 2026-03-09T00:03:52.141 INFO:tasks.workunit.client.0.vm03.stdout:3/268: truncate d2/f1d 1775402 0 2026-03-09T00:03:52.148 INFO:tasks.workunit.client.0.vm03.stdout:5/380: truncate d1c/f37 3060053 0 2026-03-09T00:03:52.148 INFO:tasks.workunit.client.0.vm03.stdout:0/326: dwrite d2/d1f/f43 [0,4194304] 0 2026-03-09T00:03:52.148 INFO:tasks.workunit.client.0.vm03.stdout:0/327: read d2/f59 [848248,106962] 0 2026-03-09T00:03:52.148 INFO:tasks.workunit.client.0.vm03.stdout:2/317: dread d8/fd [0,4194304] 0 2026-03-09T00:03:52.152 INFO:tasks.workunit.client.0.vm03.stdout:0/328: read d2/da/d1a/f25 [417120,76815] 0 2026-03-09T00:03:52.159 INFO:tasks.workunit.client.0.vm03.stdout:6/323: sync 2026-03-09T00:03:52.160 INFO:tasks.workunit.client.0.vm03.stdout:8/323: truncate d7/f11 944995 0 2026-03-09T00:03:52.160 INFO:tasks.workunit.client.0.vm03.stdout:8/324: stat d7/df/f29 0 2026-03-09T00:03:52.161 INFO:tasks.workunit.client.1.vm06.stdout:4/534: mknod d17/d24/d3b/d5e/d6e/db0/cb1 0 2026-03-09T00:03:52.163 INFO:tasks.workunit.client.0.vm03.stdout:7/341: symlink d2/d1f/d35/l66 0 2026-03-09T00:03:52.170 INFO:tasks.workunit.client.1.vm06.stdout:4/535: dread d17/f19 [0,4194304] 0 2026-03-09T00:03:52.170 INFO:tasks.workunit.client.1.vm06.stdout:4/536: dread d17/d5b/f64 [4194304,4194304] 0 2026-03-09T00:03:52.170 INFO:tasks.workunit.client.1.vm06.stdout:4/537: write d17/d5b/f83 [1060379,1324] 0 2026-03-09T00:03:52.171 INFO:tasks.workunit.client.0.vm03.stdout:9/378: mkdir d15/d7f 0 2026-03-09T00:03:52.175 INFO:tasks.workunit.client.0.vm03.stdout:4/403: sync 2026-03-09T00:03:52.197 INFO:tasks.workunit.client.0.vm03.stdout:1/435: mkdir d4/d15/d77/d95 0 2026-03-09T00:03:52.197 INFO:tasks.workunit.client.0.vm03.stdout:1/436: readlink d4/d3a/d61/d78/l7e 0 2026-03-09T00:03:52.198 INFO:tasks.workunit.client.0.vm03.stdout:1/437: write d4/d3a/d84/f91 [676319,27348] 0 2026-03-09T00:03:52.205 INFO:tasks.workunit.client.0.vm03.stdout:3/269: creat d2/f4e x:0 0 0 2026-03-09T00:03:52.220 INFO:tasks.workunit.client.0.vm03.stdout:3/270: dread d2/db/f17 [0,4194304] 0 2026-03-09T00:03:52.221 INFO:tasks.workunit.client.0.vm03.stdout:3/271: creat d2/db/d3b/f4f x:0 0 0 2026-03-09T00:03:52.221 INFO:tasks.workunit.client.0.vm03.stdout:3/272: dread - d2/db/f26 zero size 2026-03-09T00:03:52.221 INFO:tasks.workunit.client.0.vm03.stdout:5/381: symlink d1c/d20/d56/d74/l7c 0 2026-03-09T00:03:52.225 INFO:tasks.workunit.client.1.vm06.stdout:3/627: mkdir d11/d28/d2e/d2f/d5b/ddb 0 2026-03-09T00:03:52.228 INFO:tasks.workunit.client.0.vm03.stdout:2/318: mknod d8/d1b/c67 0 2026-03-09T00:03:52.231 INFO:tasks.workunit.client.0.vm03.stdout:0/329: symlink d2/da/dd/d49/d6c/d4b/d55/d6f/l74 0 2026-03-09T00:03:52.241 INFO:tasks.workunit.client.1.vm06.stdout:0/578: mknod d3/d18/d2c/d2d/d74/d7d/cc2 0 2026-03-09T00:03:52.241 INFO:tasks.workunit.client.1.vm06.stdout:0/579: chown d3/d18/d79/c94 31236 1 2026-03-09T00:03:52.245 INFO:tasks.workunit.client.0.vm03.stdout:1/438: dwrite d4/d3a/d61/f75 [4194304,4194304] 0 2026-03-09T00:03:52.245 INFO:tasks.workunit.client.0.vm03.stdout:1/439: readlink d4/d3a/d61/l7b 0 2026-03-09T00:03:52.245 INFO:tasks.workunit.client.0.vm03.stdout:9/379: dwrite d15/d1c/d28/f39 [0,4194304] 0 2026-03-09T00:03:52.245 INFO:tasks.workunit.client.0.vm03.stdout:9/380: creat d15/d1c/d21/d54/f80 x:0 0 0 2026-03-09T00:03:52.247 INFO:tasks.workunit.client.0.vm03.stdout:6/324: mkdir d13/d35/d71 0 2026-03-09T00:03:52.247 INFO:tasks.workunit.client.0.vm03.stdout:6/325: truncate d13/d35/f6a 204462 0 2026-03-09T00:03:52.260 INFO:tasks.workunit.client.0.vm03.stdout:8/325: unlink d7/df/f56 0 2026-03-09T00:03:52.264 INFO:tasks.workunit.client.0.vm03.stdout:6/326: write d13/d1e/f3f [808023,82332] 0 2026-03-09T00:03:52.265 INFO:tasks.workunit.client.1.vm06.stdout:9/491: creat d1/d3/f9b x:0 0 0 2026-03-09T00:03:52.265 INFO:tasks.workunit.client.1.vm06.stdout:9/492: chown d1/d3/d50/l89 43015 1 2026-03-09T00:03:52.265 INFO:tasks.workunit.client.1.vm06.stdout:9/493: chown d1/d3/d4f/d91/f99 46 1 2026-03-09T00:03:52.267 INFO:tasks.workunit.client.0.vm03.stdout:7/342: rename d2/d1f/d42/d46/d54/d60 to d2/d1f/d40/d67 0 2026-03-09T00:03:52.267 INFO:tasks.workunit.client.0.vm03.stdout:7/343: write d2/d1f/d42/d43/f4a [617375,59439] 0 2026-03-09T00:03:52.267 INFO:tasks.workunit.client.0.vm03.stdout:7/344: read - d2/d1f/d3a/d31/d37/f4c zero size 2026-03-09T00:03:52.269 INFO:tasks.workunit.client.1.vm06.stdout:1/510: getdents d6/d4c/d71 0 2026-03-09T00:03:52.291 INFO:tasks.workunit.client.1.vm06.stdout:5/688: rename d5/l11 to d5/d1c/d21/d28/d5e/leb 0 2026-03-09T00:03:52.291 INFO:tasks.workunit.client.1.vm06.stdout:5/689: dread - d5/d44/d4b/d92/d49/fc2 zero size 2026-03-09T00:03:52.291 INFO:tasks.workunit.client.1.vm06.stdout:5/690: readlink d5/d1c/d68/da2/le0 0 2026-03-09T00:03:52.292 INFO:tasks.workunit.client.0.vm03.stdout:4/404: dwrite d7/d20/f21 [0,4194304] 0 2026-03-09T00:03:52.298 INFO:tasks.workunit.client.0.vm03.stdout:0/330: dwrite d2/da/dd/d49/d6c/d4b/f4c [0,4194304] 0 2026-03-09T00:03:52.298 INFO:tasks.workunit.client.0.vm03.stdout:0/331: chown d2/da/dd/d49/d6c/l73 1 1 2026-03-09T00:03:52.301 INFO:tasks.workunit.client.0.vm03.stdout:5/382: unlink d1c/d20/d55/f42 0 2026-03-09T00:03:52.301 INFO:tasks.workunit.client.0.vm03.stdout:5/383: chown d1c/d20/d56 0 1 2026-03-09T00:03:52.302 INFO:tasks.workunit.client.0.vm03.stdout:5/384: write f12 [4440023,118923] 0 2026-03-09T00:03:52.302 INFO:tasks.workunit.client.0.vm03.stdout:5/385: chown d1c/d20/d55/c7a 7690069 1 2026-03-09T00:03:52.307 INFO:tasks.workunit.client.1.vm06.stdout:2/659: sync 2026-03-09T00:03:52.308 INFO:tasks.workunit.client.0.vm03.stdout:9/381: dwrite d15/d1c/d21/d67/f7e [0,4194304] 0 2026-03-09T00:03:52.310 INFO:tasks.workunit.client.1.vm06.stdout:2/660: dread d7/d1b/d31/f90 [0,4194304] 0 2026-03-09T00:03:52.314 INFO:tasks.workunit.client.0.vm03.stdout:1/440: symlink d4/d3a/d32/d87/l96 0 2026-03-09T00:03:52.327 INFO:tasks.workunit.client.1.vm06.stdout:4/538: mkdir d17/d24/d49/d5f/db2 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.1.vm06.stdout:4/539: stat d17/d24/d3b/d54/f80 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.1.vm06.stdout:3/628: creat d11/d28/d2e/d7e/fdc x:0 0 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.1.vm06.stdout:3/629: stat d11/d3f/d8d/cc9 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:9/382: read d15/d1c/d21/d67/f7e [1821928,123369] 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:9/383: write d15/d1c/d36/f4a [718765,45316] 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:8/326: creat d7/df/d1e/f63 x:0 0 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.1.vm06.stdout:8/570: rmdir db/d1e 39 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:1/441: write d4/d15/d1a/f1d [527034,90130] 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:6/327: mkdir d13/d35/d72 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:6/328: truncate d13/f6f 4998990 0 2026-03-09T00:03:52.337 INFO:tasks.workunit.client.0.vm03.stdout:6/329: truncate d13/d35/f68 227772 0 2026-03-09T00:03:52.340 INFO:tasks.workunit.client.0.vm03.stdout:1/442: write d4/d15/f17 [2515695,50625] 0 2026-03-09T00:03:52.340 INFO:tasks.workunit.client.0.vm03.stdout:1/443: stat d4/d3a/d32/l36 0 2026-03-09T00:03:52.340 INFO:tasks.workunit.client.0.vm03.stdout:1/444: read d4/d3a/f26 [2224601,86740] 0 2026-03-09T00:03:52.341 INFO:tasks.workunit.client.1.vm06.stdout:0/580: write d3/d18/d1f/f4a [718113,33886] 0 2026-03-09T00:03:52.341 INFO:tasks.workunit.client.1.vm06.stdout:0/581: stat d3/d18/d2c/d2d/d31/l65 0 2026-03-09T00:03:52.341 INFO:tasks.workunit.client.1.vm06.stdout:0/582: fdatasync d3/d18/d1f/d44/f7c 0 2026-03-09T00:03:52.343 INFO:tasks.workunit.client.1.vm06.stdout:9/494: unlink d1/d3/d4f/d52/f8b 0 2026-03-09T00:03:52.347 INFO:tasks.workunit.client.0.vm03.stdout:3/273: rename d2/c2c to d2/db/d3b/c50 0 2026-03-09T00:03:52.349 INFO:tasks.workunit.client.0.vm03.stdout:3/274: read d2/f9 [2576751,45410] 0 2026-03-09T00:03:52.349 INFO:tasks.workunit.client.0.vm03.stdout:3/275: write d2/f30 [4432248,105936] 0 2026-03-09T00:03:52.355 INFO:tasks.workunit.client.1.vm06.stdout:1/511: dwrite d6/d21/d2d/d37/f86 [0,4194304] 0 2026-03-09T00:03:52.356 INFO:tasks.workunit.client.1.vm06.stdout:7/603: rename d0/df/d1a/d27/d70/l9c to d0/df/d1a/d22/lad 0 2026-03-09T00:03:52.357 INFO:tasks.workunit.client.0.vm03.stdout:7/345: link d2/d1f/c52 d2/d1f/d40/d67/c68 0 2026-03-09T00:03:52.357 INFO:tasks.workunit.client.0.vm03.stdout:7/346: read - d2/d1f/d40/d67/f64 zero size 2026-03-09T00:03:52.357 INFO:tasks.workunit.client.0.vm03.stdout:7/347: write d2/d1f/f11 [4443943,111670] 0 2026-03-09T00:03:52.357 INFO:tasks.workunit.client.0.vm03.stdout:7/348: chown d2/d1f/d3a/d24/c38 1300 1 2026-03-09T00:03:52.358 INFO:tasks.workunit.client.0.vm03.stdout:7/349: fsync d2/d1f/d3a/d31/f44 0 2026-03-09T00:03:52.358 INFO:tasks.workunit.client.1.vm06.stdout:5/691: mkdir d5/d1c/d68/dec 0 2026-03-09T00:03:52.358 INFO:tasks.workunit.client.1.vm06.stdout:5/692: read - d5/d1c/d21/d28/d5e/d66/d78/da6/fd4 zero size 2026-03-09T00:03:52.363 INFO:tasks.workunit.client.1.vm06.stdout:7/604: dread d0/df/d1a/f50 [0,4194304] 0 2026-03-09T00:03:52.363 INFO:tasks.workunit.client.1.vm06.stdout:2/661: symlink d7/d1a/d25/d66/d87/da8/db2/lc4 0 2026-03-09T00:03:52.363 INFO:tasks.workunit.client.1.vm06.stdout:2/662: write d7/d1a/d96/fba [88447,78361] 0 2026-03-09T00:03:52.371 INFO:tasks.workunit.client.1.vm06.stdout:4/540: symlink d17/d24/d3b/d5e/d7a/lb3 0 2026-03-09T00:03:52.373 INFO:tasks.workunit.client.1.vm06.stdout:8/571: creat db/d74/d78/d98/fbb x:0 0 0 2026-03-09T00:03:52.377 INFO:tasks.workunit.client.1.vm06.stdout:0/583: mknod d3/d18/d1f/d44/d6a/cc3 0 2026-03-09T00:03:52.385 INFO:tasks.workunit.client.0.vm03.stdout:4/405: dwrite d7/d20/d29/d38/d3a/f7c [0,4194304] 0 2026-03-09T00:03:52.399 INFO:tasks.workunit.client.0.vm03.stdout:4/406: fdatasync d7/d20/d29/d38/f6e 0 2026-03-09T00:03:52.400 INFO:tasks.workunit.client.1.vm06.stdout:9/495: creat d1/d4/f9c x:0 0 0 2026-03-09T00:03:52.400 INFO:tasks.workunit.client.0.vm03.stdout:0/332: creat d2/da/dd/f75 x:0 0 0 2026-03-09T00:03:52.400 INFO:tasks.workunit.client.0.vm03.stdout:5/386: creat d1c/d20/d55/f7d x:0 0 0 2026-03-09T00:03:52.404 INFO:tasks.workunit.client.0.vm03.stdout:0/333: write d2/da/dd/d49/d6c/f52 [2591315,53393] 0 2026-03-09T00:03:52.408 INFO:tasks.workunit.client.1.vm06.stdout:6/592: rename d4/d27/d3e/d78/d97 to d4/d27/d42/da6/dbb 0 2026-03-09T00:03:52.408 INFO:tasks.workunit.client.1.vm06.stdout:6/593: readlink d4/d27/d42/d4b/l73 0 2026-03-09T00:03:52.410 INFO:tasks.workunit.client.1.vm06.stdout:5/693: symlink d5/d1c/d68/dec/led 0 2026-03-09T00:03:52.410 INFO:tasks.workunit.client.1.vm06.stdout:5/694: stat d5/d1c/d21/d28/d5e/d66/d78/dc8/daa 0 2026-03-09T00:03:52.411 INFO:tasks.workunit.client.0.vm03.stdout:2/319: getdents d8/d26/d5e 0 2026-03-09T00:03:52.418 INFO:tasks.workunit.client.1.vm06.stdout:7/605: mkdir d0/df/d1a/d27/d4c/d40/d51/d90/dae 0 2026-03-09T00:03:52.420 INFO:tasks.workunit.client.0.vm03.stdout:9/384: symlink d15/d1c/d36/l81 0 2026-03-09T00:03:52.422 INFO:tasks.workunit.client.1.vm06.stdout:4/541: symlink d17/d21/d4c/d66/d68/lb4 0 2026-03-09T00:03:52.425 INFO:tasks.workunit.client.0.vm03.stdout:8/327: creat d7/f64 x:0 0 0 2026-03-09T00:03:52.425 INFO:tasks.workunit.client.1.vm06.stdout:1/512: getdents d6/d4c/d51 0 2026-03-09T00:03:52.432 INFO:tasks.workunit.client.1.vm06.stdout:3/630: dwrite d11/d28/d2e/d7e/d83/f9a [0,4194304] 0 2026-03-09T00:03:52.432 INFO:tasks.workunit.client.1.vm06.stdout:2/663: dwrite d7/da/d4e/d57/d9d/fbe [0,4194304] 0 2026-03-09T00:03:52.434 INFO:tasks.workunit.client.0.vm03.stdout:9/385: dread d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:52.439 INFO:tasks.workunit.client.1.vm06.stdout:5/695: mknod d5/cee 0 2026-03-09T00:03:52.450 INFO:tasks.workunit.client.0.vm03.stdout:6/330: unlink d13/d1e/c40 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.0.vm03.stdout:9/386: dread d15/f26 [0,4194304] 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.0.vm03.stdout:9/387: chown d15/d77/c79 7508 1 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/606: truncate d0/df/d1a/d27/d4c/d40/d5b/f78 875691 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/607: readlink d0/df/l10 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/608: dread - d0/df/d17/f74 zero size 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/609: read - d0/df/d1a/d35/d62/f98 zero size 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/610: write d0/df/d1a/d27/f37 [343151,68198] 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/611: creat d0/df/d1a/d27/d4c/d40/d5b/faf x:0 0 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/612: fsync d0/df/d1a/d27/f60 0 2026-03-09T00:03:52.451 INFO:tasks.workunit.client.1.vm06.stdout:7/613: creat d0/df/d1a/d27/d4c/fb0 x:0 0 0 2026-03-09T00:03:52.455 INFO:tasks.workunit.client.0.vm03.stdout:1/445: mknod d4/d3a/d3d/c97 0 2026-03-09T00:03:52.463 INFO:tasks.workunit.client.1.vm06.stdout:0/584: dwrite d3/d18/d2c/d2d/d74/d90/fa3 [0,4194304] 0 2026-03-09T00:03:52.472 INFO:tasks.workunit.client.0.vm03.stdout:3/276: mkdir d2/db/d40/d51 0 2026-03-09T00:03:52.472 INFO:tasks.workunit.client.0.vm03.stdout:3/277: creat d2/db/d2d/f52 x:0 0 0 2026-03-09T00:03:52.472 INFO:tasks.workunit.client.1.vm06.stdout:4/542: creat d17/d24/d3b/d97/fb5 x:0 0 0 2026-03-09T00:03:52.476 INFO:tasks.workunit.client.0.vm03.stdout:1/446: write d4/d3a/d32/d6a/f76 [2198137,21455] 0 2026-03-09T00:03:52.479 INFO:tasks.workunit.client.1.vm06.stdout:1/513: symlink d6/d21/d2d/d3b/d87/d9d/lad 0 2026-03-09T00:03:52.479 INFO:tasks.workunit.client.1.vm06.stdout:1/514: fsync d6/d21/d2d/d3b/d87/f8d 0 2026-03-09T00:03:52.479 INFO:tasks.workunit.client.1.vm06.stdout:1/515: chown d6/d21/d2d/d3b/l5e 75904 1 2026-03-09T00:03:52.488 INFO:tasks.workunit.client.0.vm03.stdout:7/350: symlink d2/d1f/d42/d43/l69 0 2026-03-09T00:03:52.488 INFO:tasks.workunit.client.0.vm03.stdout:7/351: chown d2/d1f/l20 1678 1 2026-03-09T00:03:52.501 INFO:tasks.workunit.client.0.vm03.stdout:4/407: rename d7/d20/l42 to d7/d20/d6a/d77/d25/l7d 0 2026-03-09T00:03:52.504 INFO:tasks.workunit.client.1.vm06.stdout:2/664: creat d7/da/d4e/d57/fc5 x:0 0 0 2026-03-09T00:03:52.504 INFO:tasks.workunit.client.0.vm03.stdout:5/387: symlink d1c/d20/d55/d4f/d58/l7e 0 2026-03-09T00:03:52.507 INFO:tasks.workunit.client.1.vm06.stdout:6/594: dwrite d4/f68 [0,4194304] 0 2026-03-09T00:03:52.507 INFO:tasks.workunit.client.1.vm06.stdout:6/595: fdatasync d4/ff 0 2026-03-09T00:03:52.507 INFO:tasks.workunit.client.1.vm06.stdout:6/596: creat d4/d27/d3e/d78/fbc x:0 0 0 2026-03-09T00:03:52.507 INFO:tasks.workunit.client.1.vm06.stdout:6/597: fsync d4/d27/d3e/f44 0 2026-03-09T00:03:52.507 INFO:tasks.workunit.client.1.vm06.stdout:6/598: write d4/f6e [1009211,85412] 0 2026-03-09T00:03:52.507 INFO:tasks.workunit.client.1.vm06.stdout:6/599: stat d4/d8d/cb0 0 2026-03-09T00:03:52.508 INFO:tasks.workunit.client.1.vm06.stdout:9/496: dwrite d1/d4/d6e/f5d [0,4194304] 0 2026-03-09T00:03:52.510 INFO:tasks.workunit.client.0.vm03.stdout:3/278: dread d2/db/d2d/f2f [0,4194304] 0 2026-03-09T00:03:52.510 INFO:tasks.workunit.client.0.vm03.stdout:5/388: dread d1c/f1f [0,4194304] 0 2026-03-09T00:03:52.510 INFO:tasks.workunit.client.0.vm03.stdout:5/389: creat d1c/d20/d55/d3b/f7f x:0 0 0 2026-03-09T00:03:52.510 INFO:tasks.workunit.client.0.vm03.stdout:5/390: dread - d1c/d20/d55/f7d zero size 2026-03-09T00:03:52.513 INFO:tasks.workunit.client.1.vm06.stdout:1/516: dread d6/d21/d2d/f74 [0,4194304] 0 2026-03-09T00:03:52.513 INFO:tasks.workunit.client.1.vm06.stdout:1/517: chown d6/d21/d2d/d3b/d87/l94 8271880 1 2026-03-09T00:03:52.517 INFO:tasks.workunit.client.0.vm03.stdout:5/391: write d1c/d20/f4e [1507200,72585] 0 2026-03-09T00:03:52.517 INFO:tasks.workunit.client.0.vm03.stdout:5/392: creat d1c/d20/d55/d66/d70/f80 x:0 0 0 2026-03-09T00:03:52.520 INFO:tasks.workunit.client.0.vm03.stdout:8/328: dwrite d7/f34 [0,4194304] 0 2026-03-09T00:03:52.532 INFO:tasks.workunit.client.0.vm03.stdout:8/329: dread - d7/df/d1e/d3f/f47 zero size 2026-03-09T00:03:52.532 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:52 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:52.546 INFO:tasks.workunit.client.0.vm03.stdout:8/330: write d7/df/f37 [1533376,110404] 0 2026-03-09T00:03:52.546 INFO:tasks.workunit.client.0.vm03.stdout:8/331: write d7/f9 [2811442,10875] 0 2026-03-09T00:03:52.547 INFO:tasks.workunit.client.0.vm03.stdout:6/331: link d13/c26 d13/d1e/d44/d4a/c73 0 2026-03-09T00:03:52.563 INFO:tasks.workunit.client.1.vm06.stdout:0/585: dwrite d3/d18/d1f/f5e [0,4194304] 0 2026-03-09T00:03:52.569 INFO:tasks.workunit.client.1.vm06.stdout:0/586: dread d3/d18/d1f/d39/d49/f4b [0,4194304] 0 2026-03-09T00:03:52.569 INFO:tasks.workunit.client.1.vm06.stdout:0/587: write d3/d18/d2c/d2d/d74/fbc [633676,89462] 0 2026-03-09T00:03:52.572 INFO:tasks.workunit.client.1.vm06.stdout:5/696: dwrite d5/d44/d4b/f6c [0,4194304] 0 2026-03-09T00:03:52.582 INFO:tasks.workunit.client.0.vm03.stdout:4/408: rename d7/d20/d29/d4e/f60 to d7/f7e 0 2026-03-09T00:03:52.600 INFO:tasks.workunit.client.1.vm06.stdout:3/631: getdents d11/d28/d2e/db2 0 2026-03-09T00:03:52.605 INFO:tasks.workunit.client.1.vm06.stdout:6/600: unlink d4/d16/d46/f93 0 2026-03-09T00:03:52.605 INFO:tasks.workunit.client.0.vm03.stdout:3/279: symlink d2/db/d40/d44/l53 0 2026-03-09T00:03:52.605 INFO:tasks.workunit.client.0.vm03.stdout:3/280: truncate d2/f4b 580146 0 2026-03-09T00:03:52.605 INFO:tasks.workunit.client.0.vm03.stdout:3/281: stat d2/db/d40/d44 0 2026-03-09T00:03:52.605 INFO:tasks.workunit.client.0.vm03.stdout:5/393: symlink d1c/d20/d55/d66/d6b/l81 0 2026-03-09T00:03:52.605 INFO:tasks.workunit.client.0.vm03.stdout:5/394: chown f15 4 1 2026-03-09T00:03:52.606 INFO:tasks.workunit.client.0.vm03.stdout:1/447: write f2 [3751957,4866] 0 2026-03-09T00:03:52.606 INFO:tasks.workunit.client.1.vm06.stdout:6/601: dread d4/f12 [4194304,4194304] 0 2026-03-09T00:03:52.606 INFO:tasks.workunit.client.1.vm06.stdout:6/602: symlink d4/d27/d42/da6/dbb/lbd 0 2026-03-09T00:03:52.606 INFO:tasks.workunit.client.0.vm03.stdout:2/320: rmdir d8 39 2026-03-09T00:03:52.613 INFO:tasks.workunit.client.1.vm06.stdout:3/632: mkdir d11/d28/d2e/d2f/d5b/d94/ddd 0 2026-03-09T00:03:52.613 INFO:tasks.workunit.client.1.vm06.stdout:3/633: chown d11/d28/d2e/l9e 619152794 1 2026-03-09T00:03:52.614 INFO:tasks.workunit.client.0.vm03.stdout:8/332: symlink d7/df/d1a/d2b/l65 0 2026-03-09T00:03:52.621 INFO:tasks.workunit.client.0.vm03.stdout:6/332: mkdir d13/d35/d74 0 2026-03-09T00:03:52.626 INFO:tasks.workunit.client.1.vm06.stdout:3/634: symlink d11/d28/d2e/d2f/d5b/lde 0 2026-03-09T00:03:52.627 INFO:tasks.workunit.client.0.vm03.stdout:4/409: creat d7/d20/d6a/d77/d25/f7f x:0 0 0 2026-03-09T00:03:52.627 INFO:tasks.workunit.client.0.vm03.stdout:4/410: write d7/f62 [1865130,14961] 0 2026-03-09T00:03:52.634 INFO:tasks.workunit.client.1.vm06.stdout:3/635: symlink d11/d28/d2e/d2f/ldf 0 2026-03-09T00:03:52.634 INFO:tasks.workunit.client.1.vm06.stdout:3/636: mknod d11/d28/d2e/d2f/ce0 0 2026-03-09T00:03:52.635 INFO:tasks.workunit.client.1.vm06.stdout:3/637: write d11/d28/d2e/d2f/d36/f59 [790822,103871] 0 2026-03-09T00:03:52.635 INFO:tasks.workunit.client.1.vm06.stdout:3/638: write d11/d28/d2e/d2f/fa3 [980253,121587] 0 2026-03-09T00:03:52.647 INFO:tasks.workunit.client.0.vm03.stdout:5/395: mknod d1c/d67/c82 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.1.vm06.stdout:3/639: rename d11/d28/d4d/d89/c8e to d11/d28/d2e/d2f/d5b/d5f/d91/ce1 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.1.vm06.stdout:7/614: dwrite d0/df/d1a/d3a/d4e/f63 [0,4194304] 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.1.vm06.stdout:7/615: truncate d0/d55/d99/fac 12720 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.0.vm03.stdout:1/448: mkdir d4/d3a/d3d/d98 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.0.vm03.stdout:2/321: creat d8/d17/f68 x:0 0 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.0.vm03.stdout:4/411: mknod d7/d20/d29/d54/d58/c80 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.0.vm03.stdout:4/412: readlink d7/d20/d29/d54/l61 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.0.vm03.stdout:4/413: write d7/d20/d29/d38/d3a/f65 [384230,23792] 0 2026-03-09T00:03:52.665 INFO:tasks.workunit.client.0.vm03.stdout:4/414: chown d7/d20/d6a/d77/d25/l2e 34564 1 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:4/415: truncate d7/d20/d6a/d77/d25/f3e 8482121 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:4/416: creat d7/f81 x:0 0 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:5/396: creat d1c/d20/d55/d66/f83 x:0 0 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:1/449: symlink d4/d3a/d61/d78/l99 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:2/322: creat d8/d26/d5e/d5f/f69 x:0 0 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:4/417: link d7/d20/d29/f53 d7/d20/d6a/d77/f82 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:5/397: creat d1c/d20/d56/d74/f84 x:0 0 0 2026-03-09T00:03:52.666 INFO:tasks.workunit.client.0.vm03.stdout:4/418: link d7/d20/d29/d54/d58/f6b d7/d20/d6a/d77/f83 0 2026-03-09T00:03:52.667 INFO:tasks.workunit.client.0.vm03.stdout:2/323: mknod d8/d26/d5e/d5f/d4b/d50/c6a 0 2026-03-09T00:03:52.673 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:52 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:03:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/419: creat d7/d20/d35/f84 x:0 0 0 2026-03-09T00:03:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/420: readlink d7/d20/d29/d4e/l5c 0 2026-03-09T00:03:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/421: chown d7/d20/l7b 37720 1 2026-03-09T00:03:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/422: write d7/f81 [146360,114196] 0 2026-03-09T00:03:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/423: write d7/d20/d29/d38/f6e [4618584,8108] 0 2026-03-09T00:03:52.673 INFO:tasks.workunit.client.0.vm03.stdout:4/424: truncate d7/f7e 7552143 0 2026-03-09T00:03:52.676 INFO:tasks.workunit.client.0.vm03.stdout:1/450: read d4/d3a/d43/f47 [6169775,44729] 0 2026-03-09T00:03:52.687 INFO:tasks.workunit.client.1.vm06.stdout:1/518: dwrite d6/d63/f99 [0,4194304] 0 2026-03-09T00:03:52.687 INFO:tasks.workunit.client.0.vm03.stdout:7/352: dwrite d2/d1f/d35/f5a [0,4194304] 0 2026-03-09T00:03:52.687 INFO:tasks.workunit.client.1.vm06.stdout:1/519: read d6/d21/d2d/d3b/d87/f9e [722655,1369] 0 2026-03-09T00:03:52.691 INFO:tasks.workunit.client.0.vm03.stdout:4/425: mkdir d7/d20/d29/d54/d58/d85 0 2026-03-09T00:03:52.692 INFO:tasks.workunit.client.0.vm03.stdout:1/451: creat d4/d6/d52/f9a x:0 0 0 2026-03-09T00:03:52.695 INFO:tasks.workunit.client.1.vm06.stdout:1/520: write d6/d21/d2d/d37/f77 [7874811,23974] 0 2026-03-09T00:03:52.701 INFO:tasks.workunit.client.0.vm03.stdout:7/353: mknod d2/d1f/d3a/d24/c6a 0 2026-03-09T00:03:52.704 INFO:tasks.workunit.client.1.vm06.stdout:1/521: dread d6/f1d [0,4194304] 0 2026-03-09T00:03:52.704 INFO:tasks.workunit.client.1.vm06.stdout:1/522: chown d6/d4c/d79/f59 218330 1 2026-03-09T00:03:52.705 INFO:tasks.workunit.client.1.vm06.stdout:1/523: truncate d6/d63/f9c 773802 0 2026-03-09T00:03:52.705 INFO:tasks.workunit.client.1.vm06.stdout:1/524: chown d6/d21/da6/la9 814306 1 2026-03-09T00:03:52.705 INFO:tasks.workunit.client.1.vm06.stdout:1/525: creat d6/d21/d2d/d3b/fae x:0 0 0 2026-03-09T00:03:52.707 INFO:tasks.workunit.client.1.vm06.stdout:0/588: dwrite d3/d18/d28/d45/fa9 [0,4194304] 0 2026-03-09T00:03:52.707 INFO:tasks.workunit.client.0.vm03.stdout:0/334: dwrite d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:03:52.716 INFO:tasks.workunit.client.0.vm03.stdout:7/354: mkdir d2/d1f/d40/d67/d6b 0 2026-03-09T00:03:52.731 INFO:tasks.workunit.client.0.vm03.stdout:0/335: mkdir d2/da/d76 0 2026-03-09T00:03:52.731 INFO:tasks.workunit.client.1.vm06.stdout:1/526: unlink l2 0 2026-03-09T00:03:52.731 INFO:tasks.workunit.client.1.vm06.stdout:0/589: mknod d3/d18/cc4 0 2026-03-09T00:03:52.731 INFO:tasks.workunit.client.1.vm06.stdout:1/527: symlink d6/d21/d2d/d3b/d87/laf 0 2026-03-09T00:03:52.731 INFO:tasks.workunit.client.1.vm06.stdout:0/590: symlink d3/d18/d2c/d2d/d8c/lc5 0 2026-03-09T00:03:52.732 INFO:tasks.workunit.client.0.vm03.stdout:0/336: creat d2/d71/f77 x:0 0 0 2026-03-09T00:03:52.733 INFO:tasks.workunit.client.1.vm06.stdout:0/591: rename d3/c4 to d3/d18/d28/cc6 0 2026-03-09T00:03:52.734 INFO:tasks.workunit.client.1.vm06.stdout:0/592: fsync d3/d18/d2c/f6b 0 2026-03-09T00:03:52.740 INFO:tasks.workunit.client.1.vm06.stdout:6/603: dwrite d4/d27/d3e/d78/f91 [4194304,4194304] 0 2026-03-09T00:03:52.742 INFO:tasks.workunit.client.1.vm06.stdout:6/604: mknod d4/d27/d42/d52/d7d/cbe 0 2026-03-09T00:03:52.742 INFO:tasks.workunit.client.1.vm06.stdout:6/605: rename d4/d27/d3e to d4/d27/d3e/d78/dbf 22 2026-03-09T00:03:52.744 INFO:tasks.workunit.client.1.vm06.stdout:6/606: link d4/l43 d4/d16/d46/lc0 0 2026-03-09T00:03:52.744 INFO:tasks.workunit.client.1.vm06.stdout:6/607: write d4/d16/d53/fb7 [696914,130587] 0 2026-03-09T00:03:52.745 INFO:tasks.workunit.client.1.vm06.stdout:6/608: mknod d4/d27/d3e/d57/cc1 0 2026-03-09T00:03:52.745 INFO:tasks.workunit.client.1.vm06.stdout:6/609: creat d4/d27/fc2 x:0 0 0 2026-03-09T00:03:52.745 INFO:tasks.workunit.client.1.vm06.stdout:6/610: dread d4/d27/d42/da6/fb3 [0,4194304] 0 2026-03-09T00:03:52.747 INFO:tasks.workunit.client.1.vm06.stdout:6/611: rename d4/d27/fc2 to d4/d27/d3e/fc3 0 2026-03-09T00:03:52.748 INFO:tasks.workunit.client.1.vm06.stdout:6/612: write d4/f6e [658367,20569] 0 2026-03-09T00:03:52.748 INFO:tasks.workunit.client.1.vm06.stdout:6/613: creat d4/d16/d46/fc4 x:0 0 0 2026-03-09T00:03:52.748 INFO:tasks.workunit.client.1.vm06.stdout:6/614: readlink d4/d16/d53/d67/l8c 0 2026-03-09T00:03:52.750 INFO:tasks.workunit.client.1.vm06.stdout:6/615: rename d4/d27/c3f to d4/d16/d46/cc5 0 2026-03-09T00:03:52.751 INFO:tasks.workunit.client.1.vm06.stdout:6/616: write d4/d27/d42/d52/d7d/f9d [1469262,28883] 0 2026-03-09T00:03:52.751 INFO:tasks.workunit.client.1.vm06.stdout:6/617: truncate d4/d27/f84 737128 0 2026-03-09T00:03:52.755 INFO:tasks.workunit.client.1.vm06.stdout:6/618: dread d4/f68 [0,4194304] 0 2026-03-09T00:03:52.756 INFO:tasks.workunit.client.1.vm06.stdout:6/619: link d4/d27/d3e/f44 d4/d27/d42/da6/fc6 0 2026-03-09T00:03:52.763 INFO:tasks.workunit.client.1.vm06.stdout:3/640: dwrite d11/d28/d2e/d7e/fdc [0,4194304] 0 2026-03-09T00:03:52.763 INFO:tasks.workunit.client.1.vm06.stdout:3/641: chown d11/d28/d4d/d89 1309550603 1 2026-03-09T00:03:52.763 INFO:tasks.workunit.client.1.vm06.stdout:3/642: chown d11/d28/d2e/f38 3077996 1 2026-03-09T00:03:52.765 INFO:tasks.workunit.client.1.vm06.stdout:5/697: dwrite d5/d44/d4b/d92/f52 [0,4194304] 0 2026-03-09T00:03:52.772 INFO:tasks.workunit.client.0.vm03.stdout:8/333: dwrite d7/df/d1a/d40/f4d [0,4194304] 0 2026-03-09T00:03:52.789 INFO:tasks.workunit.client.0.vm03.stdout:1/452: dwrite d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:52.789 INFO:tasks.workunit.client.1.vm06.stdout:7/616: dwrite d0/df/d1a/d3a/d4e/d5e/f73 [0,4194304] 0 2026-03-09T00:03:52.819 INFO:tasks.workunit.client.1.vm06.stdout:7/617: dwrite d0/df/d1a/f25 [4194304,4194304] 0 2026-03-09T00:03:52.819 INFO:tasks.workunit.client.1.vm06.stdout:7/618: truncate d0/df/d1a/d27/d4c/d40/fa5 1597640 0 2026-03-09T00:03:52.819 INFO:tasks.workunit.client.1.vm06.stdout:7/619: write d0/f14 [5244339,73426] 0 2026-03-09T00:03:52.819 INFO:tasks.workunit.client.1.vm06.stdout:7/620: chown d0/d39/c7f 1831084 1 2026-03-09T00:03:52.820 INFO:tasks.workunit.client.1.vm06.stdout:7/621: write d0/d55/d99/fac [720479,31994] 0 2026-03-09T00:03:52.820 INFO:tasks.workunit.client.1.vm06.stdout:7/622: write d0/fe [3332134,13886] 0 2026-03-09T00:03:52.822 INFO:tasks.workunit.client.1.vm06.stdout:3/643: unlink d11/d28/d4d/l82 0 2026-03-09T00:03:52.833 INFO:tasks.workunit.client.0.vm03.stdout:8/334: dread d7/f49 [0,4194304] 0 2026-03-09T00:03:52.895 INFO:tasks.workunit.client.0.vm03.stdout:1/453: rmdir d4/d3a/d84 39 2026-03-09T00:03:52.901 INFO:tasks.workunit.client.1.vm06.stdout:5/698: link d5/d44/d4b/fe1 d5/d1c/d21/d28/d5e/d66/d78/da6/fef 0 2026-03-09T00:03:52.906 INFO:tasks.workunit.client.1.vm06.stdout:7/623: unlink d0/df/d1a/d27/d4c/d40/d51/ca8 0 2026-03-09T00:03:52.906 INFO:tasks.workunit.client.1.vm06.stdout:7/624: chown d0/l1c 241259823 1 2026-03-09T00:03:52.912 INFO:tasks.workunit.client.1.vm06.stdout:3/644: rename d11/d3f/fcd to d11/d28/d4d/d9b/fe2 0 2026-03-09T00:03:52.919 INFO:tasks.workunit.client.1.vm06.stdout:5/699: truncate d5/f8e 3844640 0 2026-03-09T00:03:52.928 INFO:tasks.workunit.client.1.vm06.stdout:7/625: creat d0/fb1 x:0 0 0 2026-03-09T00:03:52.929 INFO:tasks.workunit.client.1.vm06.stdout:3/645: read d11/d28/d4d/d9b/f9d [113117,127994] 0 2026-03-09T00:03:52.930 INFO:tasks.workunit.client.1.vm06.stdout:7/626: write d0/df/d1a/f50 [505727,43649] 0 2026-03-09T00:03:52.930 INFO:tasks.workunit.client.1.vm06.stdout:7/627: fdatasync d0/df/d1a/d27/d4c/f6d 0 2026-03-09T00:03:52.930 INFO:tasks.workunit.client.1.vm06.stdout:3/646: dread d11/d28/f6b [0,4194304] 0 2026-03-09T00:03:52.930 INFO:tasks.workunit.client.1.vm06.stdout:3/647: chown f7 982671 1 2026-03-09T00:03:52.930 INFO:tasks.workunit.client.1.vm06.stdout:3/648: fsync d11/d28/d2e/d2f/d5b/d5f/f60 0 2026-03-09T00:03:52.941 INFO:tasks.workunit.client.1.vm06.stdout:5/700: mkdir d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/df0 0 2026-03-09T00:03:52.944 INFO:tasks.workunit.client.1.vm06.stdout:3/649: rename d11/d28/d2e/fb8 to d11/d28/d4d/d9b/fe3 0 2026-03-09T00:03:52.947 INFO:tasks.workunit.client.1.vm06.stdout:3/650: symlink d11/d28/d2e/d2f/d5b/d94/le4 0 2026-03-09T00:03:52.959 INFO:tasks.workunit.client.0.vm03.stdout:2/324: rename d8/d26/d5e/d5f/d4b to d8/d1b/d2a/d6b 0 2026-03-09T00:03:52.959 INFO:tasks.workunit.client.0.vm03.stdout:2/325: read - d8/d1b/d2a/d6b/d50/f63 zero size 2026-03-09T00:03:52.960 INFO:tasks.workunit.client.0.vm03.stdout:7/355: rename d2/d1f/d3a/d31 to d2/d4/d1e/d5e/d6c 0 2026-03-09T00:03:52.960 INFO:tasks.workunit.client.0.vm03.stdout:7/356: read d2/d4/f2e [269077,44892] 0 2026-03-09T00:03:52.961 INFO:tasks.workunit.client.0.vm03.stdout:2/326: mkdir d8/d1b/d6c 0 2026-03-09T00:03:52.961 INFO:tasks.workunit.client.0.vm03.stdout:2/327: write d8/d1b/f32 [1590787,6587] 0 2026-03-09T00:03:52.962 INFO:tasks.workunit.client.0.vm03.stdout:2/328: read d8/d17/f27 [191172,11228] 0 2026-03-09T00:03:52.963 INFO:tasks.workunit.client.0.vm03.stdout:0/337: rename d2/da/dd/f14 to d2/da/dd/d49/d6c/d4b/d55/f78 0 2026-03-09T00:03:52.963 INFO:tasks.workunit.client.0.vm03.stdout:0/338: fsync d2/da/dd/d49/d6c/f41 0 2026-03-09T00:03:52.966 INFO:tasks.workunit.client.0.vm03.stdout:8/335: rename d7/f49 to d7/df/d1e/f66 0 2026-03-09T00:03:52.966 INFO:tasks.workunit.client.0.vm03.stdout:8/336: stat d7/df/c20 0 2026-03-09T00:03:52.966 INFO:tasks.workunit.client.0.vm03.stdout:8/337: chown d7/f3c 2126835 1 2026-03-09T00:03:52.966 INFO:tasks.workunit.client.0.vm03.stdout:8/338: chown d7/f34 64999513 1 2026-03-09T00:03:52.966 INFO:tasks.workunit.client.0.vm03.stdout:8/339: chown d7/df/l19 2300 1 2026-03-09T00:03:52.970 INFO:tasks.workunit.client.0.vm03.stdout:8/340: unlink d7/df/d1e/l21 0 2026-03-09T00:03:52.971 INFO:tasks.workunit.client.0.vm03.stdout:2/329: dread d8/d1b/d2a/f4c [0,4194304] 0 2026-03-09T00:03:52.971 INFO:tasks.workunit.client.0.vm03.stdout:2/330: fsync d8/d26/d5e/d5f/f48 0 2026-03-09T00:03:52.975 INFO:tasks.workunit.client.0.vm03.stdout:8/341: unlink d7/f3c 0 2026-03-09T00:03:52.976 INFO:tasks.workunit.client.0.vm03.stdout:8/342: creat d7/f67 x:0 0 0 2026-03-09T00:03:52.980 INFO:tasks.workunit.client.0.vm03.stdout:2/331: mknod d8/d1b/d2a/d6b/d50/c6d 0 2026-03-09T00:03:52.980 INFO:tasks.workunit.client.0.vm03.stdout:2/332: creat d8/d17/f6e x:0 0 0 2026-03-09T00:03:52.980 INFO:tasks.workunit.client.0.vm03.stdout:2/333: write d8/d1b/d2a/d6b/d50/f54 [18155,97869] 0 2026-03-09T00:03:52.981 INFO:tasks.workunit.client.0.vm03.stdout:8/343: write d7/f34 [109225,107712] 0 2026-03-09T00:03:52.982 INFO:tasks.workunit.client.1.vm06.stdout:8/572: sync 2026-03-09T00:03:52.992 INFO:tasks.workunit.client.1.vm06.stdout:8/573: creat db/d1e/d9b/fbc x:0 0 0 2026-03-09T00:03:52.993 INFO:tasks.workunit.client.1.vm06.stdout:1/528: rmdir d6/d21/d2d/d3b/d87 39 2026-03-09T00:03:52.993 INFO:tasks.workunit.client.1.vm06.stdout:1/529: chown d6/d21/da6 1422762 1 2026-03-09T00:03:52.995 INFO:tasks.workunit.client.0.vm03.stdout:8/344: symlink d7/df/d1e/d38/l68 0 2026-03-09T00:03:52.998 INFO:tasks.workunit.client.0.vm03.stdout:8/345: getdents d7/df/d1e/d3f 0 2026-03-09T00:03:52.998 INFO:tasks.workunit.client.0.vm03.stdout:8/346: creat d7/df/d1a/d40/f69 x:0 0 0 2026-03-09T00:03:53.001 INFO:tasks.workunit.client.1.vm06.stdout:8/574: read db/f28 [470485,126751] 0 2026-03-09T00:03:53.013 INFO:tasks.workunit.client.1.vm06.stdout:8/575: creat db/d53/fbd x:0 0 0 2026-03-09T00:03:53.013 INFO:tasks.workunit.client.1.vm06.stdout:8/576: read - db/d53/d6d/d7b/f8a zero size 2026-03-09T00:03:53.013 INFO:tasks.workunit.client.1.vm06.stdout:8/577: fsync db/dd/f7a 0 2026-03-09T00:03:53.013 INFO:tasks.workunit.client.1.vm06.stdout:8/578: chown db/d53/d7c/d8f 1906 1 2026-03-09T00:03:53.015 INFO:tasks.workunit.client.1.vm06.stdout:1/530: mkdir d6/db0 0 2026-03-09T00:03:53.016 INFO:tasks.workunit.client.1.vm06.stdout:6/620: fsync d4/d27/d3e/f44 0 2026-03-09T00:03:53.016 INFO:tasks.workunit.client.0.vm03.stdout:8/347: symlink d7/df/d1e/d38/d4c/l6a 0 2026-03-09T00:03:53.016 INFO:tasks.workunit.client.0.vm03.stdout:2/334: dwrite d8/f59 [0,4194304] 0 2026-03-09T00:03:53.016 INFO:tasks.workunit.client.0.vm03.stdout:2/335: getdents d8/d1b/d6c 0 2026-03-09T00:03:53.023 INFO:tasks.workunit.client.1.vm06.stdout:8/579: rename f7 to db/dd/d24/d63/fbe 0 2026-03-09T00:03:53.027 INFO:tasks.workunit.client.0.vm03.stdout:2/336: dread d8/f15 [0,4194304] 0 2026-03-09T00:03:53.027 INFO:tasks.workunit.client.0.vm03.stdout:2/337: write d8/d1b/f30 [1632039,4256] 0 2026-03-09T00:03:53.030 INFO:tasks.workunit.client.1.vm06.stdout:0/593: dwrite d3/f1b [0,4194304] 0 2026-03-09T00:03:53.031 INFO:tasks.workunit.client.1.vm06.stdout:1/531: symlink d6/d21/lb1 0 2026-03-09T00:03:53.031 INFO:tasks.workunit.client.1.vm06.stdout:1/532: dread - d6/f98 zero size 2026-03-09T00:03:53.032 INFO:tasks.workunit.client.1.vm06.stdout:1/533: getdents d6/d4c/d71 0 2026-03-09T00:03:53.032 INFO:tasks.workunit.client.1.vm06.stdout:1/534: creat d6/d4c/d79/fb2 x:0 0 0 2026-03-09T00:03:53.032 INFO:tasks.workunit.client.1.vm06.stdout:1/535: fsync d6/d21/d2d/d3b/fae 0 2026-03-09T00:03:53.032 INFO:tasks.workunit.client.1.vm06.stdout:1/536: write d6/f1b [1657950,29643] 0 2026-03-09T00:03:53.040 INFO:tasks.workunit.client.0.vm03.stdout:2/338: unlink d8/f21 0 2026-03-09T00:03:53.040 INFO:tasks.workunit.client.0.vm03.stdout:2/339: fsync d8/f11 0 2026-03-09T00:03:53.041 INFO:tasks.workunit.client.1.vm06.stdout:0/594: write d3/d18/d2c/d2d/f46 [3271374,2698] 0 2026-03-09T00:03:53.045 INFO:tasks.workunit.client.0.vm03.stdout:2/340: mkdir d8/d26/d5e/d6f 0 2026-03-09T00:03:53.046 INFO:tasks.workunit.client.1.vm06.stdout:1/537: write d6/d21/d2d/d37/f77 [1275421,35626] 0 2026-03-09T00:03:53.052 INFO:tasks.workunit.client.1.vm06.stdout:0/595: dread d3/d18/d28/d45/f48 [0,4194304] 0 2026-03-09T00:03:53.052 INFO:tasks.workunit.client.1.vm06.stdout:0/596: chown d3/d18/d1f/d39/d3b/ca1 25505 1 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:9/497: sync 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:9/498: write d1/d4/d6e/d9/f4c [3515623,106419] 0 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:9/499: chown d1/d3/d4f/d52 224242 1 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:9/500: write d1/d4/d6e/f2c [844661,114431] 0 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:2/665: sync 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:4/543: sync 2026-03-09T00:03:53.056 INFO:tasks.workunit.client.1.vm06.stdout:4/544: chown d17/d24/d3b/d54/fa5 978 1 2026-03-09T00:03:53.063 INFO:tasks.workunit.client.1.vm06.stdout:6/621: truncate d4/f5 6037555 0 2026-03-09T00:03:53.063 INFO:tasks.workunit.client.1.vm06.stdout:6/622: fdatasync d4/d27/d42/d52/f6c 0 2026-03-09T00:03:53.063 INFO:tasks.workunit.client.1.vm06.stdout:6/623: write d4/f3d [4075723,94079] 0 2026-03-09T00:03:53.069 INFO:tasks.workunit.client.1.vm06.stdout:0/597: mkdir d3/d18/d2c/d2d/d74/dc7 0 2026-03-09T00:03:53.069 INFO:tasks.workunit.client.1.vm06.stdout:0/598: fsync d3/d18/d2c/d2d/d74/d90/fac 0 2026-03-09T00:03:53.076 INFO:tasks.workunit.client.1.vm06.stdout:9/501: mknod d1/d3/d4f/d52/c9d 0 2026-03-09T00:03:53.076 INFO:tasks.workunit.client.1.vm06.stdout:2/666: mknod d7/da/db/cc6 0 2026-03-09T00:03:53.079 INFO:tasks.workunit.client.1.vm06.stdout:4/545: rename d17/d21/c52 to d17/d24/d3b/d5e/d7a/cb6 0 2026-03-09T00:03:53.079 INFO:tasks.workunit.client.1.vm06.stdout:4/546: readlink d17/l37 0 2026-03-09T00:03:53.083 INFO:tasks.workunit.client.0.vm03.stdout:2/341: dwrite d8/d26/f5a [0,4194304] 0 2026-03-09T00:03:53.093 INFO:tasks.workunit.client.1.vm06.stdout:6/624: symlink d4/d16/d46/d90/lc7 0 2026-03-09T00:03:53.098 INFO:tasks.workunit.client.0.vm03.stdout:2/342: write d8/d1b/d24/f46 [2207389,94423] 0 2026-03-09T00:03:53.100 INFO:tasks.workunit.client.0.vm03.stdout:8/348: dwrite d7/df/d1e/d38/f3e [0,4194304] 0 2026-03-09T00:03:53.100 INFO:tasks.workunit.client.0.vm03.stdout:8/349: read - d7/df/d1a/f4f zero size 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:3/651: sync 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:3/652: readlink d11/d28/l30 0 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:5/701: sync 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:5/702: truncate d5/d1c/d23/f5b 638749 0 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:7/628: sync 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:7/629: write d0/df/d1a/d27/f60 [20162,51116] 0 2026-03-09T00:03:53.103 INFO:tasks.workunit.client.1.vm06.stdout:7/630: chown d0/df/c6e 598266 1 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:6/333: sync 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:6/334: dread - d13/d1e/d44/d59/f6e zero size 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:1/454: sync 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:5/398: sync 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:9/388: sync 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:4/426: sync 2026-03-09T00:03:53.112 INFO:tasks.workunit.client.0.vm03.stdout:3/282: sync 2026-03-09T00:03:53.114 INFO:tasks.workunit.client.1.vm06.stdout:9/502: stat d1/d73/c7d 0 2026-03-09T00:03:53.129 INFO:tasks.workunit.client.1.vm06.stdout:2/667: mknod d7/d1a/d25/d97/cc7 0 2026-03-09T00:03:53.132 INFO:tasks.workunit.client.0.vm03.stdout:5/399: write d1c/f30 [1706441,7782] 0 2026-03-09T00:03:53.141 INFO:tasks.workunit.client.0.vm03.stdout:0/339: getdents d2/da/dd/d49/d6c/d4b/d55 0 2026-03-09T00:03:53.148 INFO:tasks.workunit.client.0.vm03.stdout:8/350: mkdir d7/df/d6b 0 2026-03-09T00:03:53.148 INFO:tasks.workunit.client.0.vm03.stdout:8/351: write d7/f48 [285011,68694] 0 2026-03-09T00:03:53.148 INFO:tasks.workunit.client.0.vm03.stdout:8/352: fdatasync d7/f15 0 2026-03-09T00:03:53.149 INFO:tasks.workunit.client.1.vm06.stdout:4/547: mkdir d17/d24/d3b/d97/db7 0 2026-03-09T00:03:53.152 INFO:tasks.workunit.client.0.vm03.stdout:6/335: rename d13/f4d to d13/d1e/d44/d4a/f75 0 2026-03-09T00:03:53.152 INFO:tasks.workunit.client.0.vm03.stdout:6/336: write d13/d1e/f3e [348428,41469] 0 2026-03-09T00:03:53.154 INFO:tasks.workunit.client.1.vm06.stdout:0/599: truncate d3/d18/d1f/f26 2192348 0 2026-03-09T00:03:53.154 INFO:tasks.workunit.client.1.vm06.stdout:0/600: getdents d3/d18/d2c/d2d/d74/daf 0 2026-03-09T00:03:53.155 INFO:tasks.workunit.client.0.vm03.stdout:1/455: creat d4/d15/d86/f9b x:0 0 0 2026-03-09T00:03:53.155 INFO:tasks.workunit.client.0.vm03.stdout:1/456: write d4/d6/f90 [296644,38485] 0 2026-03-09T00:03:53.155 INFO:tasks.workunit.client.0.vm03.stdout:1/457: write d4/d15/d1a/f1b [1170999,81620] 0 2026-03-09T00:03:53.160 INFO:tasks.workunit.client.0.vm03.stdout:1/458: dread d4/d3a/d43/f47 [0,4194304] 0 2026-03-09T00:03:53.165 INFO:tasks.workunit.client.0.vm03.stdout:1/459: truncate d4/d3a/d3d/d46/f4c 5479689 0 2026-03-09T00:03:53.165 INFO:tasks.workunit.client.1.vm06.stdout:3/653: mknod d11/d28/d2e/d2f/d5b/db5/ce5 0 2026-03-09T00:03:53.166 INFO:tasks.workunit.client.0.vm03.stdout:7/357: sync 2026-03-09T00:03:53.168 INFO:tasks.workunit.client.0.vm03.stdout:1/460: write d4/d3a/d3d/d46/f70 [437611,66619] 0 2026-03-09T00:03:53.168 INFO:tasks.workunit.client.0.vm03.stdout:1/461: creat d4/d15/d5c/f9c x:0 0 0 2026-03-09T00:03:53.170 INFO:tasks.workunit.client.1.vm06.stdout:6/625: dwrite d4/d27/d3e/d45/f4d [0,4194304] 0 2026-03-09T00:03:53.170 INFO:tasks.workunit.client.1.vm06.stdout:6/626: chown d4/d27/d3e/c99 1399599 1 2026-03-09T00:03:53.170 INFO:tasks.workunit.client.1.vm06.stdout:6/627: chown d4/d27/d42/da6/dbb/lbd 119 1 2026-03-09T00:03:53.173 INFO:tasks.workunit.client.0.vm03.stdout:9/389: rmdir d15/d1c/d21/d54/d62 0 2026-03-09T00:03:53.177 INFO:tasks.workunit.client.0.vm03.stdout:9/390: truncate d15/d1c/d36/d4d/f6b 821741 0 2026-03-09T00:03:53.177 INFO:tasks.workunit.client.0.vm03.stdout:9/391: dread - d15/d1c/d28/f5e zero size 2026-03-09T00:03:53.182 INFO:tasks.workunit.client.1.vm06.stdout:5/703: creat d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/ff1 x:0 0 0 2026-03-09T00:03:53.185 INFO:tasks.workunit.client.0.vm03.stdout:4/427: mknod d7/c86 0 2026-03-09T00:03:53.189 INFO:tasks.workunit.client.0.vm03.stdout:5/400: symlink d1c/d20/d55/d4f/l85 0 2026-03-09T00:03:53.189 INFO:tasks.workunit.client.0.vm03.stdout:5/401: stat d1c/d51 0 2026-03-09T00:03:53.197 INFO:tasks.workunit.client.0.vm03.stdout:5/402: write d1c/d20/d55/f5a [2612873,76375] 0 2026-03-09T00:03:53.207 INFO:tasks.workunit.client.0.vm03.stdout:4/428: write d7/f1f [3571861,62067] 0 2026-03-09T00:03:53.219 INFO:tasks.workunit.client.0.vm03.stdout:0/340: link d2/d1f/c54 d2/da/dd/d49/d6c/d4b/d55/d6f/c79 0 2026-03-09T00:03:53.223 INFO:tasks.workunit.client.0.vm03.stdout:0/341: write d2/d5a/f63 [426662,115351] 0 2026-03-09T00:03:53.231 INFO:tasks.workunit.client.1.vm06.stdout:8/580: sync 2026-03-09T00:03:53.232 INFO:tasks.workunit.client.1.vm06.stdout:2/668: mkdir d7/d1a/d96/dc8 0 2026-03-09T00:03:53.232 INFO:tasks.workunit.client.1.vm06.stdout:2/669: readlink d7/d1b/d71/d79/db4/dc1/d86/l95 0 2026-03-09T00:03:53.232 INFO:tasks.workunit.client.1.vm06.stdout:2/670: chown d7/f3a 35533 1 2026-03-09T00:03:53.238 INFO:tasks.workunit.client.0.vm03.stdout:2/343: dwrite d8/d17/f68 [0,4194304] 0 2026-03-09T00:03:53.238 INFO:tasks.workunit.client.0.vm03.stdout:8/353: symlink d7/df/d1a/d2b/l6c 0 2026-03-09T00:03:53.238 INFO:tasks.workunit.client.0.vm03.stdout:2/344: read f7 [326145,44746] 0 2026-03-09T00:03:53.245 INFO:tasks.workunit.client.0.vm03.stdout:6/337: rename d13/c1b to d13/d35/c76 0 2026-03-09T00:03:53.245 INFO:tasks.workunit.client.0.vm03.stdout:6/338: dread - d13/d1e/d44/d59/f6e zero size 2026-03-09T00:03:53.247 INFO:tasks.workunit.client.0.vm03.stdout:7/358: unlink d2/d1f/d3a/c4e 0 2026-03-09T00:03:53.249 INFO:tasks.workunit.client.1.vm06.stdout:5/704: symlink d5/db1/lf2 0 2026-03-09T00:03:53.252 INFO:tasks.workunit.client.1.vm06.stdout:9/503: dwrite d1/d4/d6e/d14/d25/f4a [0,4194304] 0 2026-03-09T00:03:53.253 INFO:tasks.workunit.client.1.vm06.stdout:8/581: getdents db/dd/d85/d9f 0 2026-03-09T00:03:53.253 INFO:tasks.workunit.client.1.vm06.stdout:8/582: chown db/d53/d70/d38/d47/c5e 1091 1 2026-03-09T00:03:53.254 INFO:tasks.workunit.client.0.vm03.stdout:1/462: mknod d4/d3a/d61/d78/d81/d93/c9d 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.0.vm03.stdout:3/283: dwrite d2/f16 [0,4194304] 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.0.vm03.stdout:9/392: symlink d15/l82 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.0.vm03.stdout:9/393: dread - d15/d1c/d36/f78 zero size 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:2/671: mkdir d7/d1a/d25/d66/d87/da8/db2/dc9 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:4/548: getdents d17/d24/d3b/d54 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:4/549: creat d17/d21/fb8 x:0 0 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:5/705: creat d5/d1c/d21/ff3 x:0 0 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:5/706: write d5/d1c/d21/d28/d5e/d66/d78/dc8/f60 [1639310,88645] 0 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:5/707: chown d5/d1c/d23/d34/d47/ddd 2610 1 2026-03-09T00:03:53.268 INFO:tasks.workunit.client.1.vm06.stdout:5/708: dread d5/d1c/d21/d28/d5e/fc4 [0,4194304] 0 2026-03-09T00:03:53.279 INFO:tasks.workunit.client.1.vm06.stdout:3/654: dwrite d11/d28/d2e/d2f/d5b/d5f/f60 [0,4194304] 0 2026-03-09T00:03:53.289 INFO:tasks.workunit.client.1.vm06.stdout:7/631: dwrite d0/df/d1a/d3a/f3c [4194304,4194304] 0 2026-03-09T00:03:53.289 INFO:tasks.workunit.client.1.vm06.stdout:7/632: chown d0/df/d1a/d3a/f5d 33497181 1 2026-03-09T00:03:53.289 INFO:tasks.workunit.client.1.vm06.stdout:7/633: write d0/df/d1a/d3a/d4e/fa4 [65297,53992] 0 2026-03-09T00:03:53.289 INFO:tasks.workunit.client.1.vm06.stdout:0/601: dwrite d3/f19 [0,4194304] 0 2026-03-09T00:03:53.294 INFO:tasks.workunit.client.0.vm03.stdout:0/342: dwrite d2/f22 [4194304,4194304] 0 2026-03-09T00:03:53.294 INFO:tasks.workunit.client.0.vm03.stdout:0/343: write d2/da/d1a/f25 [594593,32073] 0 2026-03-09T00:03:53.308 INFO:tasks.workunit.client.1.vm06.stdout:6/628: dwrite d4/d27/f4e [0,4194304] 0 2026-03-09T00:03:53.308 INFO:tasks.workunit.client.1.vm06.stdout:6/629: chown d4/d27/d3e/d57/fa7 421840 1 2026-03-09T00:03:53.316 INFO:tasks.workunit.client.1.vm06.stdout:9/504: mkdir d1/d3/d4f/d91/d94/d9e 0 2026-03-09T00:03:53.320 INFO:tasks.workunit.client.1.vm06.stdout:8/583: rename db/dd/d48/f4a to db/d74/d78/fbf 0 2026-03-09T00:03:53.320 INFO:tasks.workunit.client.0.vm03.stdout:4/429: rename d7/d20/d6a/d77/d25/l2e to d7/d20/d6a/l87 0 2026-03-09T00:03:53.335 INFO:tasks.workunit.client.1.vm06.stdout:9/505: write d1/d4/d2f/f7f [3397127,111575] 0 2026-03-09T00:03:53.338 INFO:tasks.workunit.client.0.vm03.stdout:6/339: rmdir d13/d35/d4c 39 2026-03-09T00:03:53.338 INFO:tasks.workunit.client.1.vm06.stdout:2/672: unlink d7/d1b/d71/d79/db4/dc1/d86/f8b 0 2026-03-09T00:03:53.338 INFO:tasks.workunit.client.1.vm06.stdout:2/673: readlink d7/d1a/d25/d66/d87/l94 0 2026-03-09T00:03:53.345 INFO:tasks.workunit.client.0.vm03.stdout:7/359: rmdir d2/d1f/d42 39 2026-03-09T00:03:53.345 INFO:tasks.workunit.client.0.vm03.stdout:7/360: stat d2/d4/d1e/l2b 0 2026-03-09T00:03:53.354 INFO:tasks.workunit.client.1.vm06.stdout:7/634: dwrite d0/df/d1a/d35/d62/f98 [0,4194304] 0 2026-03-09T00:03:53.355 INFO:tasks.workunit.client.1.vm06.stdout:4/550: rmdir d17/d21/d32 39 2026-03-09T00:03:53.367 INFO:tasks.workunit.client.1.vm06.stdout:5/709: rmdir d5/d44/d4b/d92/d49 39 2026-03-09T00:03:53.372 INFO:tasks.workunit.client.1.vm06.stdout:3/655: mknod d11/d28/d2e/d2f/d5b/d94/ddd/ce6 0 2026-03-09T00:03:53.372 INFO:tasks.workunit.client.1.vm06.stdout:3/656: readlink d11/d28/d2e/d2f/d5b/d5f/l8b 0 2026-03-09T00:03:53.380 INFO:tasks.workunit.client.1.vm06.stdout:0/602: dread d3/d18/d1f/d39/d3b/f66 [0,4194304] 0 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/344: link d2/da/dd/d49/d6c/l4a d2/da/d36/l7a 0 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/345: fdatasync d2/da/d1a/f3a 0 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/346: truncate d2/da/d36/f58 816123 0 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/347: creat d2/da/dd/f7b x:0 0 0 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/348: chown d2/f1e 441313 1 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/349: read - d2/da/dd/d49/f6a zero size 2026-03-09T00:03:53.392 INFO:tasks.workunit.client.0.vm03.stdout:0/350: creat d2/d71/f7c x:0 0 0 2026-03-09T00:03:53.393 INFO:tasks.workunit.client.1.vm06.stdout:6/630: mkdir d4/d27/d42/dc8 0 2026-03-09T00:03:53.400 INFO:tasks.workunit.client.0.vm03.stdout:7/361: dwrite d2/d1f/d42/d43/f49 [0,4194304] 0 2026-03-09T00:03:53.400 INFO:tasks.workunit.client.0.vm03.stdout:7/362: readlink d2/l36 0 2026-03-09T00:03:53.404 INFO:tasks.workunit.client.0.vm03.stdout:4/430: dwrite d7/d20/d29/d38/d3a/f7c [0,4194304] 0 2026-03-09T00:03:53.407 INFO:tasks.workunit.client.1.vm06.stdout:9/506: symlink d1/l9f 0 2026-03-09T00:03:53.407 INFO:tasks.workunit.client.1.vm06.stdout:9/507: write d1/d3/f5c [8208986,22282] 0 2026-03-09T00:03:53.407 INFO:tasks.workunit.client.1.vm06.stdout:9/508: creat d1/d4/d2f/fa0 x:0 0 0 2026-03-09T00:03:53.409 INFO:tasks.workunit.client.1.vm06.stdout:2/674: mkdir d7/d1b/da5/dca 0 2026-03-09T00:03:53.413 INFO:tasks.workunit.client.1.vm06.stdout:2/675: dread d7/d1b/d31/fab [0,4194304] 0 2026-03-09T00:03:53.417 INFO:tasks.workunit.client.0.vm03.stdout:8/354: rename d7/df to d7/df/d1e/d38/d6d 22 2026-03-09T00:03:53.417 INFO:tasks.workunit.client.0.vm03.stdout:8/355: truncate d7/f25 535837 0 2026-03-09T00:03:53.417 INFO:tasks.workunit.client.0.vm03.stdout:8/356: chown d7/f48 87 1 2026-03-09T00:03:53.425 INFO:tasks.workunit.client.1.vm06.stdout:7/635: mkdir d0/d55/d99/db2 0 2026-03-09T00:03:53.432 INFO:tasks.workunit.client.1.vm06.stdout:4/551: creat d17/d24/d49/d5f/db2/fb9 x:0 0 0 2026-03-09T00:03:53.440 INFO:tasks.workunit.client.0.vm03.stdout:6/340: mkdir d13/d1e/d44/d59/d77 0 2026-03-09T00:03:53.445 INFO:tasks.workunit.client.1.vm06.stdout:6/631: truncate d4/d27/f31 1156744 0 2026-03-09T00:03:53.447 INFO:tasks.workunit.client.0.vm03.stdout:0/351: mknod d2/da/c7d 0 2026-03-09T00:03:53.447 INFO:tasks.workunit.client.0.vm03.stdout:0/352: creat d2/da/f7e x:0 0 0 2026-03-09T00:03:53.449 INFO:tasks.workunit.client.0.vm03.stdout:4/431: mknod d7/d20/c88 0 2026-03-09T00:03:53.450 INFO:tasks.workunit.client.1.vm06.stdout:5/710: dwrite d5/d1c/d23/f82 [0,4194304] 0 2026-03-09T00:03:53.461 INFO:tasks.workunit.client.0.vm03.stdout:4/432: dread d7/d20/d29/d38/d3a/f50 [0,4194304] 0 2026-03-09T00:03:53.470 INFO:tasks.workunit.client.0.vm03.stdout:4/433: write d7/f81 [568990,84039] 0 2026-03-09T00:03:53.470 INFO:tasks.workunit.client.0.vm03.stdout:4/434: creat d7/d27/f89 x:0 0 0 2026-03-09T00:03:53.471 INFO:tasks.workunit.client.1.vm06.stdout:6/632: dread d4/d27/d42/f60 [0,4194304] 0 2026-03-09T00:03:53.471 INFO:tasks.workunit.client.1.vm06.stdout:6/633: write d4/d27/d42/d52/f6c [529784,25380] 0 2026-03-09T00:03:53.473 INFO:tasks.workunit.client.1.vm06.stdout:2/676: creat d7/d1b/d71/fcb x:0 0 0 2026-03-09T00:03:53.475 INFO:tasks.workunit.client.0.vm03.stdout:3/284: rename d2/db/f27 to d2/db/d2d/f54 0 2026-03-09T00:03:53.492 INFO:tasks.workunit.client.0.vm03.stdout:6/341: symlink d13/d1e/d44/d59/d77/l78 0 2026-03-09T00:03:53.500 INFO:tasks.workunit.client.1.vm06.stdout:5/711: mkdir d5/d1c/d21/d28/df4 0 2026-03-09T00:03:53.503 INFO:tasks.workunit.client.1.vm06.stdout:5/712: write d5/d1c/d21/d28/d5e/d66/dab/fe3 [452022,26365] 0 2026-03-09T00:03:53.505 INFO:tasks.workunit.client.0.vm03.stdout:0/353: unlink d2/d1f/c2a 0 2026-03-09T00:03:53.505 INFO:tasks.workunit.client.0.vm03.stdout:0/354: chown d2/da/d1a/f53 65259914 1 2026-03-09T00:03:53.506 INFO:tasks.workunit.client.0.vm03.stdout:8/357: dwrite d7/f34 [0,4194304] 0 2026-03-09T00:03:53.508 INFO:tasks.workunit.client.0.vm03.stdout:9/394: write fd [7303033,23213] 0 2026-03-09T00:03:53.516 INFO:tasks.workunit.client.1.vm06.stdout:0/603: dwrite d3/f29 [0,4194304] 0 2026-03-09T00:03:53.517 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:53 vm03.local ceph-mon[52346]: pgmap v11: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 110 MiB/s rd, 133 MiB/s wr, 206 op/s 2026-03-09T00:03:53.517 INFO:tasks.workunit.client.0.vm03.stdout:7/363: dwrite d2/d4/fb [0,4194304] 0 2026-03-09T00:03:53.521 INFO:tasks.workunit.client.1.vm06.stdout:6/634: rename d4/d16/f21 to d4/d27/d3e/d78/fc9 0 2026-03-09T00:03:53.530 INFO:tasks.workunit.client.0.vm03.stdout:4/435: unlink d7/d20/d35/f84 0 2026-03-09T00:03:53.542 INFO:tasks.workunit.client.1.vm06.stdout:4/552: dwrite d17/d24/d49/f5a [0,4194304] 0 2026-03-09T00:03:53.543 INFO:tasks.workunit.client.1.vm06.stdout:1/538: sync 2026-03-09T00:03:53.551 INFO:tasks.workunit.client.0.vm03.stdout:8/358: creat d7/df/d1e/d38/d60/f6e x:0 0 0 2026-03-09T00:03:53.567 INFO:tasks.workunit.client.1.vm06.stdout:0/604: chown d3/d18/d1f/d39/d49/d60/fb3 0 1 2026-03-09T00:03:53.568 INFO:tasks.workunit.client.0.vm03.stdout:7/364: mknod d2/d4/c6d 0 2026-03-09T00:03:53.568 INFO:tasks.workunit.client.0.vm03.stdout:7/365: stat d2/d1f/d42/d43/f49 0 2026-03-09T00:03:53.572 INFO:tasks.workunit.client.1.vm06.stdout:2/677: dwrite d7/da/d4e/d57/d9d/fbe [0,4194304] 0 2026-03-09T00:03:53.572 INFO:tasks.workunit.client.1.vm06.stdout:2/678: chown d7/da/db/de/f60 2405802 1 2026-03-09T00:03:53.575 INFO:tasks.workunit.client.0.vm03.stdout:2/345: rename d8/cf to d8/d1b/d2a/d6b/d50/c70 0 2026-03-09T00:03:53.584 INFO:tasks.workunit.client.1.vm06.stdout:7/636: dwrite d0/f14 [0,4194304] 0 2026-03-09T00:03:53.613 INFO:tasks.workunit.client.0.vm03.stdout:3/285: dwrite d2/db/f3a [0,4194304] 0 2026-03-09T00:03:53.613 INFO:tasks.workunit.client.0.vm03.stdout:9/395: rename d15/d1c/d21/c31 to d15/d7f/c83 0 2026-03-09T00:03:53.613 INFO:tasks.workunit.client.0.vm03.stdout:3/286: write d2/db/d2d/f54 [559480,43021] 0 2026-03-09T00:03:53.616 INFO:tasks.workunit.client.0.vm03.stdout:9/396: dread d15/d1c/d21/d64/f3d [0,4194304] 0 2026-03-09T00:03:53.621 INFO:tasks.workunit.client.1.vm06.stdout:1/539: mkdir d6/d4c/d51/db3 0 2026-03-09T00:03:53.622 INFO:tasks.workunit.client.0.vm03.stdout:0/355: dwrite d2/da/d1a/f53 [0,4194304] 0 2026-03-09T00:03:53.628 INFO:tasks.workunit.client.1.vm06.stdout:0/605: link d3/d18/d2c/d2d/f85 d3/d18/d1f/d39/d3b/fc8 0 2026-03-09T00:03:53.632 INFO:tasks.workunit.client.0.vm03.stdout:5/403: sync 2026-03-09T00:03:53.632 INFO:tasks.workunit.client.0.vm03.stdout:5/404: write f12 [5014664,86986] 0 2026-03-09T00:03:53.632 INFO:tasks.workunit.client.0.vm03.stdout:5/405: fdatasync d1c/f30 0 2026-03-09T00:03:53.632 INFO:tasks.workunit.client.0.vm03.stdout:5/406: chown d1c/d20/f25 1149694 1 2026-03-09T00:03:53.632 INFO:tasks.workunit.client.0.vm03.stdout:1/463: sync 2026-03-09T00:03:53.633 INFO:tasks.workunit.client.0.vm03.stdout:5/407: dread d1c/d51/f68 [0,4194304] 0 2026-03-09T00:03:53.643 INFO:tasks.workunit.client.0.vm03.stdout:6/342: dwrite d13/d1e/f34 [0,4194304] 0 2026-03-09T00:03:53.643 INFO:tasks.workunit.client.0.vm03.stdout:6/343: write d13/f70 [880201,124606] 0 2026-03-09T00:03:53.646 INFO:tasks.workunit.client.0.vm03.stdout:6/344: read d13/f55 [771425,44462] 0 2026-03-09T00:03:53.646 INFO:tasks.workunit.client.0.vm03.stdout:6/345: fdatasync d13/d35/f68 0 2026-03-09T00:03:53.651 INFO:tasks.workunit.client.1.vm06.stdout:4/553: symlink d17/d24/d3b/d5e/lba 0 2026-03-09T00:03:53.653 INFO:tasks.workunit.client.0.vm03.stdout:4/436: dwrite d7/d20/d29/d4e/f74 [4194304,4194304] 0 2026-03-09T00:03:53.653 INFO:tasks.workunit.client.0.vm03.stdout:4/437: write d7/d20/f34 [839426,61905] 0 2026-03-09T00:03:53.656 INFO:tasks.workunit.client.1.vm06.stdout:1/540: truncate d6/d21/d2d/d37/f78 3413436 0 2026-03-09T00:03:53.659 INFO:tasks.workunit.client.1.vm06.stdout:5/713: dwrite d5/d1c/d21/d28/d5e/d66/dab/fe3 [0,4194304] 0 2026-03-09T00:03:53.667 INFO:tasks.workunit.client.0.vm03.stdout:7/366: dwrite d2/f3 [0,4194304] 0 2026-03-09T00:03:53.667 INFO:tasks.workunit.client.0.vm03.stdout:7/367: chown d2/d4/d1e/d5e/d6c/f44 2007 1 2026-03-09T00:03:53.667 INFO:tasks.workunit.client.0.vm03.stdout:7/368: write d2/d1f/f3b [3975962,31828] 0 2026-03-09T00:03:53.669 INFO:tasks.workunit.client.1.vm06.stdout:3/657: write d11/d28/d2e/d2f/f3e [396959,5758] 0 2026-03-09T00:03:53.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:53 vm06.local ceph-mon[58395]: pgmap v11: 65 pgs: 65 active+clean; 1.8 GiB data, 6.7 GiB used, 113 GiB / 120 GiB avail; 110 MiB/s rd, 133 MiB/s wr, 206 op/s 2026-03-09T00:03:53.671 INFO:tasks.workunit.client.1.vm06.stdout:7/637: dwrite d0/df/d1a/f44 [0,4194304] 0 2026-03-09T00:03:53.687 INFO:tasks.workunit.client.1.vm06.stdout:0/606: mknod d3/d18/d1f/d39/d69/cc9 0 2026-03-09T00:03:53.689 INFO:tasks.workunit.client.1.vm06.stdout:0/607: creat d3/d18/d79/fca x:0 0 0 2026-03-09T00:03:53.692 INFO:tasks.workunit.client.1.vm06.stdout:0/608: dread d3/d18/d2c/f6b [0,4194304] 0 2026-03-09T00:03:53.692 INFO:tasks.workunit.client.1.vm06.stdout:0/609: getdents d3/d18/d2c 0 2026-03-09T00:03:53.692 INFO:tasks.workunit.client.1.vm06.stdout:4/554: rename d17/d24/d3b/d5e/c8e to d17/d24/d49/d5f/cbb 0 2026-03-09T00:03:53.694 INFO:tasks.workunit.client.1.vm06.stdout:1/541: link d6/d21/d2d/f3c d6/d21/d2d/d3b/d42/fb4 0 2026-03-09T00:03:53.712 INFO:tasks.workunit.client.0.vm03.stdout:3/287: unlink d2/db/l1f 0 2026-03-09T00:03:53.716 INFO:tasks.workunit.client.0.vm03.stdout:9/397: truncate f10 1400896 0 2026-03-09T00:03:53.717 INFO:tasks.workunit.client.0.vm03.stdout:9/398: dread d15/d1c/d28/d6e/f7c [0,4194304] 0 2026-03-09T00:03:53.718 INFO:tasks.workunit.client.1.vm06.stdout:9/509: dwrite d1/d4/d2f/f43 [0,4194304] 0 2026-03-09T00:03:53.720 INFO:tasks.workunit.client.1.vm06.stdout:8/584: sync 2026-03-09T00:03:53.721 INFO:tasks.workunit.client.1.vm06.stdout:5/714: dwrite d5/d1c/d21/d28/f33 [4194304,4194304] 0 2026-03-09T00:03:53.728 INFO:tasks.workunit.client.0.vm03.stdout:8/359: rmdir d7 39 2026-03-09T00:03:53.728 INFO:tasks.workunit.client.0.vm03.stdout:8/360: write d7/df/d1a/f2a [3842854,9072] 0 2026-03-09T00:03:53.732 INFO:tasks.workunit.client.0.vm03.stdout:0/356: unlink d2/da/d1a/f53 0 2026-03-09T00:03:53.735 INFO:tasks.workunit.client.0.vm03.stdout:1/464: mkdir d4/d3a/d61/d78/d81/d9e 0 2026-03-09T00:03:53.735 INFO:tasks.workunit.client.1.vm06.stdout:0/610: creat d3/d18/d2c/d2d/d74/d7d/d9f/fcb x:0 0 0 2026-03-09T00:03:53.736 INFO:tasks.workunit.client.0.vm03.stdout:5/408: symlink d1c/d51/l86 0 2026-03-09T00:03:53.738 INFO:tasks.workunit.client.1.vm06.stdout:3/658: rename d11/d28/d4d/d89/d90/ca0 to d11/d28/d2e/d2f/dc1/ce7 0 2026-03-09T00:03:53.741 INFO:tasks.workunit.client.1.vm06.stdout:4/555: mknod d17/d21/d32/d92/cbc 0 2026-03-09T00:03:53.742 INFO:tasks.workunit.client.0.vm03.stdout:6/346: stat d13/d1e/l22 0 2026-03-09T00:03:53.745 INFO:tasks.workunit.client.0.vm03.stdout:6/347: dread d13/f1d [0,4194304] 0 2026-03-09T00:03:53.745 INFO:tasks.workunit.client.1.vm06.stdout:2/679: sync 2026-03-09T00:03:53.746 INFO:tasks.workunit.client.1.vm06.stdout:6/635: sync 2026-03-09T00:03:53.761 INFO:tasks.workunit.client.1.vm06.stdout:7/638: write d0/df/d1a/d27/d4c/d40/d5b/f78 [1012450,124175] 0 2026-03-09T00:03:53.769 INFO:tasks.workunit.client.0.vm03.stdout:4/438: symlink d7/d6f/l8a 0 2026-03-09T00:03:53.769 INFO:tasks.workunit.client.0.vm03.stdout:4/439: stat d7/d20/d29/f53 0 2026-03-09T00:03:53.773 INFO:tasks.workunit.client.1.vm06.stdout:8/585: symlink db/d74/d87/lc0 0 2026-03-09T00:03:53.786 INFO:tasks.workunit.client.1.vm06.stdout:0/611: symlink d3/d18/lcc 0 2026-03-09T00:03:53.789 INFO:tasks.workunit.client.1.vm06.stdout:1/542: rename f0 to d6/d21/d2d/d37/fb5 0 2026-03-09T00:03:53.795 INFO:tasks.workunit.client.1.vm06.stdout:1/543: write d6/f7 [922339,33354] 0 2026-03-09T00:03:53.799 INFO:tasks.workunit.client.1.vm06.stdout:4/556: rmdir d17/d24/d3b/d54 39 2026-03-09T00:03:53.810 INFO:tasks.workunit.client.1.vm06.stdout:9/510: rmdir d1/d4 39 2026-03-09T00:03:53.814 INFO:tasks.workunit.client.1.vm06.stdout:5/715: getdents d5/d1c/d21/d28/d5e/d66 0 2026-03-09T00:03:53.816 INFO:tasks.workunit.client.1.vm06.stdout:6/636: dwrite d4/d27/d3e/d78/f92 [0,4194304] 0 2026-03-09T00:03:53.816 INFO:tasks.workunit.client.1.vm06.stdout:2/680: dwrite d7/da/d4e/d57/f9f [8388608,4194304] 0 2026-03-09T00:03:53.827 INFO:tasks.workunit.client.1.vm06.stdout:3/659: sync 2026-03-09T00:03:53.828 INFO:tasks.workunit.client.1.vm06.stdout:7/639: dwrite d0/df/d17/f1f [0,4194304] 0 2026-03-09T00:03:53.829 INFO:tasks.workunit.client.1.vm06.stdout:8/586: dwrite db/d53/d70/d38/d4d/d79/f96 [0,4194304] 0 2026-03-09T00:03:53.829 INFO:tasks.workunit.client.1.vm06.stdout:8/587: fdatasync db/d53/d70/f75 0 2026-03-09T00:03:53.829 INFO:tasks.workunit.client.1.vm06.stdout:8/588: stat db/d53/d70/d38/d4d/d79 0 2026-03-09T00:03:53.829 INFO:tasks.workunit.client.1.vm06.stdout:8/589: write db/d53/d7c/fa0 [699336,30429] 0 2026-03-09T00:03:53.829 INFO:tasks.workunit.client.1.vm06.stdout:0/612: mknod d3/d18/d3c/ccd 0 2026-03-09T00:03:53.830 INFO:tasks.workunit.client.1.vm06.stdout:7/640: write d0/df/d1a/d27/d4c/d40/d5b/f78 [478725,57464] 0 2026-03-09T00:03:53.830 INFO:tasks.workunit.client.1.vm06.stdout:7/641: fsync d0/df/d1a/d3a/d4e/d5e/f93 0 2026-03-09T00:03:53.830 INFO:tasks.workunit.client.1.vm06.stdout:7/642: write d0/fb1 [1000431,58730] 0 2026-03-09T00:03:53.830 INFO:tasks.workunit.client.1.vm06.stdout:7/643: chown d0/df/d1a/d27/d4c/f6d 17206882 1 2026-03-09T00:03:53.832 INFO:tasks.workunit.client.0.vm03.stdout:3/288: write d2/f1d [97042,68509] 0 2026-03-09T00:03:53.833 INFO:tasks.workunit.client.1.vm06.stdout:0/613: dread d3/f19 [0,4194304] 0 2026-03-09T00:03:53.834 INFO:tasks.workunit.client.1.vm06.stdout:7/644: dread d0/df/f13 [0,4194304] 0 2026-03-09T00:03:53.834 INFO:tasks.workunit.client.1.vm06.stdout:7/645: readlink d0/d39/la0 0 2026-03-09T00:03:53.835 INFO:tasks.workunit.client.0.vm03.stdout:9/399: mknod d15/d1c/d21/d75/c84 0 2026-03-09T00:03:53.836 INFO:tasks.workunit.client.1.vm06.stdout:7/646: write d0/df/d1a/d35/d62/f87 [341632,122630] 0 2026-03-09T00:03:53.842 INFO:tasks.workunit.client.1.vm06.stdout:1/544: link d6/d21/d2d/d3b/d87/l94 d6/d21/d2d/lb6 0 2026-03-09T00:03:53.842 INFO:tasks.workunit.client.1.vm06.stdout:1/545: readlink d6/d4c/d79/l53 0 2026-03-09T00:03:53.848 INFO:tasks.workunit.client.1.vm06.stdout:4/557: rename d17/d5b/f89 to d17/d21/d32/fbd 0 2026-03-09T00:03:53.848 INFO:tasks.workunit.client.1.vm06.stdout:4/558: stat d17/f1d 0 2026-03-09T00:03:53.849 INFO:tasks.workunit.client.1.vm06.stdout:9/511: symlink d1/d3/d4f/d91/d94/la1 0 2026-03-09T00:03:53.849 INFO:tasks.workunit.client.1.vm06.stdout:9/512: fdatasync d1/d3/d4f/f71 0 2026-03-09T00:03:53.849 INFO:tasks.workunit.client.1.vm06.stdout:5/716: symlink d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/lf5 0 2026-03-09T00:03:53.857 INFO:tasks.workunit.client.0.vm03.stdout:8/361: unlink d7/f15 0 2026-03-09T00:03:53.858 INFO:tasks.workunit.client.0.vm03.stdout:8/362: write d7/df/d1a/f2e [497897,94179] 0 2026-03-09T00:03:53.858 INFO:tasks.workunit.client.0.vm03.stdout:0/357: creat d2/f7f x:0 0 0 2026-03-09T00:03:53.858 INFO:tasks.workunit.client.0.vm03.stdout:1/465: mknod d4/d3a/d3d/c9f 0 2026-03-09T00:03:53.858 INFO:tasks.workunit.client.0.vm03.stdout:1/466: chown d4/d15/d86/f9b 0 1 2026-03-09T00:03:53.858 INFO:tasks.workunit.client.0.vm03.stdout:5/409: mknod d1c/d20/d56/c87 0 2026-03-09T00:03:53.858 INFO:tasks.workunit.client.0.vm03.stdout:0/358: creat d2/da/d1a/f80 x:0 0 0 2026-03-09T00:03:53.860 INFO:tasks.workunit.client.0.vm03.stdout:1/467: dread f2 [0,4194304] 0 2026-03-09T00:03:53.866 INFO:tasks.workunit.client.1.vm06.stdout:6/637: mknod d4/d27/d42/d7e/dac/cca 0 2026-03-09T00:03:53.874 INFO:tasks.workunit.client.0.vm03.stdout:4/440: symlink d7/d20/d29/d38/d3a/l8b 0 2026-03-09T00:03:53.876 INFO:tasks.workunit.client.0.vm03.stdout:4/441: chown d7/d20/d29/c2d 0 1 2026-03-09T00:03:53.876 INFO:tasks.workunit.client.0.vm03.stdout:4/442: chown d7/d20/d29/d38/d3a/l8b 1010232 1 2026-03-09T00:03:53.877 INFO:tasks.workunit.client.0.vm03.stdout:4/443: creat d7/d20/d35/f8c x:0 0 0 2026-03-09T00:03:53.882 INFO:tasks.workunit.client.1.vm06.stdout:2/681: creat d7/d1b/d71/d79/db4/dc1/d86/fcc x:0 0 0 2026-03-09T00:03:53.890 INFO:tasks.workunit.client.0.vm03.stdout:9/400: dwrite d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:03:53.892 INFO:tasks.workunit.client.1.vm06.stdout:1/546: dwrite d6/fa [0,4194304] 0 2026-03-09T00:03:53.892 INFO:tasks.workunit.client.1.vm06.stdout:1/547: dread - d6/d21/d2d/d3b/d42/f9f zero size 2026-03-09T00:03:53.894 INFO:tasks.workunit.client.0.vm03.stdout:7/369: truncate d2/d1f/d42/d43/f49 4065863 0 2026-03-09T00:03:53.894 INFO:tasks.workunit.client.0.vm03.stdout:7/370: fsync d2/d1f/d35/f5a 0 2026-03-09T00:03:53.894 INFO:tasks.workunit.client.0.vm03.stdout:7/371: fsync d2/d1f/d40/d67/f64 0 2026-03-09T00:03:53.901 INFO:tasks.workunit.client.1.vm06.stdout:3/660: creat d11/d28/d2e/d7e/d83/fe8 x:0 0 0 2026-03-09T00:03:53.902 INFO:tasks.workunit.client.1.vm06.stdout:3/661: dread d11/d28/d2e/d2f/d36/f55 [0,4194304] 0 2026-03-09T00:03:53.902 INFO:tasks.workunit.client.1.vm06.stdout:3/662: fdatasync d11/fa2 0 2026-03-09T00:03:53.917 INFO:tasks.workunit.client.0.vm03.stdout:6/348: rmdir d13/d1e/d44/d4a/d52 39 2026-03-09T00:03:53.919 INFO:tasks.workunit.client.0.vm03.stdout:7/372: dread d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:03:53.919 INFO:tasks.workunit.client.0.vm03.stdout:7/373: fdatasync d2/d1f/f11 0 2026-03-09T00:03:53.929 INFO:tasks.workunit.client.0.vm03.stdout:8/363: symlink d7/df/l6f 0 2026-03-09T00:03:53.931 INFO:tasks.workunit.client.1.vm06.stdout:8/590: creat db/d53/d70/d38/d4d/d79/fc1 x:0 0 0 2026-03-09T00:03:53.939 INFO:tasks.workunit.client.0.vm03.stdout:5/410: unlink d1c/d20/d55/d66/d6b/l81 0 2026-03-09T00:03:53.952 INFO:tasks.workunit.client.1.vm06.stdout:0/614: rename d3/d18/d1f/d39/la6 to d3/d18/d2c/d2d/d74/lce 0 2026-03-09T00:03:53.963 INFO:tasks.workunit.client.0.vm03.stdout:0/359: mkdir d2/da/dd/d49/d6c/d81 0 2026-03-09T00:03:53.963 INFO:tasks.workunit.client.0.vm03.stdout:0/360: readlink d2/da/dd/l35 0 2026-03-09T00:03:53.963 INFO:tasks.workunit.client.0.vm03.stdout:0/361: rename d2/da/dd/d49 to d2/da/dd/d49/d6c/d4b/d82 22 2026-03-09T00:03:53.971 INFO:tasks.workunit.client.1.vm06.stdout:4/559: mkdir d17/d21/d4c/d66/d68/dbe 0 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/468: link d4/d3a/d61/f75 d4/fa0 0 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/469: chown d4/d3a 38 1 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/470: dread - d4/d3a/d32/f8d zero size 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/471: truncate d4/d3a/d32/f8d 342515 0 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/472: fdatasync d4/d3a/d3d/f64 0 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/473: readlink d4/l10 0 2026-03-09T00:03:53.974 INFO:tasks.workunit.client.0.vm03.stdout:1/474: write d4/d15/f4e [18728,118657] 0 2026-03-09T00:03:53.988 INFO:tasks.workunit.client.0.vm03.stdout:3/289: readlink d2/db/l22 0 2026-03-09T00:03:53.990 INFO:tasks.workunit.client.1.vm06.stdout:5/717: unlink d5/d44/d4b/f6d 0 2026-03-09T00:03:54.008 INFO:tasks.workunit.client.1.vm06.stdout:6/638: unlink d4/d27/l2f 0 2026-03-09T00:03:54.015 INFO:tasks.workunit.client.0.vm03.stdout:9/401: rmdir d15/d1c/d21/d64 39 2026-03-09T00:03:54.015 INFO:tasks.workunit.client.0.vm03.stdout:9/402: readlink l13 0 2026-03-09T00:03:54.024 INFO:tasks.workunit.client.0.vm03.stdout:6/349: creat d13/d35/d4c/d62/f79 x:0 0 0 2026-03-09T00:03:54.034 INFO:tasks.workunit.client.0.vm03.stdout:8/364: symlink d7/df/l70 0 2026-03-09T00:03:54.040 INFO:tasks.workunit.client.0.vm03.stdout:5/411: dwrite d1c/d20/f33 [0,4194304] 0 2026-03-09T00:03:54.040 INFO:tasks.workunit.client.0.vm03.stdout:5/412: fdatasync d1c/d20/d55/f61 0 2026-03-09T00:03:54.040 INFO:tasks.workunit.client.0.vm03.stdout:5/413: fdatasync fb 0 2026-03-09T00:03:54.043 INFO:tasks.workunit.client.0.vm03.stdout:9/403: dread d15/d1c/d28/f55 [0,4194304] 0 2026-03-09T00:03:54.043 INFO:tasks.workunit.client.0.vm03.stdout:9/404: truncate d15/d1c/d36/f6d 1335858 0 2026-03-09T00:03:54.043 INFO:tasks.workunit.client.0.vm03.stdout:9/405: chown d15/d7f 212 1 2026-03-09T00:03:54.048 INFO:tasks.workunit.client.0.vm03.stdout:2/346: sync 2026-03-09T00:03:54.063 INFO:tasks.workunit.client.0.vm03.stdout:2/347: truncate d8/d1b/d24/f38 271742 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.1.vm06.stdout:8/591: creat db/dd/d24/da7/fc2 x:0 0 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.1.vm06.stdout:0/615: link d3/fa d3/d18/d1f/d39/d3b/fcf 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.1.vm06.stdout:7/647: rename d0/df/d1a/d3f/d53/c89 to d0/df/d1a/d3a/cb3 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.1.vm06.stdout:0/616: fsync d3/d18/d2c/d2d/d74/d90/fac 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:3/290: readlink d2/db/l47 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:7/374: dwrite d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:7/375: fsync d2/d1f/d42/d46/f5b 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:8/365: truncate d7/df/f3d 194732 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:9/406: mknod d15/d7f/c85 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:8/366: write d7/f67 [167047,27159] 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:9/407: write d15/d1c/d28/f55 [1086473,121048] 0 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:8/367: dread - d7/df/d1e/d3f/f59 zero size 2026-03-09T00:03:54.064 INFO:tasks.workunit.client.0.vm03.stdout:9/408: creat d15/d1c/d36/f86 x:0 0 0 2026-03-09T00:03:54.065 INFO:tasks.workunit.client.0.vm03.stdout:0/362: dread d2/da/f4f [0,4194304] 0 2026-03-09T00:03:54.066 INFO:tasks.workunit.client.1.vm06.stdout:4/560: mkdir d17/d24/d3b/dbf 0 2026-03-09T00:03:54.066 INFO:tasks.workunit.client.1.vm06.stdout:2/682: symlink d7/lcd 0 2026-03-09T00:03:54.066 INFO:tasks.workunit.client.1.vm06.stdout:6/639: symlink d4/d27/d42/dc8/lcb 0 2026-03-09T00:03:54.074 INFO:tasks.workunit.client.0.vm03.stdout:9/409: dread d15/d1c/d21/d64/f50 [0,4194304] 0 2026-03-09T00:03:54.075 INFO:tasks.workunit.client.0.vm03.stdout:9/410: write d15/f23 [40164,56538] 0 2026-03-09T00:03:54.075 INFO:tasks.workunit.client.0.vm03.stdout:9/411: chown d15/c52 4045 1 2026-03-09T00:03:54.079 INFO:tasks.workunit.client.1.vm06.stdout:9/513: rename d1/d3/f1f to d1/d4/fa2 0 2026-03-09T00:03:54.093 INFO:tasks.workunit.client.1.vm06.stdout:9/514: stat d1/d4/d6e/d14/d25/d85/d49/l88 0 2026-03-09T00:03:54.093 INFO:tasks.workunit.client.0.vm03.stdout:7/376: mkdir d2/d4/d1e/d5e/d6c/d37/d39/d6e 0 2026-03-09T00:03:54.093 INFO:tasks.workunit.client.0.vm03.stdout:2/348: unlink d8/d26/c2b 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:7/648: mknod d0/df/d1a/d3f/d53/cb4 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:0/617: mknod d3/d18/d2c/d2d/cd0 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:9/515: creat d1/fa3 x:0 0 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:9/516: fsync d1/d4/d6e/d14/d25/d85/f28 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:4/561: creat d17/d24/d3b/d5e/d6e/fc0 x:0 0 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:1/548: rename d6/d21/d2d/d3b/d87/l94 to d6/d21/d2d/d3b/lb7 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:4/562: read d17/d21/d4c/f90 [30391,12192] 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:9/517: link d1/d4/d6e/f2c d1/d4/d6e/fa4 0 2026-03-09T00:03:54.094 INFO:tasks.workunit.client.1.vm06.stdout:9/518: creat d1/d3/d4f/d52/fa5 x:0 0 0 2026-03-09T00:03:54.100 INFO:tasks.workunit.client.0.vm03.stdout:0/363: creat d2/da/dd/d49/d6c/d4b/d55/f83 x:0 0 0 2026-03-09T00:03:54.100 INFO:tasks.workunit.client.0.vm03.stdout:0/364: dread - d2/d71/f7c zero size 2026-03-09T00:03:54.100 INFO:tasks.workunit.client.0.vm03.stdout:0/365: creat d2/da/d1a/f84 x:0 0 0 2026-03-09T00:03:54.100 INFO:tasks.workunit.client.0.vm03.stdout:0/366: dread - d2/d71/f77 zero size 2026-03-09T00:03:54.103 INFO:tasks.workunit.client.0.vm03.stdout:9/412: mkdir d15/d1c/d21/d54/d87 0 2026-03-09T00:03:54.109 INFO:tasks.workunit.client.1.vm06.stdout:4/563: creat d17/d24/d3b/dbf/fc1 x:0 0 0 2026-03-09T00:03:54.109 INFO:tasks.workunit.client.0.vm03.stdout:7/377: link d2/d4/c1c d2/d1f/d35/c6f 0 2026-03-09T00:03:54.123 INFO:tasks.workunit.client.0.vm03.stdout:3/291: rmdir d2/db/d40/d44 39 2026-03-09T00:03:54.124 INFO:tasks.workunit.client.1.vm06.stdout:1/549: getdents d6 0 2026-03-09T00:03:54.125 INFO:tasks.workunit.client.0.vm03.stdout:0/367: link d2/da/dd/d49/d6c/d4b/l50 d2/da/dd/d49/d6c/d81/l85 0 2026-03-09T00:03:54.125 INFO:tasks.workunit.client.0.vm03.stdout:9/413: creat d15/d7f/f88 x:0 0 0 2026-03-09T00:03:54.126 INFO:tasks.workunit.client.0.vm03.stdout:7/378: creat d2/d1f/d40/d67/f70 x:0 0 0 2026-03-09T00:03:54.126 INFO:tasks.workunit.client.0.vm03.stdout:7/379: chown d2/d1f/d42/f59 93 1 2026-03-09T00:03:54.126 INFO:tasks.workunit.client.0.vm03.stdout:7/380: readlink d2/d4/d1e/l2b 0 2026-03-09T00:03:54.130 INFO:tasks.workunit.client.1.vm06.stdout:5/718: dwrite d5/d44/d4b/d92/f40 [0,4194304] 0 2026-03-09T00:03:54.143 INFO:tasks.workunit.client.0.vm03.stdout:2/349: unlink d8/d17/f45 0 2026-03-09T00:03:54.143 INFO:tasks.workunit.client.0.vm03.stdout:2/350: chown d8/d1b/d24/c52 85160 1 2026-03-09T00:03:54.153 INFO:tasks.workunit.client.0.vm03.stdout:3/292: dread d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:03:54.155 INFO:tasks.workunit.client.0.vm03.stdout:3/293: mkdir d2/db/d2d/d55 0 2026-03-09T00:03:54.166 INFO:tasks.workunit.client.0.vm03.stdout:3/294: truncate d2/f5 5021443 0 2026-03-09T00:03:54.169 INFO:tasks.workunit.client.1.vm06.stdout:8/592: dwrite db/d53/d70/d38/f99 [0,4194304] 0 2026-03-09T00:03:54.187 INFO:tasks.workunit.client.1.vm06.stdout:1/550: unlink d6/d21/d2d/d3b/d42/ca8 0 2026-03-09T00:03:54.190 INFO:tasks.workunit.client.1.vm06.stdout:5/719: link d5/d1c/d68/l53 d5/d1c/d21/d28/d5e/d66/d78/da6/lf6 0 2026-03-09T00:03:54.195 INFO:tasks.workunit.client.1.vm06.stdout:2/683: dwrite f2 [0,4194304] 0 2026-03-09T00:03:54.195 INFO:tasks.workunit.client.1.vm06.stdout:5/720: write d5/f3d [281792,16371] 0 2026-03-09T00:03:54.196 INFO:tasks.workunit.client.1.vm06.stdout:2/684: write d7/da/db/de/f32 [330228,36666] 0 2026-03-09T00:03:54.196 INFO:tasks.workunit.client.1.vm06.stdout:2/685: chown d7/d1a/d25/d66/f84 305482 1 2026-03-09T00:03:54.198 INFO:tasks.workunit.client.0.vm03.stdout:5/414: dwrite d1c/d51/f68 [0,4194304] 0 2026-03-09T00:03:54.199 INFO:tasks.workunit.client.0.vm03.stdout:5/415: dread - d1c/d20/d55/d4f/d58/d5d/f64 zero size 2026-03-09T00:03:54.202 INFO:tasks.workunit.client.1.vm06.stdout:0/618: dwrite d3/d18/d79/f7f [0,4194304] 0 2026-03-09T00:03:54.204 INFO:tasks.workunit.client.1.vm06.stdout:7/649: dwrite d0/df/d1a/d3a/d4e/d5e/f6f [0,4194304] 0 2026-03-09T00:03:54.204 INFO:tasks.workunit.client.1.vm06.stdout:7/650: truncate d0/df/d1a/d3a/f5d 1626819 0 2026-03-09T00:03:54.206 INFO:tasks.workunit.client.1.vm06.stdout:8/593: symlink db/d53/d7c/lc3 0 2026-03-09T00:03:54.213 INFO:tasks.workunit.client.1.vm06.stdout:8/594: dread db/d74/f8e [0,4194304] 0 2026-03-09T00:03:54.213 INFO:tasks.workunit.client.1.vm06.stdout:8/595: readlink db/d1e/d46/d94/la1 0 2026-03-09T00:03:54.215 INFO:tasks.workunit.client.1.vm06.stdout:8/596: dread db/d1e/f4f [0,4194304] 0 2026-03-09T00:03:54.215 INFO:tasks.workunit.client.1.vm06.stdout:8/597: fsync db/d53/d7c/fa0 0 2026-03-09T00:03:54.215 INFO:tasks.workunit.client.1.vm06.stdout:8/598: chown db/d1e/f25 1 1 2026-03-09T00:03:54.215 INFO:tasks.workunit.client.1.vm06.stdout:8/599: stat db/d53/d6d/fa2 0 2026-03-09T00:03:54.220 INFO:tasks.workunit.client.0.vm03.stdout:3/295: dwrite d2/db/d2d/f37 [0,4194304] 0 2026-03-09T00:03:54.223 INFO:tasks.workunit.client.1.vm06.stdout:0/619: dread d3/d18/d1f/d39/d49/d60/fb3 [0,4194304] 0 2026-03-09T00:03:54.241 INFO:tasks.workunit.client.1.vm06.stdout:6/640: dwrite d4/d16/d53/fb7 [0,4194304] 0 2026-03-09T00:03:54.245 INFO:tasks.workunit.client.1.vm06.stdout:1/551: mknod d6/d4c/d71/d83/cb8 0 2026-03-09T00:03:54.249 INFO:tasks.workunit.client.1.vm06.stdout:6/641: dread d4/fb [0,4194304] 0 2026-03-09T00:03:54.249 INFO:tasks.workunit.client.1.vm06.stdout:6/642: stat d4/d16/c47 0 2026-03-09T00:03:54.249 INFO:tasks.workunit.client.1.vm06.stdout:6/643: creat d4/d27/d42/da6/fcc x:0 0 0 2026-03-09T00:03:54.249 INFO:tasks.workunit.client.1.vm06.stdout:6/644: stat d4/d27/d3e/d57/f65 0 2026-03-09T00:03:54.249 INFO:tasks.workunit.client.1.vm06.stdout:6/645: write d4/ff [1126066,55313] 0 2026-03-09T00:03:54.268 INFO:tasks.workunit.client.0.vm03.stdout:3/296: chown d2/db/f10 43646347 1 2026-03-09T00:03:54.269 INFO:tasks.workunit.client.0.vm03.stdout:1/475: dwrite d4/d15/f8a [0,4194304] 0 2026-03-09T00:03:54.269 INFO:tasks.workunit.client.0.vm03.stdout:1/476: write d4/d15/d1a/f92 [841566,4438] 0 2026-03-09T00:03:54.271 INFO:tasks.workunit.client.0.vm03.stdout:2/351: dwrite d8/d1b/f22 [0,4194304] 0 2026-03-09T00:03:54.278 INFO:tasks.workunit.client.1.vm06.stdout:8/600: getdents db/dd/d24 0 2026-03-09T00:03:54.286 INFO:tasks.workunit.client.1.vm06.stdout:3/663: sync 2026-03-09T00:03:54.292 INFO:tasks.workunit.client.0.vm03.stdout:5/416: link d1c/d20/d55/d43/l79 d1c/d20/d55/d66/d70/l88 0 2026-03-09T00:03:54.292 INFO:tasks.workunit.client.1.vm06.stdout:0/620: unlink d3/d18/d79/l9b 0 2026-03-09T00:03:54.298 INFO:tasks.workunit.client.1.vm06.stdout:5/721: dwrite d5/d44/d4b/d92/d49/da0/fda [0,4194304] 0 2026-03-09T00:03:54.318 INFO:tasks.workunit.client.0.vm03.stdout:8/368: read d7/df/f3d [132290,35620] 0 2026-03-09T00:03:54.321 INFO:tasks.workunit.client.1.vm06.stdout:1/552: truncate d6/d21/d2d/f3c 1989810 0 2026-03-09T00:03:54.327 INFO:tasks.workunit.client.0.vm03.stdout:3/297: dwrite d2/f8 [0,4194304] 0 2026-03-09T00:03:54.335 INFO:tasks.workunit.client.0.vm03.stdout:3/298: dread d2/db/f10 [4194304,4194304] 0 2026-03-09T00:03:54.344 INFO:tasks.workunit.client.1.vm06.stdout:6/646: mkdir d4/d27/d42/d7e/dac/dcd 0 2026-03-09T00:03:54.356 INFO:tasks.workunit.client.1.vm06.stdout:8/601: creat db/d74/d78/d98/d9c/fc4 x:0 0 0 2026-03-09T00:03:54.375 INFO:tasks.workunit.client.0.vm03.stdout:4/444: sync 2026-03-09T00:03:54.375 INFO:tasks.workunit.client.0.vm03.stdout:4/445: chown d7/d20/d29/d54/d58 50632 1 2026-03-09T00:03:54.375 INFO:tasks.workunit.client.0.vm03.stdout:4/446: write d7/f1d [4621709,129334] 0 2026-03-09T00:03:54.406 INFO:tasks.workunit.client.0.vm03.stdout:1/477: dwrite d4/d3a/f2c [0,4194304] 0 2026-03-09T00:03:54.411 INFO:tasks.workunit.client.1.vm06.stdout:0/621: creat d3/d18/d28/fd1 x:0 0 0 2026-03-09T00:03:54.418 INFO:tasks.workunit.client.0.vm03.stdout:2/352: creat d8/d1b/f71 x:0 0 0 2026-03-09T00:03:54.428 INFO:tasks.workunit.client.1.vm06.stdout:7/651: dwrite d0/df/d1a/f44 [8388608,4194304] 0 2026-03-09T00:03:54.428 INFO:tasks.workunit.client.0.vm03.stdout:8/369: rmdir d7/df/d1a/d40 39 2026-03-09T00:03:54.428 INFO:tasks.workunit.client.0.vm03.stdout:8/370: chown d7/df/d1e/c28 8066 1 2026-03-09T00:03:54.441 INFO:tasks.workunit.client.0.vm03.stdout:4/447: mknod d7/d27/c8d 0 2026-03-09T00:03:54.447 INFO:tasks.workunit.client.1.vm06.stdout:1/553: mknod d6/d4c/d79/cb9 0 2026-03-09T00:03:54.447 INFO:tasks.workunit.client.1.vm06.stdout:1/554: write d6/d4c/d71/f45 [498083,17507] 0 2026-03-09T00:03:54.447 INFO:tasks.workunit.client.0.vm03.stdout:4/448: dread d7/d20/d6a/d77/f83 [0,4194304] 0 2026-03-09T00:03:54.452 INFO:tasks.workunit.client.0.vm03.stdout:4/449: write d7/d20/f3d [651774,25737] 0 2026-03-09T00:03:54.454 INFO:tasks.workunit.client.1.vm06.stdout:6/647: symlink d4/d27/d3e/d57/lce 0 2026-03-09T00:03:54.473 INFO:tasks.workunit.client.1.vm06.stdout:8/602: symlink db/d1e/d46/d94/lc5 0 2026-03-09T00:03:54.478 INFO:tasks.workunit.client.1.vm06.stdout:8/603: dread db/f17 [0,4194304] 0 2026-03-09T00:03:54.483 INFO:tasks.workunit.client.1.vm06.stdout:8/604: dread db/dd/f40 [0,4194304] 0 2026-03-09T00:03:54.484 INFO:tasks.workunit.client.1.vm06.stdout:8/605: fdatasync db/d53/d70/d38/f3a 0 2026-03-09T00:03:54.495 INFO:tasks.workunit.client.1.vm06.stdout:0/622: symlink d3/d18/d2c/d2d/d74/d7d/d9f/ld2 0 2026-03-09T00:03:54.523 INFO:tasks.workunit.client.1.vm06.stdout:7/652: symlink d0/df/d1a/d3a/lb5 0 2026-03-09T00:03:54.523 INFO:tasks.workunit.client.1.vm06.stdout:7/653: chown d0/fe 1663 1 2026-03-09T00:03:54.526 INFO:tasks.workunit.client.0.vm03.stdout:1/478: truncate f2 2690554 0 2026-03-09T00:03:54.526 INFO:tasks.workunit.client.0.vm03.stdout:1/479: write d4/d3a/d61/d78/f8e [248058,18913] 0 2026-03-09T00:03:54.526 INFO:tasks.workunit.client.0.vm03.stdout:1/480: chown d4/d3a/d3d 113097818 1 2026-03-09T00:03:54.541 INFO:tasks.workunit.client.1.vm06.stdout:5/722: symlink d5/d1c/d23/d34/lf7 0 2026-03-09T00:03:54.541 INFO:tasks.workunit.client.1.vm06.stdout:5/723: dread - d5/d1c/d21/ff3 zero size 2026-03-09T00:03:54.542 INFO:tasks.workunit.client.1.vm06.stdout:1/555: getdents d6/d8f 0 2026-03-09T00:03:54.543 INFO:tasks.workunit.client.0.vm03.stdout:2/353: unlink d8/d1b/f32 0 2026-03-09T00:03:54.548 INFO:tasks.workunit.client.1.vm06.stdout:8/606: creat db/dd/d24/dac/fc6 x:0 0 0 2026-03-09T00:03:54.553 INFO:tasks.workunit.client.0.vm03.stdout:4/450: truncate d7/f7e 4007456 0 2026-03-09T00:03:54.553 INFO:tasks.workunit.client.1.vm06.stdout:5/724: truncate d5/d44/d4b/d92/f52 237101 0 2026-03-09T00:03:54.553 INFO:tasks.workunit.client.1.vm06.stdout:5/725: dread - d5/d1c/d21/d28/d5e/d66/fc9 zero size 2026-03-09T00:03:54.555 INFO:tasks.workunit.client.0.vm03.stdout:2/354: symlink d8/d1b/d2a/d56/l72 0 2026-03-09T00:03:54.556 INFO:tasks.workunit.client.1.vm06.stdout:5/726: dread d5/d1c/d68/f3f [0,4194304] 0 2026-03-09T00:03:54.557 INFO:tasks.workunit.client.1.vm06.stdout:6/648: dwrite d4/f3d [0,4194304] 0 2026-03-09T00:03:54.559 INFO:tasks.workunit.client.0.vm03.stdout:4/451: read d7/f28 [98426,111248] 0 2026-03-09T00:03:54.559 INFO:tasks.workunit.client.0.vm03.stdout:4/452: dread - d7/d20/d6a/d77/d25/f7f zero size 2026-03-09T00:03:54.559 INFO:tasks.workunit.client.0.vm03.stdout:4/453: write d7/d20/d29/d4e/f73 [733063,47549] 0 2026-03-09T00:03:54.559 INFO:tasks.workunit.client.0.vm03.stdout:4/454: readlink d7/d20/d6a/d77/l7a 0 2026-03-09T00:03:54.559 INFO:tasks.workunit.client.0.vm03.stdout:4/455: getdents d7/d20/d35/d66 0 2026-03-09T00:03:54.578 INFO:tasks.workunit.client.1.vm06.stdout:1/556: creat d6/d4c/d51/fba x:0 0 0 2026-03-09T00:03:54.581 INFO:tasks.workunit.client.1.vm06.stdout:8/607: mkdir db/d74/d78/d98/db6/dc7 0 2026-03-09T00:03:54.583 INFO:tasks.workunit.client.0.vm03.stdout:2/355: write d8/d17/f68 [4090677,124822] 0 2026-03-09T00:03:54.589 INFO:tasks.workunit.client.1.vm06.stdout:1/557: dread d6/d63/f6a [0,4194304] 0 2026-03-09T00:03:54.589 INFO:tasks.workunit.client.1.vm06.stdout:0/623: dwrite d3/d18/d1f/d44/f7c [0,4194304] 0 2026-03-09T00:03:54.589 INFO:tasks.workunit.client.0.vm03.stdout:4/456: dwrite d7/d20/d29/d54/d58/f6b [4194304,4194304] 0 2026-03-09T00:03:54.589 INFO:tasks.workunit.client.0.vm03.stdout:4/457: chown d7/d20/d29/f2a 3551105 1 2026-03-09T00:03:54.589 INFO:tasks.workunit.client.0.vm03.stdout:4/458: stat d7/d20/d29/d38/d3a/l8b 0 2026-03-09T00:03:54.589 INFO:tasks.workunit.client.0.vm03.stdout:2/356: mknod d8/d1b/d2a/d2e/c73 0 2026-03-09T00:03:54.591 INFO:tasks.workunit.client.1.vm06.stdout:5/727: truncate d5/f8e 2874546 0 2026-03-09T00:03:54.616 INFO:tasks.workunit.client.1.vm06.stdout:1/558: truncate d6/d4c/d79/f59 139989 0 2026-03-09T00:03:54.616 INFO:tasks.workunit.client.1.vm06.stdout:1/559: write d6/d4c/d79/fb2 [110923,49505] 0 2026-03-09T00:03:54.618 INFO:tasks.workunit.client.0.vm03.stdout:4/459: symlink d7/d20/d29/l8e 0 2026-03-09T00:03:54.618 INFO:tasks.workunit.client.0.vm03.stdout:4/460: creat d7/d20/d29/d38/f8f x:0 0 0 2026-03-09T00:03:54.641 INFO:tasks.workunit.client.0.vm03.stdout:2/357: truncate d8/f5d 335124 0 2026-03-09T00:03:54.641 INFO:tasks.workunit.client.0.vm03.stdout:2/358: chown d8/d1b/d2a/d2e/c40 27544 1 2026-03-09T00:03:54.650 INFO:tasks.workunit.client.0.vm03.stdout:2/359: getdents d8/d26 0 2026-03-09T00:03:54.654 INFO:tasks.workunit.client.0.vm03.stdout:2/360: unlink d8/d1b/d24/f46 0 2026-03-09T00:03:54.666 INFO:tasks.workunit.client.1.vm06.stdout:0/624: dwrite d3/d18/d1f/d39/d3b/f57 [0,4194304] 0 2026-03-09T00:03:54.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:54 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:54 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:54 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:54 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.672 INFO:tasks.workunit.client.0.vm03.stdout:2/361: dread f6 [0,4194304] 0 2026-03-09T00:03:54.677 INFO:tasks.workunit.client.1.vm06.stdout:5/728: dwrite d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/ff1 [0,4194304] 0 2026-03-09T00:03:54.681 INFO:tasks.workunit.client.1.vm06.stdout:0/625: creat d3/d18/d1f/d39/d49/d60/fd3 x:0 0 0 2026-03-09T00:03:54.681 INFO:tasks.workunit.client.0.vm03.stdout:4/461: dwrite d7/d20/d35/d66/f69 [0,4194304] 0 2026-03-09T00:03:54.681 INFO:tasks.workunit.client.0.vm03.stdout:4/462: stat d7/d20/c79 0 2026-03-09T00:03:54.684 INFO:tasks.workunit.client.0.vm03.stdout:4/463: mkdir d7/d20/d29/d38/d3a/d90 0 2026-03-09T00:03:54.684 INFO:tasks.workunit.client.0.vm03.stdout:4/464: chown d7/d20/d29/f43 15447 1 2026-03-09T00:03:54.685 INFO:tasks.workunit.client.1.vm06.stdout:5/729: mknod d5/d1c/d68/dec/cf8 0 2026-03-09T00:03:54.685 INFO:tasks.workunit.client.1.vm06.stdout:5/730: stat d5/d1c/d23/d34/d47/ldb 0 2026-03-09T00:03:54.690 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:54 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.690 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:54 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.690 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:54 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.690 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:54 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:54.702 INFO:tasks.workunit.client.1.vm06.stdout:0/626: creat d3/d18/d2c/d2d/d74/da8/fd4 x:0 0 0 2026-03-09T00:03:54.702 INFO:tasks.workunit.client.1.vm06.stdout:0/627: chown d3/d18/d2c/d2d/d74/d7d/cc2 188776810 1 2026-03-09T00:03:54.702 INFO:tasks.workunit.client.1.vm06.stdout:0/628: chown d3/d18/d28/d45/f52 18764917 1 2026-03-09T00:03:54.707 INFO:tasks.workunit.client.0.vm03.stdout:4/465: rmdir d7/d20/d6a 39 2026-03-09T00:03:54.723 INFO:tasks.workunit.client.0.vm03.stdout:2/362: dwrite d8/d17/f34 [4194304,4194304] 0 2026-03-09T00:03:54.723 INFO:tasks.workunit.client.0.vm03.stdout:2/363: chown d8/d1b/d2a/f33 4624 1 2026-03-09T00:03:54.741 INFO:tasks.workunit.client.0.vm03.stdout:4/466: creat d7/d20/d29/d38/d3a/d90/f91 x:0 0 0 2026-03-09T00:03:54.741 INFO:tasks.workunit.client.0.vm03.stdout:4/467: read d7/d20/d6a/d77/d25/f3e [2400577,123009] 0 2026-03-09T00:03:54.741 INFO:tasks.workunit.client.0.vm03.stdout:4/468: readlink d7/d20/l5b 0 2026-03-09T00:03:54.753 INFO:tasks.workunit.client.1.vm06.stdout:7/654: symlink d0/df/d1a/d35/lb6 0 2026-03-09T00:03:54.754 INFO:tasks.workunit.client.1.vm06.stdout:7/655: symlink d0/d55/d85/lb7 0 2026-03-09T00:03:54.773 INFO:tasks.workunit.client.1.vm06.stdout:4/564: mkdir d17/d21/d4c/dc2 0 2026-03-09T00:03:54.773 INFO:tasks.workunit.client.1.vm06.stdout:4/565: dread - d17/d24/d3b/d54/f7e zero size 2026-03-09T00:03:54.773 INFO:tasks.workunit.client.1.vm06.stdout:4/566: chown d17/d24/d3b/l45 13 1 2026-03-09T00:03:54.773 INFO:tasks.workunit.client.1.vm06.stdout:4/567: chown d17/d24/d49/f5a 784051894 1 2026-03-09T00:03:54.896 INFO:tasks.workunit.client.1.vm06.stdout:9/519: rename d1/d4/d6e/d14/d25/f7a to d1/d3/fa6 0 2026-03-09T00:03:54.896 INFO:tasks.workunit.client.1.vm06.stdout:9/520: mkdir d1/da7 0 2026-03-09T00:03:54.897 INFO:tasks.workunit.client.1.vm06.stdout:3/664: rename d11/d28/d2e/d2f/d36/ca8 to d11/d3f/ce9 0 2026-03-09T00:03:54.897 INFO:tasks.workunit.client.1.vm06.stdout:3/665: write d11/d28/d2e/d7e/d83/fe8 [1016454,85006] 0 2026-03-09T00:03:54.901 INFO:tasks.workunit.client.0.vm03.stdout:4/469: dwrite d7/d20/d29/f53 [0,4194304] 0 2026-03-09T00:03:54.906 INFO:tasks.workunit.client.1.vm06.stdout:3/666: dread d11/d28/d2e/f62 [0,4194304] 0 2026-03-09T00:03:54.913 INFO:tasks.workunit.client.1.vm06.stdout:9/521: getdents d1/d4/d6e 0 2026-03-09T00:03:54.914 INFO:tasks.workunit.client.1.vm06.stdout:9/522: readlink d1/d4/d6e/d9/l79 0 2026-03-09T00:03:54.923 INFO:tasks.workunit.client.1.vm06.stdout:1/560: rename d6/d21/da6/la9 to d6/d4c/d51/lbb 0 2026-03-09T00:03:54.924 INFO:tasks.workunit.client.0.vm03.stdout:4/470: unlink d7/d20/d6a/d77/d25/l7d 0 2026-03-09T00:03:54.924 INFO:tasks.workunit.client.0.vm03.stdout:4/471: fsync d7/f28 0 2026-03-09T00:03:54.930 INFO:tasks.workunit.client.1.vm06.stdout:1/561: mkdir d6/d21/d2d/d37/dbc 0 2026-03-09T00:03:54.938 INFO:tasks.workunit.client.0.vm03.stdout:4/472: symlink d7/d20/d29/d38/d3a/l92 0 2026-03-09T00:03:54.938 INFO:tasks.workunit.client.0.vm03.stdout:4/473: dread - d7/d20/d29/d38/d3a/d90/f91 zero size 2026-03-09T00:03:54.940 INFO:tasks.workunit.client.0.vm03.stdout:4/474: unlink d7/d20/c3f 0 2026-03-09T00:03:54.944 INFO:tasks.workunit.client.0.vm03.stdout:5/417: rmdir d1c/d20 39 2026-03-09T00:03:54.949 INFO:tasks.workunit.client.0.vm03.stdout:4/475: link d7/d20/d29/d54/c5a d7/d20/d29/d38/c93 0 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.0.vm03.stdout:4/476: mknod d7/d20/d35/d66/c94 0 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.0.vm03.stdout:4/477: truncate d7/d20/d6a/d77/d25/f7f 548825 0 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.0.vm03.stdout:4/478: write d7/d20/d29/f2a [5043739,52825] 0 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.0.vm03.stdout:4/479: truncate d7/d20/f70 52228 0 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.1.vm06.stdout:0/629: rmdir d3/d18/d2c/d2d/d74/d7d/d9f 39 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.1.vm06.stdout:6/649: creat d4/d16/fcf x:0 0 0 2026-03-09T00:03:54.957 INFO:tasks.workunit.client.1.vm06.stdout:6/650: truncate d4/d27/d3e/d57/f5c 5137269 0 2026-03-09T00:03:54.964 INFO:tasks.workunit.client.1.vm06.stdout:3/667: dwrite d11/d28/d2e/f47 [4194304,4194304] 0 2026-03-09T00:03:54.966 INFO:tasks.workunit.client.0.vm03.stdout:4/480: symlink d7/d20/d29/d54/d58/l95 0 2026-03-09T00:03:54.971 INFO:tasks.workunit.client.1.vm06.stdout:6/651: dread d4/d27/d3e/f41 [0,4194304] 0 2026-03-09T00:03:54.973 INFO:tasks.workunit.client.1.vm06.stdout:3/668: creat d11/d28/d2e/d2f/d5b/fea x:0 0 0 2026-03-09T00:03:54.974 INFO:tasks.workunit.client.0.vm03.stdout:4/481: rmdir d7/d20/d35 39 2026-03-09T00:03:54.980 INFO:tasks.workunit.client.1.vm06.stdout:6/652: creat d4/d16/d46/d90/fd0 x:0 0 0 2026-03-09T00:03:54.980 INFO:tasks.workunit.client.1.vm06.stdout:6/653: chown d4/f2a 1675319114 1 2026-03-09T00:03:54.980 INFO:tasks.workunit.client.1.vm06.stdout:6/654: chown d4/c25 0 1 2026-03-09T00:03:54.982 INFO:tasks.workunit.client.0.vm03.stdout:5/418: write ff [3882762,83924] 0 2026-03-09T00:03:54.986 INFO:tasks.workunit.client.1.vm06.stdout:9/523: dwrite d1/d3/d4f/f74 [0,4194304] 0 2026-03-09T00:03:54.986 INFO:tasks.workunit.client.1.vm06.stdout:9/524: dread - d1/d4/d6e/d14/d25/f6f zero size 2026-03-09T00:03:54.986 INFO:tasks.workunit.client.1.vm06.stdout:9/525: chown d1/d4/d6e/d14/d25/c9a 16 1 2026-03-09T00:03:55.010 INFO:tasks.workunit.client.1.vm06.stdout:3/669: rename d11/c63 to d11/d28/d2e/d2f/d5b/d5f/db1/ceb 0 2026-03-09T00:03:55.010 INFO:tasks.workunit.client.1.vm06.stdout:3/670: dread - d11/d28/d2e/d2f/f78 zero size 2026-03-09T00:03:55.017 INFO:tasks.workunit.client.1.vm06.stdout:2/686: creat d7/d1b/fce x:0 0 0 2026-03-09T00:03:55.023 INFO:tasks.workunit.client.1.vm06.stdout:2/687: fdatasync d7/da/db/de/f11 0 2026-03-09T00:03:55.024 INFO:tasks.workunit.client.1.vm06.stdout:2/688: symlink d7/da/d4e/d57/lcf 0 2026-03-09T00:03:55.024 INFO:tasks.workunit.client.1.vm06.stdout:2/689: symlink d7/da/d4e/d57/ld0 0 2026-03-09T00:03:55.024 INFO:tasks.workunit.client.1.vm06.stdout:2/690: mkdir d7/dd1 0 2026-03-09T00:03:55.024 INFO:tasks.workunit.client.1.vm06.stdout:0/630: dread d3/d18/d2c/d2d/d31/f88 [0,4194304] 0 2026-03-09T00:03:55.024 INFO:tasks.workunit.client.1.vm06.stdout:0/631: chown d3/d18/d3c/fa0 6470 1 2026-03-09T00:03:55.025 INFO:tasks.workunit.client.1.vm06.stdout:2/691: unlink d7/da/db/l35 0 2026-03-09T00:03:55.025 INFO:tasks.workunit.client.1.vm06.stdout:2/692: fdatasync d7/da/d4e/d57/f7a 0 2026-03-09T00:03:55.025 INFO:tasks.workunit.client.1.vm06.stdout:0/632: truncate d3/d18/d1f/d44/f5a 4093789 0 2026-03-09T00:03:55.027 INFO:tasks.workunit.client.1.vm06.stdout:2/693: creat d7/d1b/d71/d79/db4/dc1/fd2 x:0 0 0 2026-03-09T00:03:55.028 INFO:tasks.workunit.client.0.vm03.stdout:4/482: creat d7/d20/d29/d54/f96 x:0 0 0 2026-03-09T00:03:55.028 INFO:tasks.workunit.client.1.vm06.stdout:3/671: write d11/d28/d2e/d2f/d36/f75 [2625186,113765] 0 2026-03-09T00:03:55.029 INFO:tasks.workunit.client.1.vm06.stdout:0/633: dread d3/d18/d1f/d39/d69/f71 [0,4194304] 0 2026-03-09T00:03:55.029 INFO:tasks.workunit.client.1.vm06.stdout:2/694: unlink d7/d1a/d56/fa4 0 2026-03-09T00:03:55.031 INFO:tasks.workunit.client.1.vm06.stdout:1/562: dwrite d6/d21/d2d/d3b/d87/f9e [0,4194304] 0 2026-03-09T00:03:55.037 INFO:tasks.workunit.client.0.vm03.stdout:1/481: dwrite f2 [0,4194304] 0 2026-03-09T00:03:55.038 INFO:tasks.workunit.client.1.vm06.stdout:0/634: write d3/d18/d1f/d39/d49/d60/fb3 [2871120,20070] 0 2026-03-09T00:03:55.038 INFO:tasks.workunit.client.1.vm06.stdout:0/635: write d3/d18/d1f/d39/d49/d60/fd3 [625256,91683] 0 2026-03-09T00:03:55.041 INFO:tasks.workunit.client.1.vm06.stdout:1/563: dread d6/f19 [0,4194304] 0 2026-03-09T00:03:55.041 INFO:tasks.workunit.client.1.vm06.stdout:1/564: creat d6/d63/fbd x:0 0 0 2026-03-09T00:03:55.047 INFO:tasks.workunit.client.1.vm06.stdout:9/526: dwrite d1/d73/f8f [4194304,4194304] 0 2026-03-09T00:03:55.047 INFO:tasks.workunit.client.1.vm06.stdout:9/527: truncate d1/d3/f23 4622145 0 2026-03-09T00:03:55.048 INFO:tasks.workunit.client.0.vm03.stdout:4/483: truncate d7/f15 237898 0 2026-03-09T00:03:55.048 INFO:tasks.workunit.client.0.vm03.stdout:4/484: chown d7/d20/d29/d4e/c72 0 1 2026-03-09T00:03:55.048 INFO:tasks.workunit.client.0.vm03.stdout:4/485: write d7/f1c [3710083,12858] 0 2026-03-09T00:03:55.048 INFO:tasks.workunit.client.0.vm03.stdout:4/486: chown d7/d20/d29/d38 2004 1 2026-03-09T00:03:55.053 INFO:tasks.workunit.client.0.vm03.stdout:5/419: dwrite d1c/f30 [0,4194304] 0 2026-03-09T00:03:55.058 INFO:tasks.workunit.client.0.vm03.stdout:0/368: symlink d2/l86 0 2026-03-09T00:03:55.059 INFO:tasks.workunit.client.0.vm03.stdout:0/369: fsync d2/f22 0 2026-03-09T00:03:55.059 INFO:tasks.workunit.client.0.vm03.stdout:0/370: readlink d2/d1f/l2e 0 2026-03-09T00:03:55.059 INFO:tasks.workunit.client.0.vm03.stdout:0/371: creat d2/da/dd/d49/f87 x:0 0 0 2026-03-09T00:03:55.066 INFO:tasks.workunit.client.1.vm06.stdout:3/672: creat d11/d28/d2e/d2f/fec x:0 0 0 2026-03-09T00:03:55.066 INFO:tasks.workunit.client.1.vm06.stdout:3/673: chown d11/d28/d2e/d7e/fd3 28413840 1 2026-03-09T00:03:55.067 INFO:tasks.workunit.client.1.vm06.stdout:7/656: creat d0/df/fb8 x:0 0 0 2026-03-09T00:03:55.067 INFO:tasks.workunit.client.1.vm06.stdout:7/657: chown d0/df/d1a/d3a/d4e/f63 15555584 1 2026-03-09T00:03:55.068 INFO:tasks.workunit.client.1.vm06.stdout:7/658: dread d0/df/d1a/d3a/f5d [0,4194304] 0 2026-03-09T00:03:55.069 INFO:tasks.workunit.client.1.vm06.stdout:2/695: rename d7/d1b/d31/f90 to d7/d1a/fd3 0 2026-03-09T00:03:55.069 INFO:tasks.workunit.client.1.vm06.stdout:2/696: write d7/da/f18 [594357,6667] 0 2026-03-09T00:03:55.069 INFO:tasks.workunit.client.1.vm06.stdout:2/697: chown d7/da/d63/d81 12638 1 2026-03-09T00:03:55.074 INFO:tasks.workunit.client.0.vm03.stdout:1/482: mkdir d4/d3a/d32/da1 0 2026-03-09T00:03:55.081 INFO:tasks.workunit.client.1.vm06.stdout:0/636: symlink d3/d18/d1f/d39/d69/ld5 0 2026-03-09T00:03:55.081 INFO:tasks.workunit.client.1.vm06.stdout:0/637: chown d3/d18/d2c/d2d/d74/d7d/fbd 97 1 2026-03-09T00:03:55.082 INFO:tasks.workunit.client.1.vm06.stdout:8/608: link db/d53/d7c/lc3 db/d74/d78/lc8 0 2026-03-09T00:03:55.085 INFO:tasks.workunit.client.1.vm06.stdout:1/565: creat d6/d4c/fbe x:0 0 0 2026-03-09T00:03:55.085 INFO:tasks.workunit.client.1.vm06.stdout:1/566: chown d6/l12 2682 1 2026-03-09T00:03:55.085 INFO:tasks.workunit.client.1.vm06.stdout:1/567: stat d6/d63/f9c 0 2026-03-09T00:03:55.096 INFO:tasks.workunit.client.1.vm06.stdout:9/528: dwrite d1/f2a [0,4194304] 0 2026-03-09T00:03:55.101 INFO:tasks.workunit.client.0.vm03.stdout:1/483: creat d4/d3a/d3d/fa2 x:0 0 0 2026-03-09T00:03:55.104 INFO:tasks.workunit.client.1.vm06.stdout:3/674: mknod d11/d28/d2e/d2f/d5b/d94/ddd/ced 0 2026-03-09T00:03:55.104 INFO:tasks.workunit.client.1.vm06.stdout:9/529: write d1/d4/f24 [196656,126953] 0 2026-03-09T00:03:55.118 INFO:tasks.workunit.client.0.vm03.stdout:5/420: link d1c/d20/d55/d43/c6e d1c/d20/d55/d66/c89 0 2026-03-09T00:03:55.118 INFO:tasks.workunit.client.0.vm03.stdout:5/421: fsync d1c/f29 0 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/638: dwrite d3/d18/d1f/d44/d6a/d73/fab [0,4194304] 0 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/639: creat d3/d18/d2c/d2d/d8c/fd6 x:0 0 0 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/640: creat d3/d18/d28/d45/fd7 x:0 0 0 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/641: chown d3/d18/d1f/d39/d3b/f57 12 1 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/642: write d3/d18/d1f/d39/d49/d60/fb3 [4599807,95929] 0 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/643: chown d3/d18/d1f/d39/d3b/f57 1738853 1 2026-03-09T00:03:55.124 INFO:tasks.workunit.client.1.vm06.stdout:0/644: chown d3/d18/d1f/d44/d6a/cc3 0 1 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/659: unlink d0/df/d17/f21 0 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/660: readlink d0/df/l10 0 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/661: chown d0/df/d17/f74 1021 1 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/662: write d0/df/d1a/f72 [542383,46583] 0 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/663: creat d0/d55/d99/fb9 x:0 0 0 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/664: dread - d0/df/d1a/d27/d4c/d40/d5b/faf zero size 2026-03-09T00:03:55.130 INFO:tasks.workunit.client.1.vm06.stdout:7/665: chown d0/df/d1a/d27/d4c/d40/d5b/f78 1 1 2026-03-09T00:03:55.136 INFO:tasks.workunit.client.0.vm03.stdout:9/414: rename d15/c60 to d15/d1c/d21/d54/d87/c89 0 2026-03-09T00:03:55.136 INFO:tasks.workunit.client.0.vm03.stdout:9/415: fsync d15/d1c/d36/f3a 0 2026-03-09T00:03:55.136 INFO:tasks.workunit.client.0.vm03.stdout:9/416: write d15/d1c/d21/d64/f50 [4850431,1120] 0 2026-03-09T00:03:55.138 INFO:tasks.workunit.client.1.vm06.stdout:2/698: creat d7/d1a/d56/fd4 x:0 0 0 2026-03-09T00:03:55.138 INFO:tasks.workunit.client.1.vm06.stdout:2/699: chown d7/d1a/f30 104803 1 2026-03-09T00:03:55.138 INFO:tasks.workunit.client.1.vm06.stdout:2/700: getdents d7/d1a/d25/d66 0 2026-03-09T00:03:55.138 INFO:tasks.workunit.client.1.vm06.stdout:2/701: fdatasync d7/d1b/f46 0 2026-03-09T00:03:55.145 INFO:tasks.workunit.client.0.vm03.stdout:1/484: rename d4/d3a/d84 to d4/d3a/d32/da3 0 2026-03-09T00:03:55.148 INFO:tasks.workunit.client.1.vm06.stdout:3/675: fdatasync d11/d28/d2e/d2f/d36/f75 0 2026-03-09T00:03:55.148 INFO:tasks.workunit.client.1.vm06.stdout:3/676: fsync d11/d28/d2e/d2f/d36/f4a 0 2026-03-09T00:03:55.152 INFO:tasks.workunit.client.1.vm06.stdout:8/609: unlink db/dd/f64 0 2026-03-09T00:03:55.153 INFO:tasks.workunit.client.1.vm06.stdout:1/568: rmdir d6/d21/d2d 39 2026-03-09T00:03:55.159 INFO:tasks.workunit.client.0.vm03.stdout:9/417: mknod d15/d1c/d21/d54/c8a 0 2026-03-09T00:03:55.171 INFO:tasks.workunit.client.0.vm03.stdout:9/418: truncate f10 2198548 0 2026-03-09T00:03:55.172 INFO:tasks.workunit.client.0.vm03.stdout:1/485: symlink d4/d15/d77/la4 0 2026-03-09T00:03:55.173 INFO:tasks.workunit.client.1.vm06.stdout:8/610: write db/f16 [428865,20364] 0 2026-03-09T00:03:55.179 INFO:tasks.workunit.client.0.vm03.stdout:1/486: write f2 [3413515,32638] 0 2026-03-09T00:03:55.189 INFO:tasks.workunit.client.0.vm03.stdout:9/419: read d15/d1c/d21/d64/f3d [121058,9341] 0 2026-03-09T00:03:55.189 INFO:tasks.workunit.client.0.vm03.stdout:9/420: write f8 [40853,22398] 0 2026-03-09T00:03:55.190 INFO:tasks.workunit.client.0.vm03.stdout:9/421: dread d15/d1c/d28/d6e/f7c [0,4194304] 0 2026-03-09T00:03:55.195 INFO:tasks.workunit.client.0.vm03.stdout:9/422: stat d15/d77/c79 0 2026-03-09T00:03:55.195 INFO:tasks.workunit.client.1.vm06.stdout:5/731: unlink d5/d1c/f75 0 2026-03-09T00:03:55.197 INFO:tasks.workunit.client.1.vm06.stdout:0/645: getdents d3/d18/d2c/d2d/d74/d7d/d9f 0 2026-03-09T00:03:55.205 INFO:tasks.workunit.client.1.vm06.stdout:7/666: dwrite d0/df/d1a/d3a/f23 [0,4194304] 0 2026-03-09T00:03:55.209 INFO:tasks.workunit.client.1.vm06.stdout:2/702: rename d7/d1a/d56/la2 to d7/d1a/d3c/ld5 0 2026-03-09T00:03:55.209 INFO:tasks.workunit.client.1.vm06.stdout:2/703: chown d7/da/d1c/l9c 40971 1 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:6/655: rmdir d4/d27/d42/d4b 39 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:6/656: write d4/d16/d53/f5f [908626,105290] 0 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:6/657: truncate d4/d27/d42/da6/fcc 591659 0 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:6/658: write d4/d27/d3e/d78/f91 [1721493,79910] 0 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:6/659: chown d4/d27/d3e/d78/fc9 75065822 1 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:7/667: dread d0/df/d1a/d27/d4c/d40/f41 [0,4194304] 0 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:7/668: chown d0/f4f 20 1 2026-03-09T00:03:55.210 INFO:tasks.workunit.client.1.vm06.stdout:7/669: chown d0/df/d1a/d27/d70 3205 1 2026-03-09T00:03:55.211 INFO:tasks.workunit.client.1.vm06.stdout:3/677: mknod d11/d28/d57/cee 0 2026-03-09T00:03:55.221 INFO:tasks.workunit.client.1.vm06.stdout:1/569: creat d6/d4c/d71/fbf x:0 0 0 2026-03-09T00:03:55.221 INFO:tasks.workunit.client.1.vm06.stdout:1/570: truncate d6/f81 199985 0 2026-03-09T00:03:55.221 INFO:tasks.workunit.client.1.vm06.stdout:1/571: write d6/d21/f55 [302027,112841] 0 2026-03-09T00:03:55.222 INFO:tasks.workunit.client.1.vm06.stdout:8/611: creat db/d74/d78/fc9 x:0 0 0 2026-03-09T00:03:55.222 INFO:tasks.workunit.client.1.vm06.stdout:8/612: fdatasync db/dd/d48/f68 0 2026-03-09T00:03:55.230 INFO:tasks.workunit.client.1.vm06.stdout:4/568: dwrite d17/d21/d4c/d50/f69 [0,4194304] 0 2026-03-09T00:03:55.247 INFO:tasks.workunit.client.1.vm06.stdout:4/569: dread d17/d21/f4b [0,4194304] 0 2026-03-09T00:03:55.248 INFO:tasks.workunit.client.1.vm06.stdout:5/732: creat d5/d44/d4b/ff9 x:0 0 0 2026-03-09T00:03:55.249 INFO:tasks.workunit.client.1.vm06.stdout:0/646: symlink d3/d18/d1f/d39/ld8 0 2026-03-09T00:03:55.249 INFO:tasks.workunit.client.1.vm06.stdout:2/704: truncate d7/d1a/d56/f50 1618344 0 2026-03-09T00:03:55.249 INFO:tasks.workunit.client.1.vm06.stdout:0/647: chown d3/d18/d1f/f4a 367873529 1 2026-03-09T00:03:55.250 INFO:tasks.workunit.client.1.vm06.stdout:6/660: dwrite d4/f3d [4194304,4194304] 0 2026-03-09T00:03:55.250 INFO:tasks.workunit.client.1.vm06.stdout:6/661: chown d4/d16/d46/d90/l8a 0 1 2026-03-09T00:03:55.274 INFO:tasks.workunit.client.1.vm06.stdout:1/572: mknod d6/d21/d2d/d37/d6d/cc0 0 2026-03-09T00:03:55.276 INFO:tasks.workunit.client.1.vm06.stdout:8/613: creat db/d74/d87/fca x:0 0 0 2026-03-09T00:03:55.276 INFO:tasks.workunit.client.1.vm06.stdout:8/614: fsync db/f28 0 2026-03-09T00:03:55.300 INFO:tasks.workunit.client.1.vm06.stdout:5/733: dwrite d5/d1c/d21/d28/f63 [4194304,4194304] 0 2026-03-09T00:03:55.300 INFO:tasks.workunit.client.1.vm06.stdout:5/734: readlink d5/d1c/d21/d28/d5e/leb 0 2026-03-09T00:03:55.306 INFO:tasks.workunit.client.1.vm06.stdout:5/735: write d5/d44/d4b/fa9 [1560422,58284] 0 2026-03-09T00:03:55.306 INFO:tasks.workunit.client.1.vm06.stdout:5/736: write d5/d1c/d21/d28/d5e/d66/fc9 [647683,34977] 0 2026-03-09T00:03:55.307 INFO:tasks.workunit.client.1.vm06.stdout:5/737: dread d5/d44/d4b/d92/f52 [0,4194304] 0 2026-03-09T00:03:55.314 INFO:tasks.workunit.client.0.vm03.stdout:5/422: dwrite d1c/d20/d55/f46 [0,4194304] 0 2026-03-09T00:03:55.328 INFO:tasks.workunit.client.0.vm03.stdout:1/487: dwrite d4/d3a/d32/da3/f91 [0,4194304] 0 2026-03-09T00:03:55.329 INFO:tasks.workunit.client.1.vm06.stdout:2/705: symlink d7/d1b/d31/ld6 0 2026-03-09T00:03:55.330 INFO:tasks.workunit.client.1.vm06.stdout:2/706: chown d7/d1a/d39 11930014 1 2026-03-09T00:03:55.330 INFO:tasks.workunit.client.1.vm06.stdout:2/707: readlink d7/d1a/d25/d97/lac 0 2026-03-09T00:03:55.330 INFO:tasks.workunit.client.1.vm06.stdout:2/708: read - d7/d1b/d71/fcb zero size 2026-03-09T00:03:55.334 INFO:tasks.workunit.client.1.vm06.stdout:2/709: read d7/da/db/de/f11 [2300004,33616] 0 2026-03-09T00:03:55.342 INFO:tasks.workunit.client.1.vm06.stdout:0/648: rename d3/d18/d1f/d39/d3b/fc8 to d3/d18/d2c/d2d/d31/fd9 0 2026-03-09T00:03:55.348 INFO:tasks.workunit.client.1.vm06.stdout:7/670: dread d0/df/d1a/f44 [8388608,4194304] 0 2026-03-09T00:03:55.348 INFO:tasks.workunit.client.1.vm06.stdout:7/671: chown d0/df/d7b 1640852 1 2026-03-09T00:03:55.357 INFO:tasks.workunit.client.0.vm03.stdout:3/299: sync 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:8/371: sync 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:7/381: sync 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:6/350: sync 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:2/364: sync 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:2/365: fdatasync f6 0 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:2/366: write d8/d1b/f1f [4608037,104941] 0 2026-03-09T00:03:55.358 INFO:tasks.workunit.client.0.vm03.stdout:2/367: chown f6 388 1 2026-03-09T00:03:55.362 INFO:tasks.workunit.client.0.vm03.stdout:6/351: dread d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:03:55.364 INFO:tasks.workunit.client.0.vm03.stdout:6/352: write d13/d1e/f2d [1897625,21716] 0 2026-03-09T00:03:55.364 INFO:tasks.workunit.client.0.vm03.stdout:6/353: write d13/d35/f68 [1002854,101551] 0 2026-03-09T00:03:55.365 INFO:tasks.workunit.client.0.vm03.stdout:5/423: dwrite d1c/d20/d55/d3b/f57 [0,4194304] 0 2026-03-09T00:03:55.367 INFO:tasks.workunit.client.1.vm06.stdout:3/678: truncate d11/f48 406772 0 2026-03-09T00:03:55.368 INFO:tasks.workunit.client.0.vm03.stdout:9/423: dwrite d15/d1c/d21/d54/f65 [4194304,4194304] 0 2026-03-09T00:03:55.372 INFO:tasks.workunit.client.1.vm06.stdout:1/573: unlink d6/d21/d2d/f5d 0 2026-03-09T00:03:55.386 INFO:tasks.workunit.client.1.vm06.stdout:8/615: creat db/d53/d70/fcb x:0 0 0 2026-03-09T00:03:55.388 INFO:tasks.workunit.client.0.vm03.stdout:1/488: creat d4/d3a/d32/d87/fa5 x:0 0 0 2026-03-09T00:03:55.394 INFO:tasks.workunit.client.0.vm03.stdout:8/372: creat d7/df/d1e/d38/d60/f71 x:0 0 0 2026-03-09T00:03:55.394 INFO:tasks.workunit.client.0.vm03.stdout:8/373: chown d7/df/d1a/f33 1713 1 2026-03-09T00:03:55.397 INFO:tasks.workunit.client.0.vm03.stdout:7/382: mknod d2/d4/d1e/d5e/d6c/c71 0 2026-03-09T00:03:55.399 INFO:tasks.workunit.client.1.vm06.stdout:4/570: mknod d17/d24/d3b/d5e/cc3 0 2026-03-09T00:03:55.399 INFO:tasks.workunit.client.1.vm06.stdout:4/571: getdents d17/d5b/dac 0 2026-03-09T00:03:55.399 INFO:tasks.workunit.client.1.vm06.stdout:4/572: readlink d17/d24/d3b/l45 0 2026-03-09T00:03:55.400 INFO:tasks.workunit.client.0.vm03.stdout:2/368: readlink d8/l19 0 2026-03-09T00:03:55.405 INFO:tasks.workunit.client.0.vm03.stdout:2/369: write f6 [1148050,87707] 0 2026-03-09T00:03:55.419 INFO:tasks.workunit.client.0.vm03.stdout:6/354: stat ce 0 2026-03-09T00:03:55.420 INFO:tasks.workunit.client.0.vm03.stdout:5/424: rename d1c/d20/d55/c2b to d1c/d51/c8a 0 2026-03-09T00:03:55.421 INFO:tasks.workunit.client.1.vm06.stdout:2/710: read - d7/da/d4e/d57/fc5 zero size 2026-03-09T00:03:55.421 INFO:tasks.workunit.client.1.vm06.stdout:2/711: chown d7/d1b/d31/f7d 36 1 2026-03-09T00:03:55.421 INFO:tasks.workunit.client.1.vm06.stdout:2/712: chown d7/d1b/da5/daa 2145548351 1 2026-03-09T00:03:55.432 INFO:tasks.workunit.client.0.vm03.stdout:6/355: write f10 [176645,92184] 0 2026-03-09T00:03:55.432 INFO:tasks.workunit.client.0.vm03.stdout:6/356: chown d13/d1e/d44/d4a/c73 2421282 1 2026-03-09T00:03:55.432 INFO:tasks.workunit.client.0.vm03.stdout:6/357: chown d13/l67 223916 1 2026-03-09T00:03:55.433 INFO:tasks.workunit.client.0.vm03.stdout:2/370: dwrite d8/f5d [0,4194304] 0 2026-03-09T00:03:55.433 INFO:tasks.workunit.client.0.vm03.stdout:2/371: write d8/d17/f1c [2352763,18789] 0 2026-03-09T00:03:55.453 INFO:tasks.workunit.client.0.vm03.stdout:9/424: getdents d15/d1c/d21/d75 0 2026-03-09T00:03:55.453 INFO:tasks.workunit.client.0.vm03.stdout:9/425: fdatasync d15/d1c/d21/f71 0 2026-03-09T00:03:55.453 INFO:tasks.workunit.client.0.vm03.stdout:9/426: chown d15/d1c/d21/d64/l32 0 1 2026-03-09T00:03:55.453 INFO:tasks.workunit.client.0.vm03.stdout:9/427: chown fb 9269240 1 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.0.vm03.stdout:4/487: truncate d7/f7e 691111 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:5/738: rename d5/d1c/d21/la7 to d5/d1c/d21/d28/d5e/d66/d78/lfa 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:5/739: creat d5/d44/d4b/ffb x:0 0 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:5/740: fsync d5/d44/d4b/d92/d49/fc2 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:5/741: fsync d5/d1c/d21/d28/f59 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:5/742: stat d5/d44/d4b/d92/d49 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:5/743: creat d5/d44/d4b/d92/d49/da0/ffc x:0 0 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:0/649: symlink d3/lda 0 2026-03-09T00:03:55.454 INFO:tasks.workunit.client.1.vm06.stdout:7/672: mkdir d0/df/d17/dba 0 2026-03-09T00:03:55.460 INFO:tasks.workunit.client.1.vm06.stdout:0/650: dread d3/d18/d2c/d2d/f85 [0,4194304] 0 2026-03-09T00:03:55.474 INFO:tasks.workunit.client.1.vm06.stdout:3/679: link d11/d28/d2e/d2f/d36/fb7 d11/d28/d4d/d89/d90/dd2/fef 0 2026-03-09T00:03:55.479 INFO:tasks.workunit.client.0.vm03.stdout:1/489: mkdir d4/d3a/d61/da6 0 2026-03-09T00:03:55.479 INFO:tasks.workunit.client.1.vm06.stdout:9/530: sync 2026-03-09T00:03:55.480 INFO:tasks.workunit.client.1.vm06.stdout:9/531: write d1/d4/d6e/d14/d25/d85/f5a [1019420,39177] 0 2026-03-09T00:03:55.481 INFO:tasks.workunit.client.1.vm06.stdout:1/574: creat d6/d21/fc1 x:0 0 0 2026-03-09T00:03:55.482 INFO:tasks.workunit.client.1.vm06.stdout:1/575: creat d6/d4c/d79/fc2 x:0 0 0 2026-03-09T00:03:55.482 INFO:tasks.workunit.client.1.vm06.stdout:1/576: fdatasync d6/f19 0 2026-03-09T00:03:55.485 INFO:tasks.workunit.client.0.vm03.stdout:3/300: mkdir d2/db/d56 0 2026-03-09T00:03:55.487 INFO:tasks.workunit.client.1.vm06.stdout:8/616: truncate db/d1e/f2e 3651733 0 2026-03-09T00:03:55.488 INFO:tasks.workunit.client.1.vm06.stdout:1/577: dread d6/d21/d2d/f3c [0,4194304] 0 2026-03-09T00:03:55.488 INFO:tasks.workunit.client.1.vm06.stdout:1/578: truncate d6/d4c/d51/fba 384436 0 2026-03-09T00:03:55.490 INFO:tasks.workunit.client.1.vm06.stdout:9/532: dread d1/d4/d6e/d14/d25/d85/f90 [0,4194304] 0 2026-03-09T00:03:55.494 INFO:tasks.workunit.client.0.vm03.stdout:8/374: creat d7/df/d1a/d2b/f72 x:0 0 0 2026-03-09T00:03:55.497 INFO:tasks.workunit.client.1.vm06.stdout:5/744: dwrite d5/d44/d4b/ff9 [0,4194304] 0 2026-03-09T00:03:55.497 INFO:tasks.workunit.client.1.vm06.stdout:5/745: fsync d5/d1c/d21/d28/d5e/d66/d78/fc1 0 2026-03-09T00:03:55.497 INFO:tasks.workunit.client.1.vm06.stdout:5/746: dread - d5/d44/d4b/fe1 zero size 2026-03-09T00:03:55.500 INFO:tasks.workunit.client.1.vm06.stdout:4/573: mknod d17/d5b/d8f/cc4 0 2026-03-09T00:03:55.500 INFO:tasks.workunit.client.1.vm06.stdout:4/574: readlink d17/l1c 0 2026-03-09T00:03:55.500 INFO:tasks.workunit.client.0.vm03.stdout:7/383: rmdir d2/d1f/d42/d43 39 2026-03-09T00:03:55.500 INFO:tasks.workunit.client.0.vm03.stdout:7/384: dread - d2/d4/d1e/d5e/d6c/d37/f4c zero size 2026-03-09T00:03:55.500 INFO:tasks.workunit.client.0.vm03.stdout:7/385: chown d2/d1f/d40/d67 6 1 2026-03-09T00:03:55.504 INFO:tasks.workunit.client.1.vm06.stdout:4/575: dread d17/d24/d3b/d54/f80 [0,4194304] 0 2026-03-09T00:03:55.524 INFO:tasks.workunit.client.0.vm03.stdout:6/358: creat d13/d1e/d44/d4a/d52/f7a x:0 0 0 2026-03-09T00:03:55.526 INFO:tasks.workunit.client.1.vm06.stdout:7/673: mknod d0/df/d1a/d3a/d4e/d5e/cbb 0 2026-03-09T00:03:55.537 INFO:tasks.workunit.client.1.vm06.stdout:6/662: sync 2026-03-09T00:03:55.538 INFO:tasks.workunit.client.1.vm06.stdout:0/651: symlink d3/d18/d1f/d44/d6a/d73/ldb 0 2026-03-09T00:03:55.539 INFO:tasks.workunit.client.1.vm06.stdout:6/663: truncate d4/d27/f70 595997 0 2026-03-09T00:03:55.544 INFO:tasks.workunit.client.0.vm03.stdout:7/386: dwrite d2/d4/d1e/d5e/d6c/d37/f4c [0,4194304] 0 2026-03-09T00:03:55.548 INFO:tasks.workunit.client.1.vm06.stdout:4/576: dwrite d17/d24/f39 [0,4194304] 0 2026-03-09T00:03:55.550 INFO:tasks.workunit.client.1.vm06.stdout:3/680: creat d11/d28/ff0 x:0 0 0 2026-03-09T00:03:55.554 INFO:tasks.workunit.client.0.vm03.stdout:0/372: sync 2026-03-09T00:03:55.559 INFO:tasks.workunit.client.0.vm03.stdout:9/428: mknod d15/d1c/d28/d6e/c8b 0 2026-03-09T00:03:55.567 INFO:tasks.workunit.client.1.vm06.stdout:8/617: truncate db/d1e/f51 700132 0 2026-03-09T00:03:55.567 INFO:tasks.workunit.client.1.vm06.stdout:8/618: truncate db/d53/d5c/f6f 1270978 0 2026-03-09T00:03:55.573 INFO:tasks.workunit.client.0.vm03.stdout:4/488: read d7/d20/d29/d38/f6e [1036664,58602] 0 2026-03-09T00:03:55.588 INFO:tasks.workunit.client.0.vm03.stdout:9/429: dwrite d15/d1c/d21/f61 [0,4194304] 0 2026-03-09T00:03:55.593 INFO:tasks.workunit.client.1.vm06.stdout:9/533: rename d1/d3/d2b/d58/l61 to d1/d4/d6e/d14/d25/la8 0 2026-03-09T00:03:55.597 INFO:tasks.workunit.client.0.vm03.stdout:1/490: rmdir d4/d15/d77/d95 0 2026-03-09T00:03:55.608 INFO:tasks.workunit.client.0.vm03.stdout:3/301: creat d2/db/d40/d51/f57 x:0 0 0 2026-03-09T00:03:55.616 INFO:tasks.workunit.client.1.vm06.stdout:5/747: link d5/d1c/d21/d28/f56 d5/d44/d4b/d92/d49/da0/ffd 0 2026-03-09T00:03:55.623 INFO:tasks.workunit.client.1.vm06.stdout:5/748: write d5/d1c/d23/f82 [1450627,9397] 0 2026-03-09T00:03:55.623 INFO:tasks.workunit.client.1.vm06.stdout:5/749: chown d5/d1c/f62 184 1 2026-03-09T00:03:55.623 INFO:tasks.workunit.client.1.vm06.stdout:5/750: truncate d5/d1c/d21/d28/d5e/d66/d78/fc1 359401 0 2026-03-09T00:03:55.635 INFO:tasks.workunit.client.0.vm03.stdout:8/375: symlink d7/df/d6b/l73 0 2026-03-09T00:03:55.647 INFO:tasks.workunit.client.1.vm06.stdout:2/713: link d7/d1a/f30 d7/da/d4e/d57/d9d/fd7 0 2026-03-09T00:03:55.647 INFO:tasks.workunit.client.1.vm06.stdout:7/674: mknod d0/df/d1a/d27/d4c/d40/d51/d90/cbc 0 2026-03-09T00:03:55.647 INFO:tasks.workunit.client.0.vm03.stdout:5/425: link d1c/d20/d55/d43/l79 d1c/d20/d55/d43/l8b 0 2026-03-09T00:03:55.647 INFO:tasks.workunit.client.0.vm03.stdout:5/426: creat d1c/d20/d55/d66/d70/f8c x:0 0 0 2026-03-09T00:03:55.647 INFO:tasks.workunit.client.0.vm03.stdout:5/427: creat d1c/d20/d55/d4f/d58/d5d/f8d x:0 0 0 2026-03-09T00:03:55.647 INFO:tasks.workunit.client.0.vm03.stdout:6/359: symlink d13/d1e/d44/d4a/d52/l7b 0 2026-03-09T00:03:55.659 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:55 vm03.local ceph-mon[52346]: pgmap v12: 65 pgs: 65 active+clean; 2.2 GiB data, 7.8 GiB used, 112 GiB / 120 GiB avail; 165 MiB/s rd, 199 MiB/s wr, 303 op/s 2026-03-09T00:03:55.666 INFO:tasks.workunit.client.0.vm03.stdout:2/372: sync 2026-03-09T00:03:55.666 INFO:tasks.workunit.client.0.vm03.stdout:2/373: write d8/d17/f1d [977143,81158] 0 2026-03-09T00:03:55.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:55 vm06.local ceph-mon[58395]: pgmap v12: 65 pgs: 65 active+clean; 2.2 GiB data, 7.8 GiB used, 112 GiB / 120 GiB avail; 165 MiB/s rd, 199 MiB/s wr, 303 op/s 2026-03-09T00:03:55.674 INFO:tasks.workunit.client.1.vm06.stdout:4/577: link l2 d17/d24/d3b/d97/lc5 0 2026-03-09T00:03:55.674 INFO:tasks.workunit.client.1.vm06.stdout:4/578: stat d17/d21/fa6 0 2026-03-09T00:03:55.674 INFO:tasks.workunit.client.1.vm06.stdout:4/579: readlink d17/d24/d3b/d97/lc5 0 2026-03-09T00:03:55.674 INFO:tasks.workunit.client.1.vm06.stdout:4/580: fsync d17/d24/d49/f2a 0 2026-03-09T00:03:55.674 INFO:tasks.workunit.client.0.vm03.stdout:7/387: truncate d2/d4/d1e/d5e/d6c/f44 2282574 0 2026-03-09T00:03:55.675 INFO:tasks.workunit.client.1.vm06.stdout:5/751: dwrite d5/d1c/d21/d28/f59 [0,4194304] 0 2026-03-09T00:03:55.675 INFO:tasks.workunit.client.1.vm06.stdout:5/752: stat d5/d1c/d23/d34/cb7 0 2026-03-09T00:03:55.675 INFO:tasks.workunit.client.1.vm06.stdout:3/681: mkdir d11/d28/d2e/d2f/d5b/ddb/df1 0 2026-03-09T00:03:55.676 INFO:tasks.workunit.client.0.vm03.stdout:0/373: rename d2/d5a/f63 to d2/da/dd/d49/d6c/d4b/f88 0 2026-03-09T00:03:55.684 INFO:tasks.workunit.client.0.vm03.stdout:9/430: unlink d15/d1c/d36/d4d/f6b 0 2026-03-09T00:03:55.692 INFO:tasks.workunit.client.1.vm06.stdout:2/714: dwrite d7/d1a/d96/fba [0,4194304] 0 2026-03-09T00:03:55.696 INFO:tasks.workunit.client.1.vm06.stdout:2/715: dread d7/da/db/de/f49 [0,4194304] 0 2026-03-09T00:03:55.696 INFO:tasks.workunit.client.1.vm06.stdout:2/716: creat d7/d1b/fd8 x:0 0 0 2026-03-09T00:03:55.696 INFO:tasks.workunit.client.1.vm06.stdout:2/717: dread - d7/d1b/fd8 zero size 2026-03-09T00:03:55.696 INFO:tasks.workunit.client.1.vm06.stdout:2/718: write d7/d1a/d25/d66/f6b [212523,84960] 0 2026-03-09T00:03:55.697 INFO:tasks.workunit.client.1.vm06.stdout:7/675: dwrite d0/df/d1a/d27/d4c/d40/d5b/faf [0,4194304] 0 2026-03-09T00:03:55.722 INFO:tasks.workunit.client.1.vm06.stdout:2/719: write d7/da/db/de/f60 [1233477,25087] 0 2026-03-09T00:03:55.723 INFO:tasks.workunit.client.1.vm06.stdout:8/619: creat db/d53/d7c/d8f/fcc x:0 0 0 2026-03-09T00:03:55.723 INFO:tasks.workunit.client.1.vm06.stdout:8/620: readlink db/l12 0 2026-03-09T00:03:55.725 INFO:tasks.workunit.client.0.vm03.stdout:1/491: creat d4/d3a/d61/da6/fa7 x:0 0 0 2026-03-09T00:03:55.731 INFO:tasks.workunit.client.1.vm06.stdout:8/621: dread db/d1e/d46/f4b [0,4194304] 0 2026-03-09T00:03:55.747 INFO:tasks.workunit.client.1.vm06.stdout:1/579: getdents d6/d4c/d71/d83 0 2026-03-09T00:03:55.747 INFO:tasks.workunit.client.1.vm06.stdout:1/580: chown d6/d21/l26 76 1 2026-03-09T00:03:55.750 INFO:tasks.workunit.client.1.vm06.stdout:7/676: dwrite d0/d39/f68 [0,4194304] 0 2026-03-09T00:03:55.750 INFO:tasks.workunit.client.1.vm06.stdout:7/677: fsync d0/df/d17/f1f 0 2026-03-09T00:03:55.751 INFO:tasks.workunit.client.0.vm03.stdout:3/302: truncate f1 1352232 0 2026-03-09T00:03:55.752 INFO:tasks.workunit.client.1.vm06.stdout:0/652: sync 2026-03-09T00:03:55.752 INFO:tasks.workunit.client.1.vm06.stdout:0/653: fsync d3/d18/d1f/d39/d49/d60/f92 0 2026-03-09T00:03:55.752 INFO:tasks.workunit.client.1.vm06.stdout:0/654: dread - d3/d18/d2c/d2d/d74/d7d/d9f/fcb zero size 2026-03-09T00:03:55.754 INFO:tasks.workunit.client.0.vm03.stdout:8/376: link d7/df/d1a/f2e d7/df/d6b/f74 0 2026-03-09T00:03:55.755 INFO:tasks.workunit.client.1.vm06.stdout:9/534: rename d1/f2a to d1/d4/d6e/fa9 0 2026-03-09T00:03:55.759 INFO:tasks.workunit.client.1.vm06.stdout:6/664: getdents d4/d27/d3e/d57 0 2026-03-09T00:03:55.759 INFO:tasks.workunit.client.1.vm06.stdout:6/665: fdatasync d4/d27/d3e/f44 0 2026-03-09T00:03:55.762 INFO:tasks.workunit.client.1.vm06.stdout:2/720: dwrite d7/d1a/d25/d66/d87/f9b [0,4194304] 0 2026-03-09T00:03:55.769 INFO:tasks.workunit.client.0.vm03.stdout:5/428: mkdir d1c/d20/d55/d4f/d58/d73/d76/d8e 0 2026-03-09T00:03:55.769 INFO:tasks.workunit.client.0.vm03.stdout:5/429: read - d1c/d20/d55/d66/f83 zero size 2026-03-09T00:03:55.769 INFO:tasks.workunit.client.0.vm03.stdout:5/430: chown d1c/d51/d6a/d75 126 1 2026-03-09T00:03:55.776 INFO:tasks.workunit.client.1.vm06.stdout:4/581: link d17/d21/d4c/c7c d17/d24/d49/d5f/db2/cc6 0 2026-03-09T00:03:55.776 INFO:tasks.workunit.client.1.vm06.stdout:4/582: chown d17/d24/d3b/d5e/d6e/db0/cb1 1486 1 2026-03-09T00:03:55.776 INFO:tasks.workunit.client.1.vm06.stdout:4/583: read d17/d21/d4c/d50/f9c [323204,107209] 0 2026-03-09T00:03:55.778 INFO:tasks.workunit.client.0.vm03.stdout:6/360: mknod d13/d1e/d44/d59/c7c 0 2026-03-09T00:03:55.780 INFO:tasks.workunit.client.1.vm06.stdout:5/753: symlink d5/d1c/d21/d28/lfe 0 2026-03-09T00:03:55.786 INFO:tasks.workunit.client.1.vm06.stdout:5/754: write d5/f36 [1507529,83122] 0 2026-03-09T00:03:55.788 INFO:tasks.workunit.client.1.vm06.stdout:3/682: rmdir d11/d28/d2e/d7e/d83/dd8 0 2026-03-09T00:03:55.792 INFO:tasks.workunit.client.0.vm03.stdout:7/388: creat d2/d1f/d40/f72 x:0 0 0 2026-03-09T00:03:55.799 INFO:tasks.workunit.client.0.vm03.stdout:0/374: creat d2/da/dd/d49/d6c/f89 x:0 0 0 2026-03-09T00:03:55.802 INFO:tasks.workunit.client.0.vm03.stdout:4/489: rename d7/d27/c40 to d7/d20/d29/d4e/c97 0 2026-03-09T00:03:55.807 INFO:tasks.workunit.client.1.vm06.stdout:8/622: creat db/d53/d70/d38/fcd x:0 0 0 2026-03-09T00:03:55.808 INFO:tasks.workunit.client.1.vm06.stdout:8/623: write db/dd/f97 [1720461,47186] 0 2026-03-09T00:03:55.808 INFO:tasks.workunit.client.1.vm06.stdout:8/624: truncate db/dd/d85/d9f/f88 5148114 0 2026-03-09T00:03:55.808 INFO:tasks.workunit.client.0.vm03.stdout:9/431: getdents d15/d1c/d28/d6e 0 2026-03-09T00:03:55.808 INFO:tasks.workunit.client.0.vm03.stdout:1/492: mknod d4/d3a/d32/da1/ca8 0 2026-03-09T00:03:55.809 INFO:tasks.workunit.client.0.vm03.stdout:3/303: mkdir d2/db/d40/d58 0 2026-03-09T00:03:55.811 INFO:tasks.workunit.client.0.vm03.stdout:1/493: dread f2 [0,4194304] 0 2026-03-09T00:03:55.811 INFO:tasks.workunit.client.0.vm03.stdout:8/377: creat d7/df/d6b/f75 x:0 0 0 2026-03-09T00:03:55.811 INFO:tasks.workunit.client.0.vm03.stdout:8/378: chown d7/df/d1a/c5b 1343973 1 2026-03-09T00:03:55.815 INFO:tasks.workunit.client.1.vm06.stdout:2/721: dwrite d7/d1b/d71/d79/db4/dc1/fd2 [0,4194304] 0 2026-03-09T00:03:55.818 INFO:tasks.workunit.client.1.vm06.stdout:7/678: creat d0/df/d1a/d27/d4c/d40/d51/d86/fbd x:0 0 0 2026-03-09T00:03:55.819 INFO:tasks.workunit.client.0.vm03.stdout:5/431: mkdir d1c/d20/d55/d66/d6b/d8f 0 2026-03-09T00:03:55.824 INFO:tasks.workunit.client.0.vm03.stdout:5/432: write d1c/d20/d55/f46 [1129682,98358] 0 2026-03-09T00:03:55.824 INFO:tasks.workunit.client.0.vm03.stdout:5/433: stat d1c/d20/d55/d66/c78 0 2026-03-09T00:03:55.826 INFO:tasks.workunit.client.1.vm06.stdout:0/655: mknod d3/d18/d3c/cdc 0 2026-03-09T00:03:55.826 INFO:tasks.workunit.client.1.vm06.stdout:0/656: chown d3/d18/d2c/d2d/d74/fa7 1746102221 1 2026-03-09T00:03:55.827 INFO:tasks.workunit.client.0.vm03.stdout:6/361: symlink d13/d35/l7d 0 2026-03-09T00:03:55.827 INFO:tasks.workunit.client.0.vm03.stdout:6/362: fsync d13/d1e/f28 0 2026-03-09T00:03:55.828 INFO:tasks.workunit.client.1.vm06.stdout:9/535: mknod d1/caa 0 2026-03-09T00:03:55.828 INFO:tasks.workunit.client.0.vm03.stdout:2/374: mkdir d8/d74 0 2026-03-09T00:03:55.828 INFO:tasks.workunit.client.0.vm03.stdout:2/375: creat d8/d17/f75 x:0 0 0 2026-03-09T00:03:55.831 INFO:tasks.workunit.client.0.vm03.stdout:6/363: write d13/d35/f68 [678336,48745] 0 2026-03-09T00:03:55.831 INFO:tasks.workunit.client.0.vm03.stdout:6/364: stat d13/c39 0 2026-03-09T00:03:55.831 INFO:tasks.workunit.client.0.vm03.stdout:6/365: chown f2 801494 1 2026-03-09T00:03:55.848 INFO:tasks.workunit.client.0.vm03.stdout:7/389: creat d2/f73 x:0 0 0 2026-03-09T00:03:55.849 INFO:tasks.workunit.client.0.vm03.stdout:0/375: read d2/da/d36/f58 [53310,76546] 0 2026-03-09T00:03:55.851 INFO:tasks.workunit.client.0.vm03.stdout:4/490: link d7/d27/l36 d7/d27/l98 0 2026-03-09T00:03:55.851 INFO:tasks.workunit.client.0.vm03.stdout:4/491: readlink d7/d20/d6a/d77/d25/l4c 0 2026-03-09T00:03:55.854 INFO:tasks.workunit.client.0.vm03.stdout:9/432: mknod d15/d1c/c8c 0 2026-03-09T00:03:55.854 INFO:tasks.workunit.client.0.vm03.stdout:0/376: write d2/da/dd/f11 [911808,124259] 0 2026-03-09T00:03:55.859 INFO:tasks.workunit.client.1.vm06.stdout:9/536: dwrite d1/d4/d2f/fa0 [0,4194304] 0 2026-03-09T00:03:55.861 INFO:tasks.workunit.client.0.vm03.stdout:1/494: rename d4/d15/f44 to d4/d15/d77/fa9 0 2026-03-09T00:03:55.861 INFO:tasks.workunit.client.0.vm03.stdout:1/495: fdatasync d4/d3a/d61/d78/f94 0 2026-03-09T00:03:55.863 INFO:tasks.workunit.client.0.vm03.stdout:8/379: getdents d7/df/d1a/d40/d58 0 2026-03-09T00:03:55.867 INFO:tasks.workunit.client.1.vm06.stdout:4/584: truncate d17/d21/d4c/f87 444406 0 2026-03-09T00:03:55.867 INFO:tasks.workunit.client.1.vm06.stdout:4/585: stat d17/d21/d32/d92 0 2026-03-09T00:03:55.870 INFO:tasks.workunit.client.1.vm06.stdout:3/683: symlink d11/d28/d2e/d2f/d5b/d5f/lf2 0 2026-03-09T00:03:55.871 INFO:tasks.workunit.client.0.vm03.stdout:5/434: mknod d1c/d20/d55/d4f/d58/d73/c90 0 2026-03-09T00:03:55.874 INFO:tasks.workunit.client.1.vm06.stdout:3/684: write d11/d28/d2e/d2f/d5b/d5f/f60 [3661204,2363] 0 2026-03-09T00:03:55.874 INFO:tasks.workunit.client.1.vm06.stdout:3/685: chown d11/d28/d2e/d2f/d5b/db5/cb4 56975 1 2026-03-09T00:03:55.875 INFO:tasks.workunit.client.1.vm06.stdout:8/625: link db/d1e/l37 db/d53/d7c/d8f/lce 0 2026-03-09T00:03:55.877 INFO:tasks.workunit.client.0.vm03.stdout:6/366: mkdir d13/d35/d7e 0 2026-03-09T00:03:55.886 INFO:tasks.workunit.client.0.vm03.stdout:7/390: unlink d2/d1f/d3a/f29 0 2026-03-09T00:03:55.888 INFO:tasks.workunit.client.0.vm03.stdout:7/391: dread d2/d4/d1e/d5e/d6c/f44 [0,4194304] 0 2026-03-09T00:03:55.889 INFO:tasks.workunit.client.0.vm03.stdout:4/492: unlink d7/d20/d35/f8c 0 2026-03-09T00:03:55.891 INFO:tasks.workunit.client.0.vm03.stdout:9/433: mkdir d15/d1c/d8d 0 2026-03-09T00:03:55.896 INFO:tasks.workunit.client.0.vm03.stdout:0/377: stat d2/l30 0 2026-03-09T00:03:55.902 INFO:tasks.workunit.client.1.vm06.stdout:7/679: mknod d0/df/d1a/d27/d4c/d40/d51/d90/dae/cbe 0 2026-03-09T00:03:55.902 INFO:tasks.workunit.client.1.vm06.stdout:0/657: mknod d3/d18/d1f/d39/d3b/cdd 0 2026-03-09T00:03:55.902 INFO:tasks.workunit.client.0.vm03.stdout:1/496: symlink d4/d3a/d32/da3/laa 0 2026-03-09T00:03:55.902 INFO:tasks.workunit.client.0.vm03.stdout:1/497: write d4/d3a/d32/d87/fa5 [124347,69122] 0 2026-03-09T00:03:55.904 INFO:tasks.workunit.client.1.vm06.stdout:8/626: dwrite db/d74/d78/d98/fbb [0,4194304] 0 2026-03-09T00:03:55.904 INFO:tasks.workunit.client.1.vm06.stdout:8/627: creat db/d1e/d9b/fcf x:0 0 0 2026-03-09T00:03:55.916 INFO:tasks.workunit.client.0.vm03.stdout:5/435: mkdir d1c/d20/d55/d4f/d58/d73/d76/d91 0 2026-03-09T00:03:55.923 INFO:tasks.workunit.client.1.vm06.stdout:1/581: rename d6/d21/d2d/f74 to d6/d4c/fc3 0 2026-03-09T00:03:55.926 INFO:tasks.workunit.client.1.vm06.stdout:1/582: truncate d6/d21/d2d/d3b/d42/f9f 42186 0 2026-03-09T00:03:55.926 INFO:tasks.workunit.client.1.vm06.stdout:1/583: read - d6/d4c/f8e zero size 2026-03-09T00:03:55.927 INFO:tasks.workunit.client.0.vm03.stdout:6/367: mknod d13/d35/c7f 0 2026-03-09T00:03:55.930 INFO:tasks.workunit.client.1.vm06.stdout:8/628: dread db/d53/d70/d38/f99 [0,4194304] 0 2026-03-09T00:03:55.931 INFO:tasks.workunit.client.1.vm06.stdout:8/629: stat db/f16 0 2026-03-09T00:03:55.940 INFO:tasks.workunit.client.1.vm06.stdout:0/658: dwrite d3/d18/d2c/d2d/f46 [0,4194304] 0 2026-03-09T00:03:55.940 INFO:tasks.workunit.client.1.vm06.stdout:0/659: dread - d3/d18/d3c/fb2 zero size 2026-03-09T00:03:55.950 INFO:tasks.workunit.client.0.vm03.stdout:7/392: creat d2/d1f/d42/d43/f74 x:0 0 0 2026-03-09T00:03:55.954 INFO:tasks.workunit.client.1.vm06.stdout:8/630: dread f3 [0,4194304] 0 2026-03-09T00:03:55.964 INFO:tasks.workunit.client.0.vm03.stdout:4/493: link d7/d20/f34 d7/d20/d29/d4e/f99 0 2026-03-09T00:03:55.964 INFO:tasks.workunit.client.0.vm03.stdout:4/494: fdatasync d7/d20/d29/d4e/f99 0 2026-03-09T00:03:55.968 INFO:tasks.workunit.client.1.vm06.stdout:4/586: mknod d17/d24/d3b/d75/cc7 0 2026-03-09T00:03:55.972 INFO:tasks.workunit.client.0.vm03.stdout:9/434: symlink d15/d1c/d28/d6e/l8e 0 2026-03-09T00:03:55.972 INFO:tasks.workunit.client.0.vm03.stdout:9/435: chown d15/d1c/d8d 222493089 1 2026-03-09T00:03:55.974 INFO:tasks.workunit.client.1.vm06.stdout:5/755: sync 2026-03-09T00:03:55.974 INFO:tasks.workunit.client.1.vm06.stdout:2/722: sync 2026-03-09T00:03:55.974 INFO:tasks.workunit.client.1.vm06.stdout:6/666: sync 2026-03-09T00:03:55.974 INFO:tasks.workunit.client.1.vm06.stdout:6/667: chown d4/d27/d42/f60 934577 1 2026-03-09T00:03:55.974 INFO:tasks.workunit.client.0.vm03.stdout:2/376: sync 2026-03-09T00:03:55.975 INFO:tasks.workunit.client.0.vm03.stdout:0/378: mkdir d2/da/d76/d8a 0 2026-03-09T00:03:55.975 INFO:tasks.workunit.client.0.vm03.stdout:0/379: readlink d2/da/dd/d49/d6c/d81/l85 0 2026-03-09T00:03:55.975 INFO:tasks.workunit.client.0.vm03.stdout:0/380: write d2/f59 [1705310,18243] 0 2026-03-09T00:03:55.979 INFO:tasks.workunit.client.1.vm06.stdout:6/668: read d4/d16/d53/f82 [4022105,58550] 0 2026-03-09T00:03:55.979 INFO:tasks.workunit.client.1.vm06.stdout:6/669: write d4/d16/d53/f82 [2905046,122170] 0 2026-03-09T00:03:55.979 INFO:tasks.workunit.client.1.vm06.stdout:6/670: dread - d4/d16/d46/f76 zero size 2026-03-09T00:03:55.982 INFO:tasks.workunit.client.1.vm06.stdout:6/671: read d4/d27/d42/da6/fc6 [2085038,15118] 0 2026-03-09T00:03:55.983 INFO:tasks.workunit.client.0.vm03.stdout:3/304: link d2/db/c35 d2/db/d3b/d3f/c59 0 2026-03-09T00:03:55.991 INFO:tasks.workunit.client.0.vm03.stdout:1/498: symlink d4/d15/d1a/lab 0 2026-03-09T00:03:55.993 INFO:tasks.workunit.client.1.vm06.stdout:9/537: rename d1/d4/d2f/f84 to d1/d3/d50/fab 0 2026-03-09T00:03:55.993 INFO:tasks.workunit.client.1.vm06.stdout:9/538: chown d1/d4/d6e/d14/d25/d85/l30 1 1 2026-03-09T00:03:55.993 INFO:tasks.workunit.client.0.vm03.stdout:6/368: truncate d13/d1e/f3e 532978 0 2026-03-09T00:03:55.997 INFO:tasks.workunit.client.1.vm06.stdout:1/584: truncate d6/d21/d2d/d3b/d42/f9a 1564536 0 2026-03-09T00:03:55.998 INFO:tasks.workunit.client.0.vm03.stdout:7/393: symlink d2/d4/d1e/d5e/d6c/d37/d39/l75 0 2026-03-09T00:03:56.002 INFO:tasks.workunit.client.0.vm03.stdout:4/495: rename d7/f1d to d7/d20/d29/d38/d3a/d90/f9a 0 2026-03-09T00:03:56.004 INFO:tasks.workunit.client.1.vm06.stdout:0/660: unlink d3/d18/d1f/d39/d3b/l38 0 2026-03-09T00:03:56.011 INFO:tasks.workunit.client.1.vm06.stdout:4/587: symlink d17/d21/d4c/d50/lc8 0 2026-03-09T00:03:56.011 INFO:tasks.workunit.client.1.vm06.stdout:5/756: rmdir d5/d1c/d21/d28/d5e/d66 39 2026-03-09T00:03:56.011 INFO:tasks.workunit.client.1.vm06.stdout:3/686: getdents d11/d28/d2e/d2f/d5b 0 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:9/436: link d15/d1c/d36/f5c d15/f8f 0 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:9/437: chown d15/f2c 256242168 1 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:3/305: fsync d2/db/f14 0 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:1/499: truncate d4/d3a/d3d/d46/f4c 1709595 0 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:6/369: symlink d13/d35/d72/l80 0 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:7/394: symlink d2/d1f/d3a/d24/l76 0 2026-03-09T00:03:56.012 INFO:tasks.workunit.client.0.vm03.stdout:9/438: dread d15/d1c/d36/f4a [0,4194304] 0 2026-03-09T00:03:56.016 INFO:tasks.workunit.client.0.vm03.stdout:0/381: rename d2/d1f/c54 to d2/da/dd/d49/d6c/d4b/d55/d6f/c8b 0 2026-03-09T00:03:56.017 INFO:tasks.workunit.client.1.vm06.stdout:6/672: dwrite d4/d16/f33 [0,4194304] 0 2026-03-09T00:03:56.025 INFO:tasks.workunit.client.0.vm03.stdout:8/380: sync 2026-03-09T00:03:56.026 INFO:tasks.workunit.client.0.vm03.stdout:5/436: sync 2026-03-09T00:03:56.034 INFO:tasks.workunit.client.1.vm06.stdout:7/680: rename d0/df/d1a/d27/d4c/d40/d51/d90/cbc to d0/df/d1a/d27/d4c/d40/d51/d86/cbf 0 2026-03-09T00:03:56.034 INFO:tasks.workunit.client.0.vm03.stdout:4/496: rmdir d7/d6f 39 2026-03-09T00:03:56.034 INFO:tasks.workunit.client.0.vm03.stdout:4/497: fsync d7/d27/f31 0 2026-03-09T00:03:56.035 INFO:tasks.workunit.client.1.vm06.stdout:9/539: mknod d1/d3/d50/cac 0 2026-03-09T00:03:56.035 INFO:tasks.workunit.client.0.vm03.stdout:2/377: mknod d8/c76 0 2026-03-09T00:03:56.035 INFO:tasks.workunit.client.1.vm06.stdout:1/585: mkdir d6/dc4 0 2026-03-09T00:03:56.036 INFO:tasks.workunit.client.0.vm03.stdout:1/500: symlink d4/d15/d77/lac 0 2026-03-09T00:03:56.036 INFO:tasks.workunit.client.1.vm06.stdout:0/661: creat d3/d18/d2c/d2d/d74/d90/fde x:0 0 0 2026-03-09T00:03:56.037 INFO:tasks.workunit.client.1.vm06.stdout:0/662: write d3/d18/d2c/d2d/d31/f88 [786562,36070] 0 2026-03-09T00:03:56.037 INFO:tasks.workunit.client.0.vm03.stdout:6/370: creat d13/d35/d74/f81 x:0 0 0 2026-03-09T00:03:56.038 INFO:tasks.workunit.client.1.vm06.stdout:5/757: unlink d5/d1c/d21/d28/d5e/d66/l85 0 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:1/501: dread d4/d3a/f2c [0,4194304] 0 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:7/395: creat d2/d1f/d42/d46/d54/f77 x:0 0 0 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:9/439: truncate d15/d1c/d21/f4c 759671 0 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:9/440: truncate d15/f7b 669233 0 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:9/441: fdatasync d15/f8f 0 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:9/442: chown d15/l2a 70870 1 2026-03-09T00:03:56.045 INFO:tasks.workunit.client.0.vm03.stdout:1/502: dread d4/d15/f4e [0,4194304] 0 2026-03-09T00:03:56.049 INFO:tasks.workunit.client.0.vm03.stdout:9/443: dread d15/d1c/d28/f39 [0,4194304] 0 2026-03-09T00:03:56.049 INFO:tasks.workunit.client.0.vm03.stdout:9/444: stat d15/d1c/d21/d64/f50 0 2026-03-09T00:03:56.052 INFO:tasks.workunit.client.1.vm06.stdout:6/673: dwrite d4/d27/d3e/d57/fa7 [0,4194304] 0 2026-03-09T00:03:56.055 INFO:tasks.workunit.client.0.vm03.stdout:9/445: dread d15/d1c/d21/d64/f3d [0,4194304] 0 2026-03-09T00:03:56.062 INFO:tasks.workunit.client.0.vm03.stdout:9/446: dread d15/d1c/d28/f55 [0,4194304] 0 2026-03-09T00:03:56.068 INFO:tasks.workunit.client.0.vm03.stdout:0/382: link d2/da/dd/c12 d2/da/dd/d49/c8c 0 2026-03-09T00:03:56.068 INFO:tasks.workunit.client.0.vm03.stdout:8/381: creat d7/df/d1a/d40/f76 x:0 0 0 2026-03-09T00:03:56.069 INFO:tasks.workunit.client.0.vm03.stdout:8/382: creat d7/df/d1a/d2b/f77 x:0 0 0 2026-03-09T00:03:56.071 INFO:tasks.workunit.client.1.vm06.stdout:8/631: rename db/d1e/f5f to db/fd0 0 2026-03-09T00:03:56.071 INFO:tasks.workunit.client.1.vm06.stdout:8/632: write db/d53/d7c/fb8 [1009831,109062] 0 2026-03-09T00:03:56.071 INFO:tasks.workunit.client.1.vm06.stdout:8/633: creat db/d53/d70/d38/d4d/d79/fd1 x:0 0 0 2026-03-09T00:03:56.071 INFO:tasks.workunit.client.1.vm06.stdout:8/634: dread - db/dd/d24/dac/fc6 zero size 2026-03-09T00:03:56.071 INFO:tasks.workunit.client.1.vm06.stdout:7/681: creat d0/df/d7b/fc0 x:0 0 0 2026-03-09T00:03:56.071 INFO:tasks.workunit.client.1.vm06.stdout:7/682: chown d0/d55/d99/db2 849672 1 2026-03-09T00:03:56.074 INFO:tasks.workunit.client.1.vm06.stdout:9/540: creat d1/da7/fad x:0 0 0 2026-03-09T00:03:56.076 INFO:tasks.workunit.client.0.vm03.stdout:7/396: dwrite d2/d4/d1e/d5e/d6c/f44 [0,4194304] 0 2026-03-09T00:03:56.080 INFO:tasks.workunit.client.0.vm03.stdout:9/447: mknod d15/d1c/c90 0 2026-03-09T00:03:56.089 INFO:tasks.workunit.client.1.vm06.stdout:4/588: sync 2026-03-09T00:03:56.102 INFO:tasks.workunit.client.0.vm03.stdout:3/306: rename d2/f2a to d2/db/d40/d51/f5a 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.0.vm03.stdout:5/437: rename d1c/d51 to d1c/d51/d6a/d75/d92 22 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.0.vm03.stdout:5/438: readlink d1c/d20/d55/l40 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.0.vm03.stdout:5/439: fsync d1c/f1e 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.0.vm03.stdout:8/383: unlink d7/df/d1e/d38/l68 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.0.vm03.stdout:8/384: stat d7/f48 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.1.vm06.stdout:0/663: symlink d3/d18/d2c/d2d/d74/d90/ldf 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.1.vm06.stdout:0/664: write d3/d18/d2c/d2d/d8c/fb5 [154531,31720] 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.1.vm06.stdout:0/665: truncate d3/f51 4734397 0 2026-03-09T00:03:56.107 INFO:tasks.workunit.client.1.vm06.stdout:5/758: creat d5/d1c/d21/d28/d5e/d66/d78/dc8/fff x:0 0 0 2026-03-09T00:03:56.108 INFO:tasks.workunit.client.1.vm06.stdout:3/687: truncate d11/f27 471216 0 2026-03-09T00:03:56.109 INFO:tasks.workunit.client.0.vm03.stdout:4/498: link d7/d20/d29/f43 d7/d6f/f9b 0 2026-03-09T00:03:56.112 INFO:tasks.workunit.client.0.vm03.stdout:6/371: getdents d13/d1e/d44/d59 0 2026-03-09T00:03:56.112 INFO:tasks.workunit.client.0.vm03.stdout:6/372: readlink d13/d35/d4c/l60 0 2026-03-09T00:03:56.113 INFO:tasks.workunit.client.1.vm06.stdout:6/674: symlink d4/d16/d53/d67/ld1 0 2026-03-09T00:03:56.113 INFO:tasks.workunit.client.0.vm03.stdout:7/397: mkdir d2/d4/d1e/d78 0 2026-03-09T00:03:56.115 INFO:tasks.workunit.client.0.vm03.stdout:9/448: dwrite f10 [0,4194304] 0 2026-03-09T00:03:56.115 INFO:tasks.workunit.client.0.vm03.stdout:1/503: rename d4/d15/d5c/f62 to d4/d15/d86/fad 0 2026-03-09T00:03:56.116 INFO:tasks.workunit.client.1.vm06.stdout:8/635: read db/d1e/f2e [2509394,28465] 0 2026-03-09T00:03:56.116 INFO:tasks.workunit.client.1.vm06.stdout:8/636: chown db/dd/d24/c30 13 1 2026-03-09T00:03:56.116 INFO:tasks.workunit.client.0.vm03.stdout:8/385: rmdir d7/df 39 2026-03-09T00:03:56.117 INFO:tasks.workunit.client.1.vm06.stdout:7/683: creat d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc1 x:0 0 0 2026-03-09T00:03:56.117 INFO:tasks.workunit.client.1.vm06.stdout:7/684: dread - d0/df/d7b/fc0 zero size 2026-03-09T00:03:56.117 INFO:tasks.workunit.client.0.vm03.stdout:4/499: mknod d7/d20/d29/d4e/c9c 0 2026-03-09T00:03:56.117 INFO:tasks.workunit.client.0.vm03.stdout:4/500: chown d7/d20/d29/d38/c93 431449 1 2026-03-09T00:03:56.118 INFO:tasks.workunit.client.0.vm03.stdout:2/378: mknod d8/d1b/d2a/d6b/c77 0 2026-03-09T00:03:56.119 INFO:tasks.workunit.client.1.vm06.stdout:9/541: mkdir d1/d3/d4f/d91/dae 0 2026-03-09T00:03:56.119 INFO:tasks.workunit.client.1.vm06.stdout:9/542: dread - d1/d4/d6e/d9/f82 zero size 2026-03-09T00:03:56.119 INFO:tasks.workunit.client.1.vm06.stdout:9/543: dread - d1/d4/d6e/d9/f8a zero size 2026-03-09T00:03:56.119 INFO:tasks.workunit.client.1.vm06.stdout:9/544: write d1/d3/d50/fab [1022692,101048] 0 2026-03-09T00:03:56.119 INFO:tasks.workunit.client.0.vm03.stdout:6/373: creat d13/d35/f82 x:0 0 0 2026-03-09T00:03:56.119 INFO:tasks.workunit.client.0.vm03.stdout:6/374: chown d13/d1e/d44 1488395 1 2026-03-09T00:03:56.122 INFO:tasks.workunit.client.0.vm03.stdout:7/398: symlink d2/d1f/d40/d67/d6b/l79 0 2026-03-09T00:03:56.123 INFO:tasks.workunit.client.0.vm03.stdout:7/399: write d2/d1f/d42/d43/f4a [1271775,96845] 0 2026-03-09T00:03:56.123 INFO:tasks.workunit.client.0.vm03.stdout:7/400: dread - d2/d1f/d40/d67/f64 zero size 2026-03-09T00:03:56.123 INFO:tasks.workunit.client.0.vm03.stdout:7/401: read - d2/d1f/d42/f59 zero size 2026-03-09T00:03:56.123 INFO:tasks.workunit.client.0.vm03.stdout:7/402: fdatasync d2/d1f/d42/d43/f49 0 2026-03-09T00:03:56.123 INFO:tasks.workunit.client.1.vm06.stdout:1/586: rmdir d6/d21/d2d 39 2026-03-09T00:03:56.124 INFO:tasks.workunit.client.0.vm03.stdout:5/440: getdents d1c/d20/d55/d66/d6b 0 2026-03-09T00:03:56.124 INFO:tasks.workunit.client.1.vm06.stdout:0/666: creat d3/d18/d1f/d39/fe0 x:0 0 0 2026-03-09T00:03:56.124 INFO:tasks.workunit.client.1.vm06.stdout:0/667: write d3/d18/d2c/d2d/d74/d90/fde [867546,7778] 0 2026-03-09T00:03:56.125 INFO:tasks.workunit.client.0.vm03.stdout:9/449: truncate d15/d1c/d36/f3a 282468 0 2026-03-09T00:03:56.126 INFO:tasks.workunit.client.0.vm03.stdout:0/383: rename d2/l30 to d2/da/d36/l8d 0 2026-03-09T00:03:56.126 INFO:tasks.workunit.client.0.vm03.stdout:0/384: chown d2/d1f/l2e 618 1 2026-03-09T00:03:56.133 INFO:tasks.workunit.client.0.vm03.stdout:1/504: dread d4/d3a/d43/f49 [0,4194304] 0 2026-03-09T00:03:56.138 INFO:tasks.workunit.client.1.vm06.stdout:3/688: symlink d11/d28/d2e/d2f/d36/d8f/lf3 0 2026-03-09T00:03:56.138 INFO:tasks.workunit.client.1.vm06.stdout:3/689: dread - d11/d28/d4d/d89/d90/fa7 zero size 2026-03-09T00:03:56.138 INFO:tasks.workunit.client.1.vm06.stdout:3/690: chown d11/d28/d2e/d2f/l7a 22 1 2026-03-09T00:03:56.138 INFO:tasks.workunit.client.1.vm06.stdout:3/691: fsync d11/d28/d4d/d9b/fe2 0 2026-03-09T00:03:56.143 INFO:tasks.workunit.client.1.vm06.stdout:8/637: link db/d53/d70/d38/d4d/d79/f96 db/d74/d78/fd2 0 2026-03-09T00:03:56.144 INFO:tasks.workunit.client.0.vm03.stdout:8/386: read f6 [686604,65591] 0 2026-03-09T00:03:56.144 INFO:tasks.workunit.client.0.vm03.stdout:8/387: creat d7/df/d1a/d40/f78 x:0 0 0 2026-03-09T00:03:56.145 INFO:tasks.workunit.client.1.vm06.stdout:2/723: rename d7/da/d4e/d57/lcf to d7/d1b/d71/d79/db4/dc1/ld9 0 2026-03-09T00:03:56.145 INFO:tasks.workunit.client.1.vm06.stdout:2/724: readlink d7/da/db/l44 0 2026-03-09T00:03:56.150 INFO:tasks.workunit.client.1.vm06.stdout:6/675: dread d4/d16/f34 [4194304,4194304] 0 2026-03-09T00:03:56.161 INFO:tasks.workunit.client.0.vm03.stdout:6/375: mknod d13/d35/c83 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.0.vm03.stdout:5/441: mknod d1c/d20/d55/d4f/d58/d5d/c93 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.0.vm03.stdout:9/450: rename d15/d1c/d21/f41 to d15/d1c/d36/d4d/f91 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.0.vm03.stdout:9/451: write d15/f23 [1052718,108623] 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.0.vm03.stdout:1/505: mkdir d4/d15/dae 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:7/685: symlink d0/df/d1a/d35/lc2 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:7/686: truncate d0/df/d1a/d27/d4c/d40/f5a 943799 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:0/668: truncate d3/d18/d28/d45/f52 238326 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:3/692: symlink d11/d28/d2e/d2f/d5b/d5f/lf4 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:3/693: write d11/d28/d2e/f65 [927380,16527] 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:3/694: creat d11/d3f/ff5 x:0 0 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:3/695: chown d11/f12 10096221 1 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:5/759: rename d5/d1c/d21/d28/d5e/cad to d5/d1c/d23/d34/d47/c100 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:2/725: mknod d7/d1a/d25/d66/d87/da8/db2/cda 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:2/726: truncate d7/d1b/f3b 5205776 0 2026-03-09T00:03:56.162 INFO:tasks.workunit.client.1.vm06.stdout:2/727: chown d7/d1b/fce 111 1 2026-03-09T00:03:56.171 INFO:tasks.workunit.client.0.vm03.stdout:8/388: unlink d7/f64 0 2026-03-09T00:03:56.171 INFO:tasks.workunit.client.0.vm03.stdout:8/389: fsync f6 0 2026-03-09T00:03:56.171 INFO:tasks.workunit.client.0.vm03.stdout:8/390: creat d7/df/d1a/d40/f79 x:0 0 0 2026-03-09T00:03:56.171 INFO:tasks.workunit.client.1.vm06.stdout:5/760: write d5/d1c/d23/f42 [2139103,99276] 0 2026-03-09T00:03:56.173 INFO:tasks.workunit.client.1.vm06.stdout:6/676: creat d4/d27/d42/dc8/fd2 x:0 0 0 2026-03-09T00:03:56.173 INFO:tasks.workunit.client.1.vm06.stdout:6/677: chown d4/d27/d3e/d57/f65 0 1 2026-03-09T00:03:56.173 INFO:tasks.workunit.client.1.vm06.stdout:0/669: dread d3/f10 [0,4194304] 0 2026-03-09T00:03:56.175 INFO:tasks.workunit.client.0.vm03.stdout:6/376: creat d13/d35/d69/f84 x:0 0 0 2026-03-09T00:03:56.180 INFO:tasks.workunit.client.1.vm06.stdout:8/638: dread db/d53/d70/d38/f3a [0,4194304] 0 2026-03-09T00:03:56.187 INFO:tasks.workunit.client.0.vm03.stdout:0/385: rename d2/da/dd/d49/d6c/d4b/d55/d6f/l74 to d2/da/d76/d8a/l8e 0 2026-03-09T00:03:56.187 INFO:tasks.workunit.client.1.vm06.stdout:7/687: rename d0/df/d1a/f50 to d0/df/d1a/d27/d4c/d40/d51/d86/fc3 0 2026-03-09T00:03:56.193 INFO:tasks.workunit.client.0.vm03.stdout:5/442: read d1c/f29 [1345197,77338] 0 2026-03-09T00:03:56.199 INFO:tasks.workunit.client.0.vm03.stdout:6/377: creat d13/d35/d72/f85 x:0 0 0 2026-03-09T00:03:56.200 INFO:tasks.workunit.client.1.vm06.stdout:5/761: stat d5/d44/d4b/d92/c5f 0 2026-03-09T00:03:56.205 INFO:tasks.workunit.client.1.vm06.stdout:0/670: link d3/d18/d2c/l32 d3/d18/d1f/d39/d3b/le1 0 2026-03-09T00:03:56.207 INFO:tasks.workunit.client.0.vm03.stdout:0/386: mkdir d2/da/d76/d8a/d8f 0 2026-03-09T00:03:56.208 INFO:tasks.workunit.client.0.vm03.stdout:0/387: readlink d2/da/dd/d49/d6c/l40 0 2026-03-09T00:03:56.210 INFO:tasks.workunit.client.1.vm06.stdout:8/639: creat db/dd/fd3 x:0 0 0 2026-03-09T00:03:56.210 INFO:tasks.workunit.client.1.vm06.stdout:8/640: stat db/dd/d85 0 2026-03-09T00:03:56.212 INFO:tasks.workunit.client.0.vm03.stdout:5/443: mknod d1c/d20/d55/d4f/d58/d73/d76/c94 0 2026-03-09T00:03:56.220 INFO:tasks.workunit.client.0.vm03.stdout:5/444: symlink d1c/d20/d56/l95 0 2026-03-09T00:03:56.235 INFO:tasks.workunit.client.0.vm03.stdout:5/445: write d1c/f1f [2176016,108179] 0 2026-03-09T00:03:56.235 INFO:tasks.workunit.client.1.vm06.stdout:5/762: getdents d5/d1c/d21/d28 0 2026-03-09T00:03:56.235 INFO:tasks.workunit.client.1.vm06.stdout:5/763: mkdir d5/d44/d101 0 2026-03-09T00:03:56.235 INFO:tasks.workunit.client.1.vm06.stdout:8/641: rmdir db/d53/d7c/d8f 39 2026-03-09T00:03:56.235 INFO:tasks.workunit.client.1.vm06.stdout:8/642: creat db/d53/d70/d38/d4d/db1/fd4 x:0 0 0 2026-03-09T00:03:56.237 INFO:tasks.workunit.client.1.vm06.stdout:4/589: dwrite d17/d21/d4c/d50/f69 [0,4194304] 0 2026-03-09T00:03:56.247 INFO:tasks.workunit.client.1.vm06.stdout:8/643: mkdir db/d53/d70/d38/d4d/d79/dd5 0 2026-03-09T00:03:56.251 INFO:tasks.workunit.client.1.vm06.stdout:4/590: mknod d17/d24/cc9 0 2026-03-09T00:03:56.251 INFO:tasks.workunit.client.1.vm06.stdout:4/591: stat d17/d21/f2f 0 2026-03-09T00:03:56.257 INFO:tasks.workunit.client.1.vm06.stdout:2/728: write d7/d1b/d71/d79/db4/dc1/fd2 [3762716,67583] 0 2026-03-09T00:03:56.259 INFO:tasks.workunit.client.1.vm06.stdout:2/729: chown d7/da/ca9 127 1 2026-03-09T00:03:56.262 INFO:tasks.workunit.client.0.vm03.stdout:9/452: fdatasync d15/f23 0 2026-03-09T00:03:56.262 INFO:tasks.workunit.client.0.vm03.stdout:9/453: write d15/d1c/d21/d64/f3d [3326694,26905] 0 2026-03-09T00:03:56.267 INFO:tasks.workunit.client.1.vm06.stdout:9/545: dwrite d1/d4/d6e/d14/d25/f4a [0,4194304] 0 2026-03-09T00:03:56.268 INFO:tasks.workunit.client.1.vm06.stdout:2/730: mkdir d7/ddb 0 2026-03-09T00:03:56.278 INFO:tasks.workunit.client.0.vm03.stdout:9/454: write d15/d1c/d36/d4d/f91 [325576,94268] 0 2026-03-09T00:03:56.280 INFO:tasks.workunit.client.0.vm03.stdout:9/455: symlink d15/d1c/d21/d67/l92 0 2026-03-09T00:03:56.281 INFO:tasks.workunit.client.0.vm03.stdout:9/456: truncate d15/d1c/d36/f5c 51668 0 2026-03-09T00:03:56.282 INFO:tasks.workunit.client.0.vm03.stdout:3/307: dwrite d2/db/d2d/f54 [0,4194304] 0 2026-03-09T00:03:56.290 INFO:tasks.workunit.client.1.vm06.stdout:0/671: dread d3/f7 [0,4194304] 0 2026-03-09T00:03:56.291 INFO:tasks.workunit.client.0.vm03.stdout:3/308: link d2/db/d2d/c3c d2/db/d40/d51/c5b 0 2026-03-09T00:03:56.291 INFO:tasks.workunit.client.0.vm03.stdout:3/309: creat d2/db/d40/d51/f5c x:0 0 0 2026-03-09T00:03:56.292 INFO:tasks.workunit.client.1.vm06.stdout:0/672: getdents d3/d18 0 2026-03-09T00:03:56.305 INFO:tasks.workunit.client.1.vm06.stdout:0/673: unlink d3/d18/d79/ca5 0 2026-03-09T00:03:56.314 INFO:tasks.workunit.client.0.vm03.stdout:3/310: mkdir d2/db/d3b/d5d 0 2026-03-09T00:03:56.319 INFO:tasks.workunit.client.1.vm06.stdout:6/678: dwrite d4/d27/d42/d4b/fa9 [0,4194304] 0 2026-03-09T00:03:56.329 INFO:tasks.workunit.client.1.vm06.stdout:6/679: mkdir d4/d27/d42/d7e/dac/dd3 0 2026-03-09T00:03:56.329 INFO:tasks.workunit.client.1.vm06.stdout:6/680: chown d4/d27/d42/d7e/c72 48 1 2026-03-09T00:03:56.329 INFO:tasks.workunit.client.1.vm06.stdout:6/681: dread - d4/d27/d42/d4b/fba zero size 2026-03-09T00:03:56.332 INFO:tasks.workunit.client.1.vm06.stdout:6/682: read d4/d27/d3e/d57/f79 [1588291,113295] 0 2026-03-09T00:03:56.334 INFO:tasks.workunit.client.1.vm06.stdout:6/683: mknod d4/d27/d3e/cd4 0 2026-03-09T00:03:56.334 INFO:tasks.workunit.client.1.vm06.stdout:6/684: fsync d4/d27/d42/d4b/f50 0 2026-03-09T00:03:56.334 INFO:tasks.workunit.client.1.vm06.stdout:6/685: creat d4/d27/d42/d52/d7d/fd5 x:0 0 0 2026-03-09T00:03:56.334 INFO:tasks.workunit.client.1.vm06.stdout:6/686: write d4/d27/d3e/d78/fc9 [568033,125218] 0 2026-03-09T00:03:56.344 INFO:tasks.workunit.client.0.vm03.stdout:4/501: dwrite d7/d20/d29/f53 [0,4194304] 0 2026-03-09T00:03:56.344 INFO:tasks.workunit.client.0.vm03.stdout:4/502: stat d7/d27/f31 0 2026-03-09T00:03:56.350 INFO:tasks.workunit.client.1.vm06.stdout:7/688: rmdir d0/df/d1a/d27/d4c/d40/d51/d90/dae 39 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.1.vm06.stdout:7/689: truncate d0/df/d1a/d27/d4c/d40/d5b/f78 1144765 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.1.vm06.stdout:9/546: getdents d1/d3/d4f/d91 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.1.vm06.stdout:9/547: write d1/d4/d6e/d9/f82 [657915,33154] 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.1.vm06.stdout:9/548: creat d1/d3/faf x:0 0 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.1.vm06.stdout:9/549: link d1/l9f d1/d4/d6e/d14/lb0 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.1.vm06.stdout:9/550: fdatasync d1/d4/d6e/d14/d25/f70 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.0.vm03.stdout:4/503: rename d7/d20/d6a/d77/f4a to d7/d20/d29/d4e/f9d 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.0.vm03.stdout:4/504: creat d7/d20/d29/d4e/f9e x:0 0 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.0.vm03.stdout:4/505: link d7/d20/d29/d38/d3a/f4b d7/d20/d29/d38/d3a/d90/f9f 0 2026-03-09T00:03:56.372 INFO:tasks.workunit.client.0.vm03.stdout:4/506: truncate d7/d20/d29/d54/d58/f6b 3337858 0 2026-03-09T00:03:56.373 INFO:tasks.workunit.client.0.vm03.stdout:4/507: creat d7/d20/d29/fa0 x:0 0 0 2026-03-09T00:03:56.376 INFO:tasks.workunit.client.0.vm03.stdout:4/508: rename d7/d20/d29/d4e/f73 to d7/d20/d6a/d77/d25/fa1 0 2026-03-09T00:03:56.416 INFO:tasks.workunit.client.0.vm03.stdout:2/379: dwrite d8/d17/f27 [0,4194304] 0 2026-03-09T00:03:56.416 INFO:tasks.workunit.client.0.vm03.stdout:6/378: dwrite d13/d1e/f48 [0,4194304] 0 2026-03-09T00:03:56.416 INFO:tasks.workunit.client.1.vm06.stdout:7/690: dread d0/df/d1a/d27/d4c/d40/f5a [0,4194304] 0 2026-03-09T00:03:56.416 INFO:tasks.workunit.client.1.vm06.stdout:7/691: truncate d0/d39/f3e 228265 0 2026-03-09T00:03:56.422 INFO:tasks.workunit.client.0.vm03.stdout:2/380: dread d8/f9 [0,4194304] 0 2026-03-09T00:03:56.424 INFO:tasks.workunit.client.0.vm03.stdout:6/379: creat d13/d35/d7e/f86 x:0 0 0 2026-03-09T00:03:56.424 INFO:tasks.workunit.client.0.vm03.stdout:2/381: creat d8/d1b/d2a/d6b/f78 x:0 0 0 2026-03-09T00:03:56.424 INFO:tasks.workunit.client.0.vm03.stdout:2/382: write d8/d1b/d24/f2f [5043001,97767] 0 2026-03-09T00:03:56.424 INFO:tasks.workunit.client.0.vm03.stdout:2/383: getdents d8/d26/d5e/d6f 0 2026-03-09T00:03:56.425 INFO:tasks.workunit.client.0.vm03.stdout:6/380: creat d13/d35/d71/f87 x:0 0 0 2026-03-09T00:03:56.425 INFO:tasks.workunit.client.0.vm03.stdout:6/381: chown d13/d35/c3d 623472 1 2026-03-09T00:03:56.426 INFO:tasks.workunit.client.0.vm03.stdout:2/384: rename d8/d17/c37 to d8/d26/c79 0 2026-03-09T00:03:56.429 INFO:tasks.workunit.client.0.vm03.stdout:3/311: write d2/db/f28 [1944064,29899] 0 2026-03-09T00:03:56.431 INFO:tasks.workunit.client.0.vm03.stdout:1/506: dwrite d4/d15/d77/fa9 [0,4194304] 0 2026-03-09T00:03:56.431 INFO:tasks.workunit.client.0.vm03.stdout:1/507: write d4/d15/d86/f9b [963861,96297] 0 2026-03-09T00:03:56.435 INFO:tasks.workunit.client.0.vm03.stdout:1/508: write d4/d5e/f88 [2850993,81853] 0 2026-03-09T00:03:56.435 INFO:tasks.workunit.client.0.vm03.stdout:0/388: dwrite d2/f32 [0,4194304] 0 2026-03-09T00:03:56.435 INFO:tasks.workunit.client.0.vm03.stdout:0/389: write d2/da/dd/d49/f6a [438819,25026] 0 2026-03-09T00:03:56.435 INFO:tasks.workunit.client.0.vm03.stdout:0/390: readlink d2/d1f/l2e 0 2026-03-09T00:03:56.451 INFO:tasks.workunit.client.0.vm03.stdout:1/509: mkdir d4/d3a/d43/daf 0 2026-03-09T00:03:56.467 INFO:tasks.workunit.client.0.vm03.stdout:1/510: truncate d4/d3a/d61/d78/f94 734397 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.1.vm06.stdout:5/764: dwrite d5/d1c/d21/d28/d5e/d66/d78/da6/fd4 [0,4194304] 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.1.vm06.stdout:5/765: mkdir d5/d1c/d21/d28/d102 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.1.vm06.stdout:5/766: symlink d5/d1c/d21/d28/d102/l103 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.1.vm06.stdout:5/767: fdatasync d5/d44/d4b/fe1 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:3/312: unlink d2/db/f21 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/391: getdents d2/d1f 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/392: write d2/da/f7e [169591,112178] 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/393: read - d2/da/dd/d49/d6c/f57 zero size 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/394: write d2/f1e [1344910,31366] 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/395: symlink d2/d1f/l90 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/396: creat d2/da/d1a/f91 x:0 0 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/397: mknod d2/da/d4e/c92 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/398: symlink d2/da/dd/d6e/l93 0 2026-03-09T00:03:56.468 INFO:tasks.workunit.client.0.vm03.stdout:0/399: symlink d2/da/dd/d49/d6c/d81/l94 0 2026-03-09T00:03:56.469 INFO:tasks.workunit.client.1.vm06.stdout:8/644: dwrite db/dd/fd3 [0,4194304] 0 2026-03-09T00:03:56.469 INFO:tasks.workunit.client.1.vm06.stdout:8/645: chown db/d53/fbd 5060 1 2026-03-09T00:03:56.469 INFO:tasks.workunit.client.1.vm06.stdout:3/696: dwrite d11/d28/d2e/db2/fda [0,4194304] 0 2026-03-09T00:03:56.472 INFO:tasks.workunit.client.0.vm03.stdout:0/400: truncate d2/f22 7760716 0 2026-03-09T00:03:56.474 INFO:tasks.workunit.client.1.vm06.stdout:8/646: creat db/d1e/d46/d94/fd6 x:0 0 0 2026-03-09T00:03:56.474 INFO:tasks.workunit.client.1.vm06.stdout:8/647: creat db/d74/d78/d98/d9c/fd7 x:0 0 0 2026-03-09T00:03:56.475 INFO:tasks.workunit.client.1.vm06.stdout:3/697: rename d11/d28/d4d/l61 to d11/d28/d4d/d9b/lf6 0 2026-03-09T00:03:56.490 INFO:tasks.workunit.client.1.vm06.stdout:8/648: mknod db/dd/cd8 0 2026-03-09T00:03:56.490 INFO:tasks.workunit.client.1.vm06.stdout:8/649: write db/d74/d78/d98/d9c/fd7 [10024,25487] 0 2026-03-09T00:03:56.490 INFO:tasks.workunit.client.1.vm06.stdout:8/650: fsync db/d1e/f25 0 2026-03-09T00:03:56.490 INFO:tasks.workunit.client.0.vm03.stdout:0/401: dread d2/da/dd/d49/d6c/d4b/f4c [0,4194304] 0 2026-03-09T00:03:56.490 INFO:tasks.workunit.client.0.vm03.stdout:0/402: read - d2/da/dd/d49/d6c/f89 zero size 2026-03-09T00:03:56.490 INFO:tasks.workunit.client.0.vm03.stdout:0/403: symlink d2/da/d1a/l95 0 2026-03-09T00:03:56.563 INFO:tasks.workunit.client.0.vm03.stdout:4/509: dwrite d7/f62 [0,4194304] 0 2026-03-09T00:03:56.563 INFO:tasks.workunit.client.0.vm03.stdout:4/510: readlink d7/d20/d6a/d77/l7a 0 2026-03-09T00:03:56.563 INFO:tasks.workunit.client.0.vm03.stdout:4/511: write d7/d20/d29/d4e/f74 [371619,30902] 0 2026-03-09T00:03:56.563 INFO:tasks.workunit.client.0.vm03.stdout:4/512: fsync d7/d20/d29/d4e/f9d 0 2026-03-09T00:03:56.564 INFO:tasks.workunit.client.1.vm06.stdout:0/674: dwrite d3/d18/d2c/d2d/d31/f5d [0,4194304] 0 2026-03-09T00:03:56.564 INFO:tasks.workunit.client.1.vm06.stdout:0/675: chown d3/d18/d2c/d2d/cd0 132727 1 2026-03-09T00:03:56.565 INFO:tasks.workunit.client.1.vm06.stdout:2/731: dwrite d7/da/d1c/f5f [8388608,4194304] 0 2026-03-09T00:03:56.565 INFO:tasks.workunit.client.1.vm06.stdout:2/732: fsync d7/d1b/fd8 0 2026-03-09T00:03:56.569 INFO:tasks.workunit.client.1.vm06.stdout:2/733: getdents d7/d1a/d25/d97 0 2026-03-09T00:03:56.575 INFO:tasks.workunit.client.1.vm06.stdout:0/676: dread d3/d18/d1f/f5e [0,4194304] 0 2026-03-09T00:03:56.575 INFO:tasks.workunit.client.1.vm06.stdout:0/677: stat d3/d18/d2c/d2d/d74 0 2026-03-09T00:03:56.575 INFO:tasks.workunit.client.1.vm06.stdout:0/678: dread - d3/d18/d28/d45/fd7 zero size 2026-03-09T00:03:56.579 INFO:tasks.workunit.client.1.vm06.stdout:6/687: dwrite d4/f3d [0,4194304] 0 2026-03-09T00:03:56.586 INFO:tasks.workunit.client.1.vm06.stdout:0/679: creat d3/d18/d1f/fe2 x:0 0 0 2026-03-09T00:03:56.593 INFO:tasks.workunit.client.1.vm06.stdout:0/680: write d3/d18/d2c/d2d/d31/f89 [1947387,34542] 0 2026-03-09T00:03:56.598 INFO:tasks.workunit.client.0.vm03.stdout:1/511: dwrite d4/d3a/d61/f65 [4194304,4194304] 0 2026-03-09T00:03:56.598 INFO:tasks.workunit.client.0.vm03.stdout:1/512: fsync d4/d3a/d61/da6/fa7 0 2026-03-09T00:03:56.598 INFO:tasks.workunit.client.1.vm06.stdout:4/592: dwrite d17/d21/d32/f96 [0,4194304] 0 2026-03-09T00:03:56.600 INFO:tasks.workunit.client.1.vm06.stdout:0/681: unlink d3/d18/d2c/d2d/d31/f63 0 2026-03-09T00:03:56.601 INFO:tasks.workunit.client.0.vm03.stdout:6/382: rmdir d13/d35/d71 39 2026-03-09T00:03:56.601 INFO:tasks.workunit.client.0.vm03.stdout:6/383: chown d13/d1e/c64 17199 1 2026-03-09T00:03:56.613 INFO:tasks.workunit.client.1.vm06.stdout:3/698: dwrite d11/d28/d2e/d2f/f92 [0,4194304] 0 2026-03-09T00:03:56.617 INFO:tasks.workunit.client.1.vm06.stdout:4/593: mknod d17/d21/d4c/d66/d68/cca 0 2026-03-09T00:03:56.617 INFO:tasks.workunit.client.1.vm06.stdout:4/594: write f1 [9467696,32861] 0 2026-03-09T00:03:56.617 INFO:tasks.workunit.client.1.vm06.stdout:4/595: rename d17/d24/d3b to d17/d24/d3b/d75/dcb 22 2026-03-09T00:03:56.617 INFO:tasks.workunit.client.1.vm06.stdout:0/682: mkdir d3/d18/d2c/d2d/d74/daf/de3 0 2026-03-09T00:03:56.619 INFO:tasks.workunit.client.1.vm06.stdout:4/596: dread f14 [0,4194304] 0 2026-03-09T00:03:56.624 INFO:tasks.workunit.client.1.vm06.stdout:4/597: stat d17/f19 0 2026-03-09T00:03:56.624 INFO:tasks.workunit.client.1.vm06.stdout:4/598: chown d17/d24/d49/l48 202 1 2026-03-09T00:03:56.624 INFO:tasks.workunit.client.1.vm06.stdout:3/699: write d11/d28/d4d/d89/d90/dd2/fef [3807463,38380] 0 2026-03-09T00:03:56.624 INFO:tasks.workunit.client.1.vm06.stdout:3/700: chown d11/d28/d4d/cd9 5611178 1 2026-03-09T00:03:56.624 INFO:tasks.workunit.client.0.vm03.stdout:1/513: mknod d4/d3a/d32/d6a/cb0 0 2026-03-09T00:03:56.624 INFO:tasks.workunit.client.0.vm03.stdout:1/514: chown d4/d6/c29 4 1 2026-03-09T00:03:56.636 INFO:tasks.workunit.client.1.vm06.stdout:0/683: mkdir d3/d18/d1f/d39/d49/d60/de4 0 2026-03-09T00:03:56.637 INFO:tasks.workunit.client.1.vm06.stdout:4/599: rmdir d17/d24/d3b/d5e/d6e 39 2026-03-09T00:03:56.638 INFO:tasks.workunit.client.1.vm06.stdout:0/684: mknod d3/d18/d2c/d2d/d74/ce5 0 2026-03-09T00:03:56.643 INFO:tasks.workunit.client.1.vm06.stdout:0/685: dread - d3/d18/d2c/d2d/d74/d7d/d9f/fcb zero size 2026-03-09T00:03:56.643 INFO:tasks.workunit.client.1.vm06.stdout:0/686: unlink d3/d18/d79/c94 0 2026-03-09T00:03:56.643 INFO:tasks.workunit.client.1.vm06.stdout:0/687: creat d3/d18/d2c/d2d/d31/fe6 x:0 0 0 2026-03-09T00:03:56.646 INFO:tasks.workunit.client.1.vm06.stdout:0/688: write d3/d18/d1f/d44/f58 [2553155,119440] 0 2026-03-09T00:03:56.652 INFO:tasks.workunit.client.1.vm06.stdout:0/689: mknod d3/d18/d1f/d44/ce7 0 2026-03-09T00:03:56.652 INFO:tasks.workunit.client.1.vm06.stdout:7/692: dwrite d0/df/d1a/d27/d4c/d40/d5b/f78 [0,4194304] 0 2026-03-09T00:03:56.659 INFO:tasks.workunit.client.1.vm06.stdout:7/693: creat d0/df/d1a/d27/d70/fc4 x:0 0 0 2026-03-09T00:03:56.661 INFO:tasks.workunit.client.1.vm06.stdout:7/694: symlink d0/df/d7b/lc5 0 2026-03-09T00:03:56.666 INFO:tasks.workunit.client.1.vm06.stdout:7/695: creat d0/df/d1a/d3f/d53/fc6 x:0 0 0 2026-03-09T00:03:56.666 INFO:tasks.workunit.client.1.vm06.stdout:7/696: truncate d0/df/f13 1879796 0 2026-03-09T00:03:56.666 INFO:tasks.workunit.client.1.vm06.stdout:0/690: dread d3/f51 [0,4194304] 0 2026-03-09T00:03:56.667 INFO:tasks.workunit.client.1.vm06.stdout:7/697: dread d0/df/d1a/d35/f77 [0,4194304] 0 2026-03-09T00:03:56.667 INFO:tasks.workunit.client.1.vm06.stdout:7/698: creat d0/df/d1a/d27/d70/fc7 x:0 0 0 2026-03-09T00:03:56.670 INFO:tasks.workunit.client.1.vm06.stdout:0/691: dread d3/d18/d2c/d2d/d31/f7b [0,4194304] 0 2026-03-09T00:03:56.670 INFO:tasks.workunit.client.1.vm06.stdout:0/692: truncate d3/d18/d28/f81 58313 0 2026-03-09T00:03:56.677 INFO:tasks.workunit.client.1.vm06.stdout:0/693: read d3/d18/d1f/d44/d6a/f96 [601446,57381] 0 2026-03-09T00:03:56.677 INFO:tasks.workunit.client.1.vm06.stdout:0/694: fsync d3/f1e 0 2026-03-09T00:03:56.679 INFO:tasks.workunit.client.1.vm06.stdout:7/699: unlink d0/df/d1a/d3a/ca1 0 2026-03-09T00:03:56.683 INFO:tasks.workunit.client.1.vm06.stdout:7/700: write d0/df/d17/f7e [1398963,99286] 0 2026-03-09T00:03:56.683 INFO:tasks.workunit.client.0.vm03.stdout:0/404: dwrite d2/d71/f7c [0,4194304] 0 2026-03-09T00:03:56.683 INFO:tasks.workunit.client.0.vm03.stdout:2/385: dwrite d8/d26/d5e/d5f/f69 [0,4194304] 0 2026-03-09T00:03:56.685 INFO:tasks.workunit.client.1.vm06.stdout:9/551: dread d1/d3/d2b/f6d [0,4194304] 0 2026-03-09T00:03:56.688 INFO:tasks.workunit.client.1.vm06.stdout:0/695: unlink d3/d18/d1f/d44/fbe 0 2026-03-09T00:03:56.692 INFO:tasks.workunit.client.1.vm06.stdout:0/696: dread d3/d18/d1f/d39/d3b/fcf [0,4194304] 0 2026-03-09T00:03:56.709 INFO:tasks.workunit.client.1.vm06.stdout:7/701: mkdir d0/df/d1a/d3a/d4e/d5e/dc8 0 2026-03-09T00:03:56.709 INFO:tasks.workunit.client.0.vm03.stdout:6/384: dwrite d13/f70 [0,4194304] 0 2026-03-09T00:03:56.709 INFO:tasks.workunit.client.0.vm03.stdout:6/385: readlink d13/l67 0 2026-03-09T00:03:56.709 INFO:tasks.workunit.client.0.vm03.stdout:6/386: chown d13/f6f 0 1 2026-03-09T00:03:56.709 INFO:tasks.workunit.client.0.vm03.stdout:2/386: creat d8/d1b/d2a/f7a x:0 0 0 2026-03-09T00:03:56.713 INFO:tasks.workunit.client.1.vm06.stdout:9/552: dread d1/d4/fe [0,4194304] 0 2026-03-09T00:03:56.713 INFO:tasks.workunit.client.1.vm06.stdout:0/697: creat d3/d18/d1f/d39/d49/d60/fe8 x:0 0 0 2026-03-09T00:03:56.714 INFO:tasks.workunit.client.1.vm06.stdout:0/698: dread d3/d18/d28/f81 [0,4194304] 0 2026-03-09T00:03:56.718 INFO:tasks.workunit.client.1.vm06.stdout:9/553: creat d1/d73/fb1 x:0 0 0 2026-03-09T00:03:56.721 INFO:tasks.workunit.client.0.vm03.stdout:2/387: creat d8/d1b/d6c/f7b x:0 0 0 2026-03-09T00:03:56.721 INFO:tasks.workunit.client.0.vm03.stdout:2/388: write d8/d17/f75 [283066,95175] 0 2026-03-09T00:03:56.721 INFO:tasks.workunit.client.1.vm06.stdout:6/688: dwrite d4/d27/d42/fb8 [0,4194304] 0 2026-03-09T00:03:56.724 INFO:tasks.workunit.client.1.vm06.stdout:2/734: dwrite d7/da/db/de/f53 [0,4194304] 0 2026-03-09T00:03:56.724 INFO:tasks.workunit.client.1.vm06.stdout:0/699: dread d3/d18/d1f/d39/d69/f91 [0,4194304] 0 2026-03-09T00:03:56.726 INFO:tasks.workunit.client.1.vm06.stdout:6/689: write d4/d27/d42/d4b/fa9 [4154260,21071] 0 2026-03-09T00:03:56.738 INFO:tasks.workunit.client.1.vm06.stdout:9/554: creat d1/d4/d6e/d14/fb2 x:0 0 0 2026-03-09T00:03:56.738 INFO:tasks.workunit.client.1.vm06.stdout:9/555: stat d1/d3/d4f/d91/d94/d9e 0 2026-03-09T00:03:56.741 INFO:tasks.workunit.client.1.vm06.stdout:4/600: dwrite d17/f61 [0,4194304] 0 2026-03-09T00:03:56.741 INFO:tasks.workunit.client.1.vm06.stdout:4/601: stat d17/l3c 0 2026-03-09T00:03:56.743 INFO:tasks.workunit.client.1.vm06.stdout:8/651: dread db/dd/f7a [0,4194304] 0 2026-03-09T00:03:56.743 INFO:tasks.workunit.client.1.vm06.stdout:8/652: dread - db/d1e/d46/d94/fd6 zero size 2026-03-09T00:03:56.743 INFO:tasks.workunit.client.1.vm06.stdout:8/653: stat db/d1e/d9b/fcf 0 2026-03-09T00:03:56.746 INFO:tasks.workunit.client.0.vm03.stdout:7/403: sync 2026-03-09T00:03:56.746 INFO:tasks.workunit.client.0.vm03.stdout:7/404: fdatasync d2/f3 0 2026-03-09T00:03:56.749 INFO:tasks.workunit.client.0.vm03.stdout:7/405: mknod d2/d1f/d3a/d24/c7a 0 2026-03-09T00:03:56.749 INFO:tasks.workunit.client.0.vm03.stdout:7/406: chown d2/d4/d1e/f63 3 1 2026-03-09T00:03:56.749 INFO:tasks.workunit.client.1.vm06.stdout:2/735: chown d7/da/d4e/d57/f9f 0 1 2026-03-09T00:03:56.749 INFO:tasks.workunit.client.1.vm06.stdout:2/736: creat d7/d1b/d71/d79/db4/dc1/d86/fdc x:0 0 0 2026-03-09T00:03:56.751 INFO:tasks.workunit.client.1.vm06.stdout:4/602: read d17/f20 [163651,82734] 0 2026-03-09T00:03:56.756 INFO:tasks.workunit.client.1.vm06.stdout:4/603: write d17/f20 [4975116,109623] 0 2026-03-09T00:03:56.756 INFO:tasks.workunit.client.0.vm03.stdout:7/407: dread d2/d1f/d42/d43/f4a [0,4194304] 0 2026-03-09T00:03:56.757 INFO:tasks.workunit.client.1.vm06.stdout:1/587: sync 2026-03-09T00:03:56.757 INFO:tasks.workunit.client.0.vm03.stdout:7/408: write d2/d4/d1e/d5e/d6c/d37/f56 [2420157,101817] 0 2026-03-09T00:03:56.757 INFO:tasks.workunit.client.0.vm03.stdout:7/409: write d2/d1f/d42/d46/d54/f77 [94025,79660] 0 2026-03-09T00:03:56.757 INFO:tasks.workunit.client.0.vm03.stdout:7/410: fsync d2/d1f/d42/d43/f49 0 2026-03-09T00:03:56.760 INFO:tasks.workunit.client.0.vm03.stdout:6/387: dwrite d13/d1e/d44/d59/f6e [0,4194304] 0 2026-03-09T00:03:56.771 INFO:tasks.workunit.client.1.vm06.stdout:6/690: symlink d4/d16/d46/ld6 0 2026-03-09T00:03:56.771 INFO:tasks.workunit.client.1.vm06.stdout:6/691: readlink d4/d16/d46/d90/lc7 0 2026-03-09T00:03:56.771 INFO:tasks.workunit.client.1.vm06.stdout:6/692: stat d4/d27/f84 0 2026-03-09T00:03:56.777 INFO:tasks.workunit.client.0.vm03.stdout:6/388: rename d13/d35/d4c/l60 to d13/d1e/d44/d4a/l88 0 2026-03-09T00:03:56.778 INFO:tasks.workunit.client.0.vm03.stdout:6/389: mkdir d13/d35/d74/d89 0 2026-03-09T00:03:56.780 INFO:tasks.workunit.client.0.vm03.stdout:6/390: symlink d13/d35/d7e/l8a 0 2026-03-09T00:03:56.781 INFO:tasks.workunit.client.1.vm06.stdout:1/588: creat d6/d21/d2d/fc5 x:0 0 0 2026-03-09T00:03:56.783 INFO:tasks.workunit.client.1.vm06.stdout:6/693: unlink d4/d27/d42/f6b 0 2026-03-09T00:03:56.790 INFO:tasks.workunit.client.1.vm06.stdout:6/694: chown d4/d27/d42/dc8/fd2 36811 1 2026-03-09T00:03:56.790 INFO:tasks.workunit.client.1.vm06.stdout:6/695: truncate d4/d27/f61 982493 0 2026-03-09T00:03:56.792 INFO:tasks.workunit.client.0.vm03.stdout:0/405: fdatasync d2/da/dd/f24 0 2026-03-09T00:03:56.804 INFO:tasks.workunit.client.0.vm03.stdout:4/513: read f4 [3268832,23298] 0 2026-03-09T00:03:56.804 INFO:tasks.workunit.client.1.vm06.stdout:0/700: rename d3/d18/d2c/d2d/d74/d7d to d3/d18/de9 0 2026-03-09T00:03:56.804 INFO:tasks.workunit.client.1.vm06.stdout:1/589: mknod d6/cc6 0 2026-03-09T00:03:56.811 INFO:tasks.workunit.client.0.vm03.stdout:0/406: link d2/da/dd/d49/d6c/l5f d2/da/dd/d49/d6c/d4b/d55/l96 0 2026-03-09T00:03:56.828 INFO:tasks.workunit.client.1.vm06.stdout:2/737: rename c5 to d7/d1a/d25/d66/cdd 0 2026-03-09T00:03:56.828 INFO:tasks.workunit.client.1.vm06.stdout:2/738: chown d7/d1b/d71/d79/db4/dc1/f5e 256925 1 2026-03-09T00:03:56.828 INFO:tasks.workunit.client.1.vm06.stdout:2/739: chown d7/d1a/d25/d66/d87/f9b 7004010 1 2026-03-09T00:03:56.828 INFO:tasks.workunit.client.1.vm06.stdout:2/740: dread - d7/d1a/d25/d66/d87/fc3 zero size 2026-03-09T00:03:56.829 INFO:tasks.workunit.client.1.vm06.stdout:3/701: dwrite d11/d28/d2e/d2f/d5b/d94/fb3 [0,4194304] 0 2026-03-09T00:03:56.832 INFO:tasks.workunit.client.1.vm06.stdout:0/701: creat d3/d18/d2c/d2d/d74/daf/fea x:0 0 0 2026-03-09T00:03:56.834 INFO:tasks.workunit.client.0.vm03.stdout:4/514: mknod d7/d20/d29/d54/ca2 0 2026-03-09T00:03:56.834 INFO:tasks.workunit.client.0.vm03.stdout:0/407: unlink d2/c37 0 2026-03-09T00:03:56.839 INFO:tasks.workunit.client.0.vm03.stdout:1/515: dwrite d4/d3a/d61/d78/f8e [0,4194304] 0 2026-03-09T00:03:56.841 INFO:tasks.workunit.client.1.vm06.stdout:6/696: rename d4/d27/d3e/d57/c5a to d4/d27/d42/d52/d7d/cd7 0 2026-03-09T00:03:56.841 INFO:tasks.workunit.client.1.vm06.stdout:6/697: read d4/d27/fb6 [294188,26225] 0 2026-03-09T00:03:56.847 INFO:tasks.workunit.client.0.vm03.stdout:4/515: read d7/d20/d29/f2a [454590,73299] 0 2026-03-09T00:03:56.847 INFO:tasks.workunit.client.1.vm06.stdout:9/556: dwrite d1/d3/fa6 [0,4194304] 0 2026-03-09T00:03:56.852 INFO:tasks.workunit.client.0.vm03.stdout:0/408: symlink d2/da/d76/d8a/d8f/l97 0 2026-03-09T00:03:56.854 INFO:tasks.workunit.client.0.vm03.stdout:4/516: write d7/d20/d29/f53 [989645,106495] 0 2026-03-09T00:03:56.854 INFO:tasks.workunit.client.0.vm03.stdout:4/517: fdatasync d7/fe 0 2026-03-09T00:03:56.854 INFO:tasks.workunit.client.0.vm03.stdout:4/518: dread d7/d20/d35/f68 [0,4194304] 0 2026-03-09T00:03:56.854 INFO:tasks.workunit.client.0.vm03.stdout:4/519: creat d7/d20/d29/d38/d3a/d90/fa3 x:0 0 0 2026-03-09T00:03:56.854 INFO:tasks.workunit.client.0.vm03.stdout:4/520: chown d7/f22 48045711 1 2026-03-09T00:03:56.854 INFO:tasks.workunit.client.1.vm06.stdout:2/741: creat d7/d1a/d96/dc8/fde x:0 0 0 2026-03-09T00:03:56.858 INFO:tasks.workunit.client.1.vm06.stdout:9/557: write d1/d4/d6e/d14/d25/f70 [3037961,39579] 0 2026-03-09T00:03:56.861 INFO:tasks.workunit.client.0.vm03.stdout:0/409: link d2/da/dd/d49/d6c/l64 d2/da/dd/l98 0 2026-03-09T00:03:56.874 INFO:tasks.workunit.client.0.vm03.stdout:4/521: creat d7/d20/d29/fa4 x:0 0 0 2026-03-09T00:03:56.876 INFO:tasks.workunit.client.0.vm03.stdout:4/522: write d7/f7e [952505,49476] 0 2026-03-09T00:03:56.876 INFO:tasks.workunit.client.1.vm06.stdout:0/702: symlink d3/d18/d1f/d44/leb 0 2026-03-09T00:03:56.876 INFO:tasks.workunit.client.1.vm06.stdout:0/703: truncate d3/d18/d1f/d39/d49/f64 4986839 0 2026-03-09T00:03:56.878 INFO:tasks.workunit.client.1.vm06.stdout:1/590: mknod d6/d4c/cc7 0 2026-03-09T00:03:56.878 INFO:tasks.workunit.client.1.vm06.stdout:1/591: fsync d6/d21/d2d/f6c 0 2026-03-09T00:03:56.878 INFO:tasks.workunit.client.1.vm06.stdout:1/592: truncate d6/d21/d2d/d37/f86 5097831 0 2026-03-09T00:03:56.878 INFO:tasks.workunit.client.1.vm06.stdout:1/593: creat d6/d21/fc8 x:0 0 0 2026-03-09T00:03:56.878 INFO:tasks.workunit.client.1.vm06.stdout:1/594: readlink d6/d21/l24 0 2026-03-09T00:03:56.880 INFO:tasks.workunit.client.1.vm06.stdout:6/698: symlink d4/db4/ld8 0 2026-03-09T00:03:56.880 INFO:tasks.workunit.client.1.vm06.stdout:6/699: write d4/d16/d53/d67/f8f [205993,86019] 0 2026-03-09T00:03:56.886 INFO:tasks.workunit.client.1.vm06.stdout:2/742: creat d7/d1b/d71/d79/fdf x:0 0 0 2026-03-09T00:03:56.900 INFO:tasks.workunit.client.1.vm06.stdout:9/558: rename d1/d3/d4f/d52/f6b to d1/d3/d4f/d52/fb3 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.0.vm03.stdout:6/391: dwrite d13/f31 [0,4194304] 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.0.vm03.stdout:6/392: mknod d13/d35/d74/c8b 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.0.vm03.stdout:6/393: creat d13/d1e/f8c x:0 0 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.0.vm03.stdout:6/394: mknod d13/d35/d72/c8d 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.0.vm03.stdout:6/395: stat d13/c2b 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.0.vm03.stdout:6/396: link d13/d1e/d44/d4a/l63 d13/l8e 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:4/604: dwrite d17/d21/d4c/d50/f8c [0,4194304] 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:4/605: dread - d17/d21/d4c/d66/f7b zero size 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:4/606: chown d17 347709 1 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:0/704: link d3/d18/d2c/d2d/d31/f4f d3/d18/d1f/d44/d6a/fec 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:0/705: chown d3/d18/d2c/d2d/d31/l42 844056 1 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:1/595: mkdir d6/d21/d2d/d3b/dc9 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:6/700: rename d4/l43 to d4/d27/d42/da6/ld9 0 2026-03-09T00:03:56.901 INFO:tasks.workunit.client.1.vm06.stdout:6/701: fdatasync d4/d16/d46/fb9 0 2026-03-09T00:03:56.904 INFO:tasks.workunit.client.1.vm06.stdout:0/706: write d3/d18/d1f/d39/d3b/f57 [2668180,128828] 0 2026-03-09T00:03:56.906 INFO:tasks.workunit.client.1.vm06.stdout:1/596: write d6/d21/d2d/d3b/d87/f9e [291771,127212] 0 2026-03-09T00:03:56.906 INFO:tasks.workunit.client.1.vm06.stdout:1/597: fdatasync d6/d4c/d79/f59 0 2026-03-09T00:03:56.907 INFO:tasks.workunit.client.1.vm06.stdout:4/607: creat d17/d21/d4c/fcc x:0 0 0 2026-03-09T00:03:56.908 INFO:tasks.workunit.client.1.vm06.stdout:9/559: symlink d1/d3/d2b/d58/lb4 0 2026-03-09T00:03:56.908 INFO:tasks.workunit.client.1.vm06.stdout:9/560: fdatasync d1/d4/d6e/f93 0 2026-03-09T00:03:56.908 INFO:tasks.workunit.client.1.vm06.stdout:9/561: creat d1/fb5 x:0 0 0 2026-03-09T00:03:56.909 INFO:tasks.workunit.client.1.vm06.stdout:6/702: symlink d4/d27/lda 0 2026-03-09T00:03:56.917 INFO:tasks.workunit.client.0.vm03.stdout:2/389: dwrite d8/d1b/f1f [0,4194304] 0 2026-03-09T00:03:56.918 INFO:tasks.workunit.client.1.vm06.stdout:0/707: link d3/d18/d1f/d39/cb7 d3/d18/d1f/d44/d6a/d73/ced 0 2026-03-09T00:03:56.922 INFO:tasks.workunit.client.0.vm03.stdout:6/397: write d13/d1e/d44/d4a/f58 [3123158,24290] 0 2026-03-09T00:03:56.922 INFO:tasks.workunit.client.0.vm03.stdout:6/398: truncate d13/f5d 942420 0 2026-03-09T00:03:56.922 INFO:tasks.workunit.client.0.vm03.stdout:6/399: chown d13/l67 106 1 2026-03-09T00:03:56.923 INFO:tasks.workunit.client.1.vm06.stdout:9/562: rename d1/d4/d6e/f7 to d1/d3/d4f/d91/dae/fb6 0 2026-03-09T00:03:56.925 INFO:tasks.workunit.client.1.vm06.stdout:8/654: dwrite db/d53/d70/d38/f72 [0,4194304] 0 2026-03-09T00:03:56.929 INFO:tasks.workunit.client.0.vm03.stdout:4/523: dread d7/f7e [0,4194304] 0 2026-03-09T00:03:56.933 INFO:tasks.workunit.client.0.vm03.stdout:4/524: write d7/d20/f21 [695246,76054] 0 2026-03-09T00:03:56.935 INFO:tasks.workunit.client.1.vm06.stdout:0/708: mknod d3/d18/d2c/cee 0 2026-03-09T00:03:56.940 INFO:tasks.workunit.client.1.vm06.stdout:0/709: write d3/f7 [3708209,91385] 0 2026-03-09T00:03:56.944 INFO:tasks.workunit.client.0.vm03.stdout:8/391: sync 2026-03-09T00:03:56.945 INFO:tasks.workunit.client.0.vm03.stdout:3/313: dwrite d2/db/f28 [4194304,4194304] 0 2026-03-09T00:03:56.945 INFO:tasks.workunit.client.0.vm03.stdout:3/314: chown d2/db/d40/d58 151 1 2026-03-09T00:03:56.946 INFO:tasks.workunit.client.0.vm03.stdout:6/400: mkdir d13/d8f 0 2026-03-09T00:03:56.946 INFO:tasks.workunit.client.1.vm06.stdout:1/598: symlink d6/d4c/lca 0 2026-03-09T00:03:56.947 INFO:tasks.workunit.client.1.vm06.stdout:1/599: dread d6/d4c/d51/fba [0,4194304] 0 2026-03-09T00:03:56.947 INFO:tasks.workunit.client.1.vm06.stdout:1/600: dread - d6/d4c/d71/fbf zero size 2026-03-09T00:03:56.949 INFO:tasks.workunit.client.1.vm06.stdout:9/563: mknod d1/d4/d2f/cb7 0 2026-03-09T00:03:56.949 INFO:tasks.workunit.client.1.vm06.stdout:9/564: fdatasync d1/d4/d6e/d9/f3d 0 2026-03-09T00:03:56.949 INFO:tasks.workunit.client.0.vm03.stdout:4/525: mkdir d7/d6f/da5 0 2026-03-09T00:03:56.949 INFO:tasks.workunit.client.0.vm03.stdout:4/526: readlink d7/l14 0 2026-03-09T00:03:56.952 INFO:tasks.workunit.client.0.vm03.stdout:8/392: rmdir d7/df/d1e/d5a 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:8/393: write d7/df/d6b/f74 [297819,96026] 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:8/394: write d7/df/d1a/d40/f69 [1003760,10] 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:3/315: readlink d2/db/l38 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:3/316: fsync d2/db/d2d/f52 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:6/401: truncate d13/d1e/f28 2381488 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:4/527: mknod d7/d6f/ca6 0 2026-03-09T00:03:56.967 INFO:tasks.workunit.client.0.vm03.stdout:8/395: creat d7/df/d1a/d40/d58/f7a x:0 0 0 2026-03-09T00:03:56.970 INFO:tasks.workunit.client.0.vm03.stdout:6/402: symlink d13/d35/d4c/d62/l90 0 2026-03-09T00:03:56.972 INFO:tasks.workunit.client.0.vm03.stdout:4/528: chown d7/d20/d29/d78 93 1 2026-03-09T00:03:56.979 INFO:tasks.workunit.client.1.vm06.stdout:8/655: mkdir db/dd/d24/da7/dab/dd9 0 2026-03-09T00:03:56.985 INFO:tasks.workunit.client.0.vm03.stdout:0/410: dwrite d2/f32 [0,4194304] 0 2026-03-09T00:03:56.988 INFO:tasks.workunit.client.0.vm03.stdout:8/396: symlink d7/df/d1e/d38/d60/l7b 0 2026-03-09T00:03:57.005 INFO:tasks.workunit.client.0.vm03.stdout:8/397: dread - d7/df/d1e/f63 zero size 2026-03-09T00:03:57.005 INFO:tasks.workunit.client.0.vm03.stdout:8/398: chown d7/la 21551 1 2026-03-09T00:03:57.006 INFO:tasks.workunit.client.0.vm03.stdout:6/403: truncate d13/d1e/f2d 3551438 0 2026-03-09T00:03:57.006 INFO:tasks.workunit.client.0.vm03.stdout:6/404: truncate d13/d35/f68 1544620 0 2026-03-09T00:03:57.006 INFO:tasks.workunit.client.0.vm03.stdout:0/411: mknod d2/da/c99 0 2026-03-09T00:03:57.006 INFO:tasks.workunit.client.0.vm03.stdout:0/412: chown d2/da/d76/d8a/d8f 7047638 1 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:0/710: creat d3/d18/d1f/d39/d49/d60/fef x:0 0 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:0/711: creat d3/d18/d2c/d2d/d74/d90/ff0 x:0 0 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:0/712: fsync d3/f1e 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:0/713: fsync d3/d18/d2c/f7e 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:0/714: fsync d3/d18/d2c/f4d 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:9/565: creat d1/d4/d6e/d14/d25/d85/fb8 x:0 0 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:9/566: fdatasync d1/d4/fe 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:9/567: creat d1/da7/fb9 x:0 0 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.1.vm06.stdout:0/715: creat d3/d18/de9/d9f/ff1 x:0 0 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.0.vm03.stdout:8/399: rename d7/df/d1a/d40/d58/c4b to d7/df/d6b/c7c 0 2026-03-09T00:03:57.009 INFO:tasks.workunit.client.0.vm03.stdout:8/400: fdatasync d7/df/d1e/d3f/f47 0 2026-03-09T00:03:57.010 INFO:tasks.workunit.client.0.vm03.stdout:0/413: mkdir d2/da/dd/d49/d6c/d4b/d55/d9a 0 2026-03-09T00:03:57.011 INFO:tasks.workunit.client.0.vm03.stdout:8/401: unlink d7/df/d1e/d38/d4c/f5f 0 2026-03-09T00:03:57.012 INFO:tasks.workunit.client.0.vm03.stdout:6/405: rename d13/d1e/d44/d4a/f75 to d13/d1e/d44/d4a/d52/f91 0 2026-03-09T00:03:57.012 INFO:tasks.workunit.client.0.vm03.stdout:0/414: symlink d2/da/d1a/l9b 0 2026-03-09T00:03:57.013 INFO:tasks.workunit.client.0.vm03.stdout:8/402: rename d7/df/d1e/f63 to d7/df/d1e/d3f/f7d 0 2026-03-09T00:03:57.013 INFO:tasks.workunit.client.0.vm03.stdout:6/406: creat d13/f92 x:0 0 0 2026-03-09T00:03:57.014 INFO:tasks.workunit.client.0.vm03.stdout:0/415: link d2/f1e d2/da/d4e/f9c 0 2026-03-09T00:03:57.014 INFO:tasks.workunit.client.0.vm03.stdout:0/416: creat d2/da/dd/d49/d6c/f9d x:0 0 0 2026-03-09T00:03:57.015 INFO:tasks.workunit.client.0.vm03.stdout:8/403: mknod d7/df/d1a/c7e 0 2026-03-09T00:03:57.019 INFO:tasks.workunit.client.0.vm03.stdout:8/404: write d7/df/d1a/d40/f4d [3369474,58355] 0 2026-03-09T00:03:57.019 INFO:tasks.workunit.client.0.vm03.stdout:8/405: fsync d7/df/f3d 0 2026-03-09T00:03:57.019 INFO:tasks.workunit.client.0.vm03.stdout:8/406: creat d7/df/d1a/d40/d58/f7f x:0 0 0 2026-03-09T00:03:57.020 INFO:tasks.workunit.client.0.vm03.stdout:0/417: symlink d2/da/d76/l9e 0 2026-03-09T00:03:57.025 INFO:tasks.workunit.client.1.vm06.stdout:3/702: write d11/d3f/f71 [451669,124763] 0 2026-03-09T00:03:57.032 INFO:tasks.workunit.client.1.vm06.stdout:4/608: dwrite d17/d21/d4c/d50/f8c [4194304,4194304] 0 2026-03-09T00:03:57.034 INFO:tasks.workunit.client.0.vm03.stdout:1/516: dwrite d4/d3a/d3d/fa2 [0,4194304] 0 2026-03-09T00:03:57.034 INFO:tasks.workunit.client.0.vm03.stdout:1/517: stat d4/d3a/d61/d78/d81/d93/c9d 0 2026-03-09T00:03:57.035 INFO:tasks.workunit.client.1.vm06.stdout:6/703: dwrite d4/d27/d42/d52/d7d/faf [0,4194304] 0 2026-03-09T00:03:57.035 INFO:tasks.workunit.client.1.vm06.stdout:6/704: write d4/d27/d42/d52/f6c [2680709,57240] 0 2026-03-09T00:03:57.035 INFO:tasks.workunit.client.0.vm03.stdout:1/518: dread d4/d3a/d3d/d46/f70 [0,4194304] 0 2026-03-09T00:03:57.037 INFO:tasks.workunit.client.0.vm03.stdout:8/407: link d7/df/d1a/d2b/f72 d7/df/d1a/d40/f80 0 2026-03-09T00:03:57.037 INFO:tasks.workunit.client.0.vm03.stdout:8/408: chown d7/df/d1a/d2b/d62 40912 1 2026-03-09T00:03:57.037 INFO:tasks.workunit.client.0.vm03.stdout:8/409: write d7/f48 [1238092,46337] 0 2026-03-09T00:03:57.043 INFO:tasks.workunit.client.0.vm03.stdout:0/418: rename d2/d1f to d2/da/d1a/d9f 0 2026-03-09T00:03:57.059 INFO:tasks.workunit.client.0.vm03.stdout:0/419: chown d2/da/d36 27194 1 2026-03-09T00:03:57.059 INFO:tasks.workunit.client.0.vm03.stdout:0/420: dread - d2/da/dd/f7b zero size 2026-03-09T00:03:57.059 INFO:tasks.workunit.client.0.vm03.stdout:0/421: dread - d2/da/d1a/f56 zero size 2026-03-09T00:03:57.059 INFO:tasks.workunit.client.0.vm03.stdout:0/422: dread - d2/da/dd/d49/d6c/f89 zero size 2026-03-09T00:03:57.059 INFO:tasks.workunit.client.1.vm06.stdout:4/609: creat d17/d21/d4c/dc2/fcd x:0 0 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:1/519: creat d4/d15/d5c/fb1 x:0 0 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/410: mknod d7/df/c81 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/411: readlink d7/l13 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/412: read - d7/df/d1e/d3f/f47 zero size 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/413: stat d7/df 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/414: mknod d7/df/d1a/d2b/c82 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/415: creat d7/df/d1e/d38/d60/f83 x:0 0 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/416: truncate d7/df/d1a/d40/f5e 44213 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/417: write d7/df/f37 [2936134,73400] 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/418: chown d7/df/d1e/d38/d60 93699 1 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/419: rename d7/df/d1a/c7e to d7/df/d1a/d40/c84 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/420: write d7/df/d1a/d40/f76 [254755,110758] 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/421: rename d7/df/d1a/d40/f80 to d7/df/d1e/d38/f85 0 2026-03-09T00:03:57.060 INFO:tasks.workunit.client.0.vm03.stdout:8/422: rename d7/df/c81 to d7/df/d1a/d2b/d62/c86 0 2026-03-09T00:03:57.064 INFO:tasks.workunit.client.0.vm03.stdout:1/520: write d4/d15/f8a [4125093,30980] 0 2026-03-09T00:03:57.068 INFO:tasks.workunit.client.0.vm03.stdout:1/521: symlink d4/lb2 0 2026-03-09T00:03:57.070 INFO:tasks.workunit.client.0.vm03.stdout:1/522: truncate d4/d15/d5c/d6c/f71 592869 0 2026-03-09T00:03:57.084 INFO:tasks.workunit.client.0.vm03.stdout:6/407: dread d13/d1e/f3e [0,4194304] 0 2026-03-09T00:03:57.085 INFO:tasks.workunit.client.0.vm03.stdout:6/408: fsync d13/d1e/f3e 0 2026-03-09T00:03:57.085 INFO:tasks.workunit.client.0.vm03.stdout:6/409: dread - d13/d1e/d44/d4a/d52/f7a zero size 2026-03-09T00:03:57.085 INFO:tasks.workunit.client.0.vm03.stdout:6/410: fdatasync d13/f3a 0 2026-03-09T00:03:57.086 INFO:tasks.workunit.client.0.vm03.stdout:8/423: dread d7/f25 [0,4194304] 0 2026-03-09T00:03:57.086 INFO:tasks.workunit.client.0.vm03.stdout:8/424: write d7/df/d1e/d38/d60/f6e [214997,8075] 0 2026-03-09T00:03:57.086 INFO:tasks.workunit.client.0.vm03.stdout:8/425: readlink d7/df/d1a/d2b/l35 0 2026-03-09T00:03:57.086 INFO:tasks.workunit.client.0.vm03.stdout:6/411: link d13/d1e/l56 d13/d1e/d44/d59/l93 0 2026-03-09T00:03:57.086 INFO:tasks.workunit.client.0.vm03.stdout:8/426: creat d7/df/f87 x:0 0 0 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr fail", "who": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T00:03:57.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd='[{"prefix": "mgr fail", "who": "vm06.rzcvhn"}]': finished 2026-03-09T00:03:57.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:56 vm03.local ceph-mon[52346]: mgrmap e26: vm03.yvcons(active, starting, since 0.0500291s) 2026-03-09T00:03:57.089 INFO:tasks.workunit.client.1.vm06.stdout:1/601: dwrite d6/f8c [0,4194304] 0 2026-03-09T00:03:57.090 INFO:tasks.workunit.client.1.vm06.stdout:1/602: creat d6/d4c/d71/d83/fcb x:0 0 0 2026-03-09T00:03:57.123 INFO:tasks.workunit.client.0.vm03.stdout:2/390: dwrite d8/d17/f34 [0,4194304] 0 2026-03-09T00:03:57.125 INFO:tasks.workunit.client.0.vm03.stdout:2/391: write d8/d17/f27 [4322236,12349] 0 2026-03-09T00:03:57.145 INFO:tasks.workunit.client.0.vm03.stdout:8/427: fsync d7/df/d6b/f74 0 2026-03-09T00:03:57.147 INFO:tasks.workunit.client.1.vm06.stdout:3/703: dwrite d11/d28/d2e/d2f/d5b/fea [0,4194304] 0 2026-03-09T00:03:57.151 INFO:tasks.workunit.client.1.vm06.stdout:3/704: readlink d11/lb6 0 2026-03-09T00:03:57.151 INFO:tasks.workunit.client.1.vm06.stdout:3/705: mknod d11/d28/d2e/d2f/d5b/db5/cf7 0 2026-03-09T00:03:57.151 INFO:tasks.workunit.client.1.vm06.stdout:3/706: chown d11/d3f/f4c 3146 1 2026-03-09T00:03:57.151 INFO:tasks.workunit.client.1.vm06.stdout:3/707: symlink d11/d28/d2e/d2f/d5b/db5/lf8 0 2026-03-09T00:03:57.151 INFO:tasks.workunit.client.1.vm06.stdout:3/708: fsync d11/d28/d2e/d2f/d36/f59 0 2026-03-09T00:03:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "mgr fail", "who": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: osdmap e42: 6 total, 6 up, 6 in 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: from='mgr.24345 192.168.123.106:0/3295922597' entity='mgr.vm06.rzcvhn' cmd='[{"prefix": "mgr fail", "who": "vm06.rzcvhn"}]': finished 2026-03-09T00:03:57.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:56 vm06.local ceph-mon[58395]: mgrmap e26: vm03.yvcons(active, starting, since 0.0500291s) 2026-03-09T00:03:57.175 INFO:tasks.workunit.client.1.vm06.stdout:4/610: dwrite d17/d21/d32/f96 [0,4194304] 0 2026-03-09T00:03:57.190 INFO:tasks.workunit.client.1.vm06.stdout:8/656: dwrite db/dd/f27 [0,4194304] 0 2026-03-09T00:03:57.190 INFO:tasks.workunit.client.1.vm06.stdout:8/657: truncate db/dd/f67 268619 0 2026-03-09T00:03:57.204 INFO:tasks.workunit.client.1.vm06.stdout:6/705: dwrite d4/d16/d53/f82 [4194304,4194304] 0 2026-03-09T00:03:57.212 INFO:tasks.workunit.client.0.vm03.stdout:1/523: dwrite d4/d15/f18 [4194304,4194304] 0 2026-03-09T00:03:57.212 INFO:tasks.workunit.client.0.vm03.stdout:1/524: mkdir d4/d3a/d32/d87/db3 0 2026-03-09T00:03:57.213 INFO:tasks.workunit.client.0.vm03.stdout:1/525: mknod d4/d15/d77/d8c/cb4 0 2026-03-09T00:03:57.213 INFO:tasks.workunit.client.0.vm03.stdout:1/526: unlink d4/d3a/d32/da3/f91 0 2026-03-09T00:03:57.213 INFO:tasks.workunit.client.0.vm03.stdout:1/527: write d4/d3a/d32/f4b [4295963,23093] 0 2026-03-09T00:03:57.214 INFO:tasks.workunit.client.1.vm06.stdout:1/603: dwrite d6/d4c/d71/f84 [0,4194304] 0 2026-03-09T00:03:57.214 INFO:tasks.workunit.client.0.vm03.stdout:1/528: mkdir d4/d6/d52/db5 0 2026-03-09T00:03:57.215 INFO:tasks.workunit.client.0.vm03.stdout:1/529: rename d4/d15/c19 to d4/d6/d52/db5/cb6 0 2026-03-09T00:03:57.216 INFO:tasks.workunit.client.0.vm03.stdout:1/530: creat d4/d15/d5c/d6c/fb7 x:0 0 0 2026-03-09T00:03:57.217 INFO:tasks.workunit.client.1.vm06.stdout:1/604: mknod d6/d4c/d71/ccc 0 2026-03-09T00:03:57.219 INFO:tasks.workunit.client.1.vm06.stdout:1/605: mknod d6/d4c/d71/d83/ccd 0 2026-03-09T00:03:57.221 INFO:tasks.workunit.client.0.vm03.stdout:1/531: mknod d4/d3a/d61/cb8 0 2026-03-09T00:03:57.229 INFO:tasks.workunit.client.0.vm03.stdout:1/532: dread d4/f7d [0,4194304] 0 2026-03-09T00:03:57.229 INFO:tasks.workunit.client.0.vm03.stdout:4/529: dwrite d7/d20/d29/d38/d3a/d90/fa3 [0,4194304] 0 2026-03-09T00:03:57.237 INFO:tasks.workunit.client.0.vm03.stdout:2/392: dwrite d8/d1b/d2a/f4c [0,4194304] 0 2026-03-09T00:03:57.237 INFO:tasks.workunit.client.0.vm03.stdout:2/393: chown d8/f5d 4082090 1 2026-03-09T00:03:57.237 INFO:tasks.workunit.client.0.vm03.stdout:4/530: creat d7/fa7 x:0 0 0 2026-03-09T00:03:57.237 INFO:tasks.workunit.client.0.vm03.stdout:1/533: dread d4/d15/f17 [0,4194304] 0 2026-03-09T00:03:57.240 INFO:tasks.workunit.client.0.vm03.stdout:2/394: creat d8/d26/d5e/f7c x:0 0 0 2026-03-09T00:03:57.242 INFO:tasks.workunit.client.0.vm03.stdout:4/531: mknod d7/d20/d6a/ca8 0 2026-03-09T00:03:57.244 INFO:tasks.workunit.client.0.vm03.stdout:4/532: mkdir d7/d20/d29/d38/da9 0 2026-03-09T00:03:57.248 INFO:tasks.workunit.client.0.vm03.stdout:2/395: rmdir d8/d1b/d2a/d61 0 2026-03-09T00:03:57.248 INFO:tasks.workunit.client.0.vm03.stdout:4/533: mknod d7/d20/caa 0 2026-03-09T00:03:57.249 INFO:tasks.workunit.client.0.vm03.stdout:8/428: dwrite d7/df/d1a/d2b/f44 [0,4194304] 0 2026-03-09T00:03:57.250 INFO:tasks.workunit.client.1.vm06.stdout:3/709: dwrite d11/d28/d2e/d2f/d5b/d5f/f81 [0,4194304] 0 2026-03-09T00:03:57.250 INFO:tasks.workunit.client.0.vm03.stdout:2/396: readlink d8/d1b/d2a/d6b/l5b 0 2026-03-09T00:03:57.251 INFO:tasks.workunit.client.0.vm03.stdout:4/534: mknod d7/d20/d29/d54/cab 0 2026-03-09T00:03:57.251 INFO:tasks.workunit.client.0.vm03.stdout:4/535: chown d7/f71 1030094 1 2026-03-09T00:03:57.254 INFO:tasks.workunit.client.0.vm03.stdout:8/429: dread d7/df/d1e/d38/d60/f6e [0,4194304] 0 2026-03-09T00:03:57.254 INFO:tasks.workunit.client.0.vm03.stdout:8/430: chown d7/f18 393373611 1 2026-03-09T00:03:57.254 INFO:tasks.workunit.client.0.vm03.stdout:8/431: truncate d7/df/d1a/d2b/f77 302157 0 2026-03-09T00:03:57.257 INFO:tasks.workunit.client.1.vm06.stdout:3/710: unlink d11/d28/d2e/d2f/d36/f55 0 2026-03-09T00:03:57.271 INFO:tasks.workunit.client.1.vm06.stdout:0/716: rename d3/d18/d1f/d44/d6a to d3/d18/d79/df2 0 2026-03-09T00:03:57.279 INFO:tasks.workunit.client.0.vm03.stdout:2/397: chown d8/c28 16283063 1 2026-03-09T00:03:57.284 INFO:tasks.workunit.client.1.vm06.stdout:4/611: rename d17/d24/d3b/d54/fa5 to d17/d24/fce 0 2026-03-09T00:03:57.284 INFO:tasks.workunit.client.1.vm06.stdout:4/612: truncate d17/d24/d49/d5f/fad 696343 0 2026-03-09T00:03:57.285 INFO:tasks.workunit.client.0.vm03.stdout:8/432: rename d7/f10 to d7/df/d6b/f88 0 2026-03-09T00:03:57.286 INFO:tasks.workunit.client.1.vm06.stdout:7/702: sync 2026-03-09T00:03:57.286 INFO:tasks.workunit.client.1.vm06.stdout:5/768: sync 2026-03-09T00:03:57.286 INFO:tasks.workunit.client.1.vm06.stdout:5/769: write d5/d1c/d21/f96 [609671,23767] 0 2026-03-09T00:03:57.299 INFO:tasks.workunit.client.1.vm06.stdout:6/706: dwrite d4/d16/d53/f82 [4194304,4194304] 0 2026-03-09T00:03:57.300 INFO:tasks.workunit.client.1.vm06.stdout:0/717: symlink d3/d18/de9/d9f/lf3 0 2026-03-09T00:03:57.300 INFO:tasks.workunit.client.1.vm06.stdout:0/718: chown d3/d18/d1f/d44/lad 307 1 2026-03-09T00:03:57.301 INFO:tasks.workunit.client.0.vm03.stdout:4/536: unlink d7/d20/d29/l8e 0 2026-03-09T00:03:57.304 INFO:tasks.workunit.client.1.vm06.stdout:8/658: rename db/d53/fbd to db/d1e/fda 0 2026-03-09T00:03:57.304 INFO:tasks.workunit.client.1.vm06.stdout:8/659: chown db/d74/d78 62317989 1 2026-03-09T00:03:57.306 INFO:tasks.workunit.client.0.vm03.stdout:8/433: write f3 [1470119,72772] 0 2026-03-09T00:03:57.307 INFO:tasks.workunit.client.0.vm03.stdout:8/434: chown d7/df/d1e/d38 23430639 1 2026-03-09T00:03:57.307 INFO:tasks.workunit.client.1.vm06.stdout:3/711: dwrite d11/d28/d2e/d2f/f99 [0,4194304] 0 2026-03-09T00:03:57.307 INFO:tasks.workunit.client.1.vm06.stdout:3/712: dread - d11/d28/d4d/d89/d90/fa7 zero size 2026-03-09T00:03:57.310 INFO:tasks.workunit.client.0.vm03.stdout:5/446: sync 2026-03-09T00:03:57.310 INFO:tasks.workunit.client.0.vm03.stdout:5/447: stat ff 0 2026-03-09T00:03:57.310 INFO:tasks.workunit.client.0.vm03.stdout:5/448: write d1c/d20/d55/d3b/f57 [4667215,50862] 0 2026-03-09T00:03:57.310 INFO:tasks.workunit.client.1.vm06.stdout:5/770: symlink d5/d44/d84/dc5/de8/l104 0 2026-03-09T00:03:57.310 INFO:tasks.workunit.client.1.vm06.stdout:5/771: write d5/f36 [3700492,112265] 0 2026-03-09T00:03:57.310 INFO:tasks.workunit.client.0.vm03.stdout:8/435: symlink d7/df/d6b/l89 0 2026-03-09T00:03:57.311 INFO:tasks.workunit.client.0.vm03.stdout:8/436: dread d7/df/d1e/d38/f3e [0,4194304] 0 2026-03-09T00:03:57.312 INFO:tasks.workunit.client.1.vm06.stdout:6/707: mkdir d4/d27/d42/d4b/ddb 0 2026-03-09T00:03:57.313 INFO:tasks.workunit.client.0.vm03.stdout:6/412: getdents d13/d1e/d44/d4a 0 2026-03-09T00:03:57.315 INFO:tasks.workunit.client.1.vm06.stdout:5/772: write d5/d44/d4b/d92/f86 [394557,8817] 0 2026-03-09T00:03:57.315 INFO:tasks.workunit.client.1.vm06.stdout:5/773: chown d5/ff 65428097 1 2026-03-09T00:03:57.315 INFO:tasks.workunit.client.0.vm03.stdout:5/449: creat d1c/f96 x:0 0 0 2026-03-09T00:03:57.315 INFO:tasks.workunit.client.0.vm03.stdout:5/450: truncate d1c/d20/d56/d74/f84 569164 0 2026-03-09T00:03:57.315 INFO:tasks.workunit.client.0.vm03.stdout:5/451: chown d1c/f1f 214819678 1 2026-03-09T00:03:57.317 INFO:tasks.workunit.client.1.vm06.stdout:7/703: rename d0/d55/d99/fb9 to d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc9 0 2026-03-09T00:03:57.317 INFO:tasks.workunit.client.1.vm06.stdout:7/704: creat d0/d39/fca x:0 0 0 2026-03-09T00:03:57.317 INFO:tasks.workunit.client.0.vm03.stdout:8/437: symlink d7/df/d1e/d38/d60/l8a 0 2026-03-09T00:03:57.322 INFO:tasks.workunit.client.0.vm03.stdout:6/413: creat d13/d1e/d44/d59/d77/f94 x:0 0 0 2026-03-09T00:03:57.324 INFO:tasks.workunit.client.0.vm03.stdout:5/452: unlink d1c/d20/d55/c48 0 2026-03-09T00:03:57.324 INFO:tasks.workunit.client.0.vm03.stdout:5/453: fdatasync d1c/d20/d55/d66/d70/f80 0 2026-03-09T00:03:57.324 INFO:tasks.workunit.client.0.vm03.stdout:5/454: chown d1c/d20/d55/d43/l79 0 1 2026-03-09T00:03:57.324 INFO:tasks.workunit.client.0.vm03.stdout:5/455: readlink d1c/d20/d55/d43/l8b 0 2026-03-09T00:03:57.326 INFO:tasks.workunit.client.0.vm03.stdout:8/438: symlink d7/df/d1e/d38/d4c/l8b 0 2026-03-09T00:03:57.330 INFO:tasks.workunit.client.0.vm03.stdout:6/414: link d13/l8e d13/d1e/d44/d4a/d52/l95 0 2026-03-09T00:03:57.339 INFO:tasks.workunit.client.0.vm03.stdout:6/415: chown d13/l38 2558015 1 2026-03-09T00:03:57.339 INFO:tasks.workunit.client.0.vm03.stdout:6/416: creat d13/d1e/d44/d59/d77/f96 x:0 0 0 2026-03-09T00:03:57.339 INFO:tasks.workunit.client.0.vm03.stdout:6/417: truncate d13/d35/f6a 277916 0 2026-03-09T00:03:57.339 INFO:tasks.workunit.client.1.vm06.stdout:3/713: creat d11/d28/d2e/d7e/d83/d87/ff9 x:0 0 0 2026-03-09T00:03:57.340 INFO:tasks.workunit.client.0.vm03.stdout:8/439: write d7/df/d1a/d40/f5e [1546,75260] 0 2026-03-09T00:03:57.340 INFO:tasks.workunit.client.0.vm03.stdout:8/440: truncate d7/df/d1a/f33 371508 0 2026-03-09T00:03:57.340 INFO:tasks.workunit.client.0.vm03.stdout:8/441: write d7/df/d1a/d40/f5e [956110,11155] 0 2026-03-09T00:03:57.340 INFO:tasks.workunit.client.0.vm03.stdout:8/442: creat d7/df/d1a/d40/d58/f8c x:0 0 0 2026-03-09T00:03:57.340 INFO:tasks.workunit.client.0.vm03.stdout:8/443: write d7/df/d1a/f33 [1175221,27142] 0 2026-03-09T00:03:57.344 INFO:tasks.workunit.client.0.vm03.stdout:5/456: unlink d1c/c2d 0 2026-03-09T00:03:57.344 INFO:tasks.workunit.client.0.vm03.stdout:5/457: write d1c/d20/f4e [5094413,88153] 0 2026-03-09T00:03:57.344 INFO:tasks.workunit.client.0.vm03.stdout:5/458: chown d1c/d20/d56/l95 4007 1 2026-03-09T00:03:57.344 INFO:tasks.workunit.client.1.vm06.stdout:6/708: truncate d4/d27/fb6 103939 0 2026-03-09T00:03:57.345 INFO:tasks.workunit.client.1.vm06.stdout:0/719: getdents d3/d18/d1f/d39 0 2026-03-09T00:03:57.351 INFO:tasks.workunit.client.1.vm06.stdout:8/660: rename db/d53/d70/d38/f99 to db/d53/d7c/fdb 0 2026-03-09T00:03:57.355 INFO:tasks.workunit.client.1.vm06.stdout:3/714: dread d11/d28/d2e/d7e/fd3 [0,4194304] 0 2026-03-09T00:03:57.359 INFO:tasks.workunit.client.1.vm06.stdout:7/705: truncate d0/df/d1a/d27/d4c/f32 1512652 0 2026-03-09T00:03:57.359 INFO:tasks.workunit.client.1.vm06.stdout:7/706: creat d0/d55/d99/fcb x:0 0 0 2026-03-09T00:03:57.361 INFO:tasks.workunit.client.0.vm03.stdout:9/457: sync 2026-03-09T00:03:57.365 INFO:tasks.workunit.client.0.vm03.stdout:9/458: dread d15/f44 [0,4194304] 0 2026-03-09T00:03:57.380 INFO:tasks.workunit.client.1.vm06.stdout:0/720: symlink d3/d18/d2c/d2d/d74/daf/lf4 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.1.vm06.stdout:0/721: write d3/d18/d1f/d39/d69/f71 [207105,1954] 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.1.vm06.stdout:0/722: creat d3/d18/d28/ff5 x:0 0 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.1.vm06.stdout:8/661: unlink db/d1e/l2b 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.0.vm03.stdout:5/459: mkdir d1c/d20/d97 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.0.vm03.stdout:6/418: rename d13/d35/d7e to d13/d35/d71/d97 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.0.vm03.stdout:6/419: creat d13/d1e/d44/d59/d77/f98 x:0 0 0 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.0.vm03.stdout:3/317: sync 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.0.vm03.stdout:7/411: sync 2026-03-09T00:03:57.381 INFO:tasks.workunit.client.0.vm03.stdout:7/412: chown d2/d4/d1e/l53 7503882 1 2026-03-09T00:03:57.383 INFO:tasks.workunit.client.1.vm06.stdout:8/662: dread db/d74/faa [0,4194304] 0 2026-03-09T00:03:57.383 INFO:tasks.workunit.client.1.vm06.stdout:4/613: dwrite d17/d24/f39 [0,4194304] 0 2026-03-09T00:03:57.383 INFO:tasks.workunit.client.1.vm06.stdout:4/614: creat d17/d21/d4c/d66/fcf x:0 0 0 2026-03-09T00:03:57.385 INFO:tasks.workunit.client.1.vm06.stdout:3/715: link d11/d28/d2e/d2f/f78 d11/d28/d2e/d2f/d5b/d5f/d91/ffa 0 2026-03-09T00:03:57.385 INFO:tasks.workunit.client.0.vm03.stdout:0/423: sync 2026-03-09T00:03:57.386 INFO:tasks.workunit.client.0.vm03.stdout:6/420: truncate d13/d1e/f48 2644632 0 2026-03-09T00:03:57.390 INFO:tasks.workunit.client.0.vm03.stdout:8/444: rename d7/df/d1a/f2a to d7/df/d1a/d2b/f8d 0 2026-03-09T00:03:57.391 INFO:tasks.workunit.client.0.vm03.stdout:8/445: write d7/df/d1e/d38/d60/f6e [48729,109557] 0 2026-03-09T00:03:57.391 INFO:tasks.workunit.client.0.vm03.stdout:8/446: write d7/df/d1a/d2b/f77 [766414,6761] 0 2026-03-09T00:03:57.391 INFO:tasks.workunit.client.0.vm03.stdout:8/447: read d7/df/d6b/f88 [386620,117267] 0 2026-03-09T00:03:57.393 INFO:tasks.workunit.client.0.vm03.stdout:3/318: mknod d2/db/d3b/d5d/c5e 0 2026-03-09T00:03:57.394 INFO:tasks.workunit.client.1.vm06.stdout:4/615: write f15 [2381289,129767] 0 2026-03-09T00:03:57.394 INFO:tasks.workunit.client.1.vm06.stdout:4/616: readlink d17/d24/d49/l46 0 2026-03-09T00:03:57.397 INFO:tasks.workunit.client.0.vm03.stdout:4/537: dwrite d7/d20/d29/f2a [4194304,4194304] 0 2026-03-09T00:03:57.397 INFO:tasks.workunit.client.0.vm03.stdout:4/538: dread - d7/f71 zero size 2026-03-09T00:03:57.407 INFO:tasks.workunit.client.1.vm06.stdout:7/707: mkdir d0/df/d1a/d27/d4c/d40/d51/d90/dcc 0 2026-03-09T00:03:57.410 INFO:tasks.workunit.client.1.vm06.stdout:2/743: sync 2026-03-09T00:03:57.410 INFO:tasks.workunit.client.1.vm06.stdout:2/744: chown d7/d1a/d56/lb9 267230528 1 2026-03-09T00:03:57.410 INFO:tasks.workunit.client.1.vm06.stdout:2/745: chown d7/l68 594870518 1 2026-03-09T00:03:57.410 INFO:tasks.workunit.client.1.vm06.stdout:2/746: chown d7/d1a/fd3 1 1 2026-03-09T00:03:57.410 INFO:tasks.workunit.client.1.vm06.stdout:9/568: sync 2026-03-09T00:03:57.411 INFO:tasks.workunit.client.0.vm03.stdout:7/413: mknod d2/d4/d1e/c7b 0 2026-03-09T00:03:57.411 INFO:tasks.workunit.client.0.vm03.stdout:7/414: readlink d2/d1f/l20 0 2026-03-09T00:03:57.416 INFO:tasks.workunit.client.0.vm03.stdout:6/421: creat d13/d35/d4c/f99 x:0 0 0 2026-03-09T00:03:57.421 INFO:tasks.workunit.client.1.vm06.stdout:0/723: chown d3/d18/d1f/c24 284215 1 2026-03-09T00:03:57.434 INFO:tasks.workunit.client.0.vm03.stdout:1/534: dwrite d4/d3a/f2c [0,4194304] 0 2026-03-09T00:03:57.434 INFO:tasks.workunit.client.0.vm03.stdout:8/448: creat d7/df/d1e/d3f/f8e x:0 0 0 2026-03-09T00:03:57.434 INFO:tasks.workunit.client.0.vm03.stdout:8/449: chown d7/f18 1622776 1 2026-03-09T00:03:57.435 INFO:tasks.workunit.client.0.vm03.stdout:3/319: mkdir d2/db/d3b/d5f 0 2026-03-09T00:03:57.435 INFO:tasks.workunit.client.0.vm03.stdout:2/398: dwrite d8/d1b/d24/f2f [0,4194304] 0 2026-03-09T00:03:57.435 INFO:tasks.workunit.client.0.vm03.stdout:2/399: chown d8 26742706 1 2026-03-09T00:03:57.435 INFO:tasks.workunit.client.0.vm03.stdout:2/400: write d8/d1b/d2a/d6b/d50/f54 [872037,72663] 0 2026-03-09T00:03:57.439 INFO:tasks.workunit.client.0.vm03.stdout:4/539: mknod d7/d20/d35/d66/cac 0 2026-03-09T00:03:57.439 INFO:tasks.workunit.client.0.vm03.stdout:4/540: stat d7/d20/d35/c63 0 2026-03-09T00:03:57.441 INFO:tasks.workunit.client.1.vm06.stdout:4/617: symlink d17/d5b/d8f/ld0 0 2026-03-09T00:03:57.441 INFO:tasks.workunit.client.1.vm06.stdout:4/618: truncate f1 13292076 0 2026-03-09T00:03:57.441 INFO:tasks.workunit.client.1.vm06.stdout:4/619: readlink d17/l1c 0 2026-03-09T00:03:57.445 INFO:tasks.workunit.client.1.vm06.stdout:2/747: mknod d7/d1a/d25/d66/d87/da8/ce0 0 2026-03-09T00:03:57.448 INFO:tasks.workunit.client.1.vm06.stdout:2/748: stat d7/d1b/d71/d79/db4/dc1/f5e 0 2026-03-09T00:03:57.448 INFO:tasks.workunit.client.0.vm03.stdout:1/535: unlink d4/d15/d1a/f92 0 2026-03-09T00:03:57.448 INFO:tasks.workunit.client.0.vm03.stdout:1/536: write d4/d3a/d32/f4b [3426310,19863] 0 2026-03-09T00:03:57.451 INFO:tasks.workunit.client.1.vm06.stdout:4/620: dread d17/d21/d4c/d50/f69 [0,4194304] 0 2026-03-09T00:03:57.455 INFO:tasks.workunit.client.0.vm03.stdout:8/450: unlink d7/df/d1e/d38/d60/f83 0 2026-03-09T00:03:57.456 INFO:tasks.workunit.client.0.vm03.stdout:8/451: creat d7/df/d1e/d3f/f8f x:0 0 0 2026-03-09T00:03:57.456 INFO:tasks.workunit.client.0.vm03.stdout:8/452: truncate d7/df/d6b/f75 302268 0 2026-03-09T00:03:57.456 INFO:tasks.workunit.client.0.vm03.stdout:8/453: fdatasync d7/df/d1a/d40/f78 0 2026-03-09T00:03:57.456 INFO:tasks.workunit.client.0.vm03.stdout:8/454: creat d7/df/d1e/d3f/f90 x:0 0 0 2026-03-09T00:03:57.457 INFO:tasks.workunit.client.0.vm03.stdout:1/537: write d4/fa0 [1298487,44875] 0 2026-03-09T00:03:57.457 INFO:tasks.workunit.client.0.vm03.stdout:1/538: write d4/d3a/d3d/d46/f70 [881271,17756] 0 2026-03-09T00:03:57.457 INFO:tasks.workunit.client.0.vm03.stdout:1/539: write d4/d3a/d32/f4f [93551,87774] 0 2026-03-09T00:03:57.459 INFO:tasks.workunit.client.1.vm06.stdout:9/569: rename d1/d3/d4f/d91/f99 to d1/d3/d50/fba 0 2026-03-09T00:03:57.459 INFO:tasks.workunit.client.1.vm06.stdout:9/570: write d1/d4/fa2 [389343,103275] 0 2026-03-09T00:03:57.459 INFO:tasks.workunit.client.1.vm06.stdout:9/571: dread - d1/d3/f9b zero size 2026-03-09T00:03:57.459 INFO:tasks.workunit.client.1.vm06.stdout:9/572: chown d1/d4/d6e/d9/c1e 57 1 2026-03-09T00:03:57.459 INFO:tasks.workunit.client.1.vm06.stdout:9/573: fsync d1/d4/d6e/fa9 0 2026-03-09T00:03:57.467 INFO:tasks.workunit.client.1.vm06.stdout:0/724: symlink d3/d18/de9/d9f/lf6 0 2026-03-09T00:03:57.481 INFO:tasks.workunit.client.0.vm03.stdout:3/320: creat d2/db/d3b/d5d/f60 x:0 0 0 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:3/321: fdatasync d2/db/d3b/f3e 0 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:3/322: dread - d2/db/d40/d51/f5c zero size 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:3/323: write d2/f1d [2758327,90117] 0 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:2/401: chown d8/d1b/d2a/l36 208 1 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:4/541: mknod d7/d6f/da5/cad 0 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:4/542: dread - d7/d20/d6a/f76 zero size 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:4/543: read f4 [1990091,120090] 0 2026-03-09T00:03:57.482 INFO:tasks.workunit.client.0.vm03.stdout:9/459: rename d15/d1c/d21/d67 to d15/d1c/d21/d54/d87/d93 0 2026-03-09T00:03:57.483 INFO:tasks.workunit.client.1.vm06.stdout:7/708: getdents d0/df/d1a/d27/d4c/d40/d5b 0 2026-03-09T00:03:57.486 INFO:tasks.workunit.client.1.vm06.stdout:6/709: dwrite d4/d27/d3e/d78/f92 [0,4194304] 0 2026-03-09T00:03:57.488 INFO:tasks.workunit.client.1.vm06.stdout:1/606: sync 2026-03-09T00:03:57.489 INFO:tasks.workunit.client.0.vm03.stdout:4/544: creat d7/d20/d6a/d77/fae x:0 0 0 2026-03-09T00:03:57.489 INFO:tasks.workunit.client.0.vm03.stdout:4/545: write d7/d20/d6a/d77/f82 [3078170,24511] 0 2026-03-09T00:03:57.489 INFO:tasks.workunit.client.0.vm03.stdout:9/460: symlink d15/d1c/d8d/l94 0 2026-03-09T00:03:57.491 INFO:tasks.workunit.client.1.vm06.stdout:6/710: dread d4/d16/f33 [0,4194304] 0 2026-03-09T00:03:57.494 INFO:tasks.workunit.client.0.vm03.stdout:9/461: dread d15/d1c/d21/d54/d87/d93/f7e [0,4194304] 0 2026-03-09T00:03:57.494 INFO:tasks.workunit.client.0.vm03.stdout:9/462: truncate d15/d1c/d28/f5e 451717 0 2026-03-09T00:03:57.496 INFO:tasks.workunit.client.1.vm06.stdout:4/621: symlink d17/d21/d32/ld1 0 2026-03-09T00:03:57.498 INFO:tasks.workunit.client.0.vm03.stdout:3/324: dread d2/db/f3a [0,4194304] 0 2026-03-09T00:03:57.498 INFO:tasks.workunit.client.0.vm03.stdout:3/325: read - d2/db/d40/d44/f4d zero size 2026-03-09T00:03:57.498 INFO:tasks.workunit.client.0.vm03.stdout:3/326: rename d2 to d2/db/d40/d51/d61 22 2026-03-09T00:03:57.500 INFO:tasks.workunit.client.1.vm06.stdout:8/663: rmdir db/d53/d70/d38 39 2026-03-09T00:03:57.500 INFO:tasks.workunit.client.1.vm06.stdout:8/664: fsync db/dd/d85/d9f/f88 0 2026-03-09T00:03:57.500 INFO:tasks.workunit.client.1.vm06.stdout:8/665: truncate db/d1e/f4f 1664189 0 2026-03-09T00:03:57.504 INFO:tasks.workunit.client.0.vm03.stdout:1/540: write d4/d15/d77/fa9 [6329889,55191] 0 2026-03-09T00:03:57.511 INFO:tasks.workunit.client.0.vm03.stdout:1/541: creat d4/d3a/d32/fb9 x:0 0 0 2026-03-09T00:03:57.511 INFO:tasks.workunit.client.0.vm03.stdout:1/542: creat d4/d3a/d43/fba x:0 0 0 2026-03-09T00:03:57.511 INFO:tasks.workunit.client.0.vm03.stdout:1/543: write d4/d15/f4e [1182104,82742] 0 2026-03-09T00:03:57.511 INFO:tasks.workunit.client.0.vm03.stdout:2/402: getdents d8/d1b/d2a 0 2026-03-09T00:03:57.512 INFO:tasks.workunit.client.0.vm03.stdout:4/546: mknod d7/d20/d29/d54/d58/d85/caf 0 2026-03-09T00:03:57.512 INFO:tasks.workunit.client.1.vm06.stdout:9/574: creat d1/d3/d2b/fbb x:0 0 0 2026-03-09T00:03:57.512 INFO:tasks.workunit.client.1.vm06.stdout:9/575: fdatasync d1/d4/d6e/d14/d25/d85/f28 0 2026-03-09T00:03:57.512 INFO:tasks.workunit.client.1.vm06.stdout:0/725: mknod d3/cf7 0 2026-03-09T00:03:57.517 INFO:tasks.workunit.client.0.vm03.stdout:9/463: symlink d15/d1c/d21/d54/d87/l95 0 2026-03-09T00:03:57.517 INFO:tasks.workunit.client.0.vm03.stdout:3/327: chown d2/db/l22 4159050 1 2026-03-09T00:03:57.517 INFO:tasks.workunit.client.1.vm06.stdout:1/607: rename d6/d4c/d51/c6f to d6/d21/d2d/d3b/d42/cce 0 2026-03-09T00:03:57.520 INFO:tasks.workunit.client.1.vm06.stdout:4/622: symlink d17/d24/d3b/d5e/d6e/ld2 0 2026-03-09T00:03:57.523 INFO:tasks.workunit.client.1.vm06.stdout:4/623: chown d17/f20 64 1 2026-03-09T00:03:57.523 INFO:tasks.workunit.client.1.vm06.stdout:4/624: chown d17/d21/d4c/d50 26208601 1 2026-03-09T00:03:57.524 INFO:tasks.workunit.client.1.vm06.stdout:1/608: dread d6/d4c/d71/d83/f9b [0,4194304] 0 2026-03-09T00:03:57.524 INFO:tasks.workunit.client.1.vm06.stdout:1/609: chown d6/d21/l2c 3662 1 2026-03-09T00:03:57.525 INFO:tasks.workunit.client.0.vm03.stdout:1/544: truncate d4/d15/d1a/f1b 1036805 0 2026-03-09T00:03:57.529 INFO:tasks.workunit.client.1.vm06.stdout:1/610: dread d6/fb [0,4194304] 0 2026-03-09T00:03:57.531 INFO:tasks.workunit.client.0.vm03.stdout:7/415: dwrite d2/f50 [4194304,4194304] 0 2026-03-09T00:03:57.531 INFO:tasks.workunit.client.1.vm06.stdout:4/625: fsync f15 0 2026-03-09T00:03:57.531 INFO:tasks.workunit.client.1.vm06.stdout:4/626: fsync d17/d24/d3b/d54/f80 0 2026-03-09T00:03:57.543 INFO:tasks.workunit.client.0.vm03.stdout:2/403: symlink d8/d1b/d2a/d2e/l7d 0 2026-03-09T00:03:57.549 INFO:tasks.workunit.client.0.vm03.stdout:4/547: mkdir d7/d6f/da5/db0 0 2026-03-09T00:03:57.549 INFO:tasks.workunit.client.0.vm03.stdout:9/464: symlink d15/d1c/d21/d54/d87/d93/l96 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:8/666: creat db/dd/d24/da7/dab/fdc x:0 0 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:0/726: creat d3/d18/d2c/d2d/d74/daf/de3/ff8 x:0 0 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:0/727: truncate d3/f1b 4767462 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:1/611: mknod d6/d21/d2d/d3b/d42/ccf 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:1/612: fsync d6/d4c/d71/d83/fcb 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:4/627: creat d17/d5b/d8f/fd3 x:0 0 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:4/628: stat d17/d21/d4c/d66/cab 0 2026-03-09T00:03:57.550 INFO:tasks.workunit.client.1.vm06.stdout:4/629: creat d17/d21/d4c/fd4 x:0 0 0 2026-03-09T00:03:57.551 INFO:tasks.workunit.client.0.vm03.stdout:1/545: creat d4/d3a/d61/d78/d81/d9e/fbb x:0 0 0 2026-03-09T00:03:57.556 INFO:tasks.workunit.client.1.vm06.stdout:8/667: rename db/dd/d24/d63/f92 to db/dd/d85/d9f/db7/fdd 0 2026-03-09T00:03:57.556 INFO:tasks.workunit.client.1.vm06.stdout:8/668: fsync db/d53/d70/fa6 0 2026-03-09T00:03:57.556 INFO:tasks.workunit.client.1.vm06.stdout:8/669: fsync db/d53/d70/f71 0 2026-03-09T00:03:57.556 INFO:tasks.workunit.client.1.vm06.stdout:8/670: fsync db/dd/d85/d9f/f88 0 2026-03-09T00:03:57.557 INFO:tasks.workunit.client.1.vm06.stdout:8/671: chown db/dd/d24/da7 107222 1 2026-03-09T00:03:57.557 INFO:tasks.workunit.client.1.vm06.stdout:1/613: mkdir d6/d21/d2d/dd0 0 2026-03-09T00:03:57.559 INFO:tasks.workunit.client.0.vm03.stdout:7/416: mknod d2/d4/d1e/d5e/d6c/d37/c7c 0 2026-03-09T00:03:57.560 INFO:tasks.workunit.client.0.vm03.stdout:2/404: link d8/d1b/d24/c52 d8/d1b/d6c/c7e 0 2026-03-09T00:03:57.560 INFO:tasks.workunit.client.0.vm03.stdout:2/405: chown d8/l19 15194568 1 2026-03-09T00:03:57.564 INFO:tasks.workunit.client.0.vm03.stdout:4/548: mknod d7/d20/cb1 0 2026-03-09T00:03:57.564 INFO:tasks.workunit.client.0.vm03.stdout:9/465: symlink d15/d1c/d21/d75/l97 0 2026-03-09T00:03:57.566 INFO:tasks.workunit.client.0.vm03.stdout:4/549: dread d7/d20/d6a/d77/d25/fa1 [0,4194304] 0 2026-03-09T00:03:57.567 INFO:tasks.workunit.client.1.vm06.stdout:8/672: mknod db/d53/d70/d38/d4d/d79/dd5/cde 0 2026-03-09T00:03:57.568 INFO:tasks.workunit.client.1.vm06.stdout:8/673: dread db/f31 [0,4194304] 0 2026-03-09T00:03:57.568 INFO:tasks.workunit.client.1.vm06.stdout:8/674: truncate db/d1e/d9b/fbc 205610 0 2026-03-09T00:03:57.569 INFO:tasks.workunit.client.1.vm06.stdout:8/675: dread db/d74/faa [0,4194304] 0 2026-03-09T00:03:57.572 INFO:tasks.workunit.client.0.vm03.stdout:0/424: dwrite d2/da/d1a/f80 [0,4194304] 0 2026-03-09T00:03:57.573 INFO:tasks.workunit.client.0.vm03.stdout:5/460: dwrite d1c/d20/d55/d66/d70/f80 [0,4194304] 0 2026-03-09T00:03:57.573 INFO:tasks.workunit.client.0.vm03.stdout:5/461: write d1c/d20/f65 [852444,69529] 0 2026-03-09T00:03:57.573 INFO:tasks.workunit.client.0.vm03.stdout:6/422: dwrite f12 [0,4194304] 0 2026-03-09T00:03:57.575 INFO:tasks.workunit.client.0.vm03.stdout:3/328: write d2/f5 [2228209,36206] 0 2026-03-09T00:03:57.575 INFO:tasks.workunit.client.0.vm03.stdout:3/329: chown d2/db/d40/d51/f5c 216 1 2026-03-09T00:03:57.576 INFO:tasks.workunit.client.0.vm03.stdout:0/425: dread d2/da/d1a/f80 [0,4194304] 0 2026-03-09T00:03:57.576 INFO:tasks.workunit.client.1.vm06.stdout:1/614: truncate d6/d21/d2d/d37/fb5 3706027 0 2026-03-09T00:03:57.577 INFO:tasks.workunit.client.0.vm03.stdout:1/546: mknod d4/d3a/d61/d78/d81/cbc 0 2026-03-09T00:03:57.581 INFO:tasks.workunit.client.0.vm03.stdout:7/417: mkdir d2/d1f/d42/d43/d7d 0 2026-03-09T00:03:57.584 INFO:tasks.workunit.client.0.vm03.stdout:2/406: mknod d8/d26/d5e/c7f 0 2026-03-09T00:03:57.585 INFO:tasks.workunit.client.0.vm03.stdout:7/418: dread d2/d1f/f11 [0,4194304] 0 2026-03-09T00:03:57.585 INFO:tasks.workunit.client.0.vm03.stdout:7/419: chown d2/d1f/d35/l66 39715590 1 2026-03-09T00:03:57.585 INFO:tasks.workunit.client.1.vm06.stdout:3/716: dwrite d11/d28/d2e/f47 [0,4194304] 0 2026-03-09T00:03:57.585 INFO:tasks.workunit.client.1.vm06.stdout:3/717: fdatasync d11/d28/d2e/db2/fda 0 2026-03-09T00:03:57.585 INFO:tasks.workunit.client.1.vm06.stdout:7/709: dwrite d0/df/d1a/d27/d70/fc4 [0,4194304] 0 2026-03-09T00:03:57.586 INFO:tasks.workunit.client.0.vm03.stdout:8/455: dwrite d7/df/d1e/d38/d60/f71 [0,4194304] 0 2026-03-09T00:03:57.586 INFO:tasks.workunit.client.0.vm03.stdout:8/456: write d7/df/d1a/d40/f76 [444903,84886] 0 2026-03-09T00:03:57.586 INFO:tasks.workunit.client.0.vm03.stdout:8/457: truncate d7/df/f55 168178 0 2026-03-09T00:03:57.587 INFO:tasks.workunit.client.1.vm06.stdout:6/711: dwrite d4/d27/d3e/d57/f65 [0,4194304] 0 2026-03-09T00:03:57.592 INFO:tasks.workunit.client.1.vm06.stdout:9/576: dwrite d1/d3/d2b/f33 [0,4194304] 0 2026-03-09T00:03:57.593 INFO:tasks.workunit.client.0.vm03.stdout:9/466: mknod d15/c98 0 2026-03-09T00:03:57.593 INFO:tasks.workunit.client.1.vm06.stdout:6/712: dread d4/d27/f70 [0,4194304] 0 2026-03-09T00:03:57.593 INFO:tasks.workunit.client.1.vm06.stdout:6/713: chown d4/d16/f63 21952995 1 2026-03-09T00:03:57.597 INFO:tasks.workunit.client.0.vm03.stdout:4/550: creat d7/d6f/fb2 x:0 0 0 2026-03-09T00:03:57.601 INFO:tasks.workunit.client.0.vm03.stdout:1/547: dread d4/fa0 [0,4194304] 0 2026-03-09T00:03:57.604 INFO:tasks.workunit.client.1.vm06.stdout:9/577: dread d1/d4/d6e/d14/d25/f4e [0,4194304] 0 2026-03-09T00:03:57.604 INFO:tasks.workunit.client.1.vm06.stdout:9/578: write d1/d3/d4f/d52/fa5 [154645,121626] 0 2026-03-09T00:03:57.604 INFO:tasks.workunit.client.1.vm06.stdout:6/714: dread d4/d27/d3e/f55 [0,4194304] 0 2026-03-09T00:03:57.606 INFO:tasks.workunit.client.1.vm06.stdout:8/676: symlink db/dd/d24/ldf 0 2026-03-09T00:03:57.609 INFO:tasks.workunit.client.0.vm03.stdout:8/458: fsync d7/df/d1a/d40/f5e 0 2026-03-09T00:03:57.609 INFO:tasks.workunit.client.0.vm03.stdout:8/459: fsync d7/df/d1a/d40/d58/f8c 0 2026-03-09T00:03:57.609 INFO:tasks.workunit.client.0.vm03.stdout:8/460: chown d7/df/l70 0 1 2026-03-09T00:03:57.618 INFO:tasks.workunit.client.0.vm03.stdout:5/462: truncate d1c/d20/f25 2797393 0 2026-03-09T00:03:57.620 INFO:tasks.workunit.client.0.vm03.stdout:6/423: rmdir d13/d35/d71 39 2026-03-09T00:03:57.620 INFO:tasks.workunit.client.0.vm03.stdout:3/330: chown d2/db/f15 1210393 1 2026-03-09T00:03:57.620 INFO:tasks.workunit.client.0.vm03.stdout:6/424: read f10 [1658220,55393] 0 2026-03-09T00:03:57.620 INFO:tasks.workunit.client.0.vm03.stdout:6/425: creat d13/d35/d4c/d62/f9a x:0 0 0 2026-03-09T00:03:57.623 INFO:tasks.workunit.client.0.vm03.stdout:0/426: rename d2/da/dd/d49/d6c/f48 to d2/da/dd/d49/d6c/d4b/fa0 0 2026-03-09T00:03:57.626 INFO:tasks.workunit.client.0.vm03.stdout:2/407: mknod d8/d1b/d2a/d6b/d50/c80 0 2026-03-09T00:03:57.627 INFO:tasks.workunit.client.0.vm03.stdout:7/420: mkdir d2/d4/d1e/d5e/d7e 0 2026-03-09T00:03:57.634 INFO:tasks.workunit.client.0.vm03.stdout:9/467: mkdir d15/d1c/d21/d99 0 2026-03-09T00:03:57.641 INFO:tasks.workunit.client.0.vm03.stdout:4/551: mkdir d7/d20/db3 0 2026-03-09T00:03:57.641 INFO:tasks.workunit.client.0.vm03.stdout:4/552: creat d7/d20/d29/d38/fb4 x:0 0 0 2026-03-09T00:03:57.642 INFO:tasks.workunit.client.0.vm03.stdout:4/553: write d7/f7e [1441253,34087] 0 2026-03-09T00:03:57.642 INFO:tasks.workunit.client.0.vm03.stdout:9/468: dread d15/d1c/d28/f5b [0,4194304] 0 2026-03-09T00:03:57.644 INFO:tasks.workunit.client.1.vm06.stdout:9/579: dwrite d1/d4/d6e/fa9 [4194304,4194304] 0 2026-03-09T00:03:57.645 INFO:tasks.workunit.client.0.vm03.stdout:4/554: dread d7/d20/d29/d38/f6e [0,4194304] 0 2026-03-09T00:03:57.646 INFO:tasks.workunit.client.1.vm06.stdout:7/710: dwrite d0/df/d1a/d3a/d4e/d5e/f93 [0,4194304] 0 2026-03-09T00:03:57.646 INFO:tasks.workunit.client.1.vm06.stdout:7/711: dread - d0/df/d1a/d3a/f84 zero size 2026-03-09T00:03:57.646 INFO:tasks.workunit.client.1.vm06.stdout:7/712: truncate d0/df/d1a/d3a/f83 190712 0 2026-03-09T00:03:57.646 INFO:tasks.workunit.client.1.vm06.stdout:7/713: fsync d0/df/d17/f74 0 2026-03-09T00:03:57.650 INFO:tasks.workunit.client.0.vm03.stdout:1/548: creat d4/d15/d77/fbd x:0 0 0 2026-03-09T00:03:57.650 INFO:tasks.workunit.client.0.vm03.stdout:1/549: chown d4/d15/d77/f7c 836485 1 2026-03-09T00:03:57.650 INFO:tasks.workunit.client.0.vm03.stdout:1/550: dread - d4/d3a/d61/da6/fa7 zero size 2026-03-09T00:03:57.653 INFO:tasks.workunit.client.1.vm06.stdout:6/715: mknod d4/d16/d53/cdc 0 2026-03-09T00:03:57.658 INFO:tasks.workunit.client.1.vm06.stdout:8/677: symlink db/le0 0 2026-03-09T00:03:57.691 INFO:tasks.workunit.client.1.vm06.stdout:0/728: rename d3/d18/d79 to d3/d18/d1f/d39/d3b/df9 0 2026-03-09T00:03:57.691 INFO:tasks.workunit.client.0.vm03.stdout:3/331: creat d2/db/d3b/f62 x:0 0 0 2026-03-09T00:03:57.692 INFO:tasks.workunit.client.1.vm06.stdout:9/580: mknod d1/d4/d6e/d14/cbc 0 2026-03-09T00:03:57.692 INFO:tasks.workunit.client.1.vm06.stdout:9/581: stat d1/d3/d4f/d52 0 2026-03-09T00:03:57.692 INFO:tasks.workunit.client.1.vm06.stdout:9/582: truncate d1/d4/d6e/d9/f82 1075185 0 2026-03-09T00:03:57.694 INFO:tasks.workunit.client.0.vm03.stdout:6/426: mknod d13/d35/d71/d97/c9b 0 2026-03-09T00:03:57.696 INFO:tasks.workunit.client.1.vm06.stdout:7/714: mkdir d0/df/d1a/d3a/d4e/d5e/dc8/dcd 0 2026-03-09T00:03:57.696 INFO:tasks.workunit.client.1.vm06.stdout:7/715: read - d0/df/d1a/d27/d4c/fb0 zero size 2026-03-09T00:03:57.696 INFO:tasks.workunit.client.0.vm03.stdout:0/427: creat d2/da/d76/fa1 x:0 0 0 2026-03-09T00:03:57.696 INFO:tasks.workunit.client.0.vm03.stdout:0/428: dread - d2/da/dd/d49/d6c/f57 zero size 2026-03-09T00:03:57.696 INFO:tasks.workunit.client.0.vm03.stdout:0/429: fdatasync f0 0 2026-03-09T00:03:57.696 INFO:tasks.workunit.client.0.vm03.stdout:0/430: truncate d2/da/dd/d49/f87 82504 0 2026-03-09T00:03:57.697 INFO:tasks.workunit.client.1.vm06.stdout:6/716: rmdir d4/d27/d3e/d45 39 2026-03-09T00:03:57.698 INFO:tasks.workunit.client.0.vm03.stdout:5/463: rename d1c/f37 to d1c/d20/d55/d66/d6b/d8f/f98 0 2026-03-09T00:03:57.699 INFO:tasks.workunit.client.0.vm03.stdout:7/421: truncate d2/d4/d1e/d5e/d6c/f3f 3496298 0 2026-03-09T00:03:57.699 INFO:tasks.workunit.client.0.vm03.stdout:7/422: dread - d2/d1f/d40/f72 zero size 2026-03-09T00:03:57.700 INFO:tasks.workunit.client.1.vm06.stdout:8/678: getdents db/d53/d70/d38/d47 0 2026-03-09T00:03:57.701 INFO:tasks.workunit.client.1.vm06.stdout:4/630: rename d17/d24 to d17/d24/d3b/d5e/dd5 22 2026-03-09T00:03:57.701 INFO:tasks.workunit.client.1.vm06.stdout:4/631: write d17/d21/d4c/faf [285534,117295] 0 2026-03-09T00:03:57.708 INFO:tasks.workunit.client.0.vm03.stdout:9/469: creat d15/f9a x:0 0 0 2026-03-09T00:03:57.712 INFO:tasks.workunit.client.1.vm06.stdout:0/729: truncate d3/f51 2234590 0 2026-03-09T00:03:57.712 INFO:tasks.workunit.client.1.vm06.stdout:0/730: dread - d3/d18/de9/d9f/ff1 zero size 2026-03-09T00:03:57.712 INFO:tasks.workunit.client.0.vm03.stdout:4/555: creat d7/d20/d35/fb5 x:0 0 0 2026-03-09T00:03:57.712 INFO:tasks.workunit.client.0.vm03.stdout:1/551: rmdir d4/d3a 39 2026-03-09T00:03:57.712 INFO:tasks.workunit.client.0.vm03.stdout:1/552: fdatasync d4/fb 0 2026-03-09T00:03:57.713 INFO:tasks.workunit.client.1.vm06.stdout:9/583: creat d1/d3/d4f/fbd x:0 0 0 2026-03-09T00:03:57.714 INFO:tasks.workunit.client.1.vm06.stdout:6/717: mkdir d4/d27/d3e/d78/ddd 0 2026-03-09T00:03:57.715 INFO:tasks.workunit.client.0.vm03.stdout:8/461: mkdir d7/df/d1e/d38/d91 0 2026-03-09T00:03:57.719 INFO:tasks.workunit.client.1.vm06.stdout:8/679: creat db/d1e/d46/fe1 x:0 0 0 2026-03-09T00:03:57.719 INFO:tasks.workunit.client.1.vm06.stdout:8/680: creat db/d74/d78/fe2 x:0 0 0 2026-03-09T00:03:57.719 INFO:tasks.workunit.client.1.vm06.stdout:1/615: rename d6/d4c/d71/ccc to d6/d21/d2d/d3b/d87/cd1 0 2026-03-09T00:03:57.720 INFO:tasks.workunit.client.0.vm03.stdout:3/332: link d2/f4e d2/db/d3b/f63 0 2026-03-09T00:03:57.723 INFO:tasks.workunit.client.1.vm06.stdout:8/681: dread db/dd/d24/f6e [0,4194304] 0 2026-03-09T00:03:57.723 INFO:tasks.workunit.client.1.vm06.stdout:4/632: creat d17/d21/d32/fd6 x:0 0 0 2026-03-09T00:03:57.723 INFO:tasks.workunit.client.1.vm06.stdout:4/633: dread - d17/d21/d4c/d66/f7b zero size 2026-03-09T00:03:57.724 INFO:tasks.workunit.client.1.vm06.stdout:6/718: symlink d4/d27/d42/d52/d7d/lde 0 2026-03-09T00:03:57.724 INFO:tasks.workunit.client.0.vm03.stdout:5/464: mknod d1c/d20/c99 0 2026-03-09T00:03:57.725 INFO:tasks.workunit.client.0.vm03.stdout:5/465: write d1c/d20/d55/f34 [814366,83985] 0 2026-03-09T00:03:57.725 INFO:tasks.workunit.client.0.vm03.stdout:5/466: fsync d1c/f1f 0 2026-03-09T00:03:57.728 INFO:tasks.workunit.client.1.vm06.stdout:3/718: rename d11/d3f/c77 to d11/d28/d2e/d2f/cfb 0 2026-03-09T00:03:57.741 INFO:tasks.workunit.client.1.vm06.stdout:3/719: write d11/f27 [191710,3368] 0 2026-03-09T00:03:57.741 INFO:tasks.workunit.client.1.vm06.stdout:3/720: truncate d11/d28/d57/f7b 2415833 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:9/470: mknod d15/d1c/d21/d64/c9b 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:1/553: unlink d4/d6/l5b 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:1/554: read - d4/d15/f7f zero size 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:5/467: write d1c/d20/d56/d74/f84 [149025,69297] 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:5/468: truncate d1c/d20/d55/d66/d70/f71 638806 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:2/408: rename d8/d17/f2c to d8/d1b/d2a/d6b/f81 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.0.vm03.stdout:1/555: creat d4/d3a/d61/d78/d81/d93/fbe x:0 0 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:1/616: symlink d6/d21/d2d/d37/ld2 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:8/682: rmdir db/d53/d70/d38/d4d/d79 39 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:6/719: truncate d4/d16/f33 3781116 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:6/720: chown d4/d16/d53/c81 0 1 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:6/721: fsync d4/f38 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:0/731: dread d3/f1e [0,4194304] 0 2026-03-09T00:03:57.742 INFO:tasks.workunit.client.1.vm06.stdout:0/732: write d3/d18/d1f/f5e [312715,20960] 0 2026-03-09T00:03:57.743 INFO:tasks.workunit.client.1.vm06.stdout:0/733: creat d3/d18/d28/d45/ffa x:0 0 0 2026-03-09T00:03:57.743 INFO:tasks.workunit.client.1.vm06.stdout:1/617: creat d6/d21/d2d/d37/fd3 x:0 0 0 2026-03-09T00:03:57.743 INFO:tasks.workunit.client.1.vm06.stdout:1/618: truncate d6/f98 689734 0 2026-03-09T00:03:57.752 INFO:tasks.workunit.client.0.vm03.stdout:7/423: rename d2/d4/d1e/f63 to d2/d1f/d42/f7f 0 2026-03-09T00:03:57.756 INFO:tasks.workunit.client.1.vm06.stdout:5/774: sync 2026-03-09T00:03:57.756 INFO:tasks.workunit.client.1.vm06.stdout:2/749: sync 2026-03-09T00:03:57.757 INFO:tasks.workunit.client.0.vm03.stdout:5/469: read d1c/d20/d55/d3b/f57 [3675289,30832] 0 2026-03-09T00:03:57.758 INFO:tasks.workunit.client.1.vm06.stdout:4/634: getdents d17/d24/d49/d5f 0 2026-03-09T00:03:57.758 INFO:tasks.workunit.client.1.vm06.stdout:4/635: truncate d17/d21/d4c/f90 315205 0 2026-03-09T00:03:57.767 INFO:tasks.workunit.client.0.vm03.stdout:5/470: write d1c/d20/f33 [1148345,112298] 0 2026-03-09T00:03:57.767 INFO:tasks.workunit.client.0.vm03.stdout:5/471: chown d1c/d20/d55/d4f/d58/d73/d76/d91 395 1 2026-03-09T00:03:57.785 INFO:tasks.workunit.client.1.vm06.stdout:5/775: dread d5/f1d [0,4194304] 0 2026-03-09T00:03:57.786 INFO:tasks.workunit.client.0.vm03.stdout:4/556: rename d7/f22 to d7/d6f/da5/fb6 0 2026-03-09T00:03:57.786 INFO:tasks.workunit.client.0.vm03.stdout:4/557: write d7/f1f [723287,75351] 0 2026-03-09T00:03:57.787 INFO:tasks.workunit.client.0.vm03.stdout:5/472: getdents d1c/d20/d55/d4f 0 2026-03-09T00:03:57.788 INFO:tasks.workunit.client.1.vm06.stdout:8/683: mkdir db/dd/de3 0 2026-03-09T00:03:57.788 INFO:tasks.workunit.client.1.vm06.stdout:8/684: chown db/d74/d78/d98/d9c/fc4 90 1 2026-03-09T00:03:57.795 INFO:tasks.workunit.client.1.vm06.stdout:5/776: unlink d5/d1c/f62 0 2026-03-09T00:03:57.795 INFO:tasks.workunit.client.1.vm06.stdout:5/777: chown d5/d1c/f22 435338 1 2026-03-09T00:03:57.795 INFO:tasks.workunit.client.1.vm06.stdout:3/721: rename d11/c5c to d11/cfc 0 2026-03-09T00:03:57.796 INFO:tasks.workunit.client.0.vm03.stdout:4/558: readlink d7/d20/d29/l57 0 2026-03-09T00:03:57.799 INFO:tasks.workunit.client.1.vm06.stdout:5/778: symlink d5/d1c/d21/d28/d5e/d66/l105 0 2026-03-09T00:03:57.804 INFO:tasks.workunit.client.0.vm03.stdout:3/333: write d2/db/f24 [543189,82330] 0 2026-03-09T00:03:57.809 INFO:tasks.workunit.client.0.vm03.stdout:5/473: unlink d1c/d20/d55/d4f/c60 0 2026-03-09T00:03:57.814 INFO:tasks.workunit.client.0.vm03.stdout:5/474: creat d1c/d20/d56/d74/f9a x:0 0 0 2026-03-09T00:03:57.814 INFO:tasks.workunit.client.0.vm03.stdout:5/475: stat fb 0 2026-03-09T00:03:57.814 INFO:tasks.workunit.client.0.vm03.stdout:5/476: creat d1c/d20/d55/f9b x:0 0 0 2026-03-09T00:03:57.814 INFO:tasks.workunit.client.0.vm03.stdout:5/477: chown d1c/d51/f68 13 1 2026-03-09T00:03:57.815 INFO:tasks.workunit.client.1.vm06.stdout:3/722: mkdir d11/d28/d4d/d89/d90/dd2/dfd 0 2026-03-09T00:03:57.815 INFO:tasks.workunit.client.1.vm06.stdout:5/779: creat d5/d44/d4b/d92/d49/f106 x:0 0 0 2026-03-09T00:03:57.815 INFO:tasks.workunit.client.1.vm06.stdout:5/780: stat d5/d44/d84/c8d 0 2026-03-09T00:03:57.822 INFO:tasks.workunit.client.1.vm06.stdout:5/781: mkdir d5/d1c/d68/da2/d107 0 2026-03-09T00:03:57.822 INFO:tasks.workunit.client.0.vm03.stdout:8/462: rename d7/df/d6b to d7/d92 0 2026-03-09T00:03:57.822 INFO:tasks.workunit.client.0.vm03.stdout:8/463: chown d7/df/d1a/d2b/c5d 7967 1 2026-03-09T00:03:57.822 INFO:tasks.workunit.client.0.vm03.stdout:8/464: fsync d7/df/d1a/d2b/f72 0 2026-03-09T00:03:57.826 INFO:tasks.workunit.client.0.vm03.stdout:6/427: dwrite d13/d35/f68 [0,4194304] 0 2026-03-09T00:03:57.826 INFO:tasks.workunit.client.0.vm03.stdout:6/428: write d13/d35/d4c/f4f [146281,48356] 0 2026-03-09T00:03:57.826 INFO:tasks.workunit.client.0.vm03.stdout:6/429: dread - d13/d1e/d44/d4a/d52/f7a zero size 2026-03-09T00:03:57.832 INFO:tasks.workunit.client.0.vm03.stdout:4/559: rename d7/d20/d29/d38/d3a/d90 to d7/d20/d6a/d77/db7 0 2026-03-09T00:03:57.832 INFO:tasks.workunit.client.0.vm03.stdout:4/560: stat d7/d20/d29/fa0 0 2026-03-09T00:03:57.832 INFO:tasks.workunit.client.0.vm03.stdout:4/561: dread - d7/f5d zero size 2026-03-09T00:03:57.832 INFO:tasks.workunit.client.1.vm06.stdout:4/636: write d17/f35 [7963340,38289] 0 2026-03-09T00:03:57.832 INFO:tasks.workunit.client.0.vm03.stdout:8/465: rename d7/df/d1e/d3f/f8e to d7/df/d1a/f93 0 2026-03-09T00:03:57.832 INFO:tasks.workunit.client.0.vm03.stdout:8/466: readlink d7/df/d1a/d2b/l35 0 2026-03-09T00:03:57.834 INFO:tasks.workunit.client.0.vm03.stdout:4/562: link d7/d20/d29/d4e/f9d d7/d20/d6a/d77/d25/fb8 0 2026-03-09T00:03:57.837 INFO:tasks.workunit.client.0.vm03.stdout:8/467: rename d7/d92/c7c to d7/df/d1a/d40/d58/c94 0 2026-03-09T00:03:57.839 INFO:tasks.workunit.client.0.vm03.stdout:4/563: write d7/d20/d35/d66/f69 [987047,100685] 0 2026-03-09T00:03:57.848 INFO:tasks.workunit.client.1.vm06.stdout:4/637: getdents d17/d24/d3b/dbf 0 2026-03-09T00:03:57.848 INFO:tasks.workunit.client.0.vm03.stdout:8/468: mkdir d7/df/d1e/d3f/d95 0 2026-03-09T00:03:57.848 INFO:tasks.workunit.client.0.vm03.stdout:8/469: link d7/df/d1a/d40/c84 d7/df/d1e/c96 0 2026-03-09T00:03:57.848 INFO:tasks.workunit.client.0.vm03.stdout:8/470: creat d7/df/d1e/d38/d4c/f97 x:0 0 0 2026-03-09T00:03:57.848 INFO:tasks.workunit.client.0.vm03.stdout:8/471: mkdir d7/df/d1e/d38/d4c/d98 0 2026-03-09T00:03:57.849 INFO:tasks.workunit.client.0.vm03.stdout:8/472: mknod d7/df/d1e/d3f/d95/c99 0 2026-03-09T00:03:57.851 INFO:tasks.workunit.client.1.vm06.stdout:4/638: read d17/d24/d3b/d75/f9e [203649,34859] 0 2026-03-09T00:03:57.851 INFO:tasks.workunit.client.1.vm06.stdout:4/639: fdatasync d17/d21/d4c/dc2/fcd 0 2026-03-09T00:03:57.851 INFO:tasks.workunit.client.0.vm03.stdout:8/473: symlink d7/df/d1e/d38/d91/l9a 0 2026-03-09T00:03:57.851 INFO:tasks.workunit.client.0.vm03.stdout:8/474: fdatasync d7/f67 0 2026-03-09T00:03:57.851 INFO:tasks.workunit.client.0.vm03.stdout:8/475: write d7/df/d1a/d40/d58/f7a [311990,7058] 0 2026-03-09T00:03:57.851 INFO:tasks.workunit.client.0.vm03.stdout:8/476: write d7/df/d1a/f93 [1031004,37672] 0 2026-03-09T00:03:57.867 INFO:tasks.workunit.client.1.vm06.stdout:4/640: unlink d17/l3c 0 2026-03-09T00:03:57.871 INFO:tasks.workunit.client.1.vm06.stdout:4/641: unlink d17/d21/f5d 0 2026-03-09T00:03:57.874 INFO:tasks.workunit.client.1.vm06.stdout:5/782: dread d5/d1c/d23/d34/d47/f87 [0,4194304] 0 2026-03-09T00:03:57.874 INFO:tasks.workunit.client.1.vm06.stdout:5/783: fsync d5/d1c/d21/d28/f56 0 2026-03-09T00:03:57.878 INFO:tasks.workunit.client.1.vm06.stdout:5/784: link d5/d1c/d21/d28/f3b d5/d44/f108 0 2026-03-09T00:03:57.883 INFO:tasks.workunit.client.1.vm06.stdout:5/785: dread d5/d44/d4b/d92/d49/da0/fda [0,4194304] 0 2026-03-09T00:03:57.883 INFO:tasks.workunit.client.1.vm06.stdout:5/786: write d5/d44/d84/dc5/fd2 [1042254,48197] 0 2026-03-09T00:03:57.883 INFO:tasks.workunit.client.1.vm06.stdout:5/787: dread - d5/d44/d4b/d92/d49/f106 zero size 2026-03-09T00:03:57.884 INFO:tasks.workunit.client.1.vm06.stdout:5/788: truncate d5/f43 865195 0 2026-03-09T00:03:57.888 INFO:tasks.workunit.client.0.vm03.stdout:2/409: dwrite d8/f15 [0,4194304] 0 2026-03-09T00:03:57.898 INFO:tasks.workunit.client.1.vm06.stdout:9/584: dwrite d1/d4/d6e/d14/d25/d85/d49/f69 [0,4194304] 0 2026-03-09T00:03:57.898 INFO:tasks.workunit.client.0.vm03.stdout:2/410: rmdir d8/d1b/d2a/d2e 39 2026-03-09T00:03:57.898 INFO:tasks.workunit.client.0.vm03.stdout:2/411: rename d8/d1b/d24/f41 to d8/d1b/d24/f82 0 2026-03-09T00:03:57.898 INFO:tasks.workunit.client.0.vm03.stdout:2/412: fdatasync d8/d1b/f31 0 2026-03-09T00:03:57.899 INFO:tasks.workunit.client.1.vm06.stdout:1/619: dwrite d6/d4c/fbe [0,4194304] 0 2026-03-09T00:03:57.900 INFO:tasks.workunit.client.1.vm06.stdout:1/620: dread d6/f98 [0,4194304] 0 2026-03-09T00:03:57.900 INFO:tasks.workunit.client.0.vm03.stdout:0/431: dwrite d2/da/dd/d49/d6c/f3b [0,4194304] 0 2026-03-09T00:03:57.900 INFO:tasks.workunit.client.0.vm03.stdout:0/432: rename d2 to d2/da/dd/da2 22 2026-03-09T00:03:57.900 INFO:tasks.workunit.client.0.vm03.stdout:0/433: chown d2/da/dd/d49/d6c/d4b/f88 235728247 1 2026-03-09T00:03:57.903 INFO:tasks.workunit.client.0.vm03.stdout:9/471: dwrite d15/d1c/d36/f78 [0,4194304] 0 2026-03-09T00:03:57.904 INFO:tasks.workunit.client.1.vm06.stdout:1/621: dread d6/f1d [0,4194304] 0 2026-03-09T00:03:57.907 INFO:tasks.workunit.client.0.vm03.stdout:2/413: mknod d8/d26/d5e/d5f/c83 0 2026-03-09T00:03:57.912 INFO:tasks.workunit.client.0.vm03.stdout:2/414: chown d8/d1b/d2a/d2e/l7d 518359696 1 2026-03-09T00:03:57.912 INFO:tasks.workunit.client.0.vm03.stdout:2/415: dread - d8/d1b/d24/f66 zero size 2026-03-09T00:03:57.912 INFO:tasks.workunit.client.0.vm03.stdout:2/416: write d8/d17/f68 [1901333,123484] 0 2026-03-09T00:03:57.917 INFO:tasks.workunit.client.1.vm06.stdout:4/642: getdents d17/d21/d32 0 2026-03-09T00:03:57.917 INFO:tasks.workunit.client.1.vm06.stdout:4/643: chown d17/d24/d3b/d97/fb5 2264826 1 2026-03-09T00:03:57.917 INFO:tasks.workunit.client.1.vm06.stdout:4/644: chown d17/d24/d3b/d97/lc5 11972712 1 2026-03-09T00:03:57.921 INFO:tasks.workunit.client.1.vm06.stdout:4/645: read d17/d24/f31 [2426530,125046] 0 2026-03-09T00:03:57.926 INFO:tasks.workunit.client.0.vm03.stdout:7/424: dwrite d2/d1f/d40/d67/f64 [0,4194304] 0 2026-03-09T00:03:57.928 INFO:tasks.workunit.client.0.vm03.stdout:0/434: rmdir d2/da/d76/d8a 39 2026-03-09T00:03:57.928 INFO:tasks.workunit.client.0.vm03.stdout:0/435: chown d2/da/dd/d49/d6c/d4b/f4c 55066385 1 2026-03-09T00:03:57.932 INFO:tasks.workunit.client.0.vm03.stdout:9/472: write d15/d1c/d21/f25 [917459,121127] 0 2026-03-09T00:03:57.937 INFO:tasks.workunit.client.1.vm06.stdout:4/646: symlink d17/d24/d49/d5f/ld7 0 2026-03-09T00:03:57.939 INFO:tasks.workunit.client.1.vm06.stdout:4/647: dread d17/d24/d49/d5f/f76 [0,4194304] 0 2026-03-09T00:03:57.942 INFO:tasks.workunit.client.0.vm03.stdout:3/334: write d2/db/f10 [6466093,16265] 0 2026-03-09T00:03:57.942 INFO:tasks.workunit.client.0.vm03.stdout:2/417: mknod d8/d1b/d6c/c84 0 2026-03-09T00:03:57.942 INFO:tasks.workunit.client.0.vm03.stdout:2/418: write d8/d17/f27 [5550908,51685] 0 2026-03-09T00:03:57.949 INFO:tasks.workunit.client.1.vm06.stdout:4/648: unlink d17/d21/d32/d92/cbc 0 2026-03-09T00:03:57.954 INFO:tasks.workunit.client.1.vm06.stdout:4/649: creat d17/d5b/dac/fd8 x:0 0 0 2026-03-09T00:03:57.954 INFO:tasks.workunit.client.0.vm03.stdout:6/430: dwrite d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:03:57.954 INFO:tasks.workunit.client.0.vm03.stdout:6/431: fsync d13/d1e/d44/d59/f6c 0 2026-03-09T00:03:57.955 INFO:tasks.workunit.client.0.vm03.stdout:5/478: dread d1c/d20/f33 [0,4194304] 0 2026-03-09T00:03:57.961 INFO:tasks.workunit.client.0.vm03.stdout:5/479: read d1c/d20/d55/f61 [2411991,951] 0 2026-03-09T00:03:57.964 INFO:tasks.workunit.client.1.vm06.stdout:4/650: rename d17/d24/d3b/d54 to d17/d21/d4c/d66/dd9 0 2026-03-09T00:03:57.965 INFO:tasks.workunit.client.0.vm03.stdout:7/425: mkdir d2/d4/d1e/d5e/d6c/d80 0 2026-03-09T00:03:57.965 INFO:tasks.workunit.client.0.vm03.stdout:0/436: creat d2/da/d76/d8a/fa3 x:0 0 0 2026-03-09T00:03:57.968 INFO:tasks.workunit.client.1.vm06.stdout:4/651: dread d17/d24/d49/f5a [0,4194304] 0 2026-03-09T00:03:57.968 INFO:tasks.workunit.client.0.vm03.stdout:2/419: creat d8/d26/f85 x:0 0 0 2026-03-09T00:03:57.969 INFO:tasks.workunit.client.1.vm06.stdout:2/750: dwrite d7/d1a/d3c/f4d [0,4194304] 0 2026-03-09T00:03:57.969 INFO:tasks.workunit.client.1.vm06.stdout:2/751: readlink d7/d1a/d25/d97/la1 0 2026-03-09T00:03:57.969 INFO:tasks.workunit.client.1.vm06.stdout:2/752: creat d7/d1b/d71/d79/db4/dc1/d86/fe1 x:0 0 0 2026-03-09T00:03:57.969 INFO:tasks.workunit.client.1.vm06.stdout:2/753: chown d7/da/l2e 952644 1 2026-03-09T00:03:57.970 INFO:tasks.workunit.client.1.vm06.stdout:4/652: symlink d17/d24/d3b/d97/lda 0 2026-03-09T00:03:57.973 INFO:tasks.workunit.client.0.vm03.stdout:7/426: mkdir d2/d1f/d42/d46/d81 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:3/335: rename d2/fc to d2/db/f64 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:3/336: readlink d2/db/l38 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:2/420: creat d8/d1b/d24/f86 x:0 0 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:2/421: write d8/d17/f1c [5062745,58855] 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:7/427: unlink d2/d1f/d35/l5c 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:0/437: unlink d2/da/d1a/d9f/l2e 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:3/337: chown d2/db/c32 848 1 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.0.vm03.stdout:3/338: chown d2/db/d40/d51/c5b 403 1 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:2/754: link d7/da/l15 d7/d1a/d96/le2 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:2/755: write d7/da/db/de/f11 [3847243,53886] 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:2/756: dread d7/d1b/d71/d79/db4/dc1/f5e [0,4194304] 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/653: mknod d17/d24/d3b/d5e/d6e/db0/cdb 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/654: mknod d17/cdc 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/655: write d17/d21/d4c/f87 [133100,21233] 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:2/757: getdents d7/da/db/de 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/656: mkdir d17/d24/d3b/d5e/d6e/db0/ddd 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:2/758: readlink d7/l41 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/657: creat d17/d24/d3b/d5e/d7a/fde x:0 0 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/658: write d17/d21/f38 [1383456,48581] 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:4/659: stat d17/d5b/dac/fd8 0 2026-03-09T00:03:57.989 INFO:tasks.workunit.client.1.vm06.stdout:2/759: creat d7/d1a/d25/d66/d87/da8/fe3 x:0 0 0 2026-03-09T00:03:57.991 INFO:tasks.workunit.client.0.vm03.stdout:0/438: dread d2/da/d1a/f1c [0,4194304] 0 2026-03-09T00:03:57.995 INFO:tasks.workunit.client.0.vm03.stdout:8/477: dwrite d7/df/d1a/d40/f4d [0,4194304] 0 2026-03-09T00:03:58.000 INFO:tasks.workunit.client.0.vm03.stdout:3/339: read f1 [686325,26329] 0 2026-03-09T00:03:58.001 INFO:tasks.workunit.client.0.vm03.stdout:2/422: link d8/d26/d5e/d5f/f48 d8/d1b/d2a/d6b/f87 0 2026-03-09T00:03:58.001 INFO:tasks.workunit.client.0.vm03.stdout:2/423: write d8/d1b/f3d [1013841,26888] 0 2026-03-09T00:03:58.001 INFO:tasks.workunit.client.0.vm03.stdout:2/424: write d8/d1b/f30 [4913869,98734] 0 2026-03-09T00:03:58.003 INFO:tasks.workunit.client.0.vm03.stdout:0/439: read d2/da/dd/d49/d6c/f3b [560402,111971] 0 2026-03-09T00:03:58.009 INFO:tasks.workunit.client.0.vm03.stdout:6/432: dwrite d13/d1e/f28 [0,4194304] 0 2026-03-09T00:03:58.009 INFO:tasks.workunit.client.0.vm03.stdout:6/433: read d13/d1e/f2d [1723211,59193] 0 2026-03-09T00:03:58.016 INFO:tasks.workunit.client.0.vm03.stdout:8/478: link d7/f67 d7/f9b 0 2026-03-09T00:03:58.024 INFO:tasks.workunit.client.0.vm03.stdout:6/434: symlink d13/d35/d71/d97/l9c 0 2026-03-09T00:03:58.027 INFO:tasks.workunit.client.0.vm03.stdout:6/435: mkdir d13/d35/d74/d89/d9d 0 2026-03-09T00:03:58.027 INFO:tasks.workunit.client.0.vm03.stdout:6/436: truncate d13/d1e/d44/d59/d77/f98 798400 0 2026-03-09T00:03:58.027 INFO:tasks.workunit.client.0.vm03.stdout:6/437: creat d13/d35/f9e x:0 0 0 2026-03-09T00:03:58.030 INFO:tasks.workunit.client.1.vm06.stdout:8/685: dwrite db/d53/d70/d38/d4d/f65 [4194304,4194304] 0 2026-03-09T00:03:58.047 INFO:tasks.workunit.client.1.vm06.stdout:8/686: creat db/dd/d84/fe4 x:0 0 0 2026-03-09T00:03:58.048 INFO:tasks.workunit.client.0.vm03.stdout:6/438: rmdir d13/d35/d4c 39 2026-03-09T00:03:58.048 INFO:tasks.workunit.client.0.vm03.stdout:6/439: chown d13/l2a 754 1 2026-03-09T00:03:58.049 INFO:tasks.workunit.client.0.vm03.stdout:6/440: link d13/d35/d71/f87 d13/d1e/f9f 0 2026-03-09T00:03:58.049 INFO:tasks.workunit.client.1.vm06.stdout:8/687: write db/d1e/f82 [1697161,418] 0 2026-03-09T00:03:58.051 INFO:tasks.workunit.client.1.vm06.stdout:8/688: getdents db/d74/d78 0 2026-03-09T00:03:58.107 INFO:tasks.workunit.client.0.vm03.stdout:9/473: dwrite d15/d1c/d36/d4d/f91 [0,4194304] 0 2026-03-09T00:03:58.107 INFO:tasks.workunit.client.1.vm06.stdout:5/789: dwrite d5/d44/d4b/ffb [0,4194304] 0 2026-03-09T00:03:58.107 INFO:tasks.workunit.client.1.vm06.stdout:5/790: fsync d5/d1c/d23/d34/d47/fbd 0 2026-03-09T00:03:58.108 INFO:tasks.workunit.client.0.vm03.stdout:9/474: rename d15/d1c/d8d to d15/d1c/d9c 0 2026-03-09T00:03:58.112 INFO:tasks.workunit.client.1.vm06.stdout:5/791: rename d5/d1c/d21/d28/d5e/c71 to d5/d1c/d23/d34/c109 0 2026-03-09T00:03:58.129 INFO:tasks.workunit.client.0.vm03.stdout:9/475: dread fc [0,4194304] 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.0.vm03.stdout:9/476: write f8 [2245159,13600] 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.0.vm03.stdout:9/477: write d15/d1c/d36/f4a [716396,91858] 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.0.vm03.stdout:9/478: write d15/d1c/d21/d54/d87/d93/f7e [4247389,92089] 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.0.vm03.stdout:9/479: unlink d15/d1c/d28/d6e/c8b 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/792: symlink d5/d1c/d68/dec/l10a 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/793: readlink d5/d44/d84/laf 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/794: dread - d5/d44/d4b/d92/d49/da0/ffc zero size 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/795: link d5/d44/d4b/d92/f46 d5/f10b 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/796: creat d5/d1c/d68/da2/d107/f10c x:0 0 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/797: creat d5/d1c/d21/d28/d5e/f10d x:0 0 0 2026-03-09T00:03:58.130 INFO:tasks.workunit.client.1.vm06.stdout:5/798: creat d5/d1c/d21/d28/d5e/f10e x:0 0 0 2026-03-09T00:03:58.133 INFO:tasks.workunit.client.1.vm06.stdout:5/799: getdents d5/d44/d4b/d92/d95 0 2026-03-09T00:03:58.137 INFO:tasks.workunit.client.1.vm06.stdout:5/800: dread d5/f8e [0,4194304] 0 2026-03-09T00:03:58.151 INFO:tasks.workunit.client.1.vm06.stdout:5/801: dread d5/d44/d4b/ffb [0,4194304] 0 2026-03-09T00:03:58.153 INFO:tasks.workunit.client.1.vm06.stdout:5/802: mknod d5/d1c/d21/d28/d5e/d66/d78/dc8/c10f 0 2026-03-09T00:03:58.155 INFO:tasks.workunit.client.1.vm06.stdout:5/803: truncate d5/d1c/f22 1518496 0 2026-03-09T00:03:58.155 INFO:tasks.workunit.client.1.vm06.stdout:5/804: write d5/f3d [522036,12060] 0 2026-03-09T00:03:58.156 INFO:tasks.workunit.client.1.vm06.stdout:5/805: dread d5/d44/d4b/d92/f52 [0,4194304] 0 2026-03-09T00:03:58.157 INFO:tasks.workunit.client.1.vm06.stdout:5/806: symlink d5/d1c/d21/d28/l110 0 2026-03-09T00:03:58.165 INFO:tasks.workunit.client.1.vm06.stdout:5/807: write d5/d44/f4a [1959325,42043] 0 2026-03-09T00:03:58.168 INFO:tasks.workunit.client.1.vm06.stdout:9/585: dwrite d1/d3/d2b/f33 [0,4194304] 0 2026-03-09T00:03:58.168 INFO:tasks.workunit.client.1.vm06.stdout:4/660: dwrite d17/d21/d32/f85 [0,4194304] 0 2026-03-09T00:03:58.170 INFO:tasks.workunit.client.0.vm03.stdout:3/340: dwrite d2/db/d3b/d3f/f46 [0,4194304] 0 2026-03-09T00:03:58.172 INFO:tasks.workunit.client.1.vm06.stdout:5/808: truncate d5/d1c/d68/f3f 1323384 0 2026-03-09T00:03:58.172 INFO:tasks.workunit.client.1.vm06.stdout:5/809: creat d5/d1c/d21/f111 x:0 0 0 2026-03-09T00:03:58.172 INFO:tasks.workunit.client.0.vm03.stdout:3/341: dread d2/db/d2d/f2f [0,4194304] 0 2026-03-09T00:03:58.172 INFO:tasks.workunit.client.0.vm03.stdout:3/342: chown d2/f1d 1619 1 2026-03-09T00:03:58.175 INFO:tasks.workunit.client.0.vm03.stdout:2/425: dwrite d8/d1b/d2a/d6b/d50/f54 [0,4194304] 0 2026-03-09T00:03:58.179 INFO:tasks.workunit.client.1.vm06.stdout:9/586: mknod d1/d4/d6e/d14/cbe 0 2026-03-09T00:03:58.181 INFO:tasks.workunit.client.0.vm03.stdout:3/343: chown d2/db/f26 0 1 2026-03-09T00:03:58.185 INFO:tasks.workunit.client.0.vm03.stdout:8/479: dwrite d7/df/d1a/d2b/f5c [0,4194304] 0 2026-03-09T00:03:58.186 INFO:tasks.workunit.client.1.vm06.stdout:6/722: dwrite d4/d27/d3e/d57/f5c [4194304,4194304] 0 2026-03-09T00:03:58.187 INFO:tasks.workunit.client.0.vm03.stdout:2/426: read d8/d26/f5a [433171,93666] 0 2026-03-09T00:03:58.189 INFO:tasks.workunit.client.1.vm06.stdout:8/689: dwrite db/d53/d7c/d8f/fcc [0,4194304] 0 2026-03-09T00:03:58.197 INFO:tasks.workunit.client.1.vm06.stdout:4/661: mkdir d17/d24/d3b/dbf/ddf 0 2026-03-09T00:03:58.224 INFO:tasks.workunit.client.1.vm06.stdout:5/810: mknod d5/d44/d84/dc5/de8/c112 0 2026-03-09T00:03:58.225 INFO:tasks.workunit.client.0.vm03.stdout:8/480: creat d7/f9c x:0 0 0 2026-03-09T00:03:58.230 INFO:tasks.workunit.client.1.vm06.stdout:8/690: mknod db/d1e/d9b/ce5 0 2026-03-09T00:03:58.230 INFO:tasks.workunit.client.1.vm06.stdout:8/691: dread - db/d53/d70/fcb zero size 2026-03-09T00:03:58.230 INFO:tasks.workunit.client.1.vm06.stdout:8/692: fdatasync db/dd/d48/f4e 0 2026-03-09T00:03:58.232 INFO:tasks.workunit.client.1.vm06.stdout:4/662: creat d17/d24/d3b/d5e/d6e/db0/fe0 x:0 0 0 2026-03-09T00:03:58.236 INFO:tasks.workunit.client.1.vm06.stdout:9/587: dwrite d1/d3/d4f/d91/dae/fb6 [0,4194304] 0 2026-03-09T00:03:58.236 INFO:tasks.workunit.client.1.vm06.stdout:9/588: fsync d1/d4/ff 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:4/663: dread d17/f19 [0,4194304] 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:4/664: creat d17/d21/d4c/dc2/fe1 x:0 0 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:4/665: fsync d17/d24/d3b/d5e/d6e/fc0 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:4/666: write d17/d21/d32/fbd [2466193,130501] 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:4/667: write d17/f35 [9312730,113434] 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:4/668: truncate d17/d21/d4c/d66/f7b 393277 0 2026-03-09T00:03:58.238 INFO:tasks.workunit.client.1.vm06.stdout:6/723: dwrite d4/f2a [0,4194304] 0 2026-03-09T00:03:58.239 INFO:tasks.workunit.client.0.vm03.stdout:8/481: truncate d7/df/d1a/d40/f5e 620542 0 2026-03-09T00:03:58.245 INFO:tasks.workunit.client.0.vm03.stdout:2/427: symlink d8/d1b/d2a/d6b/l88 0 2026-03-09T00:03:58.254 INFO:tasks.workunit.client.1.vm06.stdout:5/811: rename d5/d1c/c93 to d5/d44/d4b/c113 0 2026-03-09T00:03:58.257 INFO:tasks.workunit.client.1.vm06.stdout:9/589: symlink d1/lbf 0 2026-03-09T00:03:58.262 INFO:tasks.workunit.client.1.vm06.stdout:4/669: unlink d17/d21/d4c/d66/d68/cca 0 2026-03-09T00:03:58.263 INFO:tasks.workunit.client.1.vm06.stdout:6/724: rmdir d4 39 2026-03-09T00:03:58.263 INFO:tasks.workunit.client.1.vm06.stdout:4/670: dread d17/d24/fce [0,4194304] 0 2026-03-09T00:03:58.280 INFO:tasks.workunit.client.1.vm06.stdout:5/812: unlink d5/d1c/d23/f54 0 2026-03-09T00:03:58.281 INFO:tasks.workunit.client.1.vm06.stdout:5/813: readlink d5/d1c/d21/d28/d5e/l6e 0 2026-03-09T00:03:58.282 INFO:tasks.workunit.client.1.vm06.stdout:9/590: unlink d1/d4/f54 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.1.vm06.stdout:9/591: stat d1/d3/d2b/d58/f5f 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.1.vm06.stdout:9/592: fsync d1/da7/fb9 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.1.vm06.stdout:9/593: creat d1/da7/fc0 x:0 0 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.1.vm06.stdout:9/594: write d1/d4/d6e/d9/f8a [909924,123837] 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.0.vm03.stdout:0/440: rename d2/da/d1a/d9f to d2/da/d36/da4 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.0.vm03.stdout:0/441: dread d2/f22 [4194304,4194304] 0 2026-03-09T00:03:58.294 INFO:tasks.workunit.client.0.vm03.stdout:8/482: dwrite d7/df/d1a/f33 [0,4194304] 0 2026-03-09T00:03:58.304 INFO:tasks.workunit.client.0.vm03.stdout:6/441: rename d13/d1e/f28 to d13/d35/d4c/d62/fa0 0 2026-03-09T00:03:58.304 INFO:tasks.workunit.client.0.vm03.stdout:6/442: fdatasync d13/f5b 0 2026-03-09T00:03:58.304 INFO:tasks.workunit.client.0.vm03.stdout:6/443: truncate d13/d1e/f30 768601 0 2026-03-09T00:03:58.304 INFO:tasks.workunit.client.0.vm03.stdout:6/444: fdatasync d13/d1e/d44/d4a/f58 0 2026-03-09T00:03:58.304 INFO:tasks.workunit.client.0.vm03.stdout:6/445: truncate d13/d35/d4c/d62/f79 774496 0 2026-03-09T00:03:58.307 INFO:tasks.workunit.client.0.vm03.stdout:0/442: link d2/da/d76/l9e d2/da/d76/d8a/la5 0 2026-03-09T00:03:58.311 INFO:tasks.workunit.client.0.vm03.stdout:6/446: write d13/d1e/d44/d4a/d52/f91 [3611922,62808] 0 2026-03-09T00:03:58.314 INFO:tasks.workunit.client.0.vm03.stdout:0/443: unlink d2/da/d1a/f80 0 2026-03-09T00:03:58.320 INFO:tasks.workunit.client.1.vm06.stdout:9/595: getdents d1/d3/d50 0 2026-03-09T00:03:58.320 INFO:tasks.workunit.client.1.vm06.stdout:9/596: stat d1/d3/d4f/d52 0 2026-03-09T00:03:58.320 INFO:tasks.workunit.client.1.vm06.stdout:6/725: rename d4/d27/d42 to d4/d16/d53/ddf 0 2026-03-09T00:03:58.320 INFO:tasks.workunit.client.1.vm06.stdout:7/716: sync 2026-03-09T00:03:58.323 INFO:tasks.workunit.client.0.vm03.stdout:7/428: write d2/d4/d1e/d5e/d6c/f3f [247481,113352] 0 2026-03-09T00:03:58.324 INFO:tasks.workunit.client.0.vm03.stdout:7/429: dread - d2/d1f/d42/d43/f5f zero size 2026-03-09T00:03:58.326 INFO:tasks.workunit.client.1.vm06.stdout:6/726: getdents d4/d16/d53/ddf/d7e/dac 0 2026-03-09T00:03:58.328 INFO:tasks.workunit.client.1.vm06.stdout:6/727: dread d4/d27/d3e/f44 [0,4194304] 0 2026-03-09T00:03:58.329 INFO:tasks.workunit.client.0.vm03.stdout:7/430: link d2/d4/c17 d2/d1f/d40/d67/d6b/c82 0 2026-03-09T00:03:58.329 INFO:tasks.workunit.client.0.vm03.stdout:7/431: write d2/d1f/d42/d46/f5b [922685,3397] 0 2026-03-09T00:03:58.332 INFO:tasks.workunit.client.1.vm06.stdout:6/728: mknod d4/d16/d53/ddf/d7e/dac/dd3/ce0 0 2026-03-09T00:03:58.336 INFO:tasks.workunit.client.1.vm06.stdout:6/729: read d4/d16/f63 [1788279,77363] 0 2026-03-09T00:03:58.340 INFO:tasks.workunit.client.1.vm06.stdout:6/730: rename d4/d16/d53/ddf/da6/fcc to d4/d16/d53/ddf/d7e/dac/fe1 0 2026-03-09T00:03:58.344 INFO:tasks.workunit.client.1.vm06.stdout:6/731: unlink d4/f2a 0 2026-03-09T00:03:58.358 INFO:tasks.workunit.client.0.vm03.stdout:9/480: rmdir d15/d1c/d28 39 2026-03-09T00:03:58.358 INFO:tasks.workunit.client.0.vm03.stdout:9/481: creat d15/d1c/d21/d54/d87/d93/d74/f9d x:0 0 0 2026-03-09T00:03:58.358 INFO:tasks.workunit.client.0.vm03.stdout:9/482: rmdir d15/d1c/d28/d6e 39 2026-03-09T00:03:58.368 INFO:tasks.workunit.client.0.vm03.stdout:9/483: dread d15/d1c/d36/d4d/f91 [0,4194304] 0 2026-03-09T00:03:58.368 INFO:tasks.workunit.client.0.vm03.stdout:9/484: chown d15/d1c/d21/d99 22 1 2026-03-09T00:03:58.369 INFO:tasks.workunit.client.0.vm03.stdout:9/485: rename d15/d1c/d36/f7a to d15/d1c/d36/f9e 0 2026-03-09T00:03:58.369 INFO:tasks.workunit.client.0.vm03.stdout:9/486: truncate d15/d1c/d21/d64/f50 5117337 0 2026-03-09T00:03:58.369 INFO:tasks.workunit.client.0.vm03.stdout:9/487: chown d15/d1c/d21/d54/f73 310117 1 2026-03-09T00:03:58.369 INFO:tasks.workunit.client.0.vm03.stdout:9/488: stat d15/f17 0 2026-03-09T00:03:58.370 INFO:tasks.workunit.client.0.vm03.stdout:9/489: mknod d15/d1c/d21/c9f 0 2026-03-09T00:03:58.371 INFO:tasks.workunit.client.1.vm06.stdout:4/671: dwrite d17/d21/d32/d92/fa4 [0,4194304] 0 2026-03-09T00:03:58.371 INFO:tasks.workunit.client.0.vm03.stdout:9/490: getdents d15/d1c/d21/d75 0 2026-03-09T00:03:58.372 INFO:tasks.workunit.client.0.vm03.stdout:9/491: mknod d15/d1c/ca0 0 2026-03-09T00:03:58.378 INFO:tasks.workunit.client.0.vm03.stdout:9/492: mknod d15/ca1 0 2026-03-09T00:03:58.379 INFO:tasks.workunit.client.0.vm03.stdout:9/493: readlink d15/d1c/d21/d64/l32 0 2026-03-09T00:03:58.385 INFO:tasks.workunit.client.1.vm06.stdout:0/734: sync 2026-03-09T00:03:58.390 INFO:tasks.workunit.client.1.vm06.stdout:0/735: dread d3/d18/d1f/d39/d3b/df9/f7f [0,4194304] 0 2026-03-09T00:03:58.392 INFO:tasks.workunit.client.1.vm06.stdout:0/736: rmdir d3/d18/de9 39 2026-03-09T00:03:58.396 INFO:tasks.workunit.client.0.vm03.stdout:8/483: dwrite d7/d92/f88 [0,4194304] 0 2026-03-09T00:03:58.397 INFO:tasks.workunit.client.0.vm03.stdout:8/484: unlink d7/f18 0 2026-03-09T00:03:58.397 INFO:tasks.workunit.client.0.vm03.stdout:8/485: fdatasync d7/df/d1e/d3f/f8f 0 2026-03-09T00:03:58.397 INFO:tasks.workunit.client.0.vm03.stdout:8/486: readlink d7/df/d1e/d38/d60/l7b 0 2026-03-09T00:03:58.405 INFO:tasks.workunit.client.1.vm06.stdout:3/723: sync 2026-03-09T00:03:58.405 INFO:tasks.workunit.client.1.vm06.stdout:3/724: truncate d11/d28/d2e/d7e/fdc 4795040 0 2026-03-09T00:03:58.407 INFO:tasks.workunit.client.1.vm06.stdout:3/725: truncate d11/d28/fbf 2667532 0 2026-03-09T00:03:58.413 INFO:tasks.workunit.client.1.vm06.stdout:1/622: sync 2026-03-09T00:03:58.428 INFO:tasks.workunit.client.1.vm06.stdout:2/760: sync 2026-03-09T00:03:58.429 INFO:tasks.workunit.client.1.vm06.stdout:1/623: creat d6/d21/fd4 x:0 0 0 2026-03-09T00:03:58.429 INFO:tasks.workunit.client.1.vm06.stdout:1/624: unlink d6/d4c/d71/f4a 0 2026-03-09T00:03:58.429 INFO:tasks.workunit.client.1.vm06.stdout:2/761: creat d7/da/d63/fe4 x:0 0 0 2026-03-09T00:03:58.429 INFO:tasks.workunit.client.1.vm06.stdout:2/762: fsync d7/da/db/f74 0 2026-03-09T00:03:58.429 INFO:tasks.workunit.client.1.vm06.stdout:2/763: mknod d7/d1a/d25/d66/d87/da8/db2/dc9/ce5 0 2026-03-09T00:03:58.432 INFO:tasks.workunit.client.1.vm06.stdout:2/764: dread d7/da/d1c/f5f [0,4194304] 0 2026-03-09T00:03:58.432 INFO:tasks.workunit.client.1.vm06.stdout:2/765: creat d7/d1b/d71/d79/db4/dc1/fe6 x:0 0 0 2026-03-09T00:03:58.432 INFO:tasks.workunit.client.1.vm06.stdout:2/766: creat d7/d1b/fe7 x:0 0 0 2026-03-09T00:03:58.439 INFO:tasks.workunit.client.1.vm06.stdout:0/737: dread d3/d18/d2c/d2d/d74/fbc [0,4194304] 0 2026-03-09T00:03:58.439 INFO:tasks.workunit.client.1.vm06.stdout:0/738: rename d3/d18/l67 to d3/d18/d1f/d39/lfb 0 2026-03-09T00:03:58.440 INFO:tasks.workunit.client.1.vm06.stdout:0/739: mknod d3/d18/d1f/d44/cfc 0 2026-03-09T00:03:58.455 INFO:tasks.workunit.client.0.vm03.stdout:3/344: dwrite f1 [0,4194304] 0 2026-03-09T00:03:58.455 INFO:tasks.workunit.client.0.vm03.stdout:9/494: dwrite d15/d1c/d21/d54/f73 [0,4194304] 0 2026-03-09T00:03:58.457 INFO:tasks.workunit.client.0.vm03.stdout:3/345: mkdir d2/db/d3b/d5f/d65 0 2026-03-09T00:03:58.461 INFO:tasks.workunit.client.0.vm03.stdout:3/346: unlink d2/db/f17 0 2026-03-09T00:03:58.462 INFO:tasks.workunit.client.0.vm03.stdout:3/347: fdatasync d2/db/f13 0 2026-03-09T00:03:58.463 INFO:tasks.workunit.client.1.vm06.stdout:5/814: dwrite d5/d1c/d23/d34/d47/f87 [0,4194304] 0 2026-03-09T00:03:58.463 INFO:tasks.workunit.client.1.vm06.stdout:5/815: truncate d5/d44/d4b/d92/f46 140722 0 2026-03-09T00:03:58.464 INFO:tasks.workunit.client.0.vm03.stdout:3/348: mknod d2/db/c66 0 2026-03-09T00:03:58.467 INFO:tasks.workunit.client.1.vm06.stdout:4/672: dwrite d17/d24/d49/f65 [0,4194304] 0 2026-03-09T00:03:58.468 INFO:tasks.workunit.client.0.vm03.stdout:6/447: dread fb [0,4194304] 0 2026-03-09T00:03:58.472 INFO:tasks.workunit.client.1.vm06.stdout:6/732: dread d4/f6e [0,4194304] 0 2026-03-09T00:03:58.472 INFO:tasks.workunit.client.1.vm06.stdout:6/733: write d4/d27/fb6 [262987,47800] 0 2026-03-09T00:03:58.474 INFO:tasks.workunit.client.0.vm03.stdout:5/480: dwrite d1c/d20/d55/d43/f53 [0,4194304] 0 2026-03-09T00:03:58.474 INFO:tasks.workunit.client.0.vm03.stdout:6/448: symlink d13/d35/d71/d97/la1 0 2026-03-09T00:03:58.474 INFO:tasks.workunit.client.0.vm03.stdout:6/449: write d13/d35/d74/f81 [756226,42902] 0 2026-03-09T00:03:58.488 INFO:tasks.workunit.client.1.vm06.stdout:8/693: sync 2026-03-09T00:03:58.488 INFO:tasks.workunit.client.1.vm06.stdout:8/694: chown db/d1e/d46/d94/lc5 2438 1 2026-03-09T00:03:58.488 INFO:tasks.workunit.client.1.vm06.stdout:8/695: write db/f3f [726015,115453] 0 2026-03-09T00:03:58.488 INFO:tasks.workunit.client.1.vm06.stdout:4/673: rmdir d17/d24/d3b/d5e/d6e 39 2026-03-09T00:03:58.490 INFO:tasks.workunit.client.0.vm03.stdout:5/481: link d1c/d20/d55/d43/c6e d1c/d20/d55/d4f/d58/c9c 0 2026-03-09T00:03:58.492 INFO:tasks.workunit.client.0.vm03.stdout:5/482: rmdir d1c/d20/d55/d66/d6b/d8f 39 2026-03-09T00:03:58.494 INFO:tasks.workunit.client.1.vm06.stdout:5/816: getdents d5/d44/d4b 0 2026-03-09T00:03:58.494 INFO:tasks.workunit.client.0.vm03.stdout:5/483: symlink d1c/d20/d56/d74/l9d 0 2026-03-09T00:03:58.495 INFO:tasks.workunit.client.0.vm03.stdout:5/484: dread d1c/d20/d55/d3b/f3c [0,4194304] 0 2026-03-09T00:03:58.495 INFO:tasks.workunit.client.1.vm06.stdout:5/817: dread d5/d1c/d68/f3f [0,4194304] 0 2026-03-09T00:03:58.505 INFO:tasks.workunit.client.1.vm06.stdout:8/696: getdents db/d74/d78/d98/db6 0 2026-03-09T00:03:58.509 INFO:tasks.workunit.client.0.vm03.stdout:1/556: sync 2026-03-09T00:03:58.509 INFO:tasks.workunit.client.0.vm03.stdout:4/564: sync 2026-03-09T00:03:58.509 INFO:tasks.workunit.client.0.vm03.stdout:4/565: truncate d7/d20/d29/d54/f96 38800 0 2026-03-09T00:03:58.509 INFO:tasks.workunit.client.0.vm03.stdout:0/444: rmdir d2/da/d76/d8a 39 2026-03-09T00:03:58.522 INFO:tasks.workunit.client.0.vm03.stdout:7/432: truncate d2/d1f/d3a/f19 845024 0 2026-03-09T00:03:58.522 INFO:tasks.workunit.client.0.vm03.stdout:7/433: chown d2/d4/d1e/d5e 833459 1 2026-03-09T00:03:58.522 INFO:tasks.workunit.client.0.vm03.stdout:7/434: fdatasync d2/fc 0 2026-03-09T00:03:58.523 INFO:tasks.workunit.client.1.vm06.stdout:8/697: dread f3 [0,4194304] 0 2026-03-09T00:03:58.525 INFO:tasks.workunit.client.1.vm06.stdout:4/674: symlink d17/d5b/d8f/le2 0 2026-03-09T00:03:58.526 INFO:tasks.workunit.client.0.vm03.stdout:5/485: mkdir d1c/d20/d55/d4f/d58/d73/d9e 0 2026-03-09T00:03:58.526 INFO:tasks.workunit.client.0.vm03.stdout:5/486: read d1c/d20/d55/f34 [49854,19340] 0 2026-03-09T00:03:58.530 INFO:tasks.workunit.client.0.vm03.stdout:9/495: fsync d15/d1c/d36/f9e 0 2026-03-09T00:03:58.530 INFO:tasks.workunit.client.0.vm03.stdout:4/566: link d7/l11 d7/d20/db3/lb9 0 2026-03-09T00:03:58.530 INFO:tasks.workunit.client.0.vm03.stdout:4/567: fdatasync d7/d20/d29/d4e/f9d 0 2026-03-09T00:03:58.531 INFO:tasks.workunit.client.1.vm06.stdout:4/675: getdents d17/d24/d3b/d97 0 2026-03-09T00:03:58.535 INFO:tasks.workunit.client.0.vm03.stdout:7/435: rmdir d2/d1f/d42/d43/d7d 0 2026-03-09T00:03:58.535 INFO:tasks.workunit.client.0.vm03.stdout:7/436: readlink d2/d1f/d3a/l65 0 2026-03-09T00:03:58.546 INFO:tasks.workunit.client.0.vm03.stdout:5/487: mknod d1c/d51/d6a/d75/c9f 0 2026-03-09T00:03:58.546 INFO:tasks.workunit.client.1.vm06.stdout:4/676: write d17/d24/d49/f65 [2307148,124189] 0 2026-03-09T00:03:58.550 INFO:tasks.workunit.client.0.vm03.stdout:5/488: write d1c/d20/d55/d43/f53 [3492768,9290] 0 2026-03-09T00:03:58.567 INFO:tasks.workunit.client.0.vm03.stdout:1/557: getdents d4/d3a/d61/da6 0 2026-03-09T00:03:58.569 INFO:tasks.workunit.client.0.vm03.stdout:1/558: chown d4/d3a/d32/d87/l96 228 1 2026-03-09T00:03:58.569 INFO:tasks.workunit.client.0.vm03.stdout:4/568: creat d7/d20/d6a/fba x:0 0 0 2026-03-09T00:03:58.569 INFO:tasks.workunit.client.0.vm03.stdout:1/559: creat d4/d3a/d43/daf/fbf x:0 0 0 2026-03-09T00:03:58.569 INFO:tasks.workunit.client.0.vm03.stdout:1/560: rename d4/d15/d77/fa9 to d4/d15/d5c/d6c/fc0 0 2026-03-09T00:03:58.569 INFO:tasks.workunit.client.0.vm03.stdout:1/561: symlink d4/d6/d52/db5/lc1 0 2026-03-09T00:03:58.571 INFO:tasks.workunit.client.0.vm03.stdout:3/349: dwrite d2/db/d40/d44/f4d [0,4194304] 0 2026-03-09T00:03:58.571 INFO:tasks.workunit.client.0.vm03.stdout:3/350: fdatasync d2/db/d3b/f3e 0 2026-03-09T00:03:58.571 INFO:tasks.workunit.client.0.vm03.stdout:3/351: stat d2/db/d3b/f62 0 2026-03-09T00:03:58.575 INFO:tasks.workunit.client.0.vm03.stdout:1/562: mkdir d4/d3a/d32/dc2 0 2026-03-09T00:03:58.576 INFO:tasks.workunit.client.0.vm03.stdout:1/563: truncate d4/d15/f45 991781 0 2026-03-09T00:03:58.581 INFO:tasks.workunit.client.1.vm06.stdout:4/677: mkdir d17/d21/d4c/d66/de3 0 2026-03-09T00:03:58.585 INFO:tasks.workunit.client.1.vm06.stdout:4/678: unlink d17/d21/d4c/d66/c9f 0 2026-03-09T00:03:58.585 INFO:tasks.workunit.client.1.vm06.stdout:4/679: write f1 [463074,57863] 0 2026-03-09T00:03:58.587 INFO:tasks.workunit.client.0.vm03.stdout:1/564: mkdir d4/d3a/d61/da6/dc3 0 2026-03-09T00:03:58.611 INFO:tasks.workunit.client.0.vm03.stdout:1/565: creat d4/d3a/d8f/fc4 x:0 0 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.0.vm03.stdout:1/566: mknod d4/d3a/d61/d78/d81/cc5 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.0.vm03.stdout:1/567: creat d4/d6/d52/db5/fc6 x:0 0 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.0.vm03.stdout:1/568: rename d4/d15/d1a/c56 to d4/d15/d1a/cc7 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/680: rename d17/d24/d3b/d5e/d6e to d17/d24/d49/de4 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/681: write d17/d21/d32/fd6 [1032716,55242] 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/682: write d17/d24/f3a [87545,62516] 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/683: creat d17/d24/d3b/d5e/fe5 x:0 0 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/684: mknod d17/d24/d49/de4/db0/ce6 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/685: symlink d17/d24/d3b/d97/db7/le7 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/686: read d17/d24/d49/f65 [3711381,76375] 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/687: rename d17/d24/cc9 to d17/d21/d4c/dc2/ce8 0 2026-03-09T00:03:58.612 INFO:tasks.workunit.client.1.vm06.stdout:4/688: chown d17/d24/d49/c3e 502997 1 2026-03-09T00:03:58.626 INFO:tasks.workunit.client.1.vm06.stdout:9/597: dwrite d1/d3/d2b/f33 [4194304,4194304] 0 2026-03-09T00:03:58.633 INFO:tasks.workunit.client.1.vm06.stdout:9/598: write d1/f16 [841009,58511] 0 2026-03-09T00:03:58.633 INFO:tasks.workunit.client.1.vm06.stdout:9/599: creat d1/d3/d4f/d52/fc1 x:0 0 0 2026-03-09T00:03:58.633 INFO:tasks.workunit.client.1.vm06.stdout:9/600: creat d1/d73/fc2 x:0 0 0 2026-03-09T00:03:58.634 INFO:tasks.workunit.client.0.vm03.stdout:2/428: dwrite d8/d17/f27 [0,4194304] 0 2026-03-09T00:03:58.634 INFO:tasks.workunit.client.0.vm03.stdout:2/429: chown d8/d1b/d2a/d2e/c4e 6603 1 2026-03-09T00:03:58.634 INFO:tasks.workunit.client.0.vm03.stdout:2/430: chown d8/d1b/d2a/d6b/d50/f54 1414815 1 2026-03-09T00:03:58.639 INFO:tasks.workunit.client.0.vm03.stdout:2/431: rename d8/d1b/d24/f66 to d8/d1b/d2a/d6b/f89 0 2026-03-09T00:03:58.639 INFO:tasks.workunit.client.0.vm03.stdout:2/432: truncate d8/d26/d5e/f64 250524 0 2026-03-09T00:03:58.649 INFO:tasks.workunit.client.0.vm03.stdout:2/433: mkdir d8/d1b/d2a/d6b/d50/d8a 0 2026-03-09T00:03:58.650 INFO:tasks.workunit.client.0.vm03.stdout:2/434: fdatasync d8/d17/f1a 0 2026-03-09T00:03:58.652 INFO:tasks.workunit.client.0.vm03.stdout:2/435: chown d8/le 63205 1 2026-03-09T00:03:58.653 INFO:tasks.workunit.client.0.vm03.stdout:2/436: creat d8/d1b/d2a/d6b/f8b x:0 0 0 2026-03-09T00:03:58.657 INFO:tasks.workunit.client.0.vm03.stdout:2/437: dread d8/d1b/f47 [0,4194304] 0 2026-03-09T00:03:58.660 INFO:tasks.workunit.client.0.vm03.stdout:2/438: creat d8/d1b/d2a/d56/f8c x:0 0 0 2026-03-09T00:03:58.663 INFO:tasks.workunit.client.0.vm03.stdout:2/439: fdatasync f7 0 2026-03-09T00:03:58.665 INFO:tasks.workunit.client.0.vm03.stdout:6/450: dwrite d13/f3a [0,4194304] 0 2026-03-09T00:03:58.669 INFO:tasks.workunit.client.0.vm03.stdout:6/451: unlink d13/c41 0 2026-03-09T00:03:58.686 INFO:tasks.workunit.client.0.vm03.stdout:8/487: dwrite d7/df/d1a/d40/f76 [0,4194304] 0 2026-03-09T00:03:58.686 INFO:tasks.workunit.client.1.vm06.stdout:1/625: dwrite d6/f19 [0,4194304] 0 2026-03-09T00:03:58.686 INFO:tasks.workunit.client.1.vm06.stdout:1/626: write d6/d21/fc1 [325127,1474] 0 2026-03-09T00:03:58.686 INFO:tasks.workunit.client.1.vm06.stdout:1/627: creat d6/d4c/d79/fd5 x:0 0 0 2026-03-09T00:03:58.691 INFO:tasks.workunit.client.1.vm06.stdout:5/818: dwrite d5/d44/d4b/f91 [0,4194304] 0 2026-03-09T00:03:58.693 INFO:tasks.workunit.client.0.vm03.stdout:8/488: dread f6 [0,4194304] 0 2026-03-09T00:03:58.695 INFO:tasks.workunit.client.1.vm06.stdout:1/628: rmdir d6/d21/d2d/dd0 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.0.vm03.stdout:8/489: mkdir d7/df/d1a/d40/d9d 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.0.vm03.stdout:8/490: rmdir d7/df/d1e/d38/d91 39 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.1.vm06.stdout:5/819: creat d5/d44/d101/f114 x:0 0 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.1.vm06.stdout:5/820: write d5/d1c/d21/d28/d5e/d66/d78/fc1 [1010119,45839] 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.1.vm06.stdout:1/629: symlink d6/d4c/d51/ld6 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.1.vm06.stdout:1/630: stat d6/d21/l60 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.1.vm06.stdout:1/631: mkdir d6/d21/d2d/d37/d6d/dd7 0 2026-03-09T00:03:58.708 INFO:tasks.workunit.client.0.vm03.stdout:2/440: fdatasync d8/d17/f68 0 2026-03-09T00:03:58.710 INFO:tasks.workunit.client.1.vm06.stdout:1/632: mkdir d6/d21/d2d/d3b/d87/d9d/dd8 0 2026-03-09T00:03:58.710 INFO:tasks.workunit.client.1.vm06.stdout:1/633: fdatasync d6/d63/f6a 0 2026-03-09T00:03:58.712 INFO:tasks.workunit.client.1.vm06.stdout:1/634: mknod d6/d21/d2d/d3b/d42/cd9 0 2026-03-09T00:03:58.720 INFO:tasks.workunit.client.1.vm06.stdout:8/698: dwrite db/dd/f1c [0,4194304] 0 2026-03-09T00:03:58.720 INFO:tasks.workunit.client.1.vm06.stdout:8/699: creat db/dd/d24/dac/fe6 x:0 0 0 2026-03-09T00:03:58.724 INFO:tasks.workunit.client.0.vm03.stdout:2/441: write d8/f11 [2069406,94690] 0 2026-03-09T00:03:58.744 INFO:tasks.workunit.client.0.vm03.stdout:2/442: creat d8/d1b/f8d x:0 0 0 2026-03-09T00:03:58.744 INFO:tasks.workunit.client.0.vm03.stdout:2/443: mknod d8/d26/d5e/d6f/c8e 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/444: chown d8/d1b/d2a/d2e 6 1 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/445: read - d8/d1b/d2a/d56/f8c zero size 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/446: mkdir d8/d1b/d8f 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/447: write d8/d1b/d2a/f2d [175727,9410] 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/448: fsync d8/d1b/d2a/d6b/d50/f54 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/449: creat d8/d1b/d6c/f90 x:0 0 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/450: creat d8/d1b/d2a/d6b/d50/f91 x:0 0 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/451: fsync d8/d1b/f30 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:2/452: creat d8/d1b/d2a/d6b/f92 x:0 0 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.0.vm03.stdout:0/445: dwrite d2/fe [0,4194304] 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:6/734: write d4/d27/f31 [474169,47844] 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/700: unlink db/d53/d5c/l6b 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/701: chown db/dd/d85 197764 1 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:6/735: creat d4/d16/d53/d67/fe2 x:0 0 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/702: rename db/f1d to db/dd/de3/fe7 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/703: dread - db/d53/d70/d38/d4d/db1/fd4 zero size 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:6/736: link d4/d16/d53/fb7 d4/d16/d53/ddf/d52/fe3 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/704: mkdir db/dd/d85/d9f/db7/de8 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/705: write db/d53/d70/d38/d4d/d79/fc1 [614206,83907] 0 2026-03-09T00:03:58.745 INFO:tasks.workunit.client.1.vm06.stdout:8/706: creat db/dd/d24/d63/fe9 x:0 0 0 2026-03-09T00:03:58.750 INFO:tasks.workunit.client.0.vm03.stdout:9/496: dwrite d15/d1c/d36/d4d/f5d [0,4194304] 0 2026-03-09T00:03:58.751 INFO:tasks.workunit.client.0.vm03.stdout:9/497: read d15/f23 [597741,61493] 0 2026-03-09T00:03:58.752 INFO:tasks.workunit.client.0.vm03.stdout:2/453: mknod d8/d26/c93 0 2026-03-09T00:03:58.752 INFO:tasks.workunit.client.0.vm03.stdout:2/454: chown d8/d17/f1d 1277 1 2026-03-09T00:03:58.753 INFO:tasks.workunit.client.0.vm03.stdout:2/455: dread d8/d17/f1d [0,4194304] 0 2026-03-09T00:03:58.756 INFO:tasks.workunit.client.0.vm03.stdout:0/446: rename d2/da/dd/d49/d6c/d4b/d55/d9a to d2/da/dd/d49/d6c/da6 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:9/498: mkdir d15/d1c/d28/d6e/da2 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:9/499: creat d15/d7f/fa3 x:0 0 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:2/456: creat d8/d1b/d2a/d2e/f94 x:0 0 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:0/447: rmdir d2/da/dd/d49/d6c/d4b/d55/d6f 39 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:9/500: creat d15/d1c/d21/d75/fa4 x:0 0 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:2/457: mkdir d8/d26/d5e/d5f/d95 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:2/458: chown d8/d1b/d2a/d6b/d50/d8a 9 1 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:0/448: mknod d2/d5a/ca7 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:9/501: rmdir d15/d1c/d21/d99 0 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:9/502: chown d15/d7f 1944554690 1 2026-03-09T00:03:58.767 INFO:tasks.workunit.client.0.vm03.stdout:0/449: symlink d2/da/d76/d8a/la8 0 2026-03-09T00:03:58.774 INFO:tasks.workunit.client.1.vm06.stdout:9/601: dwrite d1/d4/d6e/d14/d25/d85/f90 [0,4194304] 0 2026-03-09T00:03:58.783 INFO:tasks.workunit.client.0.vm03.stdout:7/437: dwrite d2/d1f/d42/d43/f74 [0,4194304] 0 2026-03-09T00:03:58.783 INFO:tasks.workunit.client.0.vm03.stdout:7/438: write d2/d1f/d42/f7f [478853,717] 0 2026-03-09T00:03:58.783 INFO:tasks.workunit.client.0.vm03.stdout:7/439: fdatasync d2/d4/f2e 0 2026-03-09T00:03:58.783 INFO:tasks.workunit.client.0.vm03.stdout:7/440: dread - d2/d1f/d40/f72 zero size 2026-03-09T00:03:58.789 INFO:tasks.workunit.client.1.vm06.stdout:9/602: creat d1/d4/d6e/d9/fc3 x:0 0 0 2026-03-09T00:03:58.796 INFO:tasks.workunit.client.1.vm06.stdout:9/603: creat d1/d3/d4f/d91/fc4 x:0 0 0 2026-03-09T00:03:58.799 INFO:tasks.workunit.client.1.vm06.stdout:9/604: write d1/d4/d6e/fa9 [7950053,44589] 0 2026-03-09T00:03:58.808 INFO:tasks.workunit.client.1.vm06.stdout:0/740: dwrite d3/d18/f14 [0,4194304] 0 2026-03-09T00:03:58.810 INFO:tasks.workunit.client.1.vm06.stdout:0/741: chown d3/d18/d2c/f7e 2 1 2026-03-09T00:03:58.810 INFO:tasks.workunit.client.0.vm03.stdout:3/352: dwrite d2/db/d2d/f52 [0,4194304] 0 2026-03-09T00:03:58.810 INFO:tasks.workunit.client.0.vm03.stdout:3/353: fsync d2/db/d2d/f52 0 2026-03-09T00:03:58.810 INFO:tasks.workunit.client.1.vm06.stdout:5/821: dread d5/d1c/d23/d34/fb2 [0,4194304] 0 2026-03-09T00:03:58.815 INFO:tasks.workunit.client.1.vm06.stdout:0/742: symlink d3/d18/d2c/d2d/d8c/lfd 0 2026-03-09T00:03:58.815 INFO:tasks.workunit.client.1.vm06.stdout:0/743: dread - d3/d18/d1f/d39/f6e zero size 2026-03-09T00:03:58.821 INFO:tasks.workunit.client.0.vm03.stdout:4/569: dwrite d7/d20/d6a/d77/db7/fa3 [0,4194304] 0 2026-03-09T00:03:58.821 INFO:tasks.workunit.client.0.vm03.stdout:4/570: chown d7/d20/d29/d54/l61 0 1 2026-03-09T00:03:58.822 INFO:tasks.workunit.client.1.vm06.stdout:5/822: rmdir d5/d1c/d23 39 2026-03-09T00:03:58.825 INFO:tasks.workunit.client.1.vm06.stdout:9/605: write d1/d4/d6e/fa4 [286681,56848] 0 2026-03-09T00:03:58.828 INFO:tasks.workunit.client.1.vm06.stdout:2/767: dwrite d7/da/d63/fe4 [0,4194304] 0 2026-03-09T00:03:58.837 INFO:tasks.workunit.client.1.vm06.stdout:0/744: getdents d3/d18/d2c/d2d/d74/da8 0 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.1.vm06.stdout:5/823: unlink d5/d1c/d21/d28/d5e/d66/f8a 0 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.1.vm06.stdout:5/824: fsync d5/d44/f81 0 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.1.vm06.stdout:9/606: mknod d1/d4/d6e/cc5 0 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.0.vm03.stdout:0/450: dread d2/d71/f7c [0,4194304] 0 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.0.vm03.stdout:0/451: chown d2/da/d1a/l46 604542471 1 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.0.vm03.stdout:0/452: creat d2/da/dd/d49/fa9 x:0 0 0 2026-03-09T00:03:58.848 INFO:tasks.workunit.client.0.vm03.stdout:0/453: creat d2/da/d4e/faa x:0 0 0 2026-03-09T00:03:58.849 INFO:tasks.workunit.client.0.vm03.stdout:0/454: stat d2 0 2026-03-09T00:03:58.849 INFO:tasks.workunit.client.0.vm03.stdout:0/455: creat d2/da/dd/d6e/fab x:0 0 0 2026-03-09T00:03:58.851 INFO:tasks.workunit.client.1.vm06.stdout:0/745: read d3/d18/d1f/d39/d49/f4b [3545907,123364] 0 2026-03-09T00:03:58.852 INFO:tasks.workunit.client.0.vm03.stdout:2/459: dread d8/d17/f1c [0,4194304] 0 2026-03-09T00:03:58.852 INFO:tasks.workunit.client.0.vm03.stdout:2/460: creat d8/d1b/d2a/d56/f96 x:0 0 0 2026-03-09T00:03:58.852 INFO:tasks.workunit.client.0.vm03.stdout:2/461: dread - d8/d1b/d6c/f7b zero size 2026-03-09T00:03:58.853 INFO:tasks.workunit.client.0.vm03.stdout:2/462: stat d8/d1b/d2a/d6b/c4d 0 2026-03-09T00:03:58.853 INFO:tasks.workunit.client.0.vm03.stdout:2/463: chown f7 286 1 2026-03-09T00:03:58.860 INFO:tasks.workunit.client.1.vm06.stdout:2/768: symlink d7/d1a/d3c/le8 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:5/825: mkdir d5/d1c/d68/dec/d115 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:9/607: mknod d1/d3/d4f/cc6 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:0/746: rmdir d3/d18/d1f/d39/d49/d60/de4 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:0/747: creat d3/d18/d2c/d2d/d74/daf/de3/ffe x:0 0 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:0/748: chown d3/d18/c20 0 1 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:0/749: fdatasync d3/d18/d1f/d44/f7c 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:0/750: chown d3/d18/d2c/d2d/d31 4135050 1 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:9/608: rename d1/d4/d6e/f93 to d1/d3/d4f/d91/fc7 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:9/609: unlink d1/d3/d50/c57 0 2026-03-09T00:03:58.868 INFO:tasks.workunit.client.1.vm06.stdout:9/610: rename d1/d4/d6e/d9/f40 to d1/da7/fc8 0 2026-03-09T00:03:58.870 INFO:tasks.workunit.client.0.vm03.stdout:9/503: dwrite d15/d1c/d36/f86 [0,4194304] 0 2026-03-09T00:03:58.870 INFO:tasks.workunit.client.0.vm03.stdout:9/504: chown d15/l2a 6822923 1 2026-03-09T00:03:58.871 INFO:tasks.workunit.client.1.vm06.stdout:5/826: write d5/d1c/d23/f4f [4907987,75012] 0 2026-03-09T00:03:58.874 INFO:tasks.workunit.client.0.vm03.stdout:8/491: dwrite d7/df/d1a/d40/f5e [0,4194304] 0 2026-03-09T00:03:58.874 INFO:tasks.workunit.client.0.vm03.stdout:8/492: chown d7/d92/f75 680025 1 2026-03-09T00:03:58.874 INFO:tasks.workunit.client.0.vm03.stdout:8/493: readlink d7/df/d1e/d38/d60/l8a 0 2026-03-09T00:03:58.886 INFO:tasks.workunit.client.0.vm03.stdout:7/441: dwrite d2/f73 [0,4194304] 0 2026-03-09T00:03:58.886 INFO:tasks.workunit.client.0.vm03.stdout:7/442: truncate d2/d1f/f3b 4348630 0 2026-03-09T00:03:58.888 INFO:tasks.workunit.client.1.vm06.stdout:9/611: rmdir d1/d4/d6e/d14 39 2026-03-09T00:03:58.888 INFO:tasks.workunit.client.1.vm06.stdout:9/612: fdatasync d1/d4/d6e/d9/f8a 0 2026-03-09T00:03:58.890 INFO:tasks.workunit.client.1.vm06.stdout:2/769: dread d7/d1b/f3b [0,4194304] 0 2026-03-09T00:03:58.896 INFO:tasks.workunit.client.1.vm06.stdout:2/770: dread - d7/d1b/d71/d79/db4/dc1/d86/fe1 zero size 2026-03-09T00:03:58.902 INFO:tasks.workunit.client.0.vm03.stdout:8/494: dread d7/df/f2c [0,4194304] 0 2026-03-09T00:03:58.908 INFO:tasks.workunit.client.1.vm06.stdout:2/771: write d7/d1a/d25/fa3 [3393802,48526] 0 2026-03-09T00:03:58.909 INFO:tasks.workunit.client.1.vm06.stdout:9/613: unlink d1/d4/d6e/d14/d25/d85/c77 0 2026-03-09T00:03:58.909 INFO:tasks.workunit.client.1.vm06.stdout:9/614: creat d1/d3/d4f/d91/fc9 x:0 0 0 2026-03-09T00:03:58.909 INFO:tasks.workunit.client.1.vm06.stdout:9/615: readlink d1/d3/d4f/d52/l92 0 2026-03-09T00:03:58.911 INFO:tasks.workunit.client.0.vm03.stdout:9/505: rmdir d15/d1c/d28/d6e 39 2026-03-09T00:03:58.931 INFO:tasks.workunit.client.1.vm06.stdout:2/772: rename d7/da/d4e/d57/d9d/fd7 to d7/d1b/da5/dca/fe9 0 2026-03-09T00:03:58.932 INFO:tasks.workunit.client.1.vm06.stdout:9/616: creat d1/d3/d4f/d91/d94/d9e/fca x:0 0 0 2026-03-09T00:03:58.932 INFO:tasks.workunit.client.1.vm06.stdout:9/617: chown d1/d4/d6e/d14/d25/c80 53327 1 2026-03-09T00:03:58.932 INFO:tasks.workunit.client.0.vm03.stdout:8/495: symlink d7/d92/l9e 0 2026-03-09T00:03:58.932 INFO:tasks.workunit.client.0.vm03.stdout:8/496: creat d7/df/d1a/d2b/f9f x:0 0 0 2026-03-09T00:03:58.932 INFO:tasks.workunit.client.0.vm03.stdout:9/506: symlink d15/d1c/d21/d64/la5 0 2026-03-09T00:03:58.932 INFO:tasks.workunit.client.1.vm06.stdout:9/618: symlink d1/d3/d2b/d58/lcb 0 2026-03-09T00:03:58.934 INFO:tasks.workunit.client.0.vm03.stdout:3/354: dwrite d2/db/d3b/f3e [0,4194304] 0 2026-03-09T00:03:58.939 INFO:tasks.workunit.client.0.vm03.stdout:4/571: dwrite d7/fa7 [0,4194304] 0 2026-03-09T00:03:58.942 INFO:tasks.workunit.client.1.vm06.stdout:4/689: rmdir d17/d21/d4c/d66 39 2026-03-09T00:03:58.953 INFO:tasks.workunit.client.1.vm06.stdout:4/690: rename f15 to d17/d24/d49/de4/fe9 0 2026-03-09T00:03:58.961 INFO:tasks.workunit.client.0.vm03.stdout:8/497: read d7/df/f55 [165167,10515] 0 2026-03-09T00:03:58.961 INFO:tasks.workunit.client.0.vm03.stdout:8/498: truncate d7/df/d1e/d3f/f47 188945 0 2026-03-09T00:03:58.962 INFO:tasks.workunit.client.0.vm03.stdout:8/499: mknod d7/df/ca0 0 2026-03-09T00:03:58.962 INFO:tasks.workunit.client.0.vm03.stdout:8/500: chown d7/l13 0 1 2026-03-09T00:03:58.962 INFO:tasks.workunit.client.0.vm03.stdout:8/501: chown d7/df/d1a/d2b/f44 251578 1 2026-03-09T00:03:58.964 INFO:tasks.workunit.client.0.vm03.stdout:8/502: symlink d7/df/d1a/d2b/d62/la1 0 2026-03-09T00:03:58.965 INFO:tasks.workunit.client.0.vm03.stdout:8/503: link d7/df/d1e/d38/d60/l7b d7/df/d1e/d38/d4c/la2 0 2026-03-09T00:03:58.972 INFO:tasks.workunit.client.1.vm06.stdout:8/707: write db/dd/f67 [134104,128968] 0 2026-03-09T00:03:58.980 INFO:tasks.workunit.client.1.vm06.stdout:8/708: chown db/dd/d24/da7/dab 56 1 2026-03-09T00:03:58.981 INFO:tasks.workunit.client.1.vm06.stdout:8/709: truncate db/d1e/f2e 2141816 0 2026-03-09T00:03:58.991 INFO:tasks.workunit.client.1.vm06.stdout:5/827: dwrite d5/d1c/d21/d28/d5e/d66/d78/dc8/f7a [0,4194304] 0 2026-03-09T00:03:58.993 INFO:tasks.workunit.client.1.vm06.stdout:4/691: write d17/d21/d32/f96 [795414,32688] 0 2026-03-09T00:03:58.999 INFO:tasks.workunit.client.0.vm03.stdout:2/464: dwrite d8/d1b/d2a/f33 [0,4194304] 0 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.0.vm03.stdout:2/465: fsync d8/d1b/d2a/d6b/f8b 0 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.0.vm03.stdout:2/466: read f7 [73162,69566] 0 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:5/828: write d5/d1c/d21/d28/d5e/d66/d78/dc8/f7a [152998,49833] 0 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:4/692: mkdir d17/d24/d3b/dbf/dea 0 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:4/693: unlink d17/d24/d3b/d5e/c99 0 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:4/694: chown d17/d21/d4c/d66/d68 111120 1 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:4/695: chown d17/d24/d3b/dbf/ddf 8449 1 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:4/696: dread - d17/d21/fb8 zero size 2026-03-09T00:03:59.015 INFO:tasks.workunit.client.1.vm06.stdout:4/697: chown d17/d5b/d8f/fd3 0 1 2026-03-09T00:03:59.020 INFO:tasks.workunit.client.1.vm06.stdout:5/829: write d5/d1c/d21/d28/d5e/d66/d78/da6/fd4 [190318,13975] 0 2026-03-09T00:03:59.021 INFO:tasks.workunit.client.1.vm06.stdout:4/698: unlink d17/c1a 0 2026-03-09T00:03:59.025 INFO:tasks.workunit.client.1.vm06.stdout:5/830: creat d5/d44/d84/dc5/de8/f116 x:0 0 0 2026-03-09T00:03:59.030 INFO:tasks.workunit.client.1.vm06.stdout:5/831: write d5/d1c/d23/d34/d47/f87 [2064801,25459] 0 2026-03-09T00:03:59.031 INFO:tasks.workunit.client.1.vm06.stdout:4/699: getdents d17/d24 0 2026-03-09T00:03:59.037 INFO:tasks.workunit.client.1.vm06.stdout:4/700: mknod d17/d5b/d8f/ceb 0 2026-03-09T00:03:59.037 INFO:tasks.workunit.client.1.vm06.stdout:4/701: creat d17/d5b/dac/fec x:0 0 0 2026-03-09T00:03:59.049 INFO:tasks.workunit.client.0.vm03.stdout:2/467: rename d8/d17 to d8/d26/d5e/d6f/d97 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/468: creat d8/d26/d5e/d6f/f98 x:0 0 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/469: mknod d8/d1b/d2a/d2e/c99 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/470: truncate f6 2214515 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/471: chown d8/fd 530 1 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/472: mkdir d8/d1b/d2a/d2e/d9a 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/473: creat d8/f9b x:0 0 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/474: fdatasync d8/d26/d5e/f64 0 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/475: chown d8/d1b/d2a/d6b/f87 77510366 1 2026-03-09T00:03:59.091 INFO:tasks.workunit.client.0.vm03.stdout:2/476: rename d8/d26/d5e/d6f/d97/f3c to d8/d26/d5e/d5f/f9c 0 2026-03-09T00:03:59.096 INFO:tasks.workunit.client.1.vm06.stdout:2/773: dwrite d7/d1b/fd8 [0,4194304] 0 2026-03-09T00:03:59.104 INFO:tasks.workunit.client.1.vm06.stdout:2/774: creat d7/d1a/d25/d97/fea x:0 0 0 2026-03-09T00:03:59.104 INFO:tasks.workunit.client.1.vm06.stdout:2/775: fsync d7/d1a/d56/f50 0 2026-03-09T00:03:59.105 INFO:tasks.workunit.client.0.vm03.stdout:7/443: dwrite d2/d1f/d42/d43/f4a [0,4194304] 0 2026-03-09T00:03:59.105 INFO:tasks.workunit.client.0.vm03.stdout:7/444: read d2/d4/f2e [610378,124619] 0 2026-03-09T00:03:59.105 INFO:tasks.workunit.client.0.vm03.stdout:7/445: readlink d2/d1f/d3a/d24/l76 0 2026-03-09T00:03:59.107 INFO:tasks.workunit.client.0.vm03.stdout:7/446: mknod d2/d4/d1e/d78/c83 0 2026-03-09T00:03:59.128 INFO:tasks.workunit.client.1.vm06.stdout:2/776: dread d7/d1b/f5c [4194304,4194304] 0 2026-03-09T00:03:59.145 INFO:tasks.workunit.client.1.vm06.stdout:8/710: dwrite db/dd/fd3 [0,4194304] 0 2026-03-09T00:03:59.146 INFO:tasks.workunit.client.0.vm03.stdout:9/507: dwrite d15/d1c/d36/f4a [0,4194304] 0 2026-03-09T00:03:59.146 INFO:tasks.workunit.client.0.vm03.stdout:9/508: creat d15/d1c/d21/d75/fa6 x:0 0 0 2026-03-09T00:03:59.147 INFO:tasks.workunit.client.1.vm06.stdout:9/619: dwrite d1/d4/d6e/d14/d25/d85/f28 [0,4194304] 0 2026-03-09T00:03:59.147 INFO:tasks.workunit.client.1.vm06.stdout:9/620: write d1/fb5 [805459,104423] 0 2026-03-09T00:03:59.171 INFO:tasks.workunit.client.1.vm06.stdout:2/777: mkdir d7/da/d1c/deb 0 2026-03-09T00:03:59.175 INFO:tasks.workunit.client.0.vm03.stdout:4/572: dwrite d7/d20/d29/d38/f6e [0,4194304] 0 2026-03-09T00:03:59.196 INFO:tasks.workunit.client.1.vm06.stdout:2/778: rename d7/d1b/d71/d79/db4/dc1/d86/l95 to d7/d1b/da5/dca/lec 0 2026-03-09T00:03:59.198 INFO:tasks.workunit.client.1.vm06.stdout:2/779: symlink d7/d1a/d96/led 0 2026-03-09T00:03:59.198 INFO:tasks.workunit.client.1.vm06.stdout:2/780: fsync d7/d1a/d25/fbd 0 2026-03-09T00:03:59.201 INFO:tasks.workunit.client.0.vm03.stdout:2/477: dwrite d8/d1b/f31 [0,4194304] 0 2026-03-09T00:03:59.201 INFO:tasks.workunit.client.0.vm03.stdout:2/478: fdatasync d8/d1b/f30 0 2026-03-09T00:03:59.201 INFO:tasks.workunit.client.1.vm06.stdout:8/711: dwrite db/dd/d48/f68 [4194304,4194304] 0 2026-03-09T00:03:59.212 INFO:tasks.workunit.client.0.vm03.stdout:7/447: dwrite d2/d4/fb [4194304,4194304] 0 2026-03-09T00:03:59.212 INFO:tasks.workunit.client.0.vm03.stdout:7/448: dread - d2/d1f/d42/d43/f5f zero size 2026-03-09T00:03:59.212 INFO:tasks.workunit.client.1.vm06.stdout:2/781: write d7/da/d4e/d57/d9d/fbe [126829,43962] 0 2026-03-09T00:03:59.212 INFO:tasks.workunit.client.1.vm06.stdout:2/782: creat d7/da/d63/fee x:0 0 0 2026-03-09T00:03:59.212 INFO:tasks.workunit.client.1.vm06.stdout:2/783: write d7/da/fbf [89263,34043] 0 2026-03-09T00:03:59.213 INFO:tasks.workunit.client.0.vm03.stdout:4/573: dwrite d7/d20/d29/d54/f96 [0,4194304] 0 2026-03-09T00:03:59.216 INFO:tasks.workunit.client.1.vm06.stdout:5/832: rmdir d5/d1c/d21/d28/d5e 39 2026-03-09T00:03:59.216 INFO:tasks.workunit.client.1.vm06.stdout:5/833: chown d5/d1c/d68/fb4 1558213 1 2026-03-09T00:03:59.218 INFO:tasks.workunit.client.1.vm06.stdout:2/784: dread d7/da/f18 [0,4194304] 0 2026-03-09T00:03:59.228 INFO:tasks.workunit.client.0.vm03.stdout:2/479: creat d8/d1b/d2a/d6b/f9d x:0 0 0 2026-03-09T00:03:59.230 INFO:tasks.workunit.client.1.vm06.stdout:5/834: write d5/d44/d4b/d92/d49/da0/ffd [549140,126556] 0 2026-03-09T00:03:59.243 INFO:tasks.workunit.client.1.vm06.stdout:3/726: sync 2026-03-09T00:03:59.243 INFO:tasks.workunit.client.1.vm06.stdout:7/717: sync 2026-03-09T00:03:59.250 INFO:tasks.workunit.client.0.vm03.stdout:2/480: dread d8/d1b/d2a/d6b/f87 [0,4194304] 0 2026-03-09T00:03:59.256 INFO:tasks.workunit.client.0.vm03.stdout:2/481: dread - d8/f9b zero size 2026-03-09T00:03:59.256 INFO:tasks.workunit.client.0.vm03.stdout:2/482: link d8/d1b/d2a/d56/f96 d8/d1b/d2a/d2e/f9e 0 2026-03-09T00:03:59.256 INFO:tasks.workunit.client.0.vm03.stdout:2/483: mknod d8/d26/d5e/d6f/d97/c9f 0 2026-03-09T00:03:59.290 INFO:tasks.workunit.client.1.vm06.stdout:8/712: dwrite db/d53/d70/d38/f5b [0,4194304] 0 2026-03-09T00:03:59.295 INFO:tasks.workunit.client.1.vm06.stdout:8/713: mknod db/d74/d87/cea 0 2026-03-09T00:03:59.295 INFO:tasks.workunit.client.1.vm06.stdout:8/714: write db/d74/d78/fbf [4255555,79719] 0 2026-03-09T00:03:59.300 INFO:tasks.workunit.client.1.vm06.stdout:8/715: read db/fd0 [763383,21959] 0 2026-03-09T00:03:59.302 INFO:tasks.workunit.client.1.vm06.stdout:2/785: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:03:59.304 INFO:tasks.workunit.client.1.vm06.stdout:8/716: unlink db/f16 0 2026-03-09T00:03:59.304 INFO:tasks.workunit.client.1.vm06.stdout:8/717: readlink db/d53/d5c/laf 0 2026-03-09T00:03:59.304 INFO:tasks.workunit.client.1.vm06.stdout:8/718: write db/f3f [738252,31509] 0 2026-03-09T00:03:59.304 INFO:tasks.workunit.client.1.vm06.stdout:8/719: write db/d53/d7c/fa0 [1356652,99238] 0 2026-03-09T00:03:59.305 INFO:tasks.workunit.client.1.vm06.stdout:5/835: dwrite d5/d1c/d23/d34/d47/fbd [0,4194304] 0 2026-03-09T00:03:59.314 INFO:tasks.workunit.client.1.vm06.stdout:8/720: truncate db/d53/d70/f91 1522645 0 2026-03-09T00:03:59.314 INFO:tasks.workunit.client.1.vm06.stdout:8/721: write db/d53/d6d/d7b/f9a [505706,107622] 0 2026-03-09T00:03:59.314 INFO:tasks.workunit.client.1.vm06.stdout:8/722: stat db/d1e/f82 0 2026-03-09T00:03:59.317 INFO:tasks.workunit.client.1.vm06.stdout:5/836: link d5/f14 d5/d1c/d23/d34/d47/dcf/f117 0 2026-03-09T00:03:59.318 INFO:tasks.workunit.client.1.vm06.stdout:8/723: mknod db/dd/d24/ceb 0 2026-03-09T00:03:59.318 INFO:tasks.workunit.client.1.vm06.stdout:8/724: creat db/dd/d24/d63/fec x:0 0 0 2026-03-09T00:03:59.318 INFO:tasks.workunit.client.1.vm06.stdout:8/725: fsync db/f55 0 2026-03-09T00:03:59.319 INFO:tasks.workunit.client.1.vm06.stdout:5/837: getdents d5/d44 0 2026-03-09T00:03:59.319 INFO:tasks.workunit.client.1.vm06.stdout:5/838: chown d5/d1c/d21/d28 51616 1 2026-03-09T00:03:59.321 INFO:tasks.workunit.client.1.vm06.stdout:8/726: link db/d1e/f82 db/d53/d70/d38/d4d/d79/fed 0 2026-03-09T00:03:59.321 INFO:tasks.workunit.client.1.vm06.stdout:5/839: creat d5/d44/d84/f118 x:0 0 0 2026-03-09T00:03:59.322 INFO:tasks.workunit.client.1.vm06.stdout:8/727: symlink db/dd/d24/da7/dab/lee 0 2026-03-09T00:03:59.322 INFO:tasks.workunit.client.1.vm06.stdout:8/728: readlink db/d1e/l3b 0 2026-03-09T00:03:59.322 INFO:tasks.workunit.client.1.vm06.stdout:8/729: read - db/d1e/fda zero size 2026-03-09T00:03:59.323 INFO:tasks.workunit.client.1.vm06.stdout:5/840: link d5/d1c/d23/d34/cb7 d5/d1c/d23/d34/d47/ddd/c119 0 2026-03-09T00:03:59.325 INFO:tasks.workunit.client.1.vm06.stdout:5/841: dread d5/d44/d4b/d92/d95/fb0 [0,4194304] 0 2026-03-09T00:03:59.330 INFO:tasks.workunit.client.1.vm06.stdout:5/842: symlink d5/d44/d4b/d92/l11a 0 2026-03-09T00:03:59.332 INFO:tasks.workunit.client.1.vm06.stdout:5/843: mkdir d5/d1c/d21/d28/d5e/d66/dab/d11b 0 2026-03-09T00:03:59.339 INFO:tasks.workunit.client.1.vm06.stdout:1/635: sync 2026-03-09T00:03:59.340 INFO:tasks.workunit.client.1.vm06.stdout:1/636: symlink d6/d63/lda 0 2026-03-09T00:03:59.341 INFO:tasks.workunit.client.1.vm06.stdout:1/637: creat d6/d21/d2d/d3b/d87/d9d/dd8/fdb x:0 0 0 2026-03-09T00:03:59.341 INFO:tasks.workunit.client.1.vm06.stdout:1/638: chown d6/d4c/d71/d83/cb8 922162 1 2026-03-09T00:03:59.345 INFO:tasks.workunit.client.1.vm06.stdout:1/639: creat d6/db0/fdc x:0 0 0 2026-03-09T00:03:59.355 INFO:tasks.workunit.client.1.vm06.stdout:8/730: dwrite db/d1e/f52 [0,4194304] 0 2026-03-09T00:03:59.367 INFO:tasks.workunit.client.1.vm06.stdout:8/731: mknod db/dd/d24/cef 0 2026-03-09T00:03:59.371 INFO:tasks.workunit.client.1.vm06.stdout:8/732: truncate db/dd/d24/f33 1961671 0 2026-03-09T00:03:59.391 INFO:tasks.workunit.client.1.vm06.stdout:1/640: dwrite d6/d4c/d79/fb2 [0,4194304] 0 2026-03-09T00:03:59.391 INFO:tasks.workunit.client.1.vm06.stdout:1/641: read - d6/d4c/d79/fd5 zero size 2026-03-09T00:03:59.393 INFO:tasks.workunit.client.1.vm06.stdout:8/733: dread db/d53/d7c/fa0 [0,4194304] 0 2026-03-09T00:03:59.394 INFO:tasks.workunit.client.1.vm06.stdout:1/642: creat d6/d21/d2d/d3b/d87/d9d/fdd x:0 0 0 2026-03-09T00:03:59.394 INFO:tasks.workunit.client.1.vm06.stdout:1/643: write d6/d21/fc8 [916787,1400] 0 2026-03-09T00:03:59.394 INFO:tasks.workunit.client.1.vm06.stdout:8/734: getdents db/dd/d85 0 2026-03-09T00:03:59.398 INFO:tasks.workunit.client.1.vm06.stdout:8/735: dread db/dd/d48/f68 [0,4194304] 0 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: Active manager daemon vm03.yvcons restarted 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: Activating manager daemon vm03.yvcons 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/crt"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/key"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: mgrmap e27: vm03.yvcons(active, starting, since 0.0594546s) 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:03:59.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:03:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:03:59.427 INFO:tasks.workunit.client.1.vm06.stdout:8/736: dwrite db/d53/d7c/f95 [0,4194304] 0 2026-03-09T00:03:59.427 INFO:tasks.workunit.client.1.vm06.stdout:8/737: chown db/d1e/d46/d94 170655691 1 2026-03-09T00:03:59.430 INFO:tasks.workunit.client.1.vm06.stdout:8/738: creat db/d74/d78/d98/db6/ff0 x:0 0 0 2026-03-09T00:03:59.434 INFO:tasks.workunit.client.1.vm06.stdout:8/739: dread db/d74/d78/d98/fbb [0,4194304] 0 2026-03-09T00:03:59.434 INFO:tasks.workunit.client.1.vm06.stdout:8/740: dread - db/d74/d78/d98/db6/ff0 zero size 2026-03-09T00:03:59.440 INFO:tasks.workunit.client.1.vm06.stdout:8/741: write db/d1e/f51 [1833807,111828] 0 2026-03-09T00:03:59.442 INFO:tasks.workunit.client.1.vm06.stdout:8/742: mkdir db/dd/d84/df1 0 2026-03-09T00:03:59.442 INFO:tasks.workunit.client.1.vm06.stdout:8/743: fdatasync db/f31 0 2026-03-09T00:03:59.445 INFO:tasks.workunit.client.1.vm06.stdout:8/744: creat db/d1e/ff2 x:0 0 0 2026-03-09T00:03:59.448 INFO:tasks.workunit.client.0.vm03.stdout:7/449: symlink d2/d4/d1e/d5e/d6c/l84 0 2026-03-09T00:03:59.448 INFO:tasks.workunit.client.0.vm03.stdout:7/450: stat d2/ce 0 2026-03-09T00:03:59.448 INFO:tasks.workunit.client.0.vm03.stdout:7/451: truncate d2/d4/f2e 1240082 0 2026-03-09T00:03:59.448 INFO:tasks.workunit.client.1.vm06.stdout:8/745: getdents db/d1e/d9b 0 2026-03-09T00:03:59.451 INFO:tasks.workunit.client.0.vm03.stdout:9/509: rename d15/f8f to d15/d1c/d28/fa7 0 2026-03-09T00:03:59.452 INFO:tasks.workunit.client.0.vm03.stdout:9/510: creat d15/d1c/d36/fa8 x:0 0 0 2026-03-09T00:03:59.452 INFO:tasks.workunit.client.0.vm03.stdout:9/511: creat d15/d1c/d28/d6e/fa9 x:0 0 0 2026-03-09T00:03:59.453 INFO:tasks.workunit.client.0.vm03.stdout:9/512: unlink d15/d1c/d28/f39 0 2026-03-09T00:03:59.454 INFO:tasks.workunit.client.0.vm03.stdout:9/513: creat d15/d1c/d28/faa x:0 0 0 2026-03-09T00:03:59.455 INFO:tasks.workunit.client.0.vm03.stdout:9/514: mkdir d15/d1c/d21/d54/dab 0 2026-03-09T00:03:59.455 INFO:tasks.workunit.client.0.vm03.stdout:9/515: creat d15/d1c/d21/d64/fac x:0 0 0 2026-03-09T00:03:59.455 INFO:tasks.workunit.client.0.vm03.stdout:9/516: creat d15/d1c/d36/d4d/fad x:0 0 0 2026-03-09T00:03:59.455 INFO:tasks.workunit.client.0.vm03.stdout:9/517: fdatasync d15/d1c/d21/d54/f65 0 2026-03-09T00:03:59.460 INFO:tasks.workunit.client.0.vm03.stdout:9/518: dread d15/f26 [0,4194304] 0 2026-03-09T00:03:59.460 INFO:tasks.workunit.client.0.vm03.stdout:9/519: dread d15/d1c/d36/f3a [0,4194304] 0 2026-03-09T00:03:59.462 INFO:tasks.workunit.client.0.vm03.stdout:9/520: symlink d15/d1c/d28/d6e/lae 0 2026-03-09T00:03:59.462 INFO:tasks.workunit.client.0.vm03.stdout:9/521: dread - d15/d1c/d28/faa zero size 2026-03-09T00:03:59.462 INFO:tasks.workunit.client.0.vm03.stdout:9/522: dread - d15/d1c/d21/d64/fac zero size 2026-03-09T00:03:59.463 INFO:tasks.workunit.client.0.vm03.stdout:9/523: creat d15/d1c/d21/d54/dab/faf x:0 0 0 2026-03-09T00:03:59.491 INFO:tasks.workunit.client.0.vm03.stdout:9/524: dwrite d15/d1c/d36/f5c [0,4194304] 0 2026-03-09T00:03:59.493 INFO:tasks.workunit.client.0.vm03.stdout:9/525: link d15/d1c/d28/d6e/lae d15/d77/lb0 0 2026-03-09T00:03:59.510 INFO:tasks.workunit.client.1.vm06.stdout:9/621: creat d1/d3/d4f/d91/fcc x:0 0 0 2026-03-09T00:03:59.510 INFO:tasks.workunit.client.1.vm06.stdout:9/622: fsync d1/d3/f9b 0 2026-03-09T00:03:59.512 INFO:tasks.workunit.client.1.vm06.stdout:9/623: creat d1/d3/d4f/d91/d94/fcd x:0 0 0 2026-03-09T00:03:59.512 INFO:tasks.workunit.client.1.vm06.stdout:9/624: chown d1/d3/d50/cac 50983 1 2026-03-09T00:03:59.512 INFO:tasks.workunit.client.1.vm06.stdout:9/625: write d1/f78 [995895,55229] 0 2026-03-09T00:03:59.524 INFO:tasks.workunit.client.1.vm06.stdout:8/746: write db/d53/d70/f91 [607130,103589] 0 2026-03-09T00:03:59.524 INFO:tasks.workunit.client.1.vm06.stdout:8/747: readlink db/d53/d5c/laf 0 2026-03-09T00:03:59.524 INFO:tasks.workunit.client.1.vm06.stdout:8/748: fsync db/d74/d78/fe2 0 2026-03-09T00:03:59.527 INFO:tasks.workunit.client.1.vm06.stdout:8/749: symlink db/dd/d85/d9f/lf3 0 2026-03-09T00:03:59.527 INFO:tasks.workunit.client.1.vm06.stdout:8/750: write db/d1e/f34 [2273189,30617] 0 2026-03-09T00:03:59.527 INFO:tasks.workunit.client.1.vm06.stdout:8/751: readlink db/d1e/d46/d94/lc5 0 2026-03-09T00:03:59.528 INFO:tasks.workunit.client.1.vm06.stdout:8/752: symlink db/d53/d7c/d8f/lf4 0 2026-03-09T00:03:59.528 INFO:tasks.workunit.client.1.vm06.stdout:8/753: write db/d1e/f2e [678021,21794] 0 2026-03-09T00:03:59.531 INFO:tasks.workunit.client.1.vm06.stdout:8/754: mkdir db/d1e/d9b/df5 0 2026-03-09T00:03:59.531 INFO:tasks.workunit.client.1.vm06.stdout:8/755: write db/dd/d84/fe4 [306626,103996] 0 2026-03-09T00:03:59.535 INFO:tasks.workunit.client.1.vm06.stdout:8/756: dread db/d53/d70/f54 [0,4194304] 0 2026-03-09T00:03:59.565 INFO:tasks.workunit.client.1.vm06.stdout:8/757: dwrite db/fd0 [0,4194304] 0 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: Active manager daemon vm03.yvcons restarted 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: Activating manager daemon vm03.yvcons 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/crt"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm03.yvcons/key"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: mgrmap e27: vm03.yvcons(active, starting, since 0.0594546s) 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:03:59.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:03:59.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:03:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:03:59.609 INFO:tasks.workunit.client.0.vm03.stdout:5/489: sync 2026-03-09T00:03:59.609 INFO:tasks.workunit.client.0.vm03.stdout:6/452: sync 2026-03-09T00:03:59.609 INFO:tasks.workunit.client.0.vm03.stdout:0/456: sync 2026-03-09T00:03:59.609 INFO:tasks.workunit.client.0.vm03.stdout:1/569: sync 2026-03-09T00:03:59.609 INFO:tasks.workunit.client.0.vm03.stdout:3/355: sync 2026-03-09T00:03:59.610 INFO:tasks.workunit.client.0.vm03.stdout:8/504: sync 2026-03-09T00:03:59.610 INFO:tasks.workunit.client.0.vm03.stdout:8/505: readlink d7/df/l70 0 2026-03-09T00:03:59.610 INFO:tasks.workunit.client.0.vm03.stdout:2/484: sync 2026-03-09T00:03:59.613 INFO:tasks.workunit.client.0.vm03.stdout:6/453: symlink d13/d1e/d44/d59/la2 0 2026-03-09T00:03:59.613 INFO:tasks.workunit.client.0.vm03.stdout:6/454: truncate d13/f1d 2160888 0 2026-03-09T00:03:59.613 INFO:tasks.workunit.client.0.vm03.stdout:6/455: write d13/d35/d4c/d62/f9a [383339,4064] 0 2026-03-09T00:03:59.616 INFO:tasks.workunit.client.0.vm03.stdout:2/485: read d8/d26/d5e/d6f/d97/f75 [257941,10125] 0 2026-03-09T00:03:59.616 INFO:tasks.workunit.client.0.vm03.stdout:2/486: write d8/d26/f85 [190297,74269] 0 2026-03-09T00:03:59.616 INFO:tasks.workunit.client.0.vm03.stdout:0/457: rmdir d2/da/d76/d8a 39 2026-03-09T00:03:59.617 INFO:tasks.workunit.client.0.vm03.stdout:8/506: mkdir d7/df/d1a/d40/d9d/da3 0 2026-03-09T00:03:59.620 INFO:tasks.workunit.client.0.vm03.stdout:3/356: rmdir d2/db/d3b/d5d 39 2026-03-09T00:03:59.623 INFO:tasks.workunit.client.0.vm03.stdout:3/357: chown d2/db/c35 706721010 1 2026-03-09T00:03:59.623 INFO:tasks.workunit.client.0.vm03.stdout:3/358: chown d2/db/d3b 447952 1 2026-03-09T00:03:59.623 INFO:tasks.workunit.client.0.vm03.stdout:8/507: mknod d7/df/d1e/d38/d60/ca4 0 2026-03-09T00:03:59.628 INFO:tasks.workunit.client.0.vm03.stdout:0/458: dread d2/da/dd/d49/d6c/f52 [0,4194304] 0 2026-03-09T00:03:59.633 INFO:tasks.workunit.client.0.vm03.stdout:3/359: write d2/f16 [1258495,78999] 0 2026-03-09T00:03:59.633 INFO:tasks.workunit.client.0.vm03.stdout:3/360: write d2/db/d2d/f45 [534869,88453] 0 2026-03-09T00:03:59.686 INFO:tasks.workunit.client.1.vm06.stdout:3/727: unlink d11/d28/c3d 0 2026-03-09T00:03:59.686 INFO:tasks.workunit.client.0.vm03.stdout:4/574: rename d7/d20/d29/d38/c93 to d7/d6f/da5/db0/cbb 0 2026-03-09T00:03:59.687 INFO:tasks.workunit.client.1.vm06.stdout:8/758: dwrite db/d74/faa [0,4194304] 0 2026-03-09T00:03:59.687 INFO:tasks.workunit.client.1.vm06.stdout:3/728: mkdir d11/d28/d2e/d2f/dfe 0 2026-03-09T00:03:59.689 INFO:tasks.workunit.client.0.vm03.stdout:4/575: symlink d7/d20/d29/d78/lbc 0 2026-03-09T00:03:59.689 INFO:tasks.workunit.client.0.vm03.stdout:9/526: rename d15/d1c/d28/f5e to d15/d1c/d36/fb1 0 2026-03-09T00:03:59.690 INFO:tasks.workunit.client.0.vm03.stdout:9/527: chown d15/d1c/d21/d54/d87/d93/d74 230 1 2026-03-09T00:03:59.690 INFO:tasks.workunit.client.1.vm06.stdout:8/759: truncate db/d74/d78/fd2 3786406 0 2026-03-09T00:03:59.690 INFO:tasks.workunit.client.1.vm06.stdout:3/729: mkdir d11/d28/d2e/dff 0 2026-03-09T00:03:59.690 INFO:tasks.workunit.client.1.vm06.stdout:8/760: fdatasync db/dd/d24/d63/fe9 0 2026-03-09T00:03:59.691 INFO:tasks.workunit.client.1.vm06.stdout:8/761: mknod db/dd/d85/d9f/db7/cf6 0 2026-03-09T00:03:59.691 INFO:tasks.workunit.client.1.vm06.stdout:8/762: mknod db/d53/d70/d38/d4d/db1/cf7 0 2026-03-09T00:03:59.691 INFO:tasks.workunit.client.1.vm06.stdout:8/763: chown db/f3f 18792775 1 2026-03-09T00:03:59.695 INFO:tasks.workunit.client.0.vm03.stdout:9/528: dread f8 [0,4194304] 0 2026-03-09T00:03:59.695 INFO:tasks.workunit.client.0.vm03.stdout:9/529: readlink d15/l59 0 2026-03-09T00:03:59.695 INFO:tasks.workunit.client.0.vm03.stdout:9/530: truncate d15/d1c/d21/d54/dab/faf 984965 0 2026-03-09T00:03:59.707 INFO:tasks.workunit.client.1.vm06.stdout:8/764: symlink db/dd/d24/da7/lf8 0 2026-03-09T00:03:59.742 INFO:tasks.workunit.client.0.vm03.stdout:2/487: dwrite d8/d1b/d2a/f4c [4194304,4194304] 0 2026-03-09T00:03:59.743 INFO:tasks.workunit.client.0.vm03.stdout:2/488: symlink d8/d1b/d8f/la0 0 2026-03-09T00:03:59.756 INFO:tasks.workunit.client.0.vm03.stdout:6/456: dwrite d13/d1e/d44/d59/d77/f94 [0,4194304] 0 2026-03-09T00:03:59.756 INFO:tasks.workunit.client.0.vm03.stdout:6/457: chown d13/d1e/d44 1833 1 2026-03-09T00:03:59.757 INFO:tasks.workunit.client.0.vm03.stdout:6/458: rmdir d13/d35/d69 39 2026-03-09T00:03:59.757 INFO:tasks.workunit.client.0.vm03.stdout:6/459: chown d13/d35/c7f 7093 1 2026-03-09T00:03:59.757 INFO:tasks.workunit.client.0.vm03.stdout:7/452: dwrite d2/d1f/f3b [4194304,4194304] 0 2026-03-09T00:03:59.757 INFO:tasks.workunit.client.0.vm03.stdout:6/460: rmdir d13/d35/d71 39 2026-03-09T00:03:59.758 INFO:tasks.workunit.client.0.vm03.stdout:7/453: mkdir d2/d4/d1e/d85 0 2026-03-09T00:03:59.759 INFO:tasks.workunit.client.0.vm03.stdout:7/454: symlink d2/d4/d1e/d78/l86 0 2026-03-09T00:03:59.759 INFO:tasks.workunit.client.0.vm03.stdout:7/455: dread - d2/d1f/d40/f72 zero size 2026-03-09T00:03:59.759 INFO:tasks.workunit.client.0.vm03.stdout:7/456: rmdir d2/d4/d1e/d5e/d6c/d37/d39 39 2026-03-09T00:03:59.759 INFO:tasks.workunit.client.0.vm03.stdout:7/457: write d2/d4/d1e/d5e/d6c/d37/f56 [4938800,19145] 0 2026-03-09T00:03:59.759 INFO:tasks.workunit.client.0.vm03.stdout:7/458: fdatasync d2/d1f/f62 0 2026-03-09T00:03:59.760 INFO:tasks.workunit.client.0.vm03.stdout:7/459: symlink d2/d1f/d3a/l87 0 2026-03-09T00:03:59.760 INFO:tasks.workunit.client.0.vm03.stdout:7/460: read - d2/d1f/d40/d67/f70 zero size 2026-03-09T00:03:59.762 INFO:tasks.workunit.client.0.vm03.stdout:8/508: dwrite d7/df/f37 [4194304,4194304] 0 2026-03-09T00:03:59.762 INFO:tasks.workunit.client.0.vm03.stdout:8/509: dread - d7/df/f87 zero size 2026-03-09T00:03:59.766 INFO:tasks.workunit.client.0.vm03.stdout:7/461: dread d2/d1f/d3a/f1a [0,4194304] 0 2026-03-09T00:03:59.766 INFO:tasks.workunit.client.0.vm03.stdout:8/510: dread d7/df/d1a/f33 [0,4194304] 0 2026-03-09T00:03:59.767 INFO:tasks.workunit.client.0.vm03.stdout:8/511: creat d7/df/d1e/d38/d91/fa5 x:0 0 0 2026-03-09T00:03:59.770 INFO:tasks.workunit.client.0.vm03.stdout:7/462: dread d2/d1f/f3b [4194304,4194304] 0 2026-03-09T00:03:59.774 INFO:tasks.workunit.client.0.vm03.stdout:7/463: mkdir d2/d4/d1e/d5e/d6c/d88 0 2026-03-09T00:03:59.777 INFO:tasks.workunit.client.0.vm03.stdout:7/464: dread d2/d4/f13 [0,4194304] 0 2026-03-09T00:03:59.777 INFO:tasks.workunit.client.0.vm03.stdout:6/461: dread d13/d35/d4c/f4f [0,4194304] 0 2026-03-09T00:03:59.777 INFO:tasks.workunit.client.0.vm03.stdout:6/462: readlink d13/d1e/d44/l53 0 2026-03-09T00:03:59.778 INFO:tasks.workunit.client.1.vm06.stdout:1/644: creat d6/d4c/fde x:0 0 0 2026-03-09T00:03:59.778 INFO:tasks.workunit.client.1.vm06.stdout:1/645: stat d6/d21/d2d 0 2026-03-09T00:03:59.779 INFO:tasks.workunit.client.0.vm03.stdout:7/465: symlink d2/l89 0 2026-03-09T00:03:59.780 INFO:tasks.workunit.client.0.vm03.stdout:6/463: symlink d13/d35/d4c/la3 0 2026-03-09T00:03:59.782 INFO:tasks.workunit.client.0.vm03.stdout:7/466: link d2/f3 d2/d1f/d40/d67/f8a 0 2026-03-09T00:03:59.784 INFO:tasks.workunit.client.0.vm03.stdout:7/467: symlink d2/l8b 0 2026-03-09T00:03:59.784 INFO:tasks.workunit.client.0.vm03.stdout:7/468: dread - d2/d1f/d40/f72 zero size 2026-03-09T00:03:59.791 INFO:tasks.workunit.client.1.vm06.stdout:2/786: rename d7/d1b/d71/d79/db4/dc1/fbb to d7/da/fef 0 2026-03-09T00:03:59.792 INFO:tasks.workunit.client.1.vm06.stdout:0/751: sync 2026-03-09T00:03:59.792 INFO:tasks.workunit.client.1.vm06.stdout:6/737: sync 2026-03-09T00:03:59.792 INFO:tasks.workunit.client.1.vm06.stdout:0/752: chown d3/d18/d2c/f7e 3395 1 2026-03-09T00:03:59.792 INFO:tasks.workunit.client.1.vm06.stdout:0/753: chown d3/d18/d1f/d39/d3b/df9/df2/fec 4325326 1 2026-03-09T00:03:59.792 INFO:tasks.workunit.client.1.vm06.stdout:0/754: write d3/d18/d1f/d39/d49/d60/f92 [295425,14922] 0 2026-03-09T00:03:59.792 INFO:tasks.workunit.client.0.vm03.stdout:3/361: creat d2/db/f67 x:0 0 0 2026-03-09T00:03:59.796 INFO:tasks.workunit.client.0.vm03.stdout:3/362: write d2/db/f25 [73064,107800] 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.1.vm06.stdout:2/787: mknod d7/d1a/d3c/cf0 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.1.vm06.stdout:6/738: readlink d4/d27/d3e/l6a 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.1.vm06.stdout:2/788: mkdir d7/d1a/d39/df1 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.1.vm06.stdout:6/739: getdents d4/d8d 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.1.vm06.stdout:6/740: fsync d4/d16/d53/f82 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/363: mkdir d2/db/d40/d44/d68 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/364: rename d2/db/d2d/f2f to d2/db/d3b/d3f/f69 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/365: mkdir d2/db/d6a 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/366: fdatasync d2/f5 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/367: unlink d2/f4b 0 2026-03-09T00:03:59.810 INFO:tasks.workunit.client.0.vm03.stdout:3/368: creat d2/db/d40/f6b x:0 0 0 2026-03-09T00:03:59.815 INFO:tasks.workunit.client.1.vm06.stdout:6/741: dread d4/d27/d3e/f41 [0,4194304] 0 2026-03-09T00:03:59.815 INFO:tasks.workunit.client.1.vm06.stdout:6/742: mknod d4/d16/d53/d67/ce4 0 2026-03-09T00:03:59.816 INFO:tasks.workunit.client.1.vm06.stdout:6/743: symlink d4/d8d/le5 0 2026-03-09T00:03:59.820 INFO:tasks.workunit.client.0.vm03.stdout:4/576: dwrite d7/d20/d6a/fba [0,4194304] 0 2026-03-09T00:03:59.827 INFO:tasks.workunit.client.1.vm06.stdout:6/744: rmdir d4/d16/d46/d90 39 2026-03-09T00:03:59.835 INFO:tasks.workunit.client.1.vm06.stdout:6/745: mknod d4/d16/d46/ce6 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/490: dwrite fb [4194304,4194304] 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:1/570: dwrite d4/d3a/d61/f75 [0,4194304] 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/491: creat d1c/d20/d55/d4f/d58/fa0 x:0 0 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:1/571: mknod d4/d6/d52/db5/cc8 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/492: mkdir d1c/d20/d56/da1 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:1/572: symlink d4/d6/d52/lc9 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/493: creat d1c/d20/d55/d4f/d58/d73/d76/d91/fa2 x:0 0 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/494: creat d1c/d20/fa3 x:0 0 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/495: chown d1c/d20/d55/l40 7009 1 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/496: write d1c/d20/d55/d4f/d58/d5d/f64 [582392,119343] 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/497: chown d1c/d20/c23 1514008 1 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/498: chown d1c/d20/d55 19006 1 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/499: truncate d1c/d20/d55/f7d 953764 0 2026-03-09T00:03:59.836 INFO:tasks.workunit.client.0.vm03.stdout:5/500: getdents d1c/d20/d55/d4f/d58/d73/d76/d8e 0 2026-03-09T00:03:59.837 INFO:tasks.workunit.client.0.vm03.stdout:1/573: dread d4/d3a/f48 [0,4194304] 0 2026-03-09T00:03:59.838 INFO:tasks.workunit.client.0.vm03.stdout:1/574: symlink d4/d6/lca 0 2026-03-09T00:03:59.846 INFO:tasks.workunit.client.1.vm06.stdout:0/755: write d3/d18/d1f/d39/d69/fb4 [549861,120095] 0 2026-03-09T00:03:59.846 INFO:tasks.workunit.client.1.vm06.stdout:0/756: mknod d3/d18/d1f/d39/d49/d60/cff 0 2026-03-09T00:03:59.846 INFO:tasks.workunit.client.1.vm06.stdout:0/757: creat d3/d18/de9/f100 x:0 0 0 2026-03-09T00:03:59.849 INFO:tasks.workunit.client.0.vm03.stdout:5/501: dread d1c/d20/f39 [0,4194304] 0 2026-03-09T00:03:59.853 INFO:tasks.workunit.client.0.vm03.stdout:5/502: write d1c/d20/d55/f3d [581620,68597] 0 2026-03-09T00:03:59.859 INFO:tasks.workunit.client.0.vm03.stdout:5/503: symlink d1c/d20/d55/d4f/d58/d73/d9e/la4 0 2026-03-09T00:03:59.859 INFO:tasks.workunit.client.0.vm03.stdout:5/504: truncate d1c/d20/d56/d74/f9a 953043 0 2026-03-09T00:03:59.860 INFO:tasks.workunit.client.0.vm03.stdout:5/505: mkdir d1c/d20/d55/d4f/d58/d73/d9e/da5 0 2026-03-09T00:03:59.860 INFO:tasks.workunit.client.0.vm03.stdout:5/506: chown d1c/d20/d55/d66/d6b/d8f/f98 9 1 2026-03-09T00:03:59.860 INFO:tasks.workunit.client.0.vm03.stdout:5/507: chown d1c/d20/d55/d3b/c7b 2455133 1 2026-03-09T00:03:59.860 INFO:tasks.workunit.client.0.vm03.stdout:5/508: truncate d1c/d20/d56/d74/f84 70807 0 2026-03-09T00:03:59.860 INFO:tasks.workunit.client.0.vm03.stdout:5/509: write d1c/d20/f39 [4786868,79925] 0 2026-03-09T00:03:59.861 INFO:tasks.workunit.client.0.vm03.stdout:5/510: write d1c/d20/d55/d43/f4d [786978,18364] 0 2026-03-09T00:03:59.861 INFO:tasks.workunit.client.0.vm03.stdout:5/511: creat d1c/d20/d55/d4f/d58/fa6 x:0 0 0 2026-03-09T00:03:59.865 INFO:tasks.workunit.client.0.vm03.stdout:5/512: mkdir d1c/d20/d55/d43/da7 0 2026-03-09T00:03:59.892 INFO:tasks.workunit.client.0.vm03.stdout:9/531: dwrite d15/d1c/d28/fa7 [0,4194304] 0 2026-03-09T00:03:59.930 INFO:tasks.workunit.client.0.vm03.stdout:6/464: dwrite d13/d35/f82 [0,4194304] 0 2026-03-09T00:03:59.930 INFO:tasks.workunit.client.0.vm03.stdout:7/469: dwrite d2/d1f/d3a/f5d [0,4194304] 0 2026-03-09T00:03:59.931 INFO:tasks.workunit.client.0.vm03.stdout:8/512: dwrite d7/df/f53 [0,4194304] 0 2026-03-09T00:03:59.934 INFO:tasks.workunit.client.0.vm03.stdout:6/465: truncate d13/d35/f6a 432404 0 2026-03-09T00:03:59.934 INFO:tasks.workunit.client.0.vm03.stdout:0/459: dwrite d2/da/d1a/f56 [0,4194304] 0 2026-03-09T00:03:59.966 INFO:tasks.workunit.client.0.vm03.stdout:7/470: mkdir d2/d4/d8c 0 2026-03-09T00:03:59.968 INFO:tasks.workunit.client.0.vm03.stdout:8/513: creat d7/df/d1a/d40/d9d/da3/fa6 x:0 0 0 2026-03-09T00:03:59.968 INFO:tasks.workunit.client.0.vm03.stdout:8/514: getdents d7/df/d1e/d38/d4c/d98 0 2026-03-09T00:03:59.968 INFO:tasks.workunit.client.0.vm03.stdout:8/515: fdatasync d7/df/d1a/f1c 0 2026-03-09T00:03:59.975 INFO:tasks.workunit.client.0.vm03.stdout:6/466: dread f8 [0,4194304] 0 2026-03-09T00:03:59.979 INFO:tasks.workunit.client.0.vm03.stdout:0/460: getdents d2/da/dd/d49/d6c/d81 0 2026-03-09T00:03:59.979 INFO:tasks.workunit.client.0.vm03.stdout:0/461: unlink d2/da/dd/d49/d6c/l4a 0 2026-03-09T00:03:59.981 INFO:tasks.workunit.client.1.vm06.stdout:9/626: dwrite d1/d3/f23 [0,4194304] 0 2026-03-09T00:03:59.988 INFO:tasks.workunit.client.0.vm03.stdout:6/467: getdents d13/d1e/d44/d4a/d52 0 2026-03-09T00:03:59.988 INFO:tasks.workunit.client.0.vm03.stdout:6/468: readlink d13/d1e/l56 0 2026-03-09T00:03:59.990 INFO:tasks.workunit.client.0.vm03.stdout:8/516: dread d7/df/d1a/d2b/f44 [0,4194304] 0 2026-03-09T00:03:59.992 INFO:tasks.workunit.client.0.vm03.stdout:0/462: symlink d2/da/dd/d49/d6c/d81/lac 0 2026-03-09T00:03:59.996 INFO:tasks.workunit.client.0.vm03.stdout:8/517: creat d7/df/d1e/d38/d4c/d98/fa7 x:0 0 0 2026-03-09T00:04:00.000 INFO:tasks.workunit.client.0.vm03.stdout:0/463: mkdir d2/da/dd/d49/d6c/d4b/d55/d6f/dad 0 2026-03-09T00:04:00.004 INFO:tasks.workunit.client.0.vm03.stdout:8/518: write d7/df/d1a/d40/f4d [4002465,40912] 0 2026-03-09T00:04:00.004 INFO:tasks.workunit.client.0.vm03.stdout:8/519: chown d7/df/d1a/f93 97 1 2026-03-09T00:04:00.004 INFO:tasks.workunit.client.0.vm03.stdout:8/520: dread - d7/df/d1e/d38/d4c/f97 zero size 2026-03-09T00:04:00.004 INFO:tasks.workunit.client.0.vm03.stdout:8/521: fdatasync d7/df/d1a/d2b/f8d 0 2026-03-09T00:04:00.004 INFO:tasks.workunit.client.0.vm03.stdout:8/522: fsync d7/df/d1a/d40/f78 0 2026-03-09T00:04:00.004 INFO:tasks.workunit.client.0.vm03.stdout:8/523: truncate d7/df/d1a/f4f 199460 0 2026-03-09T00:04:00.008 INFO:tasks.workunit.client.0.vm03.stdout:0/464: symlink d2/d71/lae 0 2026-03-09T00:04:00.011 INFO:tasks.workunit.client.0.vm03.stdout:8/524: mknod d7/df/d1a/d2b/ca8 0 2026-03-09T00:04:00.014 INFO:tasks.workunit.client.0.vm03.stdout:5/513: dwrite d1c/d20/f3e [0,4194304] 0 2026-03-09T00:04:00.014 INFO:tasks.workunit.client.0.vm03.stdout:0/465: mkdir d2/da/dd/d49/d6c/d4b/daf 0 2026-03-09T00:04:00.014 INFO:tasks.workunit.client.0.vm03.stdout:0/466: creat d2/da/d1a/fb0 x:0 0 0 2026-03-09T00:04:00.015 INFO:tasks.workunit.client.1.vm06.stdout:1/646: dwrite d6/d21/d2d/d3b/d87/d9d/fa3 [0,4194304] 0 2026-03-09T00:04:00.015 INFO:tasks.workunit.client.1.vm06.stdout:1/647: read - d6/d4c/fde zero size 2026-03-09T00:04:00.015 INFO:tasks.workunit.client.1.vm06.stdout:1/648: write d6/d4c/d71/f45 [1160087,94236] 0 2026-03-09T00:04:00.015 INFO:tasks.workunit.client.1.vm06.stdout:1/649: read d6/d21/d2d/d3b/d87/f9e [3758250,44629] 0 2026-03-09T00:04:00.015 INFO:tasks.workunit.client.1.vm06.stdout:1/650: write d6/d4c/f90 [1225752,5222] 0 2026-03-09T00:04:00.017 INFO:tasks.workunit.client.0.vm03.stdout:8/525: mkdir d7/df/d1a/d40/d9d/da9 0 2026-03-09T00:04:00.017 INFO:tasks.workunit.client.0.vm03.stdout:8/526: write d7/df/f29 [281512,46524] 0 2026-03-09T00:04:00.017 INFO:tasks.workunit.client.0.vm03.stdout:8/527: readlink d7/df/d1e/d38/d4c/l8b 0 2026-03-09T00:04:00.034 INFO:tasks.workunit.client.0.vm03.stdout:1/575: dwrite d4/d15/f8a [0,4194304] 0 2026-03-09T00:04:00.038 INFO:tasks.workunit.client.0.vm03.stdout:0/467: truncate d2/da/d36/da4/f43 747796 0 2026-03-09T00:04:00.043 INFO:tasks.workunit.client.0.vm03.stdout:4/577: dwrite d7/d20/f34 [0,4194304] 0 2026-03-09T00:04:00.043 INFO:tasks.workunit.client.1.vm06.stdout:9/627: dwrite d1/d4/d6e/d9/fc3 [0,4194304] 0 2026-03-09T00:04:00.044 INFO:tasks.workunit.client.0.vm03.stdout:1/576: truncate d4/d3a/d61/d78/f8e 3199584 0 2026-03-09T00:04:00.046 INFO:tasks.workunit.client.0.vm03.stdout:9/532: creat d15/d1c/fb2 x:0 0 0 2026-03-09T00:04:00.046 INFO:tasks.workunit.client.0.vm03.stdout:9/533: creat d15/d7f/fb3 x:0 0 0 2026-03-09T00:04:00.046 INFO:tasks.workunit.client.0.vm03.stdout:9/534: chown d15/d1c/d36/d4d/c56 422 1 2026-03-09T00:04:00.046 INFO:tasks.workunit.client.0.vm03.stdout:9/535: chown d15/d1c/d21/f25 3991003 1 2026-03-09T00:04:00.047 INFO:tasks.workunit.client.0.vm03.stdout:9/536: dread d15/d1c/d28/d6e/f7c [0,4194304] 0 2026-03-09T00:04:00.054 INFO:tasks.workunit.client.1.vm06.stdout:5/844: rename d5/d1c/d21/d28/f33 to d5/d44/f11c 0 2026-03-09T00:04:00.057 INFO:tasks.workunit.client.1.vm06.stdout:4/702: sync 2026-03-09T00:04:00.066 INFO:tasks.workunit.client.0.vm03.stdout:4/578: creat d7/d20/d29/fbd x:0 0 0 2026-03-09T00:04:00.070 INFO:tasks.workunit.client.1.vm06.stdout:8/765: rename db/d53/d70/d38/c6c to db/d53/d6d/d7b/cf9 0 2026-03-09T00:04:00.070 INFO:tasks.workunit.client.1.vm06.stdout:8/766: chown db/d53/d70/d38/d4d/d79/dd5 532766535 1 2026-03-09T00:04:00.071 INFO:tasks.workunit.client.0.vm03.stdout:1/577: unlink d4/d6/f8 0 2026-03-09T00:04:00.071 INFO:tasks.workunit.client.0.vm03.stdout:1/578: write d4/d15/d77/fbd [36266,78185] 0 2026-03-09T00:04:00.073 INFO:tasks.workunit.client.1.vm06.stdout:9/628: getdents d1/d4/d6e/d9 0 2026-03-09T00:04:00.078 INFO:tasks.workunit.client.1.vm06.stdout:7/718: sync 2026-03-09T00:04:00.079 INFO:tasks.workunit.client.1.vm06.stdout:5/845: truncate d5/f43 621299 0 2026-03-09T00:04:00.081 INFO:tasks.workunit.client.0.vm03.stdout:0/468: dwrite d2/da/dd/d49/d6c/f5c [0,4194304] 0 2026-03-09T00:04:00.084 INFO:tasks.workunit.client.0.vm03.stdout:4/579: rename d7/d20/c2b to d7/d20/d6a/d77/d25/cbe 0 2026-03-09T00:04:00.089 INFO:tasks.workunit.client.1.vm06.stdout:6/746: rename d4/d16/d53/ddf/d7e/c7c to d4/d16/d53/ddf/ce7 0 2026-03-09T00:04:00.093 INFO:tasks.workunit.client.0.vm03.stdout:1/579: mkdir d4/d15/dae/dcb 0 2026-03-09T00:04:00.094 INFO:tasks.workunit.client.1.vm06.stdout:9/629: rmdir d1/d3/d4f/d91/dae 39 2026-03-09T00:04:00.099 INFO:tasks.workunit.client.1.vm06.stdout:7/719: creat d0/df/d1a/d27/d4c/d40/d51/d90/dcc/fce x:0 0 0 2026-03-09T00:04:00.099 INFO:tasks.workunit.client.1.vm06.stdout:7/720: fsync d0/df/d1a/d3a/f3c 0 2026-03-09T00:04:00.103 INFO:tasks.workunit.client.0.vm03.stdout:9/537: dwrite d15/d1c/d21/d75/fa4 [0,4194304] 0 2026-03-09T00:04:00.111 INFO:tasks.workunit.client.1.vm06.stdout:5/846: mknod d5/d44/d4b/d92/d49/da0/c11d 0 2026-03-09T00:04:00.113 INFO:tasks.workunit.client.1.vm06.stdout:5/847: dread d5/d44/f81 [0,4194304] 0 2026-03-09T00:04:00.113 INFO:tasks.workunit.client.1.vm06.stdout:5/848: dread - d5/d1c/d21/d28/d5e/f10d zero size 2026-03-09T00:04:00.120 INFO:tasks.workunit.client.1.vm06.stdout:1/651: rename d6/d4c/fde to d6/db0/fdf 0 2026-03-09T00:04:00.127 INFO:tasks.workunit.client.1.vm06.stdout:6/747: unlink d4/c14 0 2026-03-09T00:04:00.131 INFO:tasks.workunit.client.0.vm03.stdout:0/469: mknod d2/da/d1a/cb1 0 2026-03-09T00:04:00.133 INFO:tasks.workunit.client.1.vm06.stdout:8/767: dwrite db/d1e/fda [0,4194304] 0 2026-03-09T00:04:00.139 INFO:tasks.workunit.client.1.vm06.stdout:9/630: symlink d1/d4/d6e/d14/d25/lce 0 2026-03-09T00:04:00.145 INFO:tasks.workunit.client.1.vm06.stdout:8/768: write db/dd/f27 [4188809,118087] 0 2026-03-09T00:04:00.147 INFO:tasks.workunit.client.1.vm06.stdout:2/789: dwrite d7/d1a/d25/d66/f84 [0,4194304] 0 2026-03-09T00:04:00.161 INFO:tasks.workunit.client.1.vm06.stdout:7/721: mknod d0/ccf 0 2026-03-09T00:04:00.162 INFO:tasks.workunit.client.0.vm03.stdout:4/580: rename d7/d20/d29/d38/fb4 to d7/d20/fbf 0 2026-03-09T00:04:00.162 INFO:tasks.workunit.client.0.vm03.stdout:4/581: chown d7/d20/db3 75 1 2026-03-09T00:04:00.162 INFO:tasks.workunit.client.0.vm03.stdout:4/582: write d7/d20/d29/fa0 [238425,29312] 0 2026-03-09T00:04:00.162 INFO:tasks.workunit.client.0.vm03.stdout:4/583: write d7/d20/d6a/d77/d25/fb8 [1636714,95973] 0 2026-03-09T00:04:00.165 INFO:tasks.workunit.client.0.vm03.stdout:1/580: symlink d4/d15/lcc 0 2026-03-09T00:04:00.165 INFO:tasks.workunit.client.0.vm03.stdout:1/581: write d4/d3a/d3d/d46/f5d [476505,22686] 0 2026-03-09T00:04:00.176 INFO:tasks.workunit.client.0.vm03.stdout:0/470: dwrite f0 [0,4194304] 0 2026-03-09T00:04:00.176 INFO:tasks.workunit.client.0.vm03.stdout:0/471: creat d2/da/d76/fb2 x:0 0 0 2026-03-09T00:04:00.178 INFO:tasks.workunit.client.0.vm03.stdout:9/538: symlink d15/d1c/d28/lb4 0 2026-03-09T00:04:00.182 INFO:tasks.workunit.client.1.vm06.stdout:9/631: mkdir d1/d73/dcf 0 2026-03-09T00:04:00.186 INFO:tasks.workunit.client.1.vm06.stdout:0/758: sync 2026-03-09T00:04:00.186 INFO:tasks.workunit.client.1.vm06.stdout:3/730: sync 2026-03-09T00:04:00.187 INFO:tasks.workunit.client.0.vm03.stdout:3/369: sync 2026-03-09T00:04:00.187 INFO:tasks.workunit.client.0.vm03.stdout:2/489: sync 2026-03-09T00:04:00.189 INFO:tasks.workunit.client.0.vm03.stdout:4/584: mknod d7/d20/d29/d54/cc0 0 2026-03-09T00:04:00.190 INFO:tasks.workunit.client.0.vm03.stdout:4/585: write d7/d20/d29/fa0 [14882,7955] 0 2026-03-09T00:04:00.194 INFO:tasks.workunit.client.0.vm03.stdout:1/582: link d4/d3a/f48 d4/d3a/d32/da3/fcd 0 2026-03-09T00:04:00.194 INFO:tasks.workunit.client.0.vm03.stdout:4/586: write d7/f62 [2728033,117981] 0 2026-03-09T00:04:00.195 INFO:tasks.workunit.client.1.vm06.stdout:2/790: dwrite d7/da/db/de/f32 [0,4194304] 0 2026-03-09T00:04:00.202 INFO:tasks.workunit.client.1.vm06.stdout:7/722: mknod d0/df/d17/dba/cd0 0 2026-03-09T00:04:00.202 INFO:tasks.workunit.client.1.vm06.stdout:7/723: truncate d0/df/d1a/d3a/d4e/fa4 552834 0 2026-03-09T00:04:00.202 INFO:tasks.workunit.client.1.vm06.stdout:7/724: write d0/df/d7b/fc0 [709720,30113] 0 2026-03-09T00:04:00.204 INFO:tasks.workunit.client.1.vm06.stdout:7/725: dread d0/f4f [0,4194304] 0 2026-03-09T00:04:00.207 INFO:tasks.workunit.client.1.vm06.stdout:1/652: mkdir d6/d4c/de0 0 2026-03-09T00:04:00.207 INFO:tasks.workunit.client.1.vm06.stdout:1/653: truncate d6/d21/d2d/d3b/d87/d9d/dd8/fdb 170755 0 2026-03-09T00:04:00.207 INFO:tasks.workunit.client.1.vm06.stdout:1/654: chown d6/d21/d2d/d3b/d42/cd9 115080 1 2026-03-09T00:04:00.207 INFO:tasks.workunit.client.0.vm03.stdout:6/469: sync 2026-03-09T00:04:00.207 INFO:tasks.workunit.client.0.vm03.stdout:6/470: getdents d13/d1e/d44/d4a/d52 0 2026-03-09T00:04:00.207 INFO:tasks.workunit.client.0.vm03.stdout:0/472: fdatasync d2/da/d36/f58 0 2026-03-09T00:04:00.208 INFO:tasks.workunit.client.1.vm06.stdout:3/731: dread d11/d28/d4d/f6e [0,4194304] 0 2026-03-09T00:04:00.220 INFO:tasks.workunit.client.0.vm03.stdout:9/539: mkdir d15/d1c/d21/db5 0 2026-03-09T00:04:00.222 INFO:tasks.workunit.client.1.vm06.stdout:6/748: mknod d4/d27/d3e/d45/ce8 0 2026-03-09T00:04:00.233 INFO:tasks.workunit.client.1.vm06.stdout:9/632: mknod d1/d4/d6e/d14/d25/d85/cd0 0 2026-03-09T00:04:00.234 INFO:tasks.workunit.client.0.vm03.stdout:8/528: sync 2026-03-09T00:04:00.234 INFO:tasks.workunit.client.0.vm03.stdout:8/529: read f6 [871196,74102] 0 2026-03-09T00:04:00.234 INFO:tasks.workunit.client.0.vm03.stdout:8/530: chown d7/f67 0 1 2026-03-09T00:04:00.235 INFO:tasks.workunit.client.0.vm03.stdout:3/370: rename d2/db/d40/f6b to d2/db/d3b/f6c 0 2026-03-09T00:04:00.235 INFO:tasks.workunit.client.0.vm03.stdout:3/371: dread - d2/db/d3b/d5d/f60 zero size 2026-03-09T00:04:00.235 INFO:tasks.workunit.client.0.vm03.stdout:3/372: dread - d2/db/d3b/f6c zero size 2026-03-09T00:04:00.235 INFO:tasks.workunit.client.0.vm03.stdout:3/373: chown d2/db/d40/f4a 3 1 2026-03-09T00:04:00.242 INFO:tasks.workunit.client.1.vm06.stdout:4/703: sync 2026-03-09T00:04:00.242 INFO:tasks.workunit.client.1.vm06.stdout:0/759: mknod d3/d18/d1f/d39/c101 0 2026-03-09T00:04:00.242 INFO:tasks.workunit.client.0.vm03.stdout:2/490: symlink d8/d1b/d24/la1 0 2026-03-09T00:04:00.242 INFO:tasks.workunit.client.1.vm06.stdout:5/849: write d5/d1c/d23/f5b [1272812,119043] 0 2026-03-09T00:04:00.247 INFO:tasks.workunit.client.1.vm06.stdout:0/760: dread d3/d18/d1f/d39/d3b/f66 [0,4194304] 0 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.0.vm03.stdout:1/583: mkdir d4/d15/d77/dce 0 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.0.vm03.stdout:5/514: sync 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.0.vm03.stdout:1/584: chown d4/d3a/d43/daf 4 1 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.0.vm03.stdout:7/471: sync 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.1.vm06.stdout:5/850: dread d5/d1c/d23/f4f [4194304,4194304] 0 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.1.vm06.stdout:5/851: chown d5/d1c/d23/l2c 282971 1 2026-03-09T00:04:00.253 INFO:tasks.workunit.client.1.vm06.stdout:5/852: dread - d5/d1c/d21/ff3 zero size 2026-03-09T00:04:00.256 INFO:tasks.workunit.client.0.vm03.stdout:4/587: mknod d7/d20/d29/d78/cc1 0 2026-03-09T00:04:00.256 INFO:tasks.workunit.client.0.vm03.stdout:4/588: readlink d7/l6d 0 2026-03-09T00:04:00.265 INFO:tasks.workunit.client.0.vm03.stdout:1/585: dread d4/d3a/d61/d78/f8e [0,4194304] 0 2026-03-09T00:04:00.274 INFO:tasks.workunit.client.0.vm03.stdout:0/473: symlink d2/da/dd/d49/d6c/d4b/d55/lb3 0 2026-03-09T00:04:00.278 INFO:tasks.workunit.client.0.vm03.stdout:4/589: read d7/d20/d29/d38/f6e [1287199,44730] 0 2026-03-09T00:04:00.283 INFO:tasks.workunit.client.1.vm06.stdout:7/726: mknod d0/df/d1a/d27/d4c/d40/cd1 0 2026-03-09T00:04:00.283 INFO:tasks.workunit.client.1.vm06.stdout:7/727: chown d0/df/d1a/d35/d62/f87 3171 1 2026-03-09T00:04:00.283 INFO:tasks.workunit.client.1.vm06.stdout:7/728: stat d0/df/d1a/d35/f77 0 2026-03-09T00:04:00.286 INFO:tasks.workunit.client.0.vm03.stdout:8/531: rename d7/df/d1a/d2b/f8d to d7/df/d1a/d2b/d62/faa 0 2026-03-09T00:04:00.286 INFO:tasks.workunit.client.0.vm03.stdout:8/532: truncate f6 1433340 0 2026-03-09T00:04:00.295 INFO:tasks.workunit.client.0.vm03.stdout:2/491: symlink d8/d1b/d2a/d56/la2 0 2026-03-09T00:04:00.300 INFO:tasks.workunit.client.0.vm03.stdout:0/474: mkdir d2/da/dd/d49/d6c/db4 0 2026-03-09T00:04:00.303 INFO:tasks.workunit.client.0.vm03.stdout:4/590: symlink d7/lc2 0 2026-03-09T00:04:00.303 INFO:tasks.workunit.client.1.vm06.stdout:3/732: mkdir d11/d28/d2e/db2/d100 0 2026-03-09T00:04:00.311 INFO:tasks.workunit.client.0.vm03.stdout:2/492: dread d8/d1b/d2a/f4c [0,4194304] 0 2026-03-09T00:04:00.316 INFO:tasks.workunit.client.1.vm06.stdout:9/633: symlink d1/d3/d4f/d91/d94/d9e/ld1 0 2026-03-09T00:04:00.327 INFO:tasks.workunit.client.0.vm03.stdout:2/493: write d8/d1b/f1f [2996443,68702] 0 2026-03-09T00:04:00.327 INFO:tasks.workunit.client.0.vm03.stdout:0/475: mkdir d2/da/dd/d49/d6c/d81/db5 0 2026-03-09T00:04:00.329 INFO:tasks.workunit.client.0.vm03.stdout:4/591: symlink d7/d20/db3/lc3 0 2026-03-09T00:04:00.329 INFO:tasks.workunit.client.1.vm06.stdout:2/791: rmdir d7/d1b/d71/d79 39 2026-03-09T00:04:00.330 INFO:tasks.workunit.client.1.vm06.stdout:9/634: dread d1/d3/d4f/d91/d94/f95 [0,4194304] 0 2026-03-09T00:04:00.330 INFO:tasks.workunit.client.1.vm06.stdout:9/635: chown d1/d3/d4f/d91/fc9 121930 1 2026-03-09T00:04:00.330 INFO:tasks.workunit.client.1.vm06.stdout:9/636: write d1/d4/f39 [4567034,59045] 0 2026-03-09T00:04:00.330 INFO:tasks.workunit.client.1.vm06.stdout:7/729: truncate d0/df/d1a/f44 7084947 0 2026-03-09T00:04:00.330 INFO:tasks.workunit.client.1.vm06.stdout:7/730: stat d0/df/fb8 0 2026-03-09T00:04:00.333 INFO:tasks.workunit.client.1.vm06.stdout:4/704: getdents d17/d5b/d8f 0 2026-03-09T00:04:00.334 INFO:tasks.workunit.client.0.vm03.stdout:2/494: creat d8/d1b/d2a/d6b/d50/d8a/fa3 x:0 0 0 2026-03-09T00:04:00.334 INFO:tasks.workunit.client.0.vm03.stdout:2/495: readlink d8/d1b/l29 0 2026-03-09T00:04:00.340 INFO:tasks.workunit.client.0.vm03.stdout:4/592: mkdir d7/d27/dc4 0 2026-03-09T00:04:00.340 INFO:tasks.workunit.client.0.vm03.stdout:4/593: fsync d7/d6f/f9b 0 2026-03-09T00:04:00.340 INFO:tasks.workunit.client.0.vm03.stdout:4/594: creat d7/d27/fc5 x:0 0 0 2026-03-09T00:04:00.341 INFO:tasks.workunit.client.0.vm03.stdout:3/374: dwrite d2/db/f13 [0,4194304] 0 2026-03-09T00:04:00.343 INFO:tasks.workunit.client.1.vm06.stdout:1/655: dread d6/d21/d2d/d3b/d42/f9a [0,4194304] 0 2026-03-09T00:04:00.343 INFO:tasks.workunit.client.1.vm06.stdout:1/656: write d6/d21/d2d/d3b/d87/d9d/dd8/fdb [971616,43132] 0 2026-03-09T00:04:00.344 INFO:tasks.workunit.client.0.vm03.stdout:2/496: dread d8/d26/f5a [0,4194304] 0 2026-03-09T00:04:00.345 INFO:tasks.workunit.client.0.vm03.stdout:0/476: dread d2/da/dd/d49/d6c/f41 [0,4194304] 0 2026-03-09T00:04:00.345 INFO:tasks.workunit.client.0.vm03.stdout:0/477: truncate d2/da/d36/f58 1822375 0 2026-03-09T00:04:00.345 INFO:tasks.workunit.client.0.vm03.stdout:0/478: stat d2/da/d1a/l9b 0 2026-03-09T00:04:00.349 INFO:tasks.workunit.client.0.vm03.stdout:0/479: write d2/da/f2d [1845458,62365] 0 2026-03-09T00:04:00.360 INFO:tasks.workunit.client.1.vm06.stdout:2/792: creat d7/d1b/d71/d79/db4/dc1/d86/ff2 x:0 0 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.1.vm06.stdout:2/793: fsync d7/d1b/da5/dca/fe9 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.1.vm06.stdout:7/731: mkdir d0/df/d7b/dd2 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.1.vm06.stdout:7/732: chown d0/df/d1a/d27/d4c/d40/d51/d86/fbd 128 1 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.1.vm06.stdout:4/705: mknod d17/d21/d4c/ced 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.1.vm06.stdout:1/657: getdents d6 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.0.vm03.stdout:3/375: write d2/db/d2d/f52 [1286023,1603] 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.0.vm03.stdout:3/376: write d2/db/f25 [1122003,22546] 0 2026-03-09T00:04:00.375 INFO:tasks.workunit.client.0.vm03.stdout:3/377: write d2/db/d3b/d5d/f60 [786371,29890] 0 2026-03-09T00:04:00.376 INFO:tasks.workunit.client.0.vm03.stdout:3/378: fdatasync d2/f9 0 2026-03-09T00:04:00.376 INFO:tasks.workunit.client.0.vm03.stdout:2/497: link d8/f9b d8/d1b/d2a/d56/fa4 0 2026-03-09T00:04:00.378 INFO:tasks.workunit.client.1.vm06.stdout:7/733: creat d0/df/d1a/d3f/fd3 x:0 0 0 2026-03-09T00:04:00.380 INFO:tasks.workunit.client.0.vm03.stdout:0/480: creat d2/da/dd/d49/d6c/d81/db5/fb6 x:0 0 0 2026-03-09T00:04:00.382 INFO:tasks.workunit.client.0.vm03.stdout:6/471: dwrite d13/d1e/d44/d59/d77/f96 [0,4194304] 0 2026-03-09T00:04:00.382 INFO:tasks.workunit.client.1.vm06.stdout:5/853: write d5/d1c/d21/d28/d5e/d66/d78/f7c [148538,20420] 0 2026-03-09T00:04:00.383 INFO:tasks.workunit.client.1.vm06.stdout:6/749: dwrite d4/d16/d53/ddf/d52/fe3 [0,4194304] 0 2026-03-09T00:04:00.383 INFO:tasks.workunit.client.1.vm06.stdout:6/750: readlink d4/d16/d53/ddf/d4b/l73 0 2026-03-09T00:04:00.389 INFO:tasks.workunit.client.1.vm06.stdout:4/706: creat d17/d24/d3b/dbf/dea/fee x:0 0 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:4/707: write d17/d5b/f77 [987549,117660] 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.0.vm03.stdout:2/498: rmdir d8/d1b/d8f 39 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.0.vm03.stdout:2/499: chown d8/d1b/d2a/d2e/l51 7988 1 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.0.vm03.stdout:2/500: write d8/d1b/d2a/d6b/f78 [178484,4267] 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.0.vm03.stdout:4/595: dread d7/d20/d29/d38/d3a/f4b [0,4194304] 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:7/734: link d0/df/d1a/d27/d4c/d40/d51/d86/fbd d0/df/d17/dba/fd4 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:6/751: getdents d4/d16/d53/ddf/d7e 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:6/752: readlink d4/d27/d3e/l6a 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:1/658: rename d6/d21/d2d/d3b/d42/f4e to d6/d4c/fe1 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:1/659: fdatasync d6/d4c/d79/fa4 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:1/660: fsync d6/d21/f2e 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:1/661: stat d6/d4c/f8e 0 2026-03-09T00:04:00.401 INFO:tasks.workunit.client.1.vm06.stdout:1/662: dread - d6/d21/d2d/fc5 zero size 2026-03-09T00:04:00.405 INFO:tasks.workunit.client.1.vm06.stdout:5/854: dread d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/fe2 [0,4194304] 0 2026-03-09T00:04:00.406 INFO:tasks.workunit.client.0.vm03.stdout:0/481: write d2/da/dd/d49/d6c/f52 [1703130,69822] 0 2026-03-09T00:04:00.406 INFO:tasks.workunit.client.0.vm03.stdout:0/482: chown d2/da/d76 109297631 1 2026-03-09T00:04:00.406 INFO:tasks.workunit.client.0.vm03.stdout:0/483: chown d2/da/d1a/f3a 57304 1 2026-03-09T00:04:00.410 INFO:tasks.workunit.client.0.vm03.stdout:8/533: dwrite d7/f67 [0,4194304] 0 2026-03-09T00:04:00.414 INFO:tasks.workunit.client.0.vm03.stdout:6/472: dread d13/d35/d4c/d62/fa0 [0,4194304] 0 2026-03-09T00:04:00.416 INFO:tasks.workunit.client.1.vm06.stdout:4/708: dread d17/d21/d32/d92/fa4 [0,4194304] 0 2026-03-09T00:04:00.416 INFO:tasks.workunit.client.1.vm06.stdout:7/735: rename d0/df/d1a/d27/d4c/d40/d51/d86/ca9 to d0/df/d1a/d27/d4c/d40/d51/d90/dcc/cd5 0 2026-03-09T00:04:00.416 INFO:tasks.workunit.client.1.vm06.stdout:7/736: stat d0/df/d1a/d3a/d4e/fa4 0 2026-03-09T00:04:00.416 INFO:tasks.workunit.client.1.vm06.stdout:7/737: write d0/df/d1a/f25 [8906587,86126] 0 2026-03-09T00:04:00.426 INFO:tasks.workunit.client.0.vm03.stdout:7/472: dwrite d2/f50 [4194304,4194304] 0 2026-03-09T00:04:00.428 INFO:tasks.workunit.client.0.vm03.stdout:4/596: write d7/d20/d6a/d77/d25/f7f [195741,102425] 0 2026-03-09T00:04:00.432 INFO:tasks.workunit.client.1.vm06.stdout:5/855: getdents d5/d1c/d21/d28 0 2026-03-09T00:04:00.437 INFO:tasks.workunit.client.0.vm03.stdout:1/586: dwrite d4/d6/f33 [0,4194304] 0 2026-03-09T00:04:00.437 INFO:tasks.workunit.client.0.vm03.stdout:1/587: truncate d4/d15/d5c/fb1 89894 0 2026-03-09T00:04:00.437 INFO:tasks.workunit.client.1.vm06.stdout:8/769: sync 2026-03-09T00:04:00.437 INFO:tasks.workunit.client.1.vm06.stdout:8/770: write db/d53/d70/d38/d4d/d79/fd1 [680758,87028] 0 2026-03-09T00:04:00.437 INFO:tasks.workunit.client.1.vm06.stdout:8/771: stat db/d1e/f25 0 2026-03-09T00:04:00.437 INFO:tasks.workunit.client.1.vm06.stdout:4/709: creat d17/d24/d49/de4/db0/ddd/fef x:0 0 0 2026-03-09T00:04:00.441 INFO:tasks.workunit.client.1.vm06.stdout:0/761: dwrite d3/d18/d1f/d39/d3b/df9/f7f [4194304,4194304] 0 2026-03-09T00:04:00.444 INFO:tasks.workunit.client.1.vm06.stdout:7/738: creat d0/df/d1a/d27/d70/fd6 x:0 0 0 2026-03-09T00:04:00.447 INFO:tasks.workunit.client.1.vm06.stdout:7/739: chown d0/df/d1a/d3a/d4e/d5e/f6f 2274 1 2026-03-09T00:04:00.447 INFO:tasks.workunit.client.1.vm06.stdout:7/740: readlink d0/df/d1a/d3a/lb5 0 2026-03-09T00:04:00.450 INFO:tasks.workunit.client.1.vm06.stdout:1/663: rmdir d6/db0 39 2026-03-09T00:04:00.463 INFO:tasks.workunit.client.0.vm03.stdout:3/379: dwrite d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:04:00.464 INFO:tasks.workunit.client.0.vm03.stdout:3/380: truncate d2/db/d40/d51/f5a 849394 0 2026-03-09T00:04:00.467 INFO:tasks.workunit.client.0.vm03.stdout:2/501: mkdir d8/d1b/d24/da5 0 2026-03-09T00:04:00.477 INFO:tasks.workunit.client.1.vm06.stdout:5/856: truncate d5/d1c/d21/f73 3670452 0 2026-03-09T00:04:00.477 INFO:tasks.workunit.client.1.vm06.stdout:5/857: fdatasync d5/d1c/d21/d28/d5e/d66/d78/dc8/f90 0 2026-03-09T00:04:00.477 INFO:tasks.workunit.client.1.vm06.stdout:8/772: rename db/le0 to db/dd/d84/lfa 0 2026-03-09T00:04:00.484 INFO:tasks.workunit.client.0.vm03.stdout:3/381: write f1 [3665564,31638] 0 2026-03-09T00:04:00.485 INFO:tasks.workunit.client.0.vm03.stdout:3/382: chown d2/db/d40/d44/d68 0 1 2026-03-09T00:04:00.491 INFO:tasks.workunit.client.0.vm03.stdout:5/515: dwrite d1c/f29 [0,4194304] 0 2026-03-09T00:04:00.491 INFO:tasks.workunit.client.0.vm03.stdout:5/516: dread - d1c/d20/d55/d4f/d58/fa6 zero size 2026-03-09T00:04:00.491 INFO:tasks.workunit.client.0.vm03.stdout:5/517: write d1c/d20/d55/f52 [1015743,79867] 0 2026-03-09T00:04:00.504 INFO:tasks.workunit.client.1.vm06.stdout:4/710: symlink d17/d24/d3b/dbf/lf0 0 2026-03-09T00:04:00.504 INFO:tasks.workunit.client.1.vm06.stdout:4/711: dread - d17/d24/d49/de4/fc0 zero size 2026-03-09T00:04:00.508 INFO:tasks.workunit.client.0.vm03.stdout:6/473: rmdir d13/d1e/d44/d4a 39 2026-03-09T00:04:00.508 INFO:tasks.workunit.client.0.vm03.stdout:6/474: fsync d13/f5d 0 2026-03-09T00:04:00.512 INFO:tasks.workunit.client.1.vm06.stdout:1/664: mknod d6/d21/d2d/d3b/d87/ce2 0 2026-03-09T00:04:00.512 INFO:tasks.workunit.client.1.vm06.stdout:0/762: dread d3/d18/d3c/fa0 [0,4194304] 0 2026-03-09T00:04:00.514 INFO:tasks.workunit.client.0.vm03.stdout:7/473: mkdir d2/d1f/d42/d46/d54/d8d 0 2026-03-09T00:04:00.522 INFO:tasks.workunit.client.0.vm03.stdout:4/597: getdents d7/d20/d6a 0 2026-03-09T00:04:00.525 INFO:tasks.workunit.client.1.vm06.stdout:3/733: dwrite d11/d28/ff0 [0,4194304] 0 2026-03-09T00:04:00.527 INFO:tasks.workunit.client.1.vm06.stdout:4/712: unlink d17/d24/d49/de4/fe9 0 2026-03-09T00:04:00.527 INFO:tasks.workunit.client.0.vm03.stdout:5/518: dread d1c/f4c [0,4194304] 0 2026-03-09T00:04:00.528 INFO:tasks.workunit.client.1.vm06.stdout:1/665: dread d6/d21/f3d [0,4194304] 0 2026-03-09T00:04:00.529 INFO:tasks.workunit.client.1.vm06.stdout:1/666: truncate d6/d4c/d71/d83/f9b 611881 0 2026-03-09T00:04:00.535 INFO:tasks.workunit.client.1.vm06.stdout:1/667: dread d6/f25 [0,4194304] 0 2026-03-09T00:04:00.545 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:00 vm06.local ceph-mon[58395]: Manager daemon vm03.yvcons is now available 2026-03-09T00:04:00.545 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:00 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:00.545 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:00 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:04:00.545 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:00 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:00.545 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:00 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-09T00:04:00.545 INFO:tasks.workunit.client.0.vm03.stdout:1/588: rmdir d4/d3a 39 2026-03-09T00:04:00.545 INFO:tasks.workunit.client.0.vm03.stdout:1/589: fsync d4/f1e 0 2026-03-09T00:04:00.545 INFO:tasks.workunit.client.0.vm03.stdout:2/502: mknod d8/d1b/d24/ca6 0 2026-03-09T00:04:00.545 INFO:tasks.workunit.client.1.vm06.stdout:0/763: rmdir d3/d18/d2c 39 2026-03-09T00:04:00.558 INFO:tasks.workunit.client.0.vm03.stdout:3/383: mkdir d2/db/d3b/d5d/d6d 0 2026-03-09T00:04:00.558 INFO:tasks.workunit.client.1.vm06.stdout:7/741: dwrite d0/df/d1a/d35/f61 [4194304,4194304] 0 2026-03-09T00:04:00.558 INFO:tasks.workunit.client.1.vm06.stdout:7/742: fdatasync d0/df/f8a 0 2026-03-09T00:04:00.558 INFO:tasks.workunit.client.1.vm06.stdout:7/743: write d0/df/d1a/f72 [1148014,96147] 0 2026-03-09T00:04:00.562 INFO:tasks.workunit.client.1.vm06.stdout:3/734: truncate d11/f1a 5397456 0 2026-03-09T00:04:00.564 INFO:tasks.workunit.client.1.vm06.stdout:3/735: chown d11/d28/d2e/d2f/fa3 367358 1 2026-03-09T00:04:00.564 INFO:tasks.workunit.client.1.vm06.stdout:3/736: write d11/d28/d2e/d2f/d36/f59 [2084785,29083] 0 2026-03-09T00:04:00.573 INFO:tasks.workunit.client.0.vm03.stdout:1/590: dread d4/f1e [0,4194304] 0 2026-03-09T00:04:00.588 INFO:tasks.workunit.client.1.vm06.stdout:7/744: creat d0/df/d1a/d27/d4c/d40/d5b/fd7 x:0 0 0 2026-03-09T00:04:00.588 INFO:tasks.workunit.client.1.vm06.stdout:7/745: chown d0/df/d1a/d27/d4c/d40/fa5 1124827 1 2026-03-09T00:04:00.596 INFO:tasks.workunit.client.1.vm06.stdout:3/737: unlink d11/d28/d2e/d2f/l39 0 2026-03-09T00:04:00.600 INFO:tasks.workunit.client.1.vm06.stdout:3/738: dread d11/d28/d2e/d2f/d5b/d94/fb3 [0,4194304] 0 2026-03-09T00:04:00.602 INFO:tasks.workunit.client.1.vm06.stdout:2/794: dwrite d7/da/d1c/f70 [0,4194304] 0 2026-03-09T00:04:00.604 INFO:tasks.workunit.client.0.vm03.stdout:7/474: dwrite d2/d1f/d42/d43/f5f [0,4194304] 0 2026-03-09T00:04:00.604 INFO:tasks.workunit.client.0.vm03.stdout:7/475: getdents d2/d1f/d42/d46/d54/d8d 0 2026-03-09T00:04:00.607 INFO:tasks.workunit.client.0.vm03.stdout:0/484: dwrite d2/fb [0,4194304] 0 2026-03-09T00:04:00.612 INFO:tasks.workunit.client.1.vm06.stdout:5/858: dwrite d5/d1c/d23/f42 [4194304,4194304] 0 2026-03-09T00:04:00.613 INFO:tasks.workunit.client.1.vm06.stdout:0/764: symlink d3/d18/d1f/l102 0 2026-03-09T00:04:00.618 INFO:tasks.workunit.client.1.vm06.stdout:3/739: dread d11/d28/d2e/d7e/d83/fe8 [0,4194304] 0 2026-03-09T00:04:00.618 INFO:tasks.workunit.client.0.vm03.stdout:4/598: dwrite d7/d20/f3d [0,4194304] 0 2026-03-09T00:04:00.618 INFO:tasks.workunit.client.0.vm03.stdout:4/599: write d7/d20/d35/fb5 [1023261,71698] 0 2026-03-09T00:04:00.618 INFO:tasks.workunit.client.0.vm03.stdout:4/600: fdatasync d7/d20/d29/d4e/f9d 0 2026-03-09T00:04:00.618 INFO:tasks.workunit.client.0.vm03.stdout:4/601: write d7/d20/d29/fa4 [840145,130170] 0 2026-03-09T00:04:00.619 INFO:tasks.workunit.client.1.vm06.stdout:4/713: dread d17/d5b/f64 [0,4194304] 0 2026-03-09T00:04:00.619 INFO:tasks.workunit.client.1.vm06.stdout:5/859: write d5/d44/d4b/d92/f52 [32440,104214] 0 2026-03-09T00:04:00.619 INFO:tasks.workunit.client.0.vm03.stdout:4/602: creat d7/d20/d29/d54/fc6 x:0 0 0 2026-03-09T00:04:00.624 INFO:tasks.workunit.client.1.vm06.stdout:7/746: mknod d0/df/d7b/dd2/cd8 0 2026-03-09T00:04:00.629 INFO:tasks.workunit.client.1.vm06.stdout:3/740: dread d11/d3f/f54 [0,4194304] 0 2026-03-09T00:04:00.629 INFO:tasks.workunit.client.1.vm06.stdout:3/741: write d11/d28/d2e/d2f/d36/f75 [1676676,115279] 0 2026-03-09T00:04:00.636 INFO:tasks.workunit.client.0.vm03.stdout:8/534: dwrite d7/df/f37 [0,4194304] 0 2026-03-09T00:04:00.636 INFO:tasks.workunit.client.0.vm03.stdout:8/535: write d7/df/d1e/f3a [930571,63519] 0 2026-03-09T00:04:00.636 INFO:tasks.workunit.client.0.vm03.stdout:8/536: write d7/d92/f75 [748175,14816] 0 2026-03-09T00:04:00.640 INFO:tasks.workunit.client.0.vm03.stdout:6/475: symlink d13/d35/d4c/la4 0 2026-03-09T00:04:00.651 INFO:tasks.workunit.client.1.vm06.stdout:2/795: rmdir d7/d1b/d71/d79/db4 39 2026-03-09T00:04:00.651 INFO:tasks.workunit.client.0.vm03.stdout:2/503: mknod d8/d26/d5e/ca7 0 2026-03-09T00:04:00.651 INFO:tasks.workunit.client.0.vm03.stdout:1/591: symlink d4/d3a/d61/d78/d81/lcf 0 2026-03-09T00:04:00.651 INFO:tasks.workunit.client.0.vm03.stdout:1/592: chown d4/d3a/d8f/fc4 12531 1 2026-03-09T00:04:00.651 INFO:tasks.workunit.client.0.vm03.stdout:1/593: dread - d4/d3a/d43/f5a zero size 2026-03-09T00:04:00.653 INFO:tasks.workunit.client.0.vm03.stdout:5/519: dwrite d1c/d20/d55/f46 [0,4194304] 0 2026-03-09T00:04:00.661 INFO:tasks.workunit.client.0.vm03.stdout:7/476: mkdir d2/d4/d1e/d5e/d6c/d8e 0 2026-03-09T00:04:00.663 INFO:tasks.workunit.client.0.vm03.stdout:1/594: write d4/fa0 [226934,62793] 0 2026-03-09T00:04:00.663 INFO:tasks.workunit.client.0.vm03.stdout:1/595: chown d4/d3a/d61/d78/d81/cbc 7377 1 2026-03-09T00:04:00.677 INFO:tasks.workunit.client.0.vm03.stdout:9/540: sync 2026-03-09T00:04:00.684 INFO:tasks.workunit.client.0.vm03.stdout:4/603: symlink d7/d6f/da5/lc7 0 2026-03-09T00:04:00.688 INFO:tasks.workunit.client.0.vm03.stdout:6/476: truncate d13/d1e/d44/d59/d77/f94 3966891 0 2026-03-09T00:04:00.688 INFO:tasks.workunit.client.0.vm03.stdout:6/477: fdatasync fb 0 2026-03-09T00:04:00.688 INFO:tasks.workunit.client.0.vm03.stdout:6/478: truncate d13/d1e/f9f 58110 0 2026-03-09T00:04:00.692 INFO:tasks.workunit.client.0.vm03.stdout:2/504: mkdir d8/d1b/d24/da5/da8 0 2026-03-09T00:04:00.693 INFO:tasks.workunit.client.1.vm06.stdout:0/765: rename d3/d18/d28/ff5 to d3/d18/d28/f103 0 2026-03-09T00:04:00.693 INFO:tasks.workunit.client.1.vm06.stdout:0/766: write d3/d18/d1f/d39/d3b/df9/df2/f96 [300303,63680] 0 2026-03-09T00:04:00.697 INFO:tasks.workunit.client.0.vm03.stdout:5/520: dread d1c/d20/d55/d3b/f45 [0,4194304] 0 2026-03-09T00:04:00.699 INFO:tasks.workunit.client.0.vm03.stdout:9/541: read d15/d1c/d28/fa7 [286754,102160] 0 2026-03-09T00:04:00.708 INFO:tasks.workunit.client.0.vm03.stdout:9/542: write d15/f2c [726045,26578] 0 2026-03-09T00:04:00.717 INFO:tasks.workunit.client.0.vm03.stdout:0/485: dwrite d2/f32 [4194304,4194304] 0 2026-03-09T00:04:00.720 INFO:tasks.workunit.client.1.vm06.stdout:3/742: creat d11/d28/d2e/db2/f101 x:0 0 0 2026-03-09T00:04:00.728 INFO:tasks.workunit.client.0.vm03.stdout:1/596: mknod d4/d15/d86/cd0 0 2026-03-09T00:04:00.733 INFO:tasks.workunit.client.0.vm03.stdout:4/604: link d7/d20/d6a/d77/db7/f91 d7/d20/d6a/d77/fc8 0 2026-03-09T00:04:00.738 INFO:tasks.workunit.client.0.vm03.stdout:6/479: mkdir d13/d35/d71/d97/da5 0 2026-03-09T00:04:00.749 INFO:tasks.workunit.client.1.vm06.stdout:4/714: dwrite d17/d24/d3b/d5e/fe5 [0,4194304] 0 2026-03-09T00:04:00.749 INFO:tasks.workunit.client.1.vm06.stdout:5/860: rename d5/d44/d4b to d5/d1c/d68/dec/d115/d11e 0 2026-03-09T00:04:00.749 INFO:tasks.workunit.client.1.vm06.stdout:5/861: readlink d5/d1c/d68/dec/d115/d11e/d92/l5d 0 2026-03-09T00:04:00.751 INFO:tasks.workunit.client.1.vm06.stdout:3/743: creat d11/d28/d2e/d2f/d5b/d94/f102 x:0 0 0 2026-03-09T00:04:00.752 INFO:tasks.workunit.client.1.vm06.stdout:3/744: stat d11/d28/d2e/d2f/d5b/ddb/df1 0 2026-03-09T00:04:00.755 INFO:tasks.workunit.client.0.vm03.stdout:5/521: dwrite fb [0,4194304] 0 2026-03-09T00:04:00.758 INFO:tasks.workunit.client.0.vm03.stdout:0/486: truncate d2/da/dd/f24 1606 0 2026-03-09T00:04:00.758 INFO:tasks.workunit.client.0.vm03.stdout:0/487: write d2/da/dd/d49/d6c/f41 [1276079,15309] 0 2026-03-09T00:04:00.763 INFO:tasks.workunit.client.0.vm03.stdout:4/605: mkdir d7/d27/dc9 0 2026-03-09T00:04:00.763 INFO:tasks.workunit.client.0.vm03.stdout:6/480: link d13/d35/d71/d97/la1 d13/d1e/d44/d59/la6 0 2026-03-09T00:04:00.765 INFO:tasks.workunit.client.0.vm03.stdout:0/488: dread d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:04:00.765 INFO:tasks.workunit.client.0.vm03.stdout:0/489: creat d2/da/d1a/fb7 x:0 0 0 2026-03-09T00:04:00.766 INFO:tasks.workunit.client.0.vm03.stdout:2/505: symlink d8/la9 0 2026-03-09T00:04:00.766 INFO:tasks.workunit.client.0.vm03.stdout:2/506: truncate d8/d26/d5e/d6f/d97/f6e 665759 0 2026-03-09T00:04:00.769 INFO:tasks.workunit.client.0.vm03.stdout:7/477: dwrite d2/d1f/d3a/f1a [4194304,4194304] 0 2026-03-09T00:04:00.771 INFO:tasks.workunit.client.1.vm06.stdout:4/715: mkdir d17/d24/d3b/d97/db7/df1 0 2026-03-09T00:04:00.771 INFO:tasks.workunit.client.1.vm06.stdout:4/716: write d17/d5b/f77 [3768304,69407] 0 2026-03-09T00:04:00.771 INFO:tasks.workunit.client.1.vm06.stdout:4/717: chown d17/d24/d3b/d5e/fe5 61643625 1 2026-03-09T00:04:00.771 INFO:tasks.workunit.client.1.vm06.stdout:4/718: read - d17/d21/d4c/d66/f9a zero size 2026-03-09T00:04:00.771 INFO:tasks.workunit.client.1.vm06.stdout:4/719: fdatasync d17/d24/d3b/dbf/fc1 0 2026-03-09T00:04:00.771 INFO:tasks.workunit.client.1.vm06.stdout:4/720: chown d17/d24/d3b/d97 6 1 2026-03-09T00:04:00.775 INFO:tasks.workunit.client.0.vm03.stdout:7/478: dread d2/d1f/d3a/f5d [0,4194304] 0 2026-03-09T00:04:00.782 INFO:tasks.workunit.client.0.vm03.stdout:9/543: dread d15/d1c/d36/fb1 [0,4194304] 0 2026-03-09T00:04:00.782 INFO:tasks.workunit.client.0.vm03.stdout:9/544: read d15/d1c/d28/f2f [135647,76795] 0 2026-03-09T00:04:00.786 INFO:tasks.workunit.client.0.vm03.stdout:5/522: mknod d1c/d20/d55/ca8 0 2026-03-09T00:04:00.799 INFO:tasks.workunit.client.1.vm06.stdout:7/747: rename d0/df/d1a/f72 to d0/df/d7b/fd9 0 2026-03-09T00:04:00.804 INFO:tasks.workunit.client.1.vm06.stdout:2/796: dwrite d7/f8 [0,4194304] 0 2026-03-09T00:04:00.814 INFO:tasks.workunit.client.1.vm06.stdout:2/797: chown d7/d1a/f30 0 1 2026-03-09T00:04:00.814 INFO:tasks.workunit.client.0.vm03.stdout:6/481: rename d13/d35/f6a to d13/d35/d74/d89/d9d/fa7 0 2026-03-09T00:04:00.814 INFO:tasks.workunit.client.0.vm03.stdout:6/482: dread f8 [0,4194304] 0 2026-03-09T00:04:00.816 INFO:tasks.workunit.client.0.vm03.stdout:4/606: creat d7/d20/d29/d38/fca x:0 0 0 2026-03-09T00:04:00.817 INFO:tasks.workunit.client.0.vm03.stdout:4/607: chown d7/d20/d6a/d77/db7/f91 10 1 2026-03-09T00:04:00.818 INFO:tasks.workunit.client.1.vm06.stdout:5/862: read d5/d1c/f22 [1046574,7771] 0 2026-03-09T00:04:00.827 INFO:tasks.workunit.client.1.vm06.stdout:4/721: dwrite d17/d21/d4c/d66/fa2 [0,4194304] 0 2026-03-09T00:04:00.834 INFO:tasks.workunit.client.1.vm06.stdout:0/767: dwrite d3/d18/d1f/d39/d3b/df9/fba [0,4194304] 0 2026-03-09T00:04:00.834 INFO:tasks.workunit.client.1.vm06.stdout:5/863: read d5/d1c/d68/dec/d115/d11e/ff9 [1797622,70621] 0 2026-03-09T00:04:00.834 INFO:tasks.workunit.client.1.vm06.stdout:5/864: readlink d5/d1c/l58 0 2026-03-09T00:04:00.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:00 vm03.local ceph-mon[52346]: Manager daemon vm03.yvcons is now available 2026-03-09T00:04:00.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:00 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:00.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:00 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:04:00.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:00 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:00.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:00 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-09T00:04:00.841 INFO:tasks.workunit.client.0.vm03.stdout:9/545: dwrite d15/f7b [0,4194304] 0 2026-03-09T00:04:00.841 INFO:tasks.workunit.client.0.vm03.stdout:9/546: write d15/d1c/d28/f2f [4536171,116240] 0 2026-03-09T00:04:00.841 INFO:tasks.workunit.client.1.vm06.stdout:5/865: dread d5/d44/f108 [0,4194304] 0 2026-03-09T00:04:00.844 INFO:tasks.workunit.client.1.vm06.stdout:3/745: creat d11/d28/d2e/db2/d100/f103 x:0 0 0 2026-03-09T00:04:00.860 INFO:tasks.workunit.client.0.vm03.stdout:0/490: mkdir d2/da/d76/d8a/d8f/db8 0 2026-03-09T00:04:00.866 INFO:tasks.workunit.client.1.vm06.stdout:7/748: mknod d0/df/d1a/d27/d4c/d40/d51/d90/dae/cda 0 2026-03-09T00:04:00.868 INFO:tasks.workunit.client.0.vm03.stdout:2/507: unlink d8/d1b/d2a/d6b/f78 0 2026-03-09T00:04:00.878 INFO:tasks.workunit.client.1.vm06.stdout:2/798: truncate d7/d1b/f22 1632902 0 2026-03-09T00:04:00.882 INFO:tasks.workunit.client.0.vm03.stdout:9/547: dwrite d15/d1c/d28/faa [0,4194304] 0 2026-03-09T00:04:00.883 INFO:tasks.workunit.client.0.vm03.stdout:7/479: link d2/d1f/d42/d43/f4a d2/d1f/d42/d46/d81/f8f 0 2026-03-09T00:04:00.885 INFO:tasks.workunit.client.0.vm03.stdout:7/480: dread d2/d4/f2e [0,4194304] 0 2026-03-09T00:04:00.896 INFO:tasks.workunit.client.1.vm06.stdout:9/637: sync 2026-03-09T00:04:00.897 INFO:tasks.workunit.client.1.vm06.stdout:4/722: creat d17/d24/d49/d5f/db2/ff2 x:0 0 0 2026-03-09T00:04:00.915 INFO:tasks.workunit.client.1.vm06.stdout:0/768: symlink d3/d18/d28/d45/l104 0 2026-03-09T00:04:00.920 INFO:tasks.workunit.client.1.vm06.stdout:5/866: mkdir d5/d1c/d68/da2/d11f 0 2026-03-09T00:04:00.924 INFO:tasks.workunit.client.1.vm06.stdout:3/746: mkdir d11/d28/d2e/d7e/d104 0 2026-03-09T00:04:00.924 INFO:tasks.workunit.client.1.vm06.stdout:3/747: creat d11/d28/d2e/d2f/f105 x:0 0 0 2026-03-09T00:04:00.924 INFO:tasks.workunit.client.1.vm06.stdout:3/748: chown d11/d3f/f4c 1629634 1 2026-03-09T00:04:00.927 INFO:tasks.workunit.client.1.vm06.stdout:7/749: unlink d0/df/f13 0 2026-03-09T00:04:00.927 INFO:tasks.workunit.client.1.vm06.stdout:7/750: truncate d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc9 36245 0 2026-03-09T00:04:00.929 INFO:tasks.workunit.client.1.vm06.stdout:2/799: creat d7/da/d93/ff3 x:0 0 0 2026-03-09T00:04:00.929 INFO:tasks.workunit.client.1.vm06.stdout:2/800: dread - d7/d1b/d71/d79/db4/dc1/fe6 zero size 2026-03-09T00:04:00.929 INFO:tasks.workunit.client.1.vm06.stdout:2/801: read - d7/da/d63/fee zero size 2026-03-09T00:04:00.931 INFO:tasks.workunit.client.1.vm06.stdout:8/773: sync 2026-03-09T00:04:00.931 INFO:tasks.workunit.client.1.vm06.stdout:1/668: sync 2026-03-09T00:04:00.931 INFO:tasks.workunit.client.1.vm06.stdout:6/753: sync 2026-03-09T00:04:00.931 INFO:tasks.workunit.client.1.vm06.stdout:1/669: write d6/d21/d2d/d3b/d42/fb4 [2032009,74090] 0 2026-03-09T00:04:00.931 INFO:tasks.workunit.client.1.vm06.stdout:6/754: read d4/d16/d53/ddf/d7e/dac/fe1 [532404,101366] 0 2026-03-09T00:04:00.933 INFO:tasks.workunit.client.0.vm03.stdout:6/483: mknod d13/d35/d71/d97/da5/ca8 0 2026-03-09T00:04:00.941 INFO:tasks.workunit.client.0.vm03.stdout:4/608: symlink d7/d27/dc9/lcb 0 2026-03-09T00:04:00.941 INFO:tasks.workunit.client.0.vm03.stdout:4/609: fsync d7/d20/d6a/d77/db7/f9f 0 2026-03-09T00:04:00.943 INFO:tasks.workunit.client.1.vm06.stdout:4/723: unlink d17/d24/d3b/d97/fb5 0 2026-03-09T00:04:00.943 INFO:tasks.workunit.client.1.vm06.stdout:4/724: write d17/d21/d32/fd6 [1464931,10627] 0 2026-03-09T00:04:00.944 INFO:tasks.workunit.client.0.vm03.stdout:7/481: dwrite d2/f73 [0,4194304] 0 2026-03-09T00:04:00.946 INFO:tasks.workunit.client.1.vm06.stdout:9/638: dwrite d1/d3/d4f/d91/dae/fb6 [0,4194304] 0 2026-03-09T00:04:00.957 INFO:tasks.workunit.client.1.vm06.stdout:0/769: symlink d3/d18/d2c/d2d/d74/daf/de3/l105 0 2026-03-09T00:04:00.958 INFO:tasks.workunit.client.1.vm06.stdout:0/770: read d3/d18/d2c/d2d/d31/f4f [696264,38846] 0 2026-03-09T00:04:00.959 INFO:tasks.workunit.client.1.vm06.stdout:5/867: creat d5/d1c/d21/f120 x:0 0 0 2026-03-09T00:04:00.965 INFO:tasks.workunit.client.1.vm06.stdout:3/749: symlink d11/d28/d2e/dff/l106 0 2026-03-09T00:04:00.967 INFO:tasks.workunit.client.1.vm06.stdout:7/751: mknod d0/df/d1a/d22/cdb 0 2026-03-09T00:04:00.969 INFO:tasks.workunit.client.1.vm06.stdout:2/802: link d7/da/db/f74 d7/d1a/d39/df1/ff4 0 2026-03-09T00:04:00.969 INFO:tasks.workunit.client.1.vm06.stdout:2/803: write d7/da/db/f98 [792146,98285] 0 2026-03-09T00:04:00.969 INFO:tasks.workunit.client.1.vm06.stdout:2/804: readlink d7/d1a/d25/d97/lac 0 2026-03-09T00:04:00.980 INFO:tasks.workunit.client.1.vm06.stdout:8/774: truncate db/d1e/f2e 1203168 0 2026-03-09T00:04:00.982 INFO:tasks.workunit.client.0.vm03.stdout:5/523: mknod d1c/d20/d55/d43/ca9 0 2026-03-09T00:04:00.985 INFO:tasks.workunit.client.1.vm06.stdout:8/775: read db/d53/d70/f54 [1966654,95711] 0 2026-03-09T00:04:00.986 INFO:tasks.workunit.client.1.vm06.stdout:8/776: fsync db/d53/d6d/d7b/f8a 0 2026-03-09T00:04:00.986 INFO:tasks.workunit.client.1.vm06.stdout:1/670: dwrite d6/d21/d2d/d37/f77 [4194304,4194304] 0 2026-03-09T00:04:01.001 INFO:tasks.workunit.client.1.vm06.stdout:6/755: mknod d4/d16/d53/ddf/d4b/ddb/ce9 0 2026-03-09T00:04:01.001 INFO:tasks.workunit.client.1.vm06.stdout:6/756: dread - d4/d16/fcf zero size 2026-03-09T00:04:01.001 INFO:tasks.workunit.client.1.vm06.stdout:6/757: dread - d4/d16/d53/ddf/fa3 zero size 2026-03-09T00:04:01.001 INFO:tasks.workunit.client.1.vm06.stdout:6/758: read d4/f12 [2428720,87439] 0 2026-03-09T00:04:01.002 INFO:tasks.workunit.client.0.vm03.stdout:3/384: sync 2026-03-09T00:04:01.002 INFO:tasks.workunit.client.0.vm03.stdout:3/385: chown d2/db/d56 1470 1 2026-03-09T00:04:01.005 INFO:tasks.workunit.client.1.vm06.stdout:4/725: chown d17/d24/d49/l33 90150628 1 2026-03-09T00:04:01.008 INFO:tasks.workunit.client.0.vm03.stdout:6/484: dread d13/f1d [0,4194304] 0 2026-03-09T00:04:01.012 INFO:tasks.workunit.client.0.vm03.stdout:4/610: truncate d7/f28 854562 0 2026-03-09T00:04:01.013 INFO:tasks.workunit.client.1.vm06.stdout:5/868: creat d5/d44/f121 x:0 0 0 2026-03-09T00:04:01.013 INFO:tasks.workunit.client.1.vm06.stdout:5/869: fdatasync d5/d1c/d21/d28/d5e/d66/dab/fe3 0 2026-03-09T00:04:01.013 INFO:tasks.workunit.client.1.vm06.stdout:5/870: fdatasync d5/d1c/d68/f8c 0 2026-03-09T00:04:01.013 INFO:tasks.workunit.client.1.vm06.stdout:5/871: chown d5/d1c/d21 266437902 1 2026-03-09T00:04:01.029 INFO:tasks.workunit.client.1.vm06.stdout:3/750: rename d11/d28/d2e/f38 to d11/d28/d2e/d2f/d5b/d5f/d91/f107 0 2026-03-09T00:04:01.029 INFO:tasks.workunit.client.1.vm06.stdout:3/751: truncate d11/d28/d4d/d89/fbe 64690 0 2026-03-09T00:04:01.032 INFO:tasks.workunit.client.1.vm06.stdout:9/639: dwrite d1/f16 [0,4194304] 0 2026-03-09T00:04:01.042 INFO:tasks.workunit.client.1.vm06.stdout:8/777: creat db/dd/d85/d9f/db7/ffb x:0 0 0 2026-03-09T00:04:01.045 INFO:tasks.workunit.client.0.vm03.stdout:5/524: unlink fb 0 2026-03-09T00:04:01.045 INFO:tasks.workunit.client.0.vm03.stdout:5/525: read - d1c/d20/fa3 zero size 2026-03-09T00:04:01.048 INFO:tasks.workunit.client.0.vm03.stdout:5/526: write d1c/d20/d55/f46 [5042693,22163] 0 2026-03-09T00:04:01.055 INFO:tasks.workunit.client.0.vm03.stdout:1/597: sync 2026-03-09T00:04:01.058 INFO:tasks.workunit.client.0.vm03.stdout:8/537: sync 2026-03-09T00:04:01.058 INFO:tasks.workunit.client.0.vm03.stdout:8/538: write d7/df/d1a/d40/d58/f57 [598176,32926] 0 2026-03-09T00:04:01.066 INFO:tasks.workunit.client.0.vm03.stdout:3/386: truncate f1 1093005 0 2026-03-09T00:04:01.069 INFO:tasks.workunit.client.0.vm03.stdout:3/387: write d2/db/f25 [1294687,36809] 0 2026-03-09T00:04:01.071 INFO:tasks.workunit.client.0.vm03.stdout:7/482: dwrite d2/d1f/d3a/f19 [0,4194304] 0 2026-03-09T00:04:01.072 INFO:tasks.workunit.client.1.vm06.stdout:6/759: unlink d4/d16/d53/ddf/fa3 0 2026-03-09T00:04:01.073 INFO:tasks.workunit.client.1.vm06.stdout:6/760: truncate d4/d16/d53/f82 9222724 0 2026-03-09T00:04:01.074 INFO:tasks.workunit.client.0.vm03.stdout:6/485: rename d13/f14 to d13/fa9 0 2026-03-09T00:04:01.074 INFO:tasks.workunit.client.0.vm03.stdout:6/486: write d13/d35/d4c/f99 [953417,21031] 0 2026-03-09T00:04:01.074 INFO:tasks.workunit.client.0.vm03.stdout:6/487: dread - d13/f92 zero size 2026-03-09T00:04:01.076 INFO:tasks.workunit.client.1.vm06.stdout:9/640: dread d1/d3/d4f/f74 [0,4194304] 0 2026-03-09T00:04:01.076 INFO:tasks.workunit.client.1.vm06.stdout:9/641: chown d1/d4/d6e/d14 112614 1 2026-03-09T00:04:01.078 INFO:tasks.workunit.client.1.vm06.stdout:9/642: write d1/f45 [129345,54724] 0 2026-03-09T00:04:01.078 INFO:tasks.workunit.client.1.vm06.stdout:9/643: chown d1/d4/d6e/d9/l79 3500 1 2026-03-09T00:04:01.080 INFO:tasks.workunit.client.1.vm06.stdout:4/726: getdents d17/d24/d3b/d97/db7 0 2026-03-09T00:04:01.113 INFO:tasks.workunit.client.1.vm06.stdout:9/644: dread d1/d4/d6e/d9/f82 [0,4194304] 0 2026-03-09T00:04:01.114 INFO:tasks.workunit.client.1.vm06.stdout:9/645: fdatasync d1/d3/d2b/d58/f5f 0 2026-03-09T00:04:01.114 INFO:tasks.workunit.client.1.vm06.stdout:9/646: write d1/d4/f39 [3265866,121950] 0 2026-03-09T00:04:01.116 INFO:tasks.workunit.client.0.vm03.stdout:3/388: truncate d2/db/d40/d44/f4d 3777617 0 2026-03-09T00:04:01.116 INFO:tasks.workunit.client.0.vm03.stdout:3/389: fsync d2/f16 0 2026-03-09T00:04:01.129 INFO:tasks.workunit.client.1.vm06.stdout:4/727: mkdir d17/d21/d4c/d66/d68/dbe/df3 0 2026-03-09T00:04:01.134 INFO:tasks.workunit.client.1.vm06.stdout:9/647: getdents d1/d4/d2f 0 2026-03-09T00:04:01.134 INFO:tasks.workunit.client.1.vm06.stdout:9/648: write d1/d3/d4f/f71 [34941,47012] 0 2026-03-09T00:04:01.134 INFO:tasks.workunit.client.1.vm06.stdout:6/761: truncate d4/d27/d3e/d78/fc9 3986153 0 2026-03-09T00:04:01.134 INFO:tasks.workunit.client.1.vm06.stdout:7/752: rename d0/df/d1a/d35/d62 to d0/df/d1a/d3a/d4e/d5e/ddc 0 2026-03-09T00:04:01.134 INFO:tasks.workunit.client.1.vm06.stdout:4/728: creat d17/d21/d4c/d50/ff4 x:0 0 0 2026-03-09T00:04:01.136 INFO:tasks.workunit.client.1.vm06.stdout:9/649: truncate d1/d4/d6e/d9/f10 405005 0 2026-03-09T00:04:01.137 INFO:tasks.workunit.client.1.vm06.stdout:6/762: dread d4/d27/d3e/f41 [4194304,4194304] 0 2026-03-09T00:04:01.141 INFO:tasks.workunit.client.1.vm06.stdout:7/753: symlink d0/df/d1a/d27/d4c/d40/d51/d90/ldd 0 2026-03-09T00:04:01.141 INFO:tasks.workunit.client.1.vm06.stdout:4/729: write d17/d24/d49/de4/db0/fe0 [758716,69710] 0 2026-03-09T00:04:01.141 INFO:tasks.workunit.client.1.vm06.stdout:4/730: fsync d17/d5b/f77 0 2026-03-09T00:04:01.141 INFO:tasks.workunit.client.1.vm06.stdout:4/731: write d17/d21/d4c/d50/f9c [1757245,29767] 0 2026-03-09T00:04:01.141 INFO:tasks.workunit.client.1.vm06.stdout:5/872: rename d5/d1c/d21/d28/d5e/d66/d78/da6/lf6 to d5/d1c/d68/dec/l122 0 2026-03-09T00:04:01.142 INFO:tasks.workunit.client.1.vm06.stdout:7/754: dread d0/df/d17/f2d [0,4194304] 0 2026-03-09T00:04:01.142 INFO:tasks.workunit.client.1.vm06.stdout:7/755: write d0/df/d1a/d27/d4c/d40/f41 [1707727,29683] 0 2026-03-09T00:04:01.142 INFO:tasks.workunit.client.1.vm06.stdout:7/756: readlink d0/df/d1a/d22/l2e 0 2026-03-09T00:04:01.150 INFO:tasks.workunit.client.1.vm06.stdout:9/650: creat d1/d3/d4f/fd2 x:0 0 0 2026-03-09T00:04:01.151 INFO:tasks.workunit.client.1.vm06.stdout:8/778: dwrite db/d74/f8e [0,4194304] 0 2026-03-09T00:04:01.151 INFO:tasks.workunit.client.1.vm06.stdout:8/779: read db/d53/d70/f71 [787842,5317] 0 2026-03-09T00:04:01.151 INFO:tasks.workunit.client.1.vm06.stdout:0/771: dwrite d3/d18/d1f/d39/d3b/df9/fba [0,4194304] 0 2026-03-09T00:04:01.161 INFO:tasks.workunit.client.1.vm06.stdout:6/763: stat d4/d16/d53/ddf/da6/ld9 0 2026-03-09T00:04:01.165 INFO:tasks.workunit.client.0.vm03.stdout:4/611: rmdir d7 39 2026-03-09T00:04:01.165 INFO:tasks.workunit.client.0.vm03.stdout:0/491: sync 2026-03-09T00:04:01.173 INFO:tasks.workunit.client.1.vm06.stdout:4/732: mkdir d17/d24/d3b/dbf/ddf/df5 0 2026-03-09T00:04:01.173 INFO:tasks.workunit.client.0.vm03.stdout:2/508: sync 2026-03-09T00:04:01.174 INFO:tasks.workunit.client.0.vm03.stdout:9/548: sync 2026-03-09T00:04:01.177 INFO:tasks.workunit.client.1.vm06.stdout:3/752: dwrite d11/d28/d2e/d2f/d36/f4a [0,4194304] 0 2026-03-09T00:04:01.179 INFO:tasks.workunit.client.1.vm06.stdout:3/753: write d11/d28/d2e/d2f/d36/faf [650576,89664] 0 2026-03-09T00:04:01.179 INFO:tasks.workunit.client.1.vm06.stdout:3/754: chown d11/d28/l97 254977 1 2026-03-09T00:04:01.179 INFO:tasks.workunit.client.1.vm06.stdout:3/755: creat d11/d28/d2e/db2/dc2/f108 x:0 0 0 2026-03-09T00:04:01.179 INFO:tasks.workunit.client.1.vm06.stdout:3/756: chown d11/d28/d2e/d2f/d5b/d94 110335 1 2026-03-09T00:04:01.180 INFO:tasks.workunit.client.1.vm06.stdout:3/757: truncate d11/d3f/f96 563633 0 2026-03-09T00:04:01.186 INFO:tasks.workunit.client.1.vm06.stdout:5/873: mknod d5/d44/d84/c123 0 2026-03-09T00:04:01.186 INFO:tasks.workunit.client.1.vm06.stdout:5/874: fsync d5/d1c/d23/d34/d47/ddd/fe9 0 2026-03-09T00:04:01.186 INFO:tasks.workunit.client.1.vm06.stdout:5/875: chown d5/d44/f108 807298019 1 2026-03-09T00:04:01.186 INFO:tasks.workunit.client.1.vm06.stdout:5/876: stat d5/d44/f4a 0 2026-03-09T00:04:01.188 INFO:tasks.workunit.client.0.vm03.stdout:4/612: mknod d7/d27/dc9/ccc 0 2026-03-09T00:04:01.197 INFO:tasks.workunit.client.0.vm03.stdout:0/492: symlink d2/da/dd/d6e/lb9 0 2026-03-09T00:04:01.197 INFO:tasks.workunit.client.1.vm06.stdout:7/757: rename d0/df/d1a/d3a/d4e/c82 to d0/d55/d85/cde 0 2026-03-09T00:04:01.200 INFO:tasks.workunit.client.1.vm06.stdout:9/651: link d1/d3/f11 d1/d3/d4f/d91/d94/fd3 0 2026-03-09T00:04:01.201 INFO:tasks.workunit.client.1.vm06.stdout:7/758: dread d0/df/d1a/d27/d4c/d40/d5b/faf [0,4194304] 0 2026-03-09T00:04:01.201 INFO:tasks.workunit.client.1.vm06.stdout:7/759: creat d0/df/fdf x:0 0 0 2026-03-09T00:04:01.202 INFO:tasks.workunit.client.0.vm03.stdout:4/613: unlink d7/d6f/da5/fb6 0 2026-03-09T00:04:01.205 INFO:tasks.workunit.client.0.vm03.stdout:0/493: mkdir d2/da/dd/d49/d6c/d81/db5/dba 0 2026-03-09T00:04:01.206 INFO:tasks.workunit.client.0.vm03.stdout:7/483: dwrite d2/d1f/d40/d67/f64 [0,4194304] 0 2026-03-09T00:04:01.206 INFO:tasks.workunit.client.0.vm03.stdout:7/484: rename d2/d4/d1e/d5e to d2/d4/d1e/d5e/d6c/d8e/d90 22 2026-03-09T00:04:01.206 INFO:tasks.workunit.client.0.vm03.stdout:7/485: write d2/d4/d1e/d5e/d6c/f3f [15261,101932] 0 2026-03-09T00:04:01.206 INFO:tasks.workunit.client.0.vm03.stdout:7/486: chown d2/d1f/d42 1848140196 1 2026-03-09T00:04:01.209 INFO:tasks.workunit.client.1.vm06.stdout:1/671: dwrite d6/d4c/d71/d83/f9b [0,4194304] 0 2026-03-09T00:04:01.210 INFO:tasks.workunit.client.1.vm06.stdout:8/780: link db/d1e/f82 db/dd/d85/d9f/ffc 0 2026-03-09T00:04:01.218 INFO:tasks.workunit.client.0.vm03.stdout:7/487: dread d2/d4/fb [0,4194304] 0 2026-03-09T00:04:01.218 INFO:tasks.workunit.client.0.vm03.stdout:4/614: symlink d7/d20/d29/d4e/lcd 0 2026-03-09T00:04:01.218 INFO:tasks.workunit.client.0.vm03.stdout:0/494: truncate d2/da/dd/d49/d6c/d4b/f67 790292 0 2026-03-09T00:04:01.221 INFO:tasks.workunit.client.0.vm03.stdout:4/615: write d7/d20/d6a/d77/db7/fa3 [518048,57854] 0 2026-03-09T00:04:01.221 INFO:tasks.workunit.client.0.vm03.stdout:4/616: read - d7/f5d zero size 2026-03-09T00:04:01.224 INFO:tasks.workunit.client.0.vm03.stdout:4/617: getdents d7/d20/d6a/d77/db7 0 2026-03-09T00:04:01.230 INFO:tasks.workunit.client.0.vm03.stdout:0/495: symlink d2/da/d36/lbb 0 2026-03-09T00:04:01.230 INFO:tasks.workunit.client.0.vm03.stdout:0/496: readlink d2/da/dd/d49/d6c/d4b/d55/lb3 0 2026-03-09T00:04:01.230 INFO:tasks.workunit.client.0.vm03.stdout:0/497: creat d2/da/dd/d49/d6c/d81/db5/dba/fbc x:0 0 0 2026-03-09T00:04:01.231 INFO:tasks.workunit.client.0.vm03.stdout:5/527: dwrite d1c/d20/d55/d66/f83 [0,4194304] 0 2026-03-09T00:04:01.231 INFO:tasks.workunit.client.0.vm03.stdout:5/528: getdents d1c/d20/d55/d66/d6b 0 2026-03-09T00:04:01.234 INFO:tasks.workunit.client.1.vm06.stdout:9/652: write d1/d4/d2f/fa0 [3063177,74073] 0 2026-03-09T00:04:01.235 INFO:tasks.workunit.client.0.vm03.stdout:1/598: dwrite d4/d3a/f26 [0,4194304] 0 2026-03-09T00:04:01.235 INFO:tasks.workunit.client.1.vm06.stdout:9/653: fsync d1/d4/d2f/f43 0 2026-03-09T00:04:01.235 INFO:tasks.workunit.client.1.vm06.stdout:9/654: chown d1/d3/c83 1962166260 1 2026-03-09T00:04:01.235 INFO:tasks.workunit.client.1.vm06.stdout:9/655: dread - d1/d4/d6e/d14/fb2 zero size 2026-03-09T00:04:01.236 INFO:tasks.workunit.client.0.vm03.stdout:4/618: write d7/d20/d6a/fba [720231,13051] 0 2026-03-09T00:04:01.236 INFO:tasks.workunit.client.0.vm03.stdout:4/619: stat d7/d27/dc4 0 2026-03-09T00:04:01.239 INFO:tasks.workunit.client.1.vm06.stdout:9/656: dread d1/d3/fa6 [0,4194304] 0 2026-03-09T00:04:01.240 INFO:tasks.workunit.client.0.vm03.stdout:7/488: dread d2/d1f/d42/d46/f5b [0,4194304] 0 2026-03-09T00:04:01.243 INFO:tasks.workunit.client.0.vm03.stdout:7/489: dread d2/d4/d1e/d5e/d6c/d37/f56 [0,4194304] 0 2026-03-09T00:04:01.246 INFO:tasks.workunit.client.0.vm03.stdout:0/498: rename d2/da/d36/f58 to d2/d5a/fbd 0 2026-03-09T00:04:01.246 INFO:tasks.workunit.client.0.vm03.stdout:0/499: dread - d2/da/d1a/fb0 zero size 2026-03-09T00:04:01.248 INFO:tasks.workunit.client.0.vm03.stdout:2/509: dwrite d8/d1b/d2a/d6b/f89 [0,4194304] 0 2026-03-09T00:04:01.248 INFO:tasks.workunit.client.0.vm03.stdout:2/510: write d8/d1b/f71 [867692,27154] 0 2026-03-09T00:04:01.248 INFO:tasks.workunit.client.0.vm03.stdout:2/511: write d8/d1b/f71 [441243,65740] 0 2026-03-09T00:04:01.248 INFO:tasks.workunit.client.0.vm03.stdout:2/512: dread - d8/d1b/d2a/d6b/f9d zero size 2026-03-09T00:04:01.248 INFO:tasks.workunit.client.0.vm03.stdout:2/513: chown d8/d26/d5e/d6f/d97/f34 0 1 2026-03-09T00:04:01.248 INFO:tasks.workunit.client.0.vm03.stdout:2/514: chown d8/d1b/d24/c4a 10459332 1 2026-03-09T00:04:01.250 INFO:tasks.workunit.client.1.vm06.stdout:0/772: dwrite d3/d18/d1f/d39/d3b/df9/fc1 [0,4194304] 0 2026-03-09T00:04:01.250 INFO:tasks.workunit.client.0.vm03.stdout:9/549: dwrite d15/d1c/d28/f55 [4194304,4194304] 0 2026-03-09T00:04:01.256 INFO:tasks.workunit.client.0.vm03.stdout:9/550: dread d15/d1c/d21/d64/f3d [0,4194304] 0 2026-03-09T00:04:01.256 INFO:tasks.workunit.client.0.vm03.stdout:9/551: stat d15/d1c/d21/d54/dab 0 2026-03-09T00:04:01.257 INFO:tasks.workunit.client.1.vm06.stdout:6/764: mkdir d4/d27/d3e/d45/dea 0 2026-03-09T00:04:01.264 INFO:tasks.workunit.client.0.vm03.stdout:9/552: dread f8 [0,4194304] 0 2026-03-09T00:04:01.269 INFO:tasks.workunit.client.0.vm03.stdout:9/553: chown d15/d1c/d21/db5 126351 1 2026-03-09T00:04:01.269 INFO:tasks.workunit.client.1.vm06.stdout:4/733: symlink d17/d24/d3b/dbf/ddf/lf6 0 2026-03-09T00:04:01.274 INFO:tasks.workunit.client.1.vm06.stdout:3/758: read d11/f12 [2356205,122224] 0 2026-03-09T00:04:01.274 INFO:tasks.workunit.client.1.vm06.stdout:3/759: write d11/d28/d2e/d2f/f78 [171719,128332] 0 2026-03-09T00:04:01.275 INFO:tasks.workunit.client.0.vm03.stdout:1/599: symlink d4/d6/ld1 0 2026-03-09T00:04:01.275 INFO:tasks.workunit.client.0.vm03.stdout:1/600: creat d4/d15/d5c/fd2 x:0 0 0 2026-03-09T00:04:01.278 INFO:tasks.workunit.client.0.vm03.stdout:3/390: write d2/db/d40/d44/f4d [4320835,79146] 0 2026-03-09T00:04:01.284 INFO:tasks.workunit.client.1.vm06.stdout:5/877: creat d5/d1c/d21/d28/d5e/d66/d78/da6/f124 x:0 0 0 2026-03-09T00:04:01.284 INFO:tasks.workunit.client.1.vm06.stdout:0/773: dwrite d3/d18/d1f/d39/d3b/df9/fca [0,4194304] 0 2026-03-09T00:04:01.284 INFO:tasks.workunit.client.1.vm06.stdout:0/774: fdatasync d3/d18/d1f/d39/d3b/f66 0 2026-03-09T00:04:01.291 INFO:tasks.workunit.client.1.vm06.stdout:7/760: mkdir d0/df/d1a/d27/d4c/d40/d51/d90/dae/de0 0 2026-03-09T00:04:01.291 INFO:tasks.workunit.client.1.vm06.stdout:7/761: fsync d0/df/d1a/d3a/f84 0 2026-03-09T00:04:01.291 INFO:tasks.workunit.client.1.vm06.stdout:7/762: write d0/df/d1a/d27/d4c/d40/d51/d86/fc3 [4291676,125718] 0 2026-03-09T00:04:01.291 INFO:tasks.workunit.client.1.vm06.stdout:7/763: write d0/df/d1a/d35/f77 [1947625,57456] 0 2026-03-09T00:04:01.305 INFO:tasks.workunit.client.0.vm03.stdout:4/620: link d7/d20/d6a/d77/d25/f7f d7/d20/fce 0 2026-03-09T00:04:01.308 INFO:tasks.workunit.client.1.vm06.stdout:8/781: truncate db/d74/d78/d98/fbb 1975638 0 2026-03-09T00:04:01.308 INFO:tasks.workunit.client.1.vm06.stdout:8/782: chown db/d74/d87/cea 133835 1 2026-03-09T00:04:01.308 INFO:tasks.workunit.client.1.vm06.stdout:8/783: chown db/dd/d85/d9f/lf3 2 1 2026-03-09T00:04:01.318 INFO:tasks.workunit.client.1.vm06.stdout:9/657: unlink d1/d3/d4f/f71 0 2026-03-09T00:04:01.328 INFO:tasks.workunit.client.0.vm03.stdout:1/601: dwrite d4/d5e/f82 [4194304,4194304] 0 2026-03-09T00:04:01.328 INFO:tasks.workunit.client.0.vm03.stdout:1/602: creat d4/d5e/fd3 x:0 0 0 2026-03-09T00:04:01.331 INFO:tasks.workunit.client.0.vm03.stdout:2/515: symlink d8/d1b/laa 0 2026-03-09T00:04:01.337 INFO:tasks.workunit.client.0.vm03.stdout:2/516: write d8/d1b/d2a/d6b/d50/f54 [1430717,89621] 0 2026-03-09T00:04:01.340 INFO:tasks.workunit.client.0.vm03.stdout:9/554: mkdir d15/db6 0 2026-03-09T00:04:01.341 INFO:tasks.workunit.client.0.vm03.stdout:9/555: truncate d15/d1c/d21/d54/dab/faf 1177484 0 2026-03-09T00:04:01.358 INFO:tasks.workunit.client.0.vm03.stdout:5/529: rmdir d1c/d20/d55/d3b 39 2026-03-09T00:04:01.362 INFO:tasks.workunit.client.1.vm06.stdout:6/765: symlink d4/d8d/leb 0 2026-03-09T00:04:01.362 INFO:tasks.workunit.client.1.vm06.stdout:6/766: stat d4/d16/d53/ddf/d7e/dac/dcd 0 2026-03-09T00:04:01.362 INFO:tasks.workunit.client.1.vm06.stdout:6/767: fsync d4/d16/d53/ddf/d52/f6c 0 2026-03-09T00:04:01.362 INFO:tasks.workunit.client.1.vm06.stdout:6/768: chown d4/d16/d53/fb7 20859 1 2026-03-09T00:04:01.370 INFO:tasks.workunit.client.1.vm06.stdout:4/734: mknod d17/d21/d4c/d66/dd9/cf7 0 2026-03-09T00:04:01.372 INFO:tasks.workunit.client.0.vm03.stdout:3/391: readlink d2/l39 0 2026-03-09T00:04:01.374 INFO:tasks.workunit.client.1.vm06.stdout:3/760: mknod d11/d28/d2e/d7e/d83/d87/c109 0 2026-03-09T00:04:01.374 INFO:tasks.workunit.client.1.vm06.stdout:3/761: stat d11/d28/d2e/d2f/d5b/l8c 0 2026-03-09T00:04:01.374 INFO:tasks.workunit.client.1.vm06.stdout:3/762: write d11/f27 [1036288,58000] 0 2026-03-09T00:04:01.377 INFO:tasks.workunit.client.0.vm03.stdout:3/392: write d2/db/d2d/f54 [25574,38698] 0 2026-03-09T00:04:01.382 INFO:tasks.workunit.client.0.vm03.stdout:4/621: mkdir d7/d6f/dcf 0 2026-03-09T00:04:01.392 INFO:tasks.workunit.client.0.vm03.stdout:2/517: dwrite d8/f5d [4194304,4194304] 0 2026-03-09T00:04:01.392 INFO:tasks.workunit.client.0.vm03.stdout:2/518: stat d8/d1b/d2a 0 2026-03-09T00:04:01.396 INFO:tasks.workunit.client.0.vm03.stdout:0/500: dwrite d2/da/dd/d49/d6c/d4b/f67 [0,4194304] 0 2026-03-09T00:04:01.402 INFO:tasks.workunit.client.0.vm03.stdout:2/519: dread d8/d1b/d2a/f2d [0,4194304] 0 2026-03-09T00:04:01.402 INFO:tasks.workunit.client.0.vm03.stdout:2/520: dread f7 [0,4194304] 0 2026-03-09T00:04:01.402 INFO:tasks.workunit.client.0.vm03.stdout:2/521: stat d8/d1b/l29 0 2026-03-09T00:04:01.402 INFO:tasks.workunit.client.1.vm06.stdout:5/878: creat d5/d1c/d21/d28/d5e/d66/d78/dd5/f125 x:0 0 0 2026-03-09T00:04:01.412 INFO:tasks.workunit.client.0.vm03.stdout:7/490: rename d2/d1f/d40 to d2/d1f/d42/d91 0 2026-03-09T00:04:01.415 INFO:tasks.workunit.client.0.vm03.stdout:7/491: dread d2/d1f/d42/f47 [0,4194304] 0 2026-03-09T00:04:01.417 INFO:tasks.workunit.client.0.vm03.stdout:4/622: dwrite d7/d20/d6a/d77/d25/fb8 [0,4194304] 0 2026-03-09T00:04:01.448 INFO:tasks.workunit.client.0.vm03.stdout:4/623: dwrite d7/d27/f89 [0,4194304] 0 2026-03-09T00:04:01.448 INFO:tasks.workunit.client.0.vm03.stdout:4/624: stat d7/d20/d6a/d77/fc8 0 2026-03-09T00:04:01.448 INFO:tasks.workunit.client.0.vm03.stdout:4/625: write d7/f5d [605277,108211] 0 2026-03-09T00:04:01.448 INFO:tasks.workunit.client.0.vm03.stdout:4/626: write d7/f5d [1038511,33892] 0 2026-03-09T00:04:01.448 INFO:tasks.workunit.client.0.vm03.stdout:4/627: chown d7/d20/c59 25466634 1 2026-03-09T00:04:01.454 INFO:tasks.workunit.client.0.vm03.stdout:9/556: symlink d15/d1c/d21/d54/dab/lb7 0 2026-03-09T00:04:01.455 INFO:tasks.workunit.client.0.vm03.stdout:9/557: read d15/d1c/d36/f5c [2553877,56122] 0 2026-03-09T00:04:01.455 INFO:tasks.workunit.client.0.vm03.stdout:9/558: write d15/f9a [472786,4994] 0 2026-03-09T00:04:01.476 INFO:tasks.workunit.client.1.vm06.stdout:7/764: mknod d0/df/ce1 0 2026-03-09T00:04:01.476 INFO:tasks.workunit.client.1.vm06.stdout:7/765: readlink d0/d39/la0 0 2026-03-09T00:04:01.476 INFO:tasks.workunit.client.0.vm03.stdout:3/393: creat d2/db/d3b/d5d/d6d/f6e x:0 0 0 2026-03-09T00:04:01.477 INFO:tasks.workunit.client.0.vm03.stdout:3/394: dread d2/db/f15 [0,4194304] 0 2026-03-09T00:04:01.477 INFO:tasks.workunit.client.0.vm03.stdout:3/395: fsync f1 0 2026-03-09T00:04:01.480 INFO:tasks.workunit.client.0.vm03.stdout:0/501: symlink d2/d71/lbe 0 2026-03-09T00:04:01.481 INFO:tasks.workunit.client.1.vm06.stdout:7/766: dread d0/df/d1a/d3a/d4e/d5e/f73 [0,4194304] 0 2026-03-09T00:04:01.492 INFO:tasks.workunit.client.0.vm03.stdout:6/488: sync 2026-03-09T00:04:01.493 INFO:tasks.workunit.client.0.vm03.stdout:8/539: sync 2026-03-09T00:04:01.504 INFO:tasks.workunit.client.0.vm03.stdout:2/522: mknod d8/d1b/d8f/cab 0 2026-03-09T00:04:01.504 INFO:tasks.workunit.client.0.vm03.stdout:2/523: chown d8/d26/d5e/d5f/d95 9 1 2026-03-09T00:04:01.504 INFO:tasks.workunit.client.0.vm03.stdout:2/524: dread - d8/d1b/d2a/d56/f8c zero size 2026-03-09T00:04:01.517 INFO:tasks.workunit.client.1.vm06.stdout:8/784: mknod db/d74/d87/cfd 0 2026-03-09T00:04:01.525 INFO:tasks.workunit.client.0.vm03.stdout:7/492: symlink d2/d4/d1e/d85/l92 0 2026-03-09T00:04:01.529 INFO:tasks.workunit.client.1.vm06.stdout:9/658: rmdir d1/d4/d6e/d14/d25 39 2026-03-09T00:04:01.529 INFO:tasks.workunit.client.1.vm06.stdout:6/769: mknod d4/d27/d3e/d78/cec 0 2026-03-09T00:04:01.535 INFO:tasks.workunit.client.0.vm03.stdout:9/559: dwrite d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:04:01.542 INFO:tasks.workunit.client.1.vm06.stdout:4/735: creat d17/d21/d4c/d66/ff8 x:0 0 0 2026-03-09T00:04:01.544 INFO:tasks.workunit.client.1.vm06.stdout:4/736: dread - d17/d21/d4c/fcc zero size 2026-03-09T00:04:01.548 INFO:tasks.workunit.client.1.vm06.stdout:2/805: sync 2026-03-09T00:04:01.548 INFO:tasks.workunit.client.1.vm06.stdout:2/806: creat d7/da/d1c/ff5 x:0 0 0 2026-03-09T00:04:01.553 INFO:tasks.workunit.client.1.vm06.stdout:4/737: read d17/d21/d4c/d50/f8c [3734952,59518] 0 2026-03-09T00:04:01.553 INFO:tasks.workunit.client.0.vm03.stdout:1/603: getdents d4/d3a/d32/d87 0 2026-03-09T00:04:01.557 INFO:tasks.workunit.client.1.vm06.stdout:3/763: symlink d11/d28/d2e/l10a 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.1.vm06.stdout:5/879: creat d5/d1c/d23/d34/d47/ddd/dd9/f126 x:0 0 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.1.vm06.stdout:9/659: symlink d1/d3/d50/ld4 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:4/628: mkdir d7/d20/d29/d4e/dd0 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:5/530: getdents d1c/d20/d55/d4f/d58/d73/d76 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:3/396: creat d2/db/d2d/d55/f6f x:0 0 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:6/489: rmdir d13/d1e 39 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:1/604: truncate d4/d3a/d3d/fa2 1449348 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:1/605: fdatasync d4/d5e/f82 0 2026-03-09T00:04:01.574 INFO:tasks.workunit.client.0.vm03.stdout:7/493: creat d2/d4/d1e/d5e/d6c/d80/f93 x:0 0 0 2026-03-09T00:04:01.577 INFO:tasks.workunit.client.1.vm06.stdout:5/880: dread d5/d1c/d68/fc7 [0,4194304] 0 2026-03-09T00:04:01.581 INFO:tasks.workunit.client.1.vm06.stdout:6/770: dread d4/d27/d3e/f55 [0,4194304] 0 2026-03-09T00:04:01.582 INFO:tasks.workunit.client.0.vm03.stdout:4/629: dread d7/d20/d35/fb5 [0,4194304] 0 2026-03-09T00:04:01.582 INFO:tasks.workunit.client.0.vm03.stdout:4/630: write d7/d20/d6a/d77/fae [770357,64690] 0 2026-03-09T00:04:01.584 INFO:tasks.workunit.client.0.vm03.stdout:0/502: dwrite d2/da/dd/d49/fa9 [0,4194304] 0 2026-03-09T00:04:01.584 INFO:tasks.workunit.client.0.vm03.stdout:0/503: readlink d2/da/d76/d8a/l8e 0 2026-03-09T00:04:01.584 INFO:tasks.workunit.client.0.vm03.stdout:0/504: stat d2/da/d1a/l9b 0 2026-03-09T00:04:01.602 INFO:tasks.workunit.client.0.vm03.stdout:8/540: dwrite d7/df/d1e/d3f/f7d [0,4194304] 0 2026-03-09T00:04:01.602 INFO:tasks.workunit.client.0.vm03.stdout:8/541: chown d7/df/d1e/d38/d91/fa5 55682279 1 2026-03-09T00:04:01.609 INFO:tasks.workunit.client.0.vm03.stdout:5/531: rename d1c/d20/d55/d3b/f57 to d1c/d20/d55/d4f/d58/d5d/faa 0 2026-03-09T00:04:01.624 INFO:tasks.workunit.client.1.vm06.stdout:1/672: dwrite d6/f25 [4194304,4194304] 0 2026-03-09T00:04:01.624 INFO:tasks.workunit.client.0.vm03.stdout:3/397: mkdir d2/db/d6a/d70 0 2026-03-09T00:04:01.625 INFO:tasks.workunit.client.1.vm06.stdout:4/738: creat d17/d5b/ff9 x:0 0 0 2026-03-09T00:04:01.627 INFO:tasks.workunit.client.0.vm03.stdout:1/606: link d4/d3a/c24 d4/d15/d77/cd4 0 2026-03-09T00:04:01.632 INFO:tasks.workunit.client.1.vm06.stdout:3/764: rename d11/d28/d4d/fc7 to d11/d28/d4d/d89/d90/f10b 0 2026-03-09T00:04:01.636 INFO:tasks.workunit.client.1.vm06.stdout:3/765: creat d11/d28/d2e/d7e/d83/d87/f10c x:0 0 0 2026-03-09T00:04:01.636 INFO:tasks.workunit.client.1.vm06.stdout:0/775: link d3/d18/d1f/l8f d3/d18/d1f/d39/d49/d60/l106 0 2026-03-09T00:04:01.636 INFO:tasks.workunit.client.1.vm06.stdout:0/776: write d3/d18/d1f/d39/d49/d60/fef [868303,44197] 0 2026-03-09T00:04:01.637 INFO:tasks.workunit.client.1.vm06.stdout:9/660: mknod d1/d3/d50/cd5 0 2026-03-09T00:04:01.637 INFO:tasks.workunit.client.1.vm06.stdout:9/661: chown d1/d3/d2b/l2e 1787 1 2026-03-09T00:04:01.642 INFO:tasks.workunit.client.1.vm06.stdout:4/739: dread d17/d24/d3b/f4a [0,4194304] 0 2026-03-09T00:04:01.649 INFO:tasks.workunit.client.1.vm06.stdout:8/785: dwrite db/d74/d78/fe2 [0,4194304] 0 2026-03-09T00:04:01.656 INFO:tasks.workunit.client.1.vm06.stdout:8/786: write db/d53/d70/f75 [3836202,105886] 0 2026-03-09T00:04:01.658 INFO:tasks.workunit.client.1.vm06.stdout:5/881: creat d5/d1c/d23/d34/d47/f127 x:0 0 0 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: mgrmap e28: vm03.yvcons(active, since 1.20804s) 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: pgmap v3: 65 pgs: 65 active+clean; 2.8 GiB data, 9.5 GiB used, 110 GiB / 120 GiB avail 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:04:01.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:01 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:04:01.671 INFO:tasks.workunit.client.0.vm03.stdout:4/631: dwrite d7/d6f/f9b [0,4194304] 0 2026-03-09T00:04:01.675 INFO:tasks.workunit.client.1.vm06.stdout:1/673: mknod d6/d21/da6/ce3 0 2026-03-09T00:04:01.675 INFO:tasks.workunit.client.1.vm06.stdout:1/674: chown d6/d4c/d71/d83/lac 458 1 2026-03-09T00:04:01.688 INFO:tasks.workunit.client.1.vm06.stdout:6/771: dwrite d4/fc [0,4194304] 0 2026-03-09T00:04:01.692 INFO:tasks.workunit.client.1.vm06.stdout:2/807: truncate d7/da/d4e/d57/f7a 1712132 0 2026-03-09T00:04:01.692 INFO:tasks.workunit.client.1.vm06.stdout:2/808: chown d7/d1a/d56/f50 3460 1 2026-03-09T00:04:01.692 INFO:tasks.workunit.client.1.vm06.stdout:2/809: stat d7/d1a/d25/d66/d87/da8/db2 0 2026-03-09T00:04:01.692 INFO:tasks.workunit.client.1.vm06.stdout:2/810: write d7/da/d1c/ff5 [305912,17958] 0 2026-03-09T00:04:01.697 INFO:tasks.workunit.client.1.vm06.stdout:3/766: write d11/f5a [1468110,28494] 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.1.vm06.stdout:9/662: link d1/d4/d6e/d14/d25/d85/f90 d1/d4/fd6 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.1.vm06.stdout:4/740: symlink d17/d21/lfa 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.1.vm06.stdout:4/741: fsync d17/d5b/f83 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.1.vm06.stdout:4/742: truncate d17/d21/d4c/d66/ff8 728919 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/505: rename d2/da/f1b to d2/fbf 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/506: dread - d2/da/d1a/f91 zero size 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/507: chown d2/da/d1a/l5e 56062 1 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/508: truncate d2/da/dd/d49/d6c/d81/db5/dba/fbc 416049 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/509: fsync d2/d71/f7c 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/510: getdents d2/da/dd/d49/d6c/da6 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/511: getdents d2/da/dd/d49/d6c/d4b/d55/d6f/dad 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/512: write d2/da/d76/fb2 [11128,90640] 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:0/513: write d2/f7f [261957,59749] 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:5/532: symlink d1c/d51/d6a/lab 0 2026-03-09T00:04:01.702 INFO:tasks.workunit.client.0.vm03.stdout:1/607: creat d4/d3a/d32/d87/fd5 x:0 0 0 2026-03-09T00:04:01.707 INFO:tasks.workunit.client.1.vm06.stdout:2/811: link d7/d1a/d3c/f4d d7/da/db/ff6 0 2026-03-09T00:04:01.707 INFO:tasks.workunit.client.1.vm06.stdout:2/812: readlink d7/d1a/d25/d66/d87/da8/lc0 0 2026-03-09T00:04:01.707 INFO:tasks.workunit.client.1.vm06.stdout:9/663: rename d1/d3/d4f/d52/l59 to d1/d4/d6e/d14/d25/d85/d49/ld7 0 2026-03-09T00:04:01.712 INFO:tasks.workunit.client.0.vm03.stdout:5/533: mkdir d1c/d20/d55/dac 0 2026-03-09T00:04:01.713 INFO:tasks.workunit.client.0.vm03.stdout:1/608: mknod d4/d15/dae/dcb/cd6 0 2026-03-09T00:04:01.713 INFO:tasks.workunit.client.0.vm03.stdout:1/609: fdatasync d4/d15/d5c/f6f 0 2026-03-09T00:04:01.713 INFO:tasks.workunit.client.0.vm03.stdout:0/514: mknod d2/da/d36/cc0 0 2026-03-09T00:04:01.713 INFO:tasks.workunit.client.0.vm03.stdout:5/534: write fe [591719,124955] 0 2026-03-09T00:04:01.713 INFO:tasks.workunit.client.0.vm03.stdout:5/535: truncate d1c/d20/d55/f9b 596397 0 2026-03-09T00:04:01.714 INFO:tasks.workunit.client.0.vm03.stdout:1/610: unlink d4/d15/d5c/fd2 0 2026-03-09T00:04:01.714 INFO:tasks.workunit.client.0.vm03.stdout:1/611: write d4/d15/f18 [8519568,71123] 0 2026-03-09T00:04:01.714 INFO:tasks.workunit.client.0.vm03.stdout:1/612: fdatasync d4/d3a/d32/f53 0 2026-03-09T00:04:01.717 INFO:tasks.workunit.client.0.vm03.stdout:5/536: mknod d1c/d20/d55/d4f/d58/d73/d9e/cad 0 2026-03-09T00:04:01.717 INFO:tasks.workunit.client.0.vm03.stdout:0/515: getdents d2/d71 0 2026-03-09T00:04:01.717 INFO:tasks.workunit.client.0.vm03.stdout:0/516: truncate d2/da/f4f 1581270 0 2026-03-09T00:04:01.717 INFO:tasks.workunit.client.0.vm03.stdout:0/517: truncate d2/da/dd/d49/d6c/d4b/d55/f83 755367 0 2026-03-09T00:04:01.719 INFO:tasks.workunit.client.0.vm03.stdout:5/537: stat d1c/d51/c8a 0 2026-03-09T00:04:01.721 INFO:tasks.workunit.client.0.vm03.stdout:2/525: dwrite d8/d1b/f71 [0,4194304] 0 2026-03-09T00:04:01.725 INFO:tasks.workunit.client.0.vm03.stdout:0/518: mkdir d2/da/dd/d49/d6c/d4b/dc1 0 2026-03-09T00:04:01.726 INFO:tasks.workunit.client.0.vm03.stdout:5/538: creat d1c/d20/d55/d4f/d58/d73/d9e/fae x:0 0 0 2026-03-09T00:04:01.726 INFO:tasks.workunit.client.0.vm03.stdout:5/539: creat d1c/d51/d6a/d75/faf x:0 0 0 2026-03-09T00:04:01.732 INFO:tasks.workunit.client.0.vm03.stdout:5/540: read d1c/d20/d55/d4f/d58/d5d/faa [4616838,57942] 0 2026-03-09T00:04:01.734 INFO:tasks.workunit.client.0.vm03.stdout:0/519: dread d2/da/dd/d49/d6c/d4b/fa0 [0,4194304] 0 2026-03-09T00:04:01.737 INFO:tasks.workunit.client.0.vm03.stdout:5/541: write d1c/d51/f68 [3233831,104509] 0 2026-03-09T00:04:01.743 INFO:tasks.workunit.client.0.vm03.stdout:2/526: rename d8/d1b/d6c/c84 to d8/d26/d5e/d6f/d97/cac 0 2026-03-09T00:04:01.761 INFO:tasks.workunit.client.0.vm03.stdout:2/527: dread d8/d26/d5e/d6f/d97/f27 [4194304,4194304] 0 2026-03-09T00:04:01.761 INFO:tasks.workunit.client.0.vm03.stdout:2/528: creat d8/d1b/d2a/d2e/fad x:0 0 0 2026-03-09T00:04:01.763 INFO:tasks.workunit.client.1.vm06.stdout:2/813: write d7/da/db/de/f49 [3199136,65981] 0 2026-03-09T00:04:01.765 INFO:tasks.workunit.client.1.vm06.stdout:2/814: unlink d7/d1b/d71/d79/db4/dc1/f5e 0 2026-03-09T00:04:01.767 INFO:tasks.workunit.client.1.vm06.stdout:2/815: rename d7/da/d4e/d57/cb8 to d7/da/cf7 0 2026-03-09T00:04:01.769 INFO:tasks.workunit.client.1.vm06.stdout:2/816: mknod d7/d1a/d25/d66/cf8 0 2026-03-09T00:04:01.769 INFO:tasks.workunit.client.1.vm06.stdout:2/817: stat d7/d1a/d25/fae 0 2026-03-09T00:04:01.800 INFO:tasks.workunit.client.1.vm06.stdout:0/777: dwrite d3/d18/de9/fbd [0,4194304] 0 2026-03-09T00:04:01.800 INFO:tasks.workunit.client.1.vm06.stdout:0/778: creat d3/d18/d2c/d2d/d74/d90/f107 x:0 0 0 2026-03-09T00:04:01.800 INFO:tasks.workunit.client.1.vm06.stdout:0/779: chown d3/d18/d2c/d2d/d74 1355544262 1 2026-03-09T00:04:01.800 INFO:tasks.workunit.client.1.vm06.stdout:0/780: chown d3/f29 56782632 1 2026-03-09T00:04:01.802 INFO:tasks.workunit.client.1.vm06.stdout:0/781: rename d3/d18/d2c/f4e to d3/d18/d2c/d2d/d31/f108 0 2026-03-09T00:04:01.802 INFO:tasks.workunit.client.1.vm06.stdout:0/782: chown d3/d18/d3c/l3f 0 1 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: mgrmap e28: vm03.yvcons(active, since 1.20804s) 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: pgmap v3: 65 pgs: 65 active+clean; 2.8 GiB data, 9.5 GiB used, 110 GiB / 120 GiB avail 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:04:01.804 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:01 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1737499333' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:04:01.813 INFO:tasks.workunit.client.1.vm06.stdout:0/783: dread d3/d18/d1f/f5e [0,4194304] 0 2026-03-09T00:04:01.820 INFO:tasks.workunit.client.1.vm06.stdout:0/784: rmdir d3/d18/d1f/d39/d49 39 2026-03-09T00:04:01.820 INFO:tasks.workunit.client.1.vm06.stdout:0/785: mkdir d3/d18/d2c/d2d/d74/da8/d109 0 2026-03-09T00:04:01.820 INFO:tasks.workunit.client.1.vm06.stdout:0/786: mkdir d3/d18/d2c/d2d/d8c/d10a 0 2026-03-09T00:04:01.822 INFO:tasks.workunit.client.1.vm06.stdout:0/787: unlink d3/d18/d28/d45/f48 0 2026-03-09T00:04:01.829 INFO:tasks.workunit.client.1.vm06.stdout:0/788: write d3/d18/d3c/f87 [1686071,16655] 0 2026-03-09T00:04:01.829 INFO:tasks.workunit.client.1.vm06.stdout:0/789: chown d3/lda 143020865 1 2026-03-09T00:04:01.831 INFO:tasks.workunit.client.1.vm06.stdout:7/767: dwrite d0/df/d1a/d27/d4c/f32 [0,4194304] 0 2026-03-09T00:04:01.834 INFO:tasks.workunit.client.1.vm06.stdout:0/790: creat d3/d18/d1f/d39/f10b x:0 0 0 2026-03-09T00:04:01.851 INFO:tasks.workunit.client.1.vm06.stdout:0/791: truncate d3/d18/d1f/d39/d3b/f57 4119531 0 2026-03-09T00:04:01.851 INFO:tasks.workunit.client.1.vm06.stdout:0/792: fsync d3/d18/d1f/d39/d3b/df9/df2/d73/fb8 0 2026-03-09T00:04:01.851 INFO:tasks.workunit.client.1.vm06.stdout:0/793: chown d3/d18/d2c/d2d/d74/daf/de3 91311 1 2026-03-09T00:04:01.851 INFO:tasks.workunit.client.1.vm06.stdout:0/794: readlink d3/d18/d2c/d2d/d74/lb0 0 2026-03-09T00:04:01.856 INFO:tasks.workunit.client.1.vm06.stdout:4/743: dwrite d17/d21/d4c/d50/f8c [8388608,4194304] 0 2026-03-09T00:04:01.856 INFO:tasks.workunit.client.1.vm06.stdout:4/744: truncate d17/d21/d4c/d50/ff4 371899 0 2026-03-09T00:04:01.856 INFO:tasks.workunit.client.1.vm06.stdout:4/745: write d17/d24/d49/d5f/db2/ff2 [226117,30449] 0 2026-03-09T00:04:01.856 INFO:tasks.workunit.client.1.vm06.stdout:4/746: creat d17/d21/d4c/d66/d68/ffb x:0 0 0 2026-03-09T00:04:01.856 INFO:tasks.workunit.client.1.vm06.stdout:4/747: chown d17/d24/d3b/d5e/d7a 346964570 1 2026-03-09T00:04:01.859 INFO:tasks.workunit.client.1.vm06.stdout:4/748: getdents d17/d24/d3b/d5e 0 2026-03-09T00:04:01.873 INFO:tasks.workunit.client.0.vm03.stdout:4/632: dwrite d7/d20/fbf [0,4194304] 0 2026-03-09T00:04:01.873 INFO:tasks.workunit.client.0.vm03.stdout:4/633: chown d7/d20/d29/d78/lbc 4881 1 2026-03-09T00:04:01.881 INFO:tasks.workunit.client.0.vm03.stdout:4/634: creat d7/d20/d29/d38/fd1 x:0 0 0 2026-03-09T00:04:01.885 INFO:tasks.workunit.client.0.vm03.stdout:2/529: dwrite d8/d1b/f1e [0,4194304] 0 2026-03-09T00:04:01.891 INFO:tasks.workunit.client.1.vm06.stdout:6/772: dwrite d4/d16/d53/ddf/f75 [0,4194304] 0 2026-03-09T00:04:01.899 INFO:tasks.workunit.client.1.vm06.stdout:1/675: dwrite d6/d4c/d71/d83/f9b [0,4194304] 0 2026-03-09T00:04:01.910 INFO:tasks.workunit.client.0.vm03.stdout:2/530: truncate d8/d26/d5e/d6f/d97/f1d 1517920 0 2026-03-09T00:04:01.911 INFO:tasks.workunit.client.0.vm03.stdout:2/531: fdatasync d8/d26/d5e/d5f/f48 0 2026-03-09T00:04:01.911 INFO:tasks.workunit.client.1.vm06.stdout:6/773: creat d4/d16/d53/ddf/da6/dbb/fed x:0 0 0 2026-03-09T00:04:01.911 INFO:tasks.workunit.client.1.vm06.stdout:6/774: fsync d4/d16/d53/ddf/d7e/dac/fe1 0 2026-03-09T00:04:01.918 INFO:tasks.workunit.client.0.vm03.stdout:2/532: creat d8/d26/d5e/d6f/d97/fae x:0 0 0 2026-03-09T00:04:01.919 INFO:tasks.workunit.client.1.vm06.stdout:1/676: creat d6/d4c/fe4 x:0 0 0 2026-03-09T00:04:01.919 INFO:tasks.workunit.client.1.vm06.stdout:3/767: dwrite d11/d28/d2e/d2f/fec [0,4194304] 0 2026-03-09T00:04:01.919 INFO:tasks.workunit.client.1.vm06.stdout:3/768: readlink d11/d28/d2e/d2f/d5b/d5f/la5 0 2026-03-09T00:04:01.920 INFO:tasks.workunit.client.1.vm06.stdout:1/677: truncate d6/d21/d2d/d3b/d42/f9a 1542840 0 2026-03-09T00:04:01.920 INFO:tasks.workunit.client.1.vm06.stdout:1/678: fdatasync d6/d4c/d71/fbf 0 2026-03-09T00:04:01.925 INFO:tasks.workunit.client.1.vm06.stdout:3/769: read d11/d28/d2e/d2f/d5b/fea [570584,32507] 0 2026-03-09T00:04:01.929 INFO:tasks.workunit.client.0.vm03.stdout:8/542: write d7/f25 [1116814,66637] 0 2026-03-09T00:04:01.956 INFO:tasks.workunit.client.0.vm03.stdout:1/613: dwrite d4/d15/d77/fbd [0,4194304] 0 2026-03-09T00:04:01.957 INFO:tasks.workunit.client.0.vm03.stdout:8/543: rename d7/df/d1a/d40/f79 to d7/df/d1e/d38/d91/fab 0 2026-03-09T00:04:01.958 INFO:tasks.workunit.client.0.vm03.stdout:7/494: sync 2026-03-09T00:04:01.959 INFO:tasks.workunit.client.0.vm03.stdout:6/490: sync 2026-03-09T00:04:01.960 INFO:tasks.workunit.client.0.vm03.stdout:3/398: sync 2026-03-09T00:04:01.960 INFO:tasks.workunit.client.0.vm03.stdout:9/560: sync 2026-03-09T00:04:01.960 INFO:tasks.workunit.client.0.vm03.stdout:9/561: chown d15/d1c/d21/d54 0 1 2026-03-09T00:04:01.960 INFO:tasks.workunit.client.0.vm03.stdout:9/562: chown d15/d1c/c90 91 1 2026-03-09T00:04:01.974 INFO:tasks.workunit.client.0.vm03.stdout:8/544: mknod d7/df/d1e/d38/d60/cac 0 2026-03-09T00:04:01.975 INFO:tasks.workunit.client.0.vm03.stdout:7/495: symlink d2/d1f/d3a/d24/l94 0 2026-03-09T00:04:01.978 INFO:tasks.workunit.client.0.vm03.stdout:6/491: rename d13/l29 to d13/d35/d72/laa 0 2026-03-09T00:04:01.991 INFO:tasks.workunit.client.0.vm03.stdout:3/399: symlink d2/db/d6a/l71 0 2026-03-09T00:04:01.992 INFO:tasks.workunit.client.0.vm03.stdout:3/400: write d2/db/d3b/d5d/d6d/f6e [835368,44191] 0 2026-03-09T00:04:01.992 INFO:tasks.workunit.client.0.vm03.stdout:8/545: rmdir d7/df/d1e/d3f/d95 39 2026-03-09T00:04:01.992 INFO:tasks.workunit.client.1.vm06.stdout:6/775: mknod d4/d27/d3e/d57/cee 0 2026-03-09T00:04:01.992 INFO:tasks.workunit.client.1.vm06.stdout:6/776: read d4/d16/d53/ddf/da6/fc6 [1844588,130492] 0 2026-03-09T00:04:01.998 INFO:tasks.workunit.client.1.vm06.stdout:1/679: getdents d6/d21/d2d/d3b/d42 0 2026-03-09T00:04:01.998 INFO:tasks.workunit.client.1.vm06.stdout:1/680: write d6/d4c/d71/fab [39195,102717] 0 2026-03-09T00:04:01.998 INFO:tasks.workunit.client.0.vm03.stdout:6/492: mknod d13/d1e/d44/d4a/cab 0 2026-03-09T00:04:01.998 INFO:tasks.workunit.client.0.vm03.stdout:6/493: write d13/d1e/f3f [788707,27834] 0 2026-03-09T00:04:01.998 INFO:tasks.workunit.client.0.vm03.stdout:7/496: unlink d2/d4/d1e/d5e/d6c/d37/c7c 0 2026-03-09T00:04:02.003 INFO:tasks.workunit.client.0.vm03.stdout:3/401: mkdir d2/db/d3b/d5d/d6d/d72 0 2026-03-09T00:04:02.003 INFO:tasks.workunit.client.0.vm03.stdout:3/402: write d2/f8 [4857094,82004] 0 2026-03-09T00:04:02.003 INFO:tasks.workunit.client.0.vm03.stdout:3/403: write d2/db/d3b/f4f [331102,67074] 0 2026-03-09T00:04:02.004 INFO:tasks.workunit.client.1.vm06.stdout:2/818: dwrite d7/d1a/d25/fa3 [0,4194304] 0 2026-03-09T00:04:02.010 INFO:tasks.workunit.client.1.vm06.stdout:2/819: write d7/d1b/f37 [257260,89155] 0 2026-03-09T00:04:02.012 INFO:tasks.workunit.client.0.vm03.stdout:5/542: dwrite d1c/d20/d55/f9b [0,4194304] 0 2026-03-09T00:04:02.012 INFO:tasks.workunit.client.1.vm06.stdout:6/777: creat d4/d27/fef x:0 0 0 2026-03-09T00:04:02.012 INFO:tasks.workunit.client.1.vm06.stdout:6/778: fdatasync d4/f6e 0 2026-03-09T00:04:02.013 INFO:tasks.workunit.client.1.vm06.stdout:0/795: dread d3/d18/d2c/f4d [0,4194304] 0 2026-03-09T00:04:02.013 INFO:tasks.workunit.client.1.vm06.stdout:0/796: creat d3/d18/de9/f10c x:0 0 0 2026-03-09T00:04:02.014 INFO:tasks.workunit.client.0.vm03.stdout:3/404: read d2/db/d3b/f4f [152257,91889] 0 2026-03-09T00:04:02.014 INFO:tasks.workunit.client.0.vm03.stdout:2/533: write d8/fb [1626941,87324] 0 2026-03-09T00:04:02.016 INFO:tasks.workunit.client.0.vm03.stdout:7/497: creat d2/d1f/d42/d91/d67/f95 x:0 0 0 2026-03-09T00:04:02.016 INFO:tasks.workunit.client.0.vm03.stdout:7/498: fsync d2/d4/d1e/d5e/d6c/d80/f93 0 2026-03-09T00:04:02.016 INFO:tasks.workunit.client.0.vm03.stdout:6/494: read d13/f6f [3783351,109828] 0 2026-03-09T00:04:02.016 INFO:tasks.workunit.client.0.vm03.stdout:6/495: fsync fb 0 2026-03-09T00:04:02.023 INFO:tasks.workunit.client.1.vm06.stdout:1/681: creat d6/d21/da6/fe5 x:0 0 0 2026-03-09T00:04:02.025 INFO:tasks.workunit.client.1.vm06.stdout:1/682: chown d6/d21/d2d/d37/d6d/cc0 29 1 2026-03-09T00:04:02.025 INFO:tasks.workunit.client.1.vm06.stdout:1/683: creat d6/d21/d2d/d3b/d87/d9d/dd8/fe6 x:0 0 0 2026-03-09T00:04:02.025 INFO:tasks.workunit.client.1.vm06.stdout:1/684: write d6/f8c [4616846,74332] 0 2026-03-09T00:04:02.025 INFO:tasks.workunit.client.1.vm06.stdout:1/685: fsync d6/f25 0 2026-03-09T00:04:02.027 INFO:tasks.workunit.client.1.vm06.stdout:3/770: dwrite d11/d28/f6b [0,4194304] 0 2026-03-09T00:04:02.027 INFO:tasks.workunit.client.1.vm06.stdout:4/749: dwrite d17/d21/d4c/d50/f60 [4194304,4194304] 0 2026-03-09T00:04:02.036 INFO:tasks.workunit.client.1.vm06.stdout:2/820: mknod d7/d1b/cf9 0 2026-03-09T00:04:02.038 INFO:tasks.workunit.client.0.vm03.stdout:3/405: mknod d2/db/d3b/d5f/d65/c73 0 2026-03-09T00:04:02.039 INFO:tasks.workunit.client.1.vm06.stdout:6/779: creat d4/ff0 x:0 0 0 2026-03-09T00:04:02.043 INFO:tasks.workunit.client.0.vm03.stdout:7/499: write d2/d1f/d35/f5a [115993,81510] 0 2026-03-09T00:04:02.043 INFO:tasks.workunit.client.0.vm03.stdout:0/520: dwrite d2/da/dd/d49/d6c/f57 [0,4194304] 0 2026-03-09T00:04:02.044 INFO:tasks.workunit.client.1.vm06.stdout:2/821: dread d7/d1a/d25/fa3 [0,4194304] 0 2026-03-09T00:04:02.045 INFO:tasks.workunit.client.0.vm03.stdout:1/614: dwrite d4/d3a/d32/f4f [0,4194304] 0 2026-03-09T00:04:02.048 INFO:tasks.workunit.client.1.vm06.stdout:0/797: mkdir d3/d18/d2c/d2d/d74/daf/d10d 0 2026-03-09T00:04:02.052 INFO:tasks.workunit.client.0.vm03.stdout:4/635: dwrite d7/d20/f3d [0,4194304] 0 2026-03-09T00:04:02.052 INFO:tasks.workunit.client.0.vm03.stdout:4/636: chown d7/d6f/l8a 0 1 2026-03-09T00:04:02.052 INFO:tasks.workunit.client.0.vm03.stdout:6/496: dread d13/d1e/f30 [0,4194304] 0 2026-03-09T00:04:02.054 INFO:tasks.workunit.client.1.vm06.stdout:7/768: dwrite d0/df/d1a/d27/d4c/d40/f5a [0,4194304] 0 2026-03-09T00:04:02.055 INFO:tasks.workunit.client.0.vm03.stdout:8/546: dwrite d7/df/d1a/d40/d58/f7f [0,4194304] 0 2026-03-09T00:04:02.057 INFO:tasks.workunit.client.0.vm03.stdout:4/637: dread d7/d20/f3d [0,4194304] 0 2026-03-09T00:04:02.057 INFO:tasks.workunit.client.0.vm03.stdout:3/406: dread d2/db/d3b/d5d/d6d/f6e [0,4194304] 0 2026-03-09T00:04:02.057 INFO:tasks.workunit.client.0.vm03.stdout:3/407: readlink d2/db/l22 0 2026-03-09T00:04:02.071 INFO:tasks.workunit.client.0.vm03.stdout:4/638: dread d7/d20/d35/d66/f69 [0,4194304] 0 2026-03-09T00:04:02.081 INFO:tasks.workunit.client.0.vm03.stdout:8/547: dread d7/f67 [0,4194304] 0 2026-03-09T00:04:02.103 INFO:tasks.workunit.client.1.vm06.stdout:1/686: creat d6/d21/fe7 x:0 0 0 2026-03-09T00:04:02.116 INFO:tasks.workunit.client.0.vm03.stdout:1/615: dwrite d4/d6/f33 [4194304,4194304] 0 2026-03-09T00:04:02.116 INFO:tasks.workunit.client.0.vm03.stdout:1/616: creat d4/d3a/d3d/d46/fd7 x:0 0 0 2026-03-09T00:04:02.121 INFO:tasks.workunit.client.0.vm03.stdout:0/521: creat d2/da/dd/d49/d6c/da6/fc2 x:0 0 0 2026-03-09T00:04:02.160 INFO:tasks.workunit.client.1.vm06.stdout:4/750: unlink d17/d21/d4c/dc2/fe1 0 2026-03-09T00:04:02.160 INFO:tasks.workunit.client.1.vm06.stdout:6/780: link d4/d16/d53/cdc d4/d16/d53/ddf/d4b/cf1 0 2026-03-09T00:04:02.160 INFO:tasks.workunit.client.1.vm06.stdout:6/781: chown d4/d16/d53/ddf/d7e/dac/cb1 1400 1 2026-03-09T00:04:02.160 INFO:tasks.workunit.client.1.vm06.stdout:6/782: read d4/d16/d53/ddf/d7e/dac/fe1 [412991,5537] 0 2026-03-09T00:04:02.160 INFO:tasks.workunit.client.1.vm06.stdout:2/822: unlink d7/d1a/l2a 0 2026-03-09T00:04:02.160 INFO:tasks.workunit.client.1.vm06.stdout:0/798: truncate d3/f1c 908020 0 2026-03-09T00:04:02.184 INFO:tasks.workunit.client.0.vm03.stdout:3/408: link d2/db/d3b/d5f/d65/c73 d2/db/d3b/d5d/c74 0 2026-03-09T00:04:02.213 INFO:tasks.workunit.client.0.vm03.stdout:4/639: creat d7/d27/dc9/fd2 x:0 0 0 2026-03-09T00:04:02.213 INFO:tasks.workunit.client.0.vm03.stdout:4/640: chown d7/d20/d29/d4e/c97 84999 1 2026-03-09T00:04:02.216 INFO:tasks.workunit.client.0.vm03.stdout:8/548: mkdir d7/df/d1e/dad 0 2026-03-09T00:04:02.216 INFO:tasks.workunit.client.0.vm03.stdout:8/549: read d7/df/d1a/f1c [3349668,103780] 0 2026-03-09T00:04:02.216 INFO:tasks.workunit.client.0.vm03.stdout:8/550: write d7/df/f31 [163107,112525] 0 2026-03-09T00:04:02.216 INFO:tasks.workunit.client.0.vm03.stdout:8/551: chown d7/df/d1e/f24 27230832 1 2026-03-09T00:04:02.216 INFO:tasks.workunit.client.0.vm03.stdout:8/552: truncate d7/df/d1e/d38/d91/fa5 278622 0 2026-03-09T00:04:02.226 INFO:tasks.workunit.client.1.vm06.stdout:7/769: mkdir d0/df/d1a/d27/d70/d9b/de2 0 2026-03-09T00:04:02.252 INFO:tasks.workunit.client.0.vm03.stdout:3/409: dwrite d2/db/f26 [0,4194304] 0 2026-03-09T00:04:02.256 INFO:tasks.workunit.client.0.vm03.stdout:7/500: rename d2/d4/d1e/d5e/d6c to d2/d1f/d42/d46/d81/d96 0 2026-03-09T00:04:02.260 INFO:tasks.workunit.client.0.vm03.stdout:7/501: write d2/d4/fb [7998310,63337] 0 2026-03-09T00:04:02.276 INFO:tasks.workunit.client.0.vm03.stdout:8/553: dwrite d7/f25 [0,4194304] 0 2026-03-09T00:04:02.276 INFO:tasks.workunit.client.0.vm03.stdout:8/554: chown d7/df/d1a/d40/c84 150208 1 2026-03-09T00:04:02.280 INFO:tasks.workunit.client.0.vm03.stdout:0/522: symlink d2/da/dd/d49/d6c/da6/lc3 0 2026-03-09T00:04:02.280 INFO:tasks.workunit.client.0.vm03.stdout:0/523: chown d2/da/dd/d49/d6c/l5f 27771 1 2026-03-09T00:04:02.284 INFO:tasks.workunit.client.0.vm03.stdout:8/555: dread d7/f25 [0,4194304] 0 2026-03-09T00:04:02.291 INFO:tasks.workunit.client.1.vm06.stdout:3/771: rmdir d11/d28 39 2026-03-09T00:04:02.293 INFO:tasks.workunit.client.0.vm03.stdout:3/410: symlink d2/db/d40/l75 0 2026-03-09T00:04:02.293 INFO:tasks.workunit.client.0.vm03.stdout:1/617: getdents d4/d15/d86 0 2026-03-09T00:04:02.297 INFO:tasks.workunit.client.0.vm03.stdout:7/502: creat d2/d4/d1e/f97 x:0 0 0 2026-03-09T00:04:02.297 INFO:tasks.workunit.client.0.vm03.stdout:8/556: rmdir d7/df/d1a 39 2026-03-09T00:04:02.301 INFO:tasks.workunit.client.1.vm06.stdout:2/823: symlink d7/ddb/lfa 0 2026-03-09T00:04:02.306 INFO:tasks.workunit.client.0.vm03.stdout:7/503: dread d2/d1f/d42/d46/d81/f8f [0,4194304] 0 2026-03-09T00:04:02.306 INFO:tasks.workunit.client.0.vm03.stdout:7/504: dread d2/d4/f2e [0,4194304] 0 2026-03-09T00:04:02.308 INFO:tasks.workunit.client.1.vm06.stdout:0/799: getdents d3/d18/d1f/d39 0 2026-03-09T00:04:02.312 INFO:tasks.workunit.client.0.vm03.stdout:1/618: mkdir d4/d3a/d61/d78/dd8 0 2026-03-09T00:04:02.315 INFO:tasks.workunit.client.1.vm06.stdout:2/824: rename d7/da/d1c/f5f to d7/da/d4e/d57/ffb 0 2026-03-09T00:04:02.315 INFO:tasks.workunit.client.1.vm06.stdout:0/800: symlink d3/d18/de9/l10e 0 2026-03-09T00:04:02.318 INFO:tasks.workunit.client.1.vm06.stdout:2/825: dread d7/d1a/d25/d66/f6b [0,4194304] 0 2026-03-09T00:04:02.318 INFO:tasks.workunit.client.1.vm06.stdout:2/826: write d7/d1a/d25/d66/d87/fc3 [764812,72163] 0 2026-03-09T00:04:02.318 INFO:tasks.workunit.client.1.vm06.stdout:2/827: chown c1 424964856 1 2026-03-09T00:04:02.319 INFO:tasks.workunit.client.0.vm03.stdout:7/505: mknod d2/d1f/d42/d46/d81/d96/d37/d39/c98 0 2026-03-09T00:04:02.319 INFO:tasks.workunit.client.0.vm03.stdout:1/619: mkdir d4/d15/d77/dce/dd9 0 2026-03-09T00:04:02.320 INFO:tasks.workunit.client.1.vm06.stdout:3/772: rename d11/d28/d2e/d2f/cfb to d11/d28/d4d/d89/c10d 0 2026-03-09T00:04:02.320 INFO:tasks.workunit.client.1.vm06.stdout:3/773: chown d11/d28/c2d 2 1 2026-03-09T00:04:02.323 INFO:tasks.workunit.client.1.vm06.stdout:2/828: rmdir d7/d1a/d25/d66/d87/da8/db2/dc9 39 2026-03-09T00:04:02.324 INFO:tasks.workunit.client.1.vm06.stdout:3/774: mknod d11/d28/d2e/d2f/d5b/d5f/db1/c10e 0 2026-03-09T00:04:02.326 INFO:tasks.workunit.client.1.vm06.stdout:2/829: rmdir d7/d1a/d96/dc8 39 2026-03-09T00:04:02.326 INFO:tasks.workunit.client.1.vm06.stdout:0/801: rename d3/d18/de9/d9f to d3/d10f 0 2026-03-09T00:04:02.333 INFO:tasks.workunit.client.0.vm03.stdout:1/620: dread d4/d15/d5c/f6f [0,4194304] 0 2026-03-09T00:04:02.336 INFO:tasks.workunit.client.1.vm06.stdout:2/830: symlink d7/d1a/d39/lfc 0 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.1.vm06.stdout:0/802: rename d3/d18/d28 to d3/d18/d2c/d2d/d74/dc7/d110 0 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.1.vm06.stdout:0/803: chown d3/d18/f82 231247304 1 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.1.vm06.stdout:0/804: fdatasync d3/d18/de9/f10c 0 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.1.vm06.stdout:0/805: truncate d3/d18/d1f/d39/d3b/f66 14155 0 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.1.vm06.stdout:0/806: readlink d3/d18/d1f/d39/d49/d60/l106 0 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.1.vm06.stdout:2/831: mknod d7/d1b/da5/daa/cfd 0 2026-03-09T00:04:02.343 INFO:tasks.workunit.client.0.vm03.stdout:1/621: mknod d4/d3a/d32/cda 0 2026-03-09T00:04:02.344 INFO:tasks.workunit.client.1.vm06.stdout:2/832: rename d7/d1a/d25/d66/d87/da8 to d7/da/d63/d81/dfe 0 2026-03-09T00:04:02.348 INFO:tasks.workunit.client.1.vm06.stdout:2/833: mknod d7/d1b/d71/d79/db4/dc1/d86/cff 0 2026-03-09T00:04:02.352 INFO:tasks.workunit.client.0.vm03.stdout:1/622: rename d4/d3a/d43/f57 to d4/d3a/d3d/d98/fdb 0 2026-03-09T00:04:02.353 INFO:tasks.workunit.client.1.vm06.stdout:2/834: dread d7/d1b/f22 [0,4194304] 0 2026-03-09T00:04:02.353 INFO:tasks.workunit.client.1.vm06.stdout:2/835: mknod d7/da/d1c/c100 0 2026-03-09T00:04:02.356 INFO:tasks.workunit.client.1.vm06.stdout:2/836: getdents d7/d1a/d25/d97 0 2026-03-09T00:04:02.356 INFO:tasks.workunit.client.1.vm06.stdout:2/837: write d7/d1a/d56/f50 [231578,72291] 0 2026-03-09T00:04:02.359 INFO:tasks.workunit.client.1.vm06.stdout:2/838: mknod d7/ddb/c101 0 2026-03-09T00:04:02.365 INFO:tasks.workunit.client.1.vm06.stdout:2/839: readlink d7/da/db/l44 0 2026-03-09T00:04:02.365 INFO:tasks.workunit.client.1.vm06.stdout:2/840: mkdir d7/da/d63/d81/dfe/db2/d102 0 2026-03-09T00:04:02.366 INFO:tasks.workunit.client.1.vm06.stdout:2/841: mknod d7/d1a/d56/c103 0 2026-03-09T00:04:02.366 INFO:tasks.workunit.client.1.vm06.stdout:7/770: dwrite d0/df/d17/f7e [4194304,4194304] 0 2026-03-09T00:04:02.366 INFO:tasks.workunit.client.1.vm06.stdout:7/771: stat d0/df/d1a/d27/d4c/d40/d5b/faf 0 2026-03-09T00:04:02.366 INFO:tasks.workunit.client.1.vm06.stdout:7/772: chown d0/df/d1a/d27/d4c/d40/d51/d90/c95 2281 1 2026-03-09T00:04:02.366 INFO:tasks.workunit.client.1.vm06.stdout:7/773: stat d0/df/d1a/d3a/d4e/d5e/ddc/f87 0 2026-03-09T00:04:02.369 INFO:tasks.workunit.client.1.vm06.stdout:7/774: truncate d0/df/d1a/d27/f66 831652 0 2026-03-09T00:04:02.369 INFO:tasks.workunit.client.1.vm06.stdout:7/775: mkdir d0/df/d1a/d22/de3 0 2026-03-09T00:04:02.370 INFO:tasks.workunit.client.1.vm06.stdout:7/776: getdents d0/df/d1a/d35 0 2026-03-09T00:04:02.380 INFO:tasks.workunit.client.0.vm03.stdout:1/623: dread d4/d15/d86/fad [0,4194304] 0 2026-03-09T00:04:02.380 INFO:tasks.workunit.client.0.vm03.stdout:1/624: write d4/d3a/d61/da6/fa7 [242769,128636] 0 2026-03-09T00:04:02.385 INFO:tasks.workunit.client.0.vm03.stdout:1/625: unlink d4/d3a/d3d/f64 0 2026-03-09T00:04:02.385 INFO:tasks.workunit.client.0.vm03.stdout:1/626: readlink d4/l7 0 2026-03-09T00:04:02.391 INFO:tasks.workunit.client.0.vm03.stdout:1/627: symlink d4/d3a/d61/d78/ldc 0 2026-03-09T00:04:02.392 INFO:tasks.workunit.client.1.vm06.stdout:6/783: dwrite d4/d16/d46/d90/fd0 [0,4194304] 0 2026-03-09T00:04:02.392 INFO:tasks.workunit.client.0.vm03.stdout:1/628: dread d4/d15/d1a/f1b [0,4194304] 0 2026-03-09T00:04:02.394 INFO:tasks.workunit.client.0.vm03.stdout:1/629: rmdir d4/d3a 39 2026-03-09T00:04:02.396 INFO:tasks.workunit.client.0.vm03.stdout:3/411: dwrite d2/db/f26 [0,4194304] 0 2026-03-09T00:04:02.396 INFO:tasks.workunit.client.0.vm03.stdout:3/412: stat d2/db/d3b/f3e 0 2026-03-09T00:04:02.398 INFO:tasks.workunit.client.0.vm03.stdout:1/630: dread d4/d15/d1a/f1b [0,4194304] 0 2026-03-09T00:04:02.400 INFO:tasks.workunit.client.1.vm06.stdout:4/751: dwrite d17/f20 [4194304,4194304] 0 2026-03-09T00:04:02.400 INFO:tasks.workunit.client.1.vm06.stdout:4/752: chown d17/d21/d4c/d66 53421 1 2026-03-09T00:04:02.406 INFO:tasks.workunit.client.0.vm03.stdout:3/413: mknod d2/db/d2d/c76 0 2026-03-09T00:04:02.409 INFO:tasks.workunit.client.0.vm03.stdout:1/631: link d4/d15/d5c/f74 d4/d3a/d32/dc2/fdd 0 2026-03-09T00:04:02.410 INFO:tasks.workunit.client.1.vm06.stdout:4/753: write d17/d24/d49/f65 [4705579,8227] 0 2026-03-09T00:04:02.410 INFO:tasks.workunit.client.1.vm06.stdout:4/754: write d17/d21/d4c/fd4 [507544,29702] 0 2026-03-09T00:04:02.410 INFO:tasks.workunit.client.0.vm03.stdout:3/414: mknod d2/db/d6a/c77 0 2026-03-09T00:04:02.411 INFO:tasks.workunit.client.1.vm06.stdout:4/755: getdents d17/d21 0 2026-03-09T00:04:02.411 INFO:tasks.workunit.client.0.vm03.stdout:1/632: creat d4/d15/dae/dcb/fde x:0 0 0 2026-03-09T00:04:02.413 INFO:tasks.workunit.client.0.vm03.stdout:1/633: dread d4/d3a/d61/d78/f8e [0,4194304] 0 2026-03-09T00:04:02.417 INFO:tasks.workunit.client.1.vm06.stdout:4/756: mkdir d17/d24/d3b/dbf/ddf/dfc 0 2026-03-09T00:04:02.420 INFO:tasks.workunit.client.1.vm06.stdout:4/757: readlink d17/d24/l44 0 2026-03-09T00:04:02.420 INFO:tasks.workunit.client.1.vm06.stdout:4/758: unlink d17/d5b/dac/fd8 0 2026-03-09T00:04:02.424 INFO:tasks.workunit.client.0.vm03.stdout:8/557: dwrite d7/df/d1a/f33 [0,4194304] 0 2026-03-09T00:04:02.424 INFO:tasks.workunit.client.0.vm03.stdout:1/634: dread d4/d15/d5c/d6c/fc0 [4194304,4194304] 0 2026-03-09T00:04:02.433 INFO:tasks.workunit.client.0.vm03.stdout:7/506: dwrite d2/d1f/d42/d46/d81/d96/d37/f56 [0,4194304] 0 2026-03-09T00:04:02.433 INFO:tasks.workunit.client.0.vm03.stdout:7/507: chown d2/d1f/d42/d91/d67/f95 173650030 1 2026-03-09T00:04:02.434 INFO:tasks.workunit.client.0.vm03.stdout:1/635: mknod d4/d3a/d32/dc2/cdf 0 2026-03-09T00:04:02.435 INFO:tasks.workunit.client.0.vm03.stdout:7/508: dread d2/d1f/d42/d46/d81/d96/d37/f56 [4194304,4194304] 0 2026-03-09T00:04:02.440 INFO:tasks.workunit.client.0.vm03.stdout:7/509: rename d2/d1f/d42/d46/d81/d96/l84 to d2/d4/d1e/d5e/d7e/l99 0 2026-03-09T00:04:02.440 INFO:tasks.workunit.client.1.vm06.stdout:6/784: dwrite d4/d16/d53/ddf/d4b/fba [0,4194304] 0 2026-03-09T00:04:02.440 INFO:tasks.workunit.client.1.vm06.stdout:6/785: fsync d4/d27/d3e/d57/f65 0 2026-03-09T00:04:02.441 INFO:tasks.workunit.client.0.vm03.stdout:7/510: mkdir d2/d1f/d35/d9a 0 2026-03-09T00:04:02.442 INFO:tasks.workunit.client.1.vm06.stdout:2/842: dwrite d7/da/d1c/f70 [0,4194304] 0 2026-03-09T00:04:02.442 INFO:tasks.workunit.client.1.vm06.stdout:2/843: stat d7/da/d1c/ff5 0 2026-03-09T00:04:02.443 INFO:tasks.workunit.client.0.vm03.stdout:7/511: link d2/d1f/d42/d91/d67/f70 d2/d1f/d42/d46/d54/f9b 0 2026-03-09T00:04:02.447 INFO:tasks.workunit.client.1.vm06.stdout:6/786: mkdir d4/d16/d53/df2 0 2026-03-09T00:04:02.448 INFO:tasks.workunit.client.1.vm06.stdout:6/787: creat d4/d16/d53/ddf/d7e/dac/dcd/ff3 x:0 0 0 2026-03-09T00:04:02.448 INFO:tasks.workunit.client.1.vm06.stdout:6/788: readlink d4/d27/l4c 0 2026-03-09T00:04:02.449 INFO:tasks.workunit.client.1.vm06.stdout:6/789: link d4/d16/c9a d4/d27/d3e/cf4 0 2026-03-09T00:04:02.449 INFO:tasks.workunit.client.1.vm06.stdout:6/790: creat d4/d27/d3e/d57/ff5 x:0 0 0 2026-03-09T00:04:02.458 INFO:tasks.workunit.client.1.vm06.stdout:5/882: sync 2026-03-09T00:04:02.458 INFO:tasks.workunit.client.1.vm06.stdout:8/787: sync 2026-03-09T00:04:02.458 INFO:tasks.workunit.client.1.vm06.stdout:5/883: fsync d5/d1c/d21/d28/d5e/f10d 0 2026-03-09T00:04:02.459 INFO:tasks.workunit.client.0.vm03.stdout:9/563: sync 2026-03-09T00:04:02.459 INFO:tasks.workunit.client.0.vm03.stdout:9/564: creat d15/d1c/fb8 x:0 0 0 2026-03-09T00:04:02.459 INFO:tasks.workunit.client.0.vm03.stdout:9/565: fsync d15/d1c/d28/f29 0 2026-03-09T00:04:02.459 INFO:tasks.workunit.client.0.vm03.stdout:5/543: sync 2026-03-09T00:04:02.461 INFO:tasks.workunit.client.1.vm06.stdout:9/664: sync 2026-03-09T00:04:02.462 INFO:tasks.workunit.client.1.vm06.stdout:1/687: sync 2026-03-09T00:04:02.464 INFO:tasks.workunit.client.1.vm06.stdout:9/665: dread d1/d4/d6e/f5d [0,4194304] 0 2026-03-09T00:04:02.471 INFO:tasks.workunit.client.0.vm03.stdout:9/566: unlink d15/d1c/d21/d64/c37 0 2026-03-09T00:04:02.471 INFO:tasks.workunit.client.0.vm03.stdout:9/567: creat d15/d1c/d36/d4d/fb9 x:0 0 0 2026-03-09T00:04:02.471 INFO:tasks.workunit.client.0.vm03.stdout:9/568: fsync d15/d1c/d36/fb1 0 2026-03-09T00:04:02.471 INFO:tasks.workunit.client.0.vm03.stdout:9/569: write d15/d1c/d36/f9e [841239,113508] 0 2026-03-09T00:04:02.472 INFO:tasks.workunit.client.1.vm06.stdout:1/688: mkdir d6/d4c/d51/db3/de8 0 2026-03-09T00:04:02.472 INFO:tasks.workunit.client.1.vm06.stdout:1/689: write d6/d21/d2d/d3b/d87/f9e [4986766,21104] 0 2026-03-09T00:04:02.472 INFO:tasks.workunit.client.0.vm03.stdout:5/544: mkdir d1c/d20/d55/db0 0 2026-03-09T00:04:02.476 INFO:tasks.workunit.client.1.vm06.stdout:9/666: symlink d1/d3/d2b/d58/ld8 0 2026-03-09T00:04:02.483 INFO:tasks.workunit.client.1.vm06.stdout:0/807: dread d3/d18/d2c/d2d/d31/f7b [0,4194304] 0 2026-03-09T00:04:02.483 INFO:tasks.workunit.client.1.vm06.stdout:9/667: mknod d1/d3/d4f/d91/cd9 0 2026-03-09T00:04:02.483 INFO:tasks.workunit.client.1.vm06.stdout:9/668: rmdir d1/d4/d6e 39 2026-03-09T00:04:02.483 INFO:tasks.workunit.client.0.vm03.stdout:9/570: rename d15/d1c/f3c to d15/d1c/d21/d54/d87/d93/fba 0 2026-03-09T00:04:02.484 INFO:tasks.workunit.client.1.vm06.stdout:1/690: write d6/d21/d2d/d3b/d87/f8d [4007658,42518] 0 2026-03-09T00:04:02.487 INFO:tasks.workunit.client.1.vm06.stdout:9/669: mknod d1/d3/d4f/d91/d94/d9e/cda 0 2026-03-09T00:04:02.493 INFO:tasks.workunit.client.1.vm06.stdout:1/691: link d6/f1b d6/d21/d2d/fe9 0 2026-03-09T00:04:02.500 INFO:tasks.workunit.client.1.vm06.stdout:0/808: dread d3/d18/d1f/d39/d49/f50 [0,4194304] 0 2026-03-09T00:04:02.500 INFO:tasks.workunit.client.1.vm06.stdout:0/809: symlink d3/d18/d2c/d2d/d74/da8/d109/l111 0 2026-03-09T00:04:02.501 INFO:tasks.workunit.client.1.vm06.stdout:0/810: creat d3/d18/f112 x:0 0 0 2026-03-09T00:04:02.508 INFO:tasks.workunit.client.1.vm06.stdout:0/811: write d3/d18/d2c/d2d/d31/f5d [739017,84358] 0 2026-03-09T00:04:02.513 INFO:tasks.workunit.client.0.vm03.stdout:9/571: dread d15/d1c/d36/f5c [0,4194304] 0 2026-03-09T00:04:02.513 INFO:tasks.workunit.client.1.vm06.stdout:0/812: link d3/d18/d1f/f4a d3/d18/d1f/d39/d49/d60/f113 0 2026-03-09T00:04:02.517 INFO:tasks.workunit.client.0.vm03.stdout:4/641: dread d7/d20/d6a/d77/db7/f9a [0,4194304] 0 2026-03-09T00:04:02.517 INFO:tasks.workunit.client.0.vm03.stdout:4/642: readlink d7/d20/d29/d4e/l64 0 2026-03-09T00:04:02.519 INFO:tasks.workunit.client.0.vm03.stdout:4/643: write d7/d20/d29/d38/d3a/f4b [1425992,127067] 0 2026-03-09T00:04:02.524 INFO:tasks.workunit.client.0.vm03.stdout:9/572: creat d15/d77/fbb x:0 0 0 2026-03-09T00:04:02.524 INFO:tasks.workunit.client.0.vm03.stdout:4/644: truncate d7/d20/d35/fb5 116043 0 2026-03-09T00:04:02.524 INFO:tasks.workunit.client.0.vm03.stdout:4/645: stat d7/d20/d29/f53 0 2026-03-09T00:04:02.541 INFO:tasks.workunit.client.1.vm06.stdout:2/844: dread d7/d1b/d31/fab [0,4194304] 0 2026-03-09T00:04:02.549 INFO:tasks.workunit.client.0.vm03.stdout:9/573: link d15/d1c/d21/d64/f3d d15/d1c/d21/d75/fbc 0 2026-03-09T00:04:02.549 INFO:tasks.workunit.client.0.vm03.stdout:9/574: chown d15/d1c/d28/l42 4344 1 2026-03-09T00:04:02.558 INFO:tasks.workunit.client.0.vm03.stdout:9/575: dread d15/d1c/d21/d54/d87/d93/fba [0,4194304] 0 2026-03-09T00:04:02.558 INFO:tasks.workunit.client.0.vm03.stdout:9/576: creat d15/d77/fbd x:0 0 0 2026-03-09T00:04:02.567 INFO:tasks.workunit.client.1.vm06.stdout:2/845: write d7/da/db/de/f11 [3778477,102122] 0 2026-03-09T00:04:02.570 INFO:tasks.workunit.client.1.vm06.stdout:2/846: link d7/da/db/de/f49 d7/d1b/d31/f104 0 2026-03-09T00:04:02.587 INFO:tasks.workunit.client.0.vm03.stdout:1/636: read d4/d15/d5c/d6c/f71 [1378842,93481] 0 2026-03-09T00:04:02.590 INFO:tasks.workunit.client.0.vm03.stdout:8/558: dwrite d7/df/d1a/f33 [0,4194304] 0 2026-03-09T00:04:02.601 INFO:tasks.workunit.client.0.vm03.stdout:6/497: sync 2026-03-09T00:04:02.601 INFO:tasks.workunit.client.0.vm03.stdout:2/534: sync 2026-03-09T00:04:02.601 INFO:tasks.workunit.client.0.vm03.stdout:6/498: write d13/fa9 [1053167,112954] 0 2026-03-09T00:04:02.608 INFO:tasks.workunit.client.0.vm03.stdout:2/535: rename d8/d26/d5e/d5f/f9c to d8/d26/d5e/d5f/d95/faf 0 2026-03-09T00:04:02.608 INFO:tasks.workunit.client.0.vm03.stdout:2/536: stat d8/d26/c93 0 2026-03-09T00:04:02.613 INFO:tasks.workunit.client.0.vm03.stdout:7/512: dwrite d2/f50 [0,4194304] 0 2026-03-09T00:04:02.616 INFO:tasks.workunit.client.0.vm03.stdout:6/499: creat d13/d35/d69/fac x:0 0 0 2026-03-09T00:04:02.617 INFO:tasks.workunit.client.1.vm06.stdout:6/791: dwrite d4/f36 [0,4194304] 0 2026-03-09T00:04:02.624 INFO:tasks.workunit.client.1.vm06.stdout:2/847: write d7/da/d4e/d57/ffb [8078589,12566] 0 2026-03-09T00:04:02.630 INFO:tasks.workunit.client.1.vm06.stdout:2/848: fdatasync d7/da/db/f74 0 2026-03-09T00:04:02.630 INFO:tasks.workunit.client.1.vm06.stdout:2/849: dread d7/d1a/d25/fa3 [0,4194304] 0 2026-03-09T00:04:02.630 INFO:tasks.workunit.client.1.vm06.stdout:2/850: chown d7/d1a/d25/fa3 64260 1 2026-03-09T00:04:02.631 INFO:tasks.workunit.client.0.vm03.stdout:7/513: dread d2/d1f/d42/d91/d67/f64 [0,4194304] 0 2026-03-09T00:04:02.634 INFO:tasks.workunit.client.0.vm03.stdout:2/537: mknod d8/d1b/d2a/d2e/cb0 0 2026-03-09T00:04:02.644 INFO:tasks.workunit.client.0.vm03.stdout:1/637: dread d4/d15/f8a [0,4194304] 0 2026-03-09T00:04:02.648 INFO:tasks.workunit.client.1.vm06.stdout:6/792: dread d4/d27/d3e/f44 [0,4194304] 0 2026-03-09T00:04:02.651 INFO:tasks.workunit.client.0.vm03.stdout:1/638: dread d4/d3a/d61/d78/f8e [0,4194304] 0 2026-03-09T00:04:02.651 INFO:tasks.workunit.client.0.vm03.stdout:1/639: readlink d4/lb2 0 2026-03-09T00:04:02.651 INFO:tasks.workunit.client.0.vm03.stdout:6/500: rename d13/d1e/d44/d4a/d52/f91 to d13/d35/d71/d97/da5/fad 0 2026-03-09T00:04:02.651 INFO:tasks.workunit.client.0.vm03.stdout:1/640: dread d4/d3a/d32/d87/fa5 [0,4194304] 0 2026-03-09T00:04:02.652 INFO:tasks.workunit.client.0.vm03.stdout:6/501: write f8 [5469716,4384] 0 2026-03-09T00:04:02.653 INFO:tasks.workunit.client.1.vm06.stdout:8/788: dwrite db/f17 [0,4194304] 0 2026-03-09T00:04:02.653 INFO:tasks.workunit.client.1.vm06.stdout:4/759: dwrite d17/d24/d49/f5a [0,4194304] 0 2026-03-09T00:04:02.653 INFO:tasks.workunit.client.1.vm06.stdout:4/760: dread - d17/d24/d49/de4/fc0 zero size 2026-03-09T00:04:02.654 INFO:tasks.workunit.client.0.vm03.stdout:8/559: write d7/df/d1a/d2b/f44 [1096194,4032] 0 2026-03-09T00:04:02.654 INFO:tasks.workunit.client.0.vm03.stdout:8/560: chown d7/f67 664 1 2026-03-09T00:04:02.654 INFO:tasks.workunit.client.0.vm03.stdout:5/545: dwrite d1c/d20/d55/f5a [0,4194304] 0 2026-03-09T00:04:02.661 INFO:tasks.workunit.client.1.vm06.stdout:9/670: dwrite d1/d4/d6e/d9/fc3 [0,4194304] 0 2026-03-09T00:04:02.661 INFO:tasks.workunit.client.1.vm06.stdout:9/671: write d1/d4/f39 [2079676,92397] 0 2026-03-09T00:04:02.664 INFO:tasks.workunit.client.0.vm03.stdout:4/646: dwrite d7/d20/d6a/d77/fc8 [0,4194304] 0 2026-03-09T00:04:02.667 INFO:tasks.workunit.client.1.vm06.stdout:2/851: mkdir d7/d1a/d89/d105 0 2026-03-09T00:04:02.667 INFO:tasks.workunit.client.1.vm06.stdout:2/852: write d7/d1a/d25/d97/fea [806616,95574] 0 2026-03-09T00:04:02.667 INFO:tasks.workunit.client.1.vm06.stdout:2/853: read d7/da/f18 [3924566,112811] 0 2026-03-09T00:04:02.672 INFO:tasks.workunit.client.0.vm03.stdout:2/538: mkdir d8/d26/d5e/db1 0 2026-03-09T00:04:02.686 INFO:tasks.workunit.client.1.vm06.stdout:1/692: dwrite d6/d63/f6a [0,4194304] 0 2026-03-09T00:04:02.686 INFO:tasks.workunit.client.1.vm06.stdout:1/693: stat d6/d21/d2d 0 2026-03-09T00:04:02.692 INFO:tasks.workunit.client.1.vm06.stdout:1/694: read d6/f19 [4184120,65449] 0 2026-03-09T00:04:02.692 INFO:tasks.workunit.client.1.vm06.stdout:1/695: truncate d6/f34 1411870 0 2026-03-09T00:04:02.692 INFO:tasks.workunit.client.1.vm06.stdout:1/696: dread - d6/d4c/fe4 zero size 2026-03-09T00:04:02.701 INFO:tasks.workunit.client.0.vm03.stdout:0/524: sync 2026-03-09T00:04:02.701 INFO:tasks.workunit.client.0.vm03.stdout:0/525: creat d2/da/d1a/fc4 x:0 0 0 2026-03-09T00:04:02.701 INFO:tasks.workunit.client.0.vm03.stdout:0/526: chown d2/da/dd/d49/d6c/d4b/d55/f78 2743 1 2026-03-09T00:04:02.701 INFO:tasks.workunit.client.0.vm03.stdout:0/527: write d2/da/dd/d49/f69 [320806,37176] 0 2026-03-09T00:04:02.701 INFO:tasks.workunit.client.0.vm03.stdout:1/641: truncate d4/d3a/f26 907511 0 2026-03-09T00:04:02.701 INFO:tasks.workunit.client.0.vm03.stdout:1/642: truncate d4/d15/d5c/fb1 600360 0 2026-03-09T00:04:02.702 INFO:tasks.workunit.client.0.vm03.stdout:3/415: rmdir d2/db/d2d 39 2026-03-09T00:04:02.702 INFO:tasks.workunit.client.0.vm03.stdout:3/416: creat d2/db/d40/f78 x:0 0 0 2026-03-09T00:04:02.702 INFO:tasks.workunit.client.1.vm06.stdout:7/777: rmdir d0/df 39 2026-03-09T00:04:02.704 INFO:tasks.workunit.client.0.vm03.stdout:8/561: link d7/df/d1e/d38/d4c/c50 d7/df/d1e/d3f/cae 0 2026-03-09T00:04:02.710 INFO:tasks.workunit.client.0.vm03.stdout:2/539: getdents d8/d1b/d8f 0 2026-03-09T00:04:02.710 INFO:tasks.workunit.client.0.vm03.stdout:2/540: write d8/f15 [4857023,120344] 0 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:9/672: rmdir d1/d4/d6e/d14/d25/d85/d49 39 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:2/854: truncate d7/da/db/f6e 1651317 0 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:2/855: chown d7/d1a/l72 11581 1 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:2/856: truncate d7/da/d63/fee 1028222 0 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:6/793: symlink d4/d16/d53/ddf/d4b/ddb/lf6 0 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:8/789: unlink db/d53/d70/d38/d4d/d79/fd1 0 2026-03-09T00:04:02.711 INFO:tasks.workunit.client.1.vm06.stdout:8/790: write db/d74/d78/d98/db6/ff0 [388202,25781] 0 2026-03-09T00:04:02.712 INFO:tasks.workunit.client.1.vm06.stdout:4/761: rename d17/d21/d4c/d66/d68/dbe/df3 to d17/d5b/d8f/dfd 0 2026-03-09T00:04:02.712 INFO:tasks.workunit.client.1.vm06.stdout:4/762: chown d17/d24/l4f 2851976 1 2026-03-09T00:04:02.712 INFO:tasks.workunit.client.1.vm06.stdout:4/763: creat d17/d21/d4c/d50/ffe x:0 0 0 2026-03-09T00:04:02.713 INFO:tasks.workunit.client.0.vm03.stdout:6/502: dread d13/d35/d74/d89/d9d/fa7 [0,4194304] 0 2026-03-09T00:04:02.714 INFO:tasks.workunit.client.0.vm03.stdout:6/503: dread d13/d35/d4c/f4f [0,4194304] 0 2026-03-09T00:04:02.715 INFO:tasks.workunit.client.0.vm03.stdout:6/504: dread d13/f1d [0,4194304] 0 2026-03-09T00:04:02.722 INFO:tasks.workunit.client.1.vm06.stdout:9/673: write d1/d3/d4f/d91/fc7 [4093957,11446] 0 2026-03-09T00:04:02.723 INFO:tasks.workunit.client.1.vm06.stdout:9/674: chown d1/d3/d50/fba 37354654 1 2026-03-09T00:04:02.728 INFO:tasks.workunit.client.1.vm06.stdout:1/697: creat d6/d4c/d71/fea x:0 0 0 2026-03-09T00:04:02.733 INFO:tasks.workunit.client.1.vm06.stdout:7/778: mkdir d0/df/d17/dba/de4 0 2026-03-09T00:04:02.733 INFO:tasks.workunit.client.1.vm06.stdout:7/779: fdatasync d0/df/d17/f1f 0 2026-03-09T00:04:02.734 INFO:tasks.workunit.client.1.vm06.stdout:7/780: dread d0/df/d1a/d27/d4c/d40/fa5 [0,4194304] 0 2026-03-09T00:04:02.734 INFO:tasks.workunit.client.1.vm06.stdout:7/781: chown d0/df/d1a/d27/d4c/d40/d51/d90/c95 28 1 2026-03-09T00:04:02.734 INFO:tasks.workunit.client.1.vm06.stdout:7/782: readlink d0/df/d1a/d27/d4c/d40/d51/d86/l91 0 2026-03-09T00:04:02.736 INFO:tasks.workunit.client.0.vm03.stdout:3/417: link d2/db/c35 d2/db/d3b/c79 0 2026-03-09T00:04:02.736 INFO:tasks.workunit.client.0.vm03.stdout:8/562: getdents d7/d92 0 2026-03-09T00:04:02.738 INFO:tasks.workunit.client.0.vm03.stdout:7/514: rename d2/d1f/d42/d43 to d2/d1f/d35/d9a/d9c 0 2026-03-09T00:04:02.739 INFO:tasks.workunit.client.0.vm03.stdout:9/577: dwrite d15/d1c/d36/f3a [0,4194304] 0 2026-03-09T00:04:02.759 INFO:tasks.workunit.client.0.vm03.stdout:6/505: mkdir d13/d35/d4c/dae 0 2026-03-09T00:04:02.759 INFO:tasks.workunit.client.0.vm03.stdout:3/418: creat d2/db/d3b/d5d/d6d/d72/f7a x:0 0 0 2026-03-09T00:04:02.759 INFO:tasks.workunit.client.0.vm03.stdout:3/419: fdatasync d2/db/f10 0 2026-03-09T00:04:02.759 INFO:tasks.workunit.client.0.vm03.stdout:3/420: dread - d2/db/d3b/f6c zero size 2026-03-09T00:04:02.764 INFO:tasks.workunit.client.1.vm06.stdout:6/794: symlink d4/d16/d53/ddf/d7e/lf7 0 2026-03-09T00:04:02.764 INFO:tasks.workunit.client.1.vm06.stdout:2/857: mknod d7/d1a/d39/df1/c106 0 2026-03-09T00:04:02.764 INFO:tasks.workunit.client.1.vm06.stdout:2/858: stat d7/d1a/d3c/f4d 0 2026-03-09T00:04:02.764 INFO:tasks.workunit.client.1.vm06.stdout:2/859: write d7/d1b/d71/d79/db4/dc1/d86/f8a [1650249,58329] 0 2026-03-09T00:04:02.765 INFO:tasks.workunit.client.1.vm06.stdout:4/764: link f14 d17/d5b/fff 0 2026-03-09T00:04:02.767 INFO:tasks.workunit.client.0.vm03.stdout:2/541: dread d8/d1b/d24/f2f [0,4194304] 0 2026-03-09T00:04:02.774 INFO:tasks.workunit.client.1.vm06.stdout:1/698: write d6/f25 [920615,8626] 0 2026-03-09T00:04:02.774 INFO:tasks.workunit.client.1.vm06.stdout:1/699: stat d6/d63/lda 0 2026-03-09T00:04:02.781 INFO:tasks.workunit.client.0.vm03.stdout:4/647: dwrite d7/d20/d29/d38/f8f [0,4194304] 0 2026-03-09T00:04:02.784 INFO:tasks.workunit.client.1.vm06.stdout:0/813: truncate d3/fa 233579 0 2026-03-09T00:04:02.786 INFO:tasks.workunit.client.0.vm03.stdout:9/578: symlink d15/d1c/d21/d54/d87/lbe 0 2026-03-09T00:04:02.790 INFO:tasks.workunit.client.1.vm06.stdout:9/675: mkdir d1/d3/d4f/d91/ddb 0 2026-03-09T00:04:02.795 INFO:tasks.workunit.client.1.vm06.stdout:2/860: mknod d7/d1b/d71/d79/db4/dc1/d86/c107 0 2026-03-09T00:04:02.798 INFO:tasks.workunit.client.1.vm06.stdout:1/700: creat d6/d4c/feb x:0 0 0 2026-03-09T00:04:02.798 INFO:tasks.workunit.client.0.vm03.stdout:4/648: truncate d7/f28 538536 0 2026-03-09T00:04:02.809 INFO:tasks.workunit.client.0.vm03.stdout:4/649: dread - d7/d20/d29/d4e/f9e zero size 2026-03-09T00:04:02.809 INFO:tasks.workunit.client.0.vm03.stdout:4/650: chown d7/d20/d29/d38/fca 12950 1 2026-03-09T00:04:02.809 INFO:tasks.workunit.client.1.vm06.stdout:9/676: mkdir d1/d3/ddc 0 2026-03-09T00:04:02.809 INFO:tasks.workunit.client.1.vm06.stdout:9/677: dread - d1/d4/f9c zero size 2026-03-09T00:04:02.809 INFO:tasks.workunit.client.1.vm06.stdout:9/678: stat d1/d4/d6e/d14/fb2 0 2026-03-09T00:04:02.810 INFO:tasks.workunit.client.1.vm06.stdout:9/679: write d1/d4/d6e/d9/f87 [351163,6369] 0 2026-03-09T00:04:02.810 INFO:tasks.workunit.client.1.vm06.stdout:1/701: mknod d6/d4c/d51/cec 0 2026-03-09T00:04:02.810 INFO:tasks.workunit.client.1.vm06.stdout:9/680: symlink d1/d3/d4f/d91/ddb/ldd 0 2026-03-09T00:04:02.810 INFO:tasks.workunit.client.1.vm06.stdout:9/681: truncate d1/d3/d2b/d58/f86 907523 0 2026-03-09T00:04:02.810 INFO:tasks.workunit.client.0.vm03.stdout:9/579: getdents d15/d1c 0 2026-03-09T00:04:02.810 INFO:tasks.workunit.client.0.vm03.stdout:5/546: rename d1c/c49 to d1c/d20/d55/d43/cb1 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:4/651: rmdir d7 39 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:9/580: creat d15/d1c/d21/d54/d87/d93/fbf x:0 0 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:2/542: rename d8/d1b/d2a/d6b/f89 to d8/d1b/d24/fb2 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:5/547: mkdir d1c/d20/d55/d66/db2 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:5/548: write fe [4112081,130206] 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:4/652: mknod d7/d20/d29/d38/d3a/cd3 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:9/581: creat d15/d1c/d28/d6e/da2/fc0 x:0 0 0 2026-03-09T00:04:02.819 INFO:tasks.workunit.client.0.vm03.stdout:2/543: mknod d8/d26/d5e/d5f/cb3 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:2/544: dread - d8/d26/d5e/d6f/d97/fae zero size 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.1.vm06.stdout:9/682: creat d1/d3/ddc/fde x:0 0 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:5/549: dread d1c/d20/f65 [0,4194304] 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:5/550: chown d1c/c1d 3 1 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:4/653: mknod d7/d20/d29/d54/d58/d85/cd4 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:4/654: chown d7/d20/d35 7 1 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:2/545: symlink d8/d1b/d2a/d2e/d9a/lb4 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:5/551: link d1c/d51/d6a/d75/faf d1c/d20/d97/fb3 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:5/552: chown d1c/d20/d55/d4f/d58/d5d/faa 70450741 1 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:5/553: getdents d1c/d20/d55/d43/da7 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:5/554: getdents d1c/d20/d55/d4f/d58/d73/d76 0 2026-03-09T00:04:02.820 INFO:tasks.workunit.client.0.vm03.stdout:4/655: unlink d7/d20/d29/d38/d3a/f4b 0 2026-03-09T00:04:02.830 INFO:tasks.workunit.client.0.vm03.stdout:4/656: symlink d7/d20/d29/d38/ld5 0 2026-03-09T00:04:02.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:02 vm03.local ceph-mon[52346]: pgmap v4: 65 pgs: 65 active+clean; 2.8 GiB data, 9.5 GiB used, 110 GiB / 120 GiB avail 2026-03-09T00:04:02.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:02 vm03.local ceph-mon[52346]: mgrmap e29: vm03.yvcons(active, since 2s), standbys: vm06.rzcvhn 2026-03-09T00:04:02.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:02 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:04:02.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:02 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:02.882 INFO:tasks.workunit.client.1.vm06.stdout:1/702: dread d6/d21/d2d/f5b [0,4194304] 0 2026-03-09T00:04:02.882 INFO:tasks.workunit.client.1.vm06.stdout:1/703: symlink d6/dc4/led 0 2026-03-09T00:04:02.885 INFO:tasks.workunit.client.1.vm06.stdout:1/704: link d6/d21/d2d/d3b/fa2 d6/d4c/d79/fee 0 2026-03-09T00:04:02.885 INFO:tasks.workunit.client.1.vm06.stdout:1/705: chown d6/d4c/l97 205577269 1 2026-03-09T00:04:02.909 INFO:tasks.workunit.client.0.vm03.stdout:3/421: dwrite d2/f5 [0,4194304] 0 2026-03-09T00:04:02.910 INFO:tasks.workunit.client.0.vm03.stdout:3/422: dread - d2/db/d3b/d5d/d6d/d72/f7a zero size 2026-03-09T00:04:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:02 vm06.local ceph-mon[58395]: pgmap v4: 65 pgs: 65 active+clean; 2.8 GiB data, 9.5 GiB used, 110 GiB / 120 GiB avail 2026-03-09T00:04:02.937 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:02 vm06.local ceph-mon[58395]: mgrmap e29: vm03.yvcons(active, since 2s), standbys: vm06.rzcvhn 2026-03-09T00:04:02.937 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:02 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:04:02.937 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:02 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:02.938 INFO:tasks.workunit.client.0.vm03.stdout:1/643: dwrite d4/d5e/fd3 [0,4194304] 0 2026-03-09T00:04:02.942 INFO:tasks.workunit.client.0.vm03.stdout:1/644: dread d4/d15/d77/f7c [0,4194304] 0 2026-03-09T00:04:02.942 INFO:tasks.workunit.client.1.vm06.stdout:8/791: dwrite db/dd/d24/fba [0,4194304] 0 2026-03-09T00:04:02.945 INFO:tasks.workunit.client.0.vm03.stdout:7/515: dwrite d2/d1f/d42/f47 [0,4194304] 0 2026-03-09T00:04:02.945 INFO:tasks.workunit.client.0.vm03.stdout:7/516: write d2/d1f/f3b [3087556,51444] 0 2026-03-09T00:04:02.951 INFO:tasks.workunit.client.0.vm03.stdout:1/645: read d4/d3a/f26 [629789,85336] 0 2026-03-09T00:04:02.957 INFO:tasks.workunit.client.0.vm03.stdout:0/528: fsync d2/da/d1a/fc4 0 2026-03-09T00:04:02.965 INFO:tasks.workunit.client.0.vm03.stdout:7/517: symlink d2/d1f/d42/d46/d81/d96/d80/l9d 0 2026-03-09T00:04:02.967 INFO:tasks.workunit.client.0.vm03.stdout:1/646: symlink d4/d3a/d61/d78/d81/le0 0 2026-03-09T00:04:02.967 INFO:tasks.workunit.client.0.vm03.stdout:1/647: write d4/d3a/d32/f4b [2920854,59021] 0 2026-03-09T00:04:02.967 INFO:tasks.workunit.client.1.vm06.stdout:0/814: dwrite d3/d18/d2c/d2d/d31/f4f [0,4194304] 0 2026-03-09T00:04:02.967 INFO:tasks.workunit.client.1.vm06.stdout:0/815: stat d3/d18/d2c/d2d/d74/dc7/d110/d45/l104 0 2026-03-09T00:04:02.967 INFO:tasks.workunit.client.1.vm06.stdout:0/816: chown d3/d18/d1f/d39/d49/d60/fe8 49762470 1 2026-03-09T00:04:02.969 INFO:tasks.workunit.client.0.vm03.stdout:0/529: rmdir d2/da 39 2026-03-09T00:04:02.971 INFO:tasks.workunit.client.0.vm03.stdout:0/530: rename d2/da/dd/d49/f87 to d2/da/dd/d49/d6c/d81/fc5 0 2026-03-09T00:04:02.971 INFO:tasks.workunit.client.0.vm03.stdout:0/531: truncate d2/da/dd/f75 1021813 0 2026-03-09T00:04:02.972 INFO:tasks.workunit.client.1.vm06.stdout:0/817: symlink d3/d18/de9/l114 0 2026-03-09T00:04:02.978 INFO:tasks.workunit.client.1.vm06.stdout:0/818: mkdir d3/d18/d1f/d39/d49/d115 0 2026-03-09T00:04:02.978 INFO:tasks.workunit.client.1.vm06.stdout:0/819: fsync d3/d18/d2c/d2d/d74/dc7/d110/f81 0 2026-03-09T00:04:02.979 INFO:tasks.workunit.client.1.vm06.stdout:2/861: dread d7/da/db/f98 [0,4194304] 0 2026-03-09T00:04:02.986 INFO:tasks.workunit.client.1.vm06.stdout:2/862: unlink d7/da/db/l44 0 2026-03-09T00:04:02.986 INFO:tasks.workunit.client.1.vm06.stdout:2/863: dread - d7/d1b/d71/d79/fdf zero size 2026-03-09T00:04:02.986 INFO:tasks.workunit.client.1.vm06.stdout:2/864: chown d7/d1a/d3c/ld5 6110142 1 2026-03-09T00:04:02.986 INFO:tasks.workunit.client.1.vm06.stdout:2/865: creat d7/d1b/f108 x:0 0 0 2026-03-09T00:04:02.989 INFO:tasks.workunit.client.0.vm03.stdout:5/555: dwrite d1c/d20/d56/d74/f9a [0,4194304] 0 2026-03-09T00:04:02.989 INFO:tasks.workunit.client.0.vm03.stdout:5/556: chown d1c/d20/fa3 121800428 1 2026-03-09T00:04:02.991 INFO:tasks.workunit.client.1.vm06.stdout:5/884: sync 2026-03-09T00:04:02.991 INFO:tasks.workunit.client.1.vm06.stdout:3/775: sync 2026-03-09T00:04:02.991 INFO:tasks.workunit.client.1.vm06.stdout:3/776: readlink d11/d28/d2e/d2f/d36/d8f/lcb 0 2026-03-09T00:04:02.992 INFO:tasks.workunit.client.0.vm03.stdout:3/423: rmdir d2 39 2026-03-09T00:04:02.992 INFO:tasks.workunit.client.0.vm03.stdout:3/424: rename d2/db to d2/db/d40/d7b 22 2026-03-09T00:04:02.992 INFO:tasks.workunit.client.0.vm03.stdout:3/425: fdatasync d2/db/d40/f4a 0 2026-03-09T00:04:02.992 INFO:tasks.workunit.client.0.vm03.stdout:3/426: chown d2/db/f10 21226 1 2026-03-09T00:04:02.993 INFO:tasks.workunit.client.1.vm06.stdout:0/820: dread d3/d18/d1f/d39/d3b/fcf [0,4194304] 0 2026-03-09T00:04:02.993 INFO:tasks.workunit.client.1.vm06.stdout:0/821: readlink d3/d18/d2c/d2d/l99 0 2026-03-09T00:04:02.999 INFO:tasks.workunit.client.0.vm03.stdout:7/518: dread d2/d1f/d42/d46/d54/f77 [0,4194304] 0 2026-03-09T00:04:03.001 INFO:tasks.workunit.client.1.vm06.stdout:2/866: mkdir d7/da/db/d109 0 2026-03-09T00:04:03.001 INFO:tasks.workunit.client.1.vm06.stdout:2/867: readlink d7/d1a/d3c/l42 0 2026-03-09T00:04:03.002 INFO:tasks.workunit.client.1.vm06.stdout:5/885: creat d5/d1c/d21/d28/d102/f128 x:0 0 0 2026-03-09T00:04:03.005 INFO:tasks.workunit.client.1.vm06.stdout:3/777: creat d11/d28/d2e/d2f/dc1/f10f x:0 0 0 2026-03-09T00:04:03.005 INFO:tasks.workunit.client.1.vm06.stdout:3/778: readlink d11/d28/d2e/l9e 0 2026-03-09T00:04:03.006 INFO:tasks.workunit.client.1.vm06.stdout:0/822: mkdir d3/d18/d1f/d39/d69/d116 0 2026-03-09T00:04:03.006 INFO:tasks.workunit.client.1.vm06.stdout:0/823: readlink d3/d10f/lf3 0 2026-03-09T00:04:03.006 INFO:tasks.workunit.client.1.vm06.stdout:0/824: dread - d3/d18/d2c/d2d/d74/dc7/d110/d45/ffa zero size 2026-03-09T00:04:03.007 INFO:tasks.workunit.client.1.vm06.stdout:2/868: rename d7/d1a/d25/d66/d87/f9b to d7/d1b/d71/d79/db4/dc1/f10a 0 2026-03-09T00:04:03.007 INFO:tasks.workunit.client.1.vm06.stdout:2/869: fsync d7/da/db/f6e 0 2026-03-09T00:04:03.021 INFO:tasks.workunit.client.1.vm06.stdout:3/779: symlink d11/d28/d4d/d9b/l110 0 2026-03-09T00:04:03.021 INFO:tasks.workunit.client.1.vm06.stdout:3/780: dread d11/d3f/f4c [0,4194304] 0 2026-03-09T00:04:03.022 INFO:tasks.workunit.client.1.vm06.stdout:0/825: mkdir d3/d18/d2c/d2d/d74/d90/d117 0 2026-03-09T00:04:03.027 INFO:tasks.workunit.client.0.vm03.stdout:5/557: chown f15 1614 1 2026-03-09T00:04:03.034 INFO:tasks.workunit.client.0.vm03.stdout:5/558: chown d1c/d20/d55/f7d 0 1 2026-03-09T00:04:03.034 INFO:tasks.workunit.client.0.vm03.stdout:7/519: mknod d2/d1f/d42/d46/d54/d8d/c9e 0 2026-03-09T00:04:03.034 INFO:tasks.workunit.client.0.vm03.stdout:5/559: mkdir d1c/d20/d56/db4 0 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.0.vm03.stdout:5/560: rmdir d1c/d20/d55/d4f/d58/d73/d76 39 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.1.vm06.stdout:3/781: symlink d11/d28/d2e/db2/d100/l111 0 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.1.vm06.stdout:1/706: dwrite d6/d21/d2d/d37/f8b [0,4194304] 0 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.1.vm06.stdout:0/826: symlink d3/d18/d1f/d44/l118 0 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.1.vm06.stdout:3/782: mkdir d11/d28/d4d/d89/d90/d112 0 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.1.vm06.stdout:3/783: chown d11/d28/d2e/d2f/dc1/f10f 329 1 2026-03-09T00:04:03.035 INFO:tasks.workunit.client.1.vm06.stdout:1/707: mkdir d6/d21/def 0 2026-03-09T00:04:03.036 INFO:tasks.workunit.client.0.vm03.stdout:9/582: dwrite d15/d1c/d21/f71 [0,4194304] 0 2026-03-09T00:04:03.038 INFO:tasks.workunit.client.1.vm06.stdout:3/784: unlink d11/f27 0 2026-03-09T00:04:03.041 INFO:tasks.workunit.client.1.vm06.stdout:4/765: dwrite d17/d21/fa6 [0,4194304] 0 2026-03-09T00:04:03.041 INFO:tasks.workunit.client.1.vm06.stdout:0/827: mkdir d3/d18/d1f/d119 0 2026-03-09T00:04:03.041 INFO:tasks.workunit.client.1.vm06.stdout:1/708: mkdir d6/d21/d2d/d3b/d42/df0 0 2026-03-09T00:04:03.041 INFO:tasks.workunit.client.1.vm06.stdout:6/795: dwrite d4/d16/d46/fc4 [0,4194304] 0 2026-03-09T00:04:03.041 INFO:tasks.workunit.client.1.vm06.stdout:6/796: truncate d4/d16/d53/ddf/d52/d7d/f9d 5071112 0 2026-03-09T00:04:03.046 INFO:tasks.workunit.client.1.vm06.stdout:6/797: dread d4/d27/d3e/d57/f79 [0,4194304] 0 2026-03-09T00:04:03.046 INFO:tasks.workunit.client.0.vm03.stdout:9/583: dread d15/d1c/d36/f9e [0,4194304] 0 2026-03-09T00:04:03.046 INFO:tasks.workunit.client.0.vm03.stdout:9/584: dread - d15/d1c/d36/f72 zero size 2026-03-09T00:04:03.047 INFO:tasks.workunit.client.1.vm06.stdout:3/785: mknod d11/d28/d2e/d2f/d5b/d5f/db1/c113 0 2026-03-09T00:04:03.048 INFO:tasks.workunit.client.1.vm06.stdout:4/766: mknod d17/c100 0 2026-03-09T00:04:03.053 INFO:tasks.workunit.client.1.vm06.stdout:0/828: mkdir d3/d18/d2c/d2d/d74/da8/d11a 0 2026-03-09T00:04:03.054 INFO:tasks.workunit.client.1.vm06.stdout:3/786: unlink d11/d28/d2e/d2f/f78 0 2026-03-09T00:04:03.060 INFO:tasks.workunit.client.1.vm06.stdout:3/787: creat d11/d28/d2e/db2/dc2/f114 x:0 0 0 2026-03-09T00:04:03.068 INFO:tasks.workunit.client.1.vm06.stdout:3/788: creat d11/d28/d4d/d89/f115 x:0 0 0 2026-03-09T00:04:03.068 INFO:tasks.workunit.client.1.vm06.stdout:7/783: write d0/df/d1a/d27/d4c/d40/f67 [1006171,125256] 0 2026-03-09T00:04:03.068 INFO:tasks.workunit.client.1.vm06.stdout:3/789: rename d11/f1a to d11/d28/d2e/db2/f116 0 2026-03-09T00:04:03.071 INFO:tasks.workunit.client.0.vm03.stdout:4/657: dwrite d7/d27/dc9/fd2 [0,4194304] 0 2026-03-09T00:04:03.077 INFO:tasks.workunit.client.1.vm06.stdout:9/683: dwrite d1/d73/f8f [0,4194304] 0 2026-03-09T00:04:03.077 INFO:tasks.workunit.client.1.vm06.stdout:7/784: mknod d0/df/d1a/d27/d4c/d40/ce5 0 2026-03-09T00:04:03.077 INFO:tasks.workunit.client.1.vm06.stdout:7/785: chown d0/df/d1a/d3a/d4e 243828231 1 2026-03-09T00:04:03.077 INFO:tasks.workunit.client.1.vm06.stdout:0/829: dread d3/d18/d1f/d39/d3b/df9/df2/f96 [0,4194304] 0 2026-03-09T00:04:03.077 INFO:tasks.workunit.client.1.vm06.stdout:0/830: write d3/f7 [4415027,94744] 0 2026-03-09T00:04:03.077 INFO:tasks.workunit.client.1.vm06.stdout:8/792: dread db/d74/d78/d98/db6/ff0 [0,4194304] 0 2026-03-09T00:04:03.093 INFO:tasks.workunit.client.0.vm03.stdout:2/546: getdents d8/d1b/d24 0 2026-03-09T00:04:03.097 INFO:tasks.workunit.client.1.vm06.stdout:5/886: dwrite d5/d1c/d21/d28/d5e/d66/d78/da6/f124 [0,4194304] 0 2026-03-09T00:04:03.098 INFO:tasks.workunit.client.1.vm06.stdout:5/887: readlink d5/d44/l74 0 2026-03-09T00:04:03.103 INFO:tasks.workunit.client.0.vm03.stdout:7/520: write d2/d4/f2e [832432,127541] 0 2026-03-09T00:04:03.115 INFO:tasks.workunit.client.0.vm03.stdout:0/532: dwrite d2/f59 [0,4194304] 0 2026-03-09T00:04:03.115 INFO:tasks.workunit.client.0.vm03.stdout:0/533: truncate d2/da/d1a/f84 416240 0 2026-03-09T00:04:03.115 INFO:tasks.workunit.client.0.vm03.stdout:0/534: fsync d2/da/dd/d49/d6c/d4b/f4c 0 2026-03-09T00:04:03.118 INFO:tasks.workunit.client.1.vm06.stdout:7/786: creat d0/df/d1a/fe6 x:0 0 0 2026-03-09T00:04:03.118 INFO:tasks.workunit.client.1.vm06.stdout:7/787: dread d0/df/d1a/d27/d4c/d40/fa5 [0,4194304] 0 2026-03-09T00:04:03.119 INFO:tasks.workunit.client.0.vm03.stdout:2/547: creat d8/d1b/d24/da5/fb5 x:0 0 0 2026-03-09T00:04:03.120 INFO:tasks.workunit.client.0.vm03.stdout:2/548: read d8/fd [360770,42203] 0 2026-03-09T00:04:03.120 INFO:tasks.workunit.client.0.vm03.stdout:2/549: creat d8/d1b/d2a/d56/fb6 x:0 0 0 2026-03-09T00:04:03.124 INFO:tasks.workunit.client.0.vm03.stdout:0/535: chown d2/da/d36/da4 102868946 1 2026-03-09T00:04:03.137 INFO:tasks.workunit.client.0.vm03.stdout:0/536: chown d2/da/dd/d49/d6c/d81/l94 1462 1 2026-03-09T00:04:03.138 INFO:tasks.workunit.client.1.vm06.stdout:8/793: rename db/d53/d7c/d8f/lce to db/d53/d70/d38/d47/lfe 0 2026-03-09T00:04:03.138 INFO:tasks.workunit.client.0.vm03.stdout:3/427: dread d2/db/f64 [0,4194304] 0 2026-03-09T00:04:03.138 INFO:tasks.workunit.client.0.vm03.stdout:6/506: dwrite d13/f5d [0,4194304] 0 2026-03-09T00:04:03.138 INFO:tasks.workunit.client.1.vm06.stdout:4/767: dread d17/d24/f29 [0,4194304] 0 2026-03-09T00:04:03.152 INFO:tasks.workunit.client.1.vm06.stdout:2/870: dwrite d7/d1b/f22 [0,4194304] 0 2026-03-09T00:04:03.161 INFO:tasks.workunit.client.1.vm06.stdout:3/790: mkdir d11/d117 0 2026-03-09T00:04:03.167 INFO:tasks.workunit.client.0.vm03.stdout:7/521: dwrite d2/d1f/d42/d91/f72 [0,4194304] 0 2026-03-09T00:04:03.168 INFO:tasks.workunit.client.0.vm03.stdout:4/658: getdents d7/d6f/da5 0 2026-03-09T00:04:03.169 INFO:tasks.workunit.client.1.vm06.stdout:0/831: dread d3/f7 [0,4194304] 0 2026-03-09T00:04:03.169 INFO:tasks.workunit.client.1.vm06.stdout:0/832: creat d3/d18/d2c/d2d/d74/daf/f11b x:0 0 0 2026-03-09T00:04:03.169 INFO:tasks.workunit.client.1.vm06.stdout:0/833: stat d3/fa 0 2026-03-09T00:04:03.174 INFO:tasks.workunit.client.0.vm03.stdout:7/522: read d2/d1f/d42/d46/d81/f8f [1591831,100381] 0 2026-03-09T00:04:03.174 INFO:tasks.workunit.client.0.vm03.stdout:7/523: write d2/d1f/d42/d91/d67/f95 [924738,83065] 0 2026-03-09T00:04:03.179 INFO:tasks.workunit.client.0.vm03.stdout:7/524: write d2/d1f/d35/d9a/d9c/f74 [2677342,31951] 0 2026-03-09T00:04:03.186 INFO:tasks.workunit.client.0.vm03.stdout:7/525: chown d2/d1f/d42/d46/d81/d96/d80/f93 1 1 2026-03-09T00:04:03.186 INFO:tasks.workunit.client.0.vm03.stdout:7/526: fsync d2/d4/fb 0 2026-03-09T00:04:03.186 INFO:tasks.workunit.client.1.vm06.stdout:9/684: truncate d1/d4/d6e/d9/f10 429072 0 2026-03-09T00:04:03.194 INFO:tasks.workunit.client.1.vm06.stdout:5/888: dwrite d5/d44/d84/dc5/fd2 [0,4194304] 0 2026-03-09T00:04:03.195 INFO:tasks.workunit.client.0.vm03.stdout:2/550: truncate d8/d26/d5e/d6f/d97/f68 4194463 0 2026-03-09T00:04:03.195 INFO:tasks.workunit.client.0.vm03.stdout:9/585: dwrite d15/d1c/d21/d54/d87/d93/fbf [0,4194304] 0 2026-03-09T00:04:03.197 INFO:tasks.workunit.client.0.vm03.stdout:0/537: creat d2/da/dd/d49/d6c/d4b/daf/fc6 x:0 0 0 2026-03-09T00:04:03.202 INFO:tasks.workunit.client.0.vm03.stdout:9/586: write fc [3703,72920] 0 2026-03-09T00:04:03.202 INFO:tasks.workunit.client.0.vm03.stdout:3/428: creat d2/db/d3b/d3f/f7c x:0 0 0 2026-03-09T00:04:03.208 INFO:tasks.workunit.client.1.vm06.stdout:7/788: truncate d0/df/d1a/d22/f2c 7586975 0 2026-03-09T00:04:03.220 INFO:tasks.workunit.client.1.vm06.stdout:8/794: truncate db/d53/d70/f75 3989928 0 2026-03-09T00:04:03.221 INFO:tasks.workunit.client.0.vm03.stdout:6/507: dwrite f8 [0,4194304] 0 2026-03-09T00:04:03.226 INFO:tasks.workunit.client.1.vm06.stdout:0/834: dwrite d3/f1a [0,4194304] 0 2026-03-09T00:04:03.226 INFO:tasks.workunit.client.1.vm06.stdout:0/835: fsync d3/f1c 0 2026-03-09T00:04:03.252 INFO:tasks.workunit.client.1.vm06.stdout:4/768: creat d17/d21/d4c/d66/d68/dbe/f101 x:0 0 0 2026-03-09T00:04:03.256 INFO:tasks.workunit.client.1.vm06.stdout:2/871: creat d7/da/d63/d81/dfe/db2/dc9/f10b x:0 0 0 2026-03-09T00:04:03.259 INFO:tasks.workunit.client.1.vm06.stdout:3/791: truncate d11/d28/d2e/d2f/d36/f4a 6504902 0 2026-03-09T00:04:03.260 INFO:tasks.workunit.client.1.vm06.stdout:3/792: dread d11/f48 [0,4194304] 0 2026-03-09T00:04:03.271 INFO:tasks.workunit.client.0.vm03.stdout:7/527: dwrite d2/d1f/d42/d46/d81/d96/d80/f93 [0,4194304] 0 2026-03-09T00:04:03.278 INFO:tasks.workunit.client.1.vm06.stdout:5/889: creat d5/d1c/d68/dec/d115/d11e/d92/f129 x:0 0 0 2026-03-09T00:04:03.281 INFO:tasks.workunit.client.1.vm06.stdout:5/890: readlink d5/l1a 0 2026-03-09T00:04:03.282 INFO:tasks.workunit.client.1.vm06.stdout:5/891: chown d5/cee 1820008 1 2026-03-09T00:04:03.283 INFO:tasks.workunit.client.0.vm03.stdout:6/508: dwrite d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:04:03.283 INFO:tasks.workunit.client.0.vm03.stdout:6/509: write d13/d1e/f3e [1298366,10305] 0 2026-03-09T00:04:03.283 INFO:tasks.workunit.client.0.vm03.stdout:6/510: chown d13/c39 12 1 2026-03-09T00:04:03.283 INFO:tasks.workunit.client.0.vm03.stdout:6/511: fsync d13/d35/d71/d97/f86 0 2026-03-09T00:04:03.283 INFO:tasks.workunit.client.1.vm06.stdout:0/836: dwrite d3/d18/d2c/f4d [0,4194304] 0 2026-03-09T00:04:03.313 INFO:tasks.workunit.client.0.vm03.stdout:5/561: dwrite d1c/d20/d55/d3b/f45 [0,4194304] 0 2026-03-09T00:04:03.314 INFO:tasks.workunit.client.0.vm03.stdout:9/587: unlink f8 0 2026-03-09T00:04:03.320 INFO:tasks.workunit.client.0.vm03.stdout:3/429: rename d2/db/d2d/f37 to d2/db/d56/f7d 0 2026-03-09T00:04:03.346 INFO:tasks.workunit.client.1.vm06.stdout:8/795: link db/d53/d70/d38/d4d/f65 db/d74/d78/d98/db6/fff 0 2026-03-09T00:04:03.353 INFO:tasks.workunit.client.0.vm03.stdout:7/528: symlink d2/d1f/d35/d9a/d9c/l9f 0 2026-03-09T00:04:03.360 INFO:tasks.workunit.client.0.vm03.stdout:4/659: getdents d7/d20/d29/d38 0 2026-03-09T00:04:03.361 INFO:tasks.workunit.client.0.vm03.stdout:4/660: dread d7/f15 [0,4194304] 0 2026-03-09T00:04:03.379 INFO:tasks.workunit.client.0.vm03.stdout:6/512: link d13/d1e/c64 d13/d35/d4c/caf 0 2026-03-09T00:04:03.382 INFO:tasks.workunit.client.0.vm03.stdout:4/661: read d7/d27/f2c [5467183,94748] 0 2026-03-09T00:04:03.383 INFO:tasks.workunit.client.0.vm03.stdout:8/563: sync 2026-03-09T00:04:03.395 INFO:tasks.workunit.client.1.vm06.stdout:4/769: truncate d17/d24/d3b/d5e/fe5 3535464 0 2026-03-09T00:04:03.395 INFO:tasks.workunit.client.1.vm06.stdout:4/770: chown d17/d21/d4c/d66/dd9/f7e 5550792 1 2026-03-09T00:04:03.402 INFO:tasks.workunit.client.1.vm06.stdout:2/872: mknod d7/da/db/c10c 0 2026-03-09T00:04:03.403 INFO:tasks.workunit.client.0.vm03.stdout:0/538: truncate d2/f1e 910037 0 2026-03-09T00:04:03.420 INFO:tasks.workunit.client.0.vm03.stdout:9/588: mkdir d15/d1c/d28/dc1 0 2026-03-09T00:04:03.420 INFO:tasks.workunit.client.0.vm03.stdout:9/589: chown d15/d1c/d21/d75/fa6 2758 1 2026-03-09T00:04:03.437 INFO:tasks.workunit.client.1.vm06.stdout:4/771: dwrite d17/d24/f2c [0,4194304] 0 2026-03-09T00:04:03.437 INFO:tasks.workunit.client.1.vm06.stdout:4/772: readlink d17/d5b/d8f/le2 0 2026-03-09T00:04:03.446 INFO:tasks.workunit.client.1.vm06.stdout:0/837: creat d3/d18/d2c/d2d/d74/dc7/d110/d45/f11c x:0 0 0 2026-03-09T00:04:03.455 INFO:tasks.workunit.client.1.vm06.stdout:4/773: dread d17/d21/d4c/d50/f8c [4194304,4194304] 0 2026-03-09T00:04:03.456 INFO:tasks.workunit.client.0.vm03.stdout:3/430: creat d2/db/f7e x:0 0 0 2026-03-09T00:04:03.464 INFO:tasks.workunit.client.0.vm03.stdout:8/564: dwrite d7/f25 [0,4194304] 0 2026-03-09T00:04:03.470 INFO:tasks.workunit.client.0.vm03.stdout:5/562: rename d1c/d20/d55/d3b to d1c/d20/d55/d4f/d58/db5 0 2026-03-09T00:04:03.470 INFO:tasks.workunit.client.0.vm03.stdout:5/563: creat d1c/d20/d55/d4f/d58/d5d/fb6 x:0 0 0 2026-03-09T00:04:03.483 INFO:tasks.workunit.client.0.vm03.stdout:7/529: link d2/d4/d1e/f97 d2/d1f/d42/d91/d67/fa0 0 2026-03-09T00:04:03.487 INFO:tasks.workunit.client.0.vm03.stdout:6/513: rmdir d13/d35/d4c/dae 0 2026-03-09T00:04:03.488 INFO:tasks.workunit.client.0.vm03.stdout:4/662: rmdir d7/d20/d29/d38 39 2026-03-09T00:04:03.489 INFO:tasks.workunit.client.0.vm03.stdout:6/514: dread d13/f17 [0,4194304] 0 2026-03-09T00:04:03.490 INFO:tasks.workunit.client.1.vm06.stdout:7/789: dwrite d0/df/d1a/d3a/f83 [0,4194304] 0 2026-03-09T00:04:03.491 INFO:tasks.workunit.client.0.vm03.stdout:3/431: getdents d2/db/d3b/d5f 0 2026-03-09T00:04:03.493 INFO:tasks.workunit.client.0.vm03.stdout:8/565: symlink d7/laf 0 2026-03-09T00:04:03.495 INFO:tasks.workunit.client.1.vm06.stdout:3/793: mkdir d11/d28/d2e/d2f/d5b/d94/d118 0 2026-03-09T00:04:03.496 INFO:tasks.workunit.client.0.vm03.stdout:8/566: dread d7/d92/f75 [0,4194304] 0 2026-03-09T00:04:03.498 INFO:tasks.workunit.client.1.vm06.stdout:9/685: rename d1/d3/d4f/d91/d94/d9e to d1/d3/d4f/d91/d94/ddf 0 2026-03-09T00:04:03.499 INFO:tasks.workunit.client.0.vm03.stdout:7/530: dread d2/d1f/d3a/f1a [4194304,4194304] 0 2026-03-09T00:04:03.500 INFO:tasks.workunit.client.0.vm03.stdout:9/590: dwrite d15/d1c/d21/d64/fac [0,4194304] 0 2026-03-09T00:04:03.500 INFO:tasks.workunit.client.0.vm03.stdout:9/591: dread - d15/d1c/d21/d54/d87/d93/d74/f9d zero size 2026-03-09T00:04:03.504 INFO:tasks.workunit.client.0.vm03.stdout:8/567: dread d7/df/d1a/d2b/d62/faa [0,4194304] 0 2026-03-09T00:04:03.512 INFO:tasks.workunit.client.1.vm06.stdout:4/774: truncate d17/d21/f2f 1263585 0 2026-03-09T00:04:03.514 INFO:tasks.workunit.client.0.vm03.stdout:5/564: getdents d1c/d67 0 2026-03-09T00:04:03.521 INFO:tasks.workunit.client.0.vm03.stdout:3/432: getdents d2/db/d3b/d5d 0 2026-03-09T00:04:03.521 INFO:tasks.workunit.client.0.vm03.stdout:6/515: creat d13/d35/d71/fb0 x:0 0 0 2026-03-09T00:04:03.521 INFO:tasks.workunit.client.0.vm03.stdout:5/565: dread d1c/d20/d55/f3d [0,4194304] 0 2026-03-09T00:04:03.530 INFO:tasks.workunit.client.0.vm03.stdout:7/531: creat d2/d1f/d42/d46/d81/d96/d37/d39/d6e/fa1 x:0 0 0 2026-03-09T00:04:03.535 INFO:tasks.workunit.client.0.vm03.stdout:6/516: mkdir d13/d35/d71/d97/da5/db1 0 2026-03-09T00:04:03.537 INFO:tasks.workunit.client.0.vm03.stdout:7/532: mkdir d2/d1f/d42/d46/d81/d96/da2 0 2026-03-09T00:04:03.538 INFO:tasks.workunit.client.1.vm06.stdout:4/775: mkdir d17/d21/d32/d102 0 2026-03-09T00:04:03.538 INFO:tasks.workunit.client.1.vm06.stdout:3/794: creat d11/d28/d4d/d89/d90/dd2/dfd/f119 x:0 0 0 2026-03-09T00:04:03.538 INFO:tasks.workunit.client.1.vm06.stdout:3/795: chown d11/d28/d2e/d2f/f79 381 1 2026-03-09T00:04:03.540 INFO:tasks.workunit.client.1.vm06.stdout:7/790: rmdir d0/d55/d99 39 2026-03-09T00:04:03.541 INFO:tasks.workunit.client.0.vm03.stdout:7/533: mknod d2/d4/ca3 0 2026-03-09T00:04:03.542 INFO:tasks.workunit.client.1.vm06.stdout:8/796: rename db/d53/d7c to db/d74/d87/d100 0 2026-03-09T00:04:03.543 INFO:tasks.workunit.client.0.vm03.stdout:6/517: link d13/d1e/d44/d4a/l63 d13/d1e/d44/d59/lb2 0 2026-03-09T00:04:03.546 INFO:tasks.workunit.client.1.vm06.stdout:9/686: rmdir d1/d4/d6e/d14/d25/d85 39 2026-03-09T00:04:03.546 INFO:tasks.workunit.client.1.vm06.stdout:4/776: symlink d17/d24/d49/de4/db0/ddd/l103 0 2026-03-09T00:04:03.546 INFO:tasks.workunit.client.1.vm06.stdout:3/796: link d11/d28/l30 d11/d28/d2e/db2/l11a 0 2026-03-09T00:04:03.546 INFO:tasks.workunit.client.1.vm06.stdout:3/797: chown d11/d28/d2e/d7e/fd3 10630 1 2026-03-09T00:04:03.548 INFO:tasks.workunit.client.1.vm06.stdout:7/791: creat d0/df/d1a/d27/d4c/d40/d51/d90/dae/de0/fe7 x:0 0 0 2026-03-09T00:04:03.549 INFO:tasks.workunit.client.0.vm03.stdout:6/518: mkdir d13/d35/d74/d89/db3 0 2026-03-09T00:04:03.549 INFO:tasks.workunit.client.0.vm03.stdout:6/519: chown d13/d35/d72/l80 7564 1 2026-03-09T00:04:03.551 INFO:tasks.workunit.client.1.vm06.stdout:4/777: truncate d17/d21/d32/fbd 3322652 0 2026-03-09T00:04:03.551 INFO:tasks.workunit.client.1.vm06.stdout:3/798: truncate d11/d28/d2e/d2f/d5b/d94/fa1 556290 0 2026-03-09T00:04:03.552 INFO:tasks.workunit.client.0.vm03.stdout:5/566: write d1c/d20/d55/d66/d70/f80 [3554530,42529] 0 2026-03-09T00:04:03.552 INFO:tasks.workunit.client.1.vm06.stdout:7/792: dread d0/df/d1a/d3a/d4e/d5e/f6f [0,4194304] 0 2026-03-09T00:04:03.553 INFO:tasks.workunit.client.1.vm06.stdout:7/793: fsync d0/df/d1a/d3f/fd3 0 2026-03-09T00:04:03.553 INFO:tasks.workunit.client.1.vm06.stdout:7/794: readlink d0/df/d1a/d22/lad 0 2026-03-09T00:04:03.553 INFO:tasks.workunit.client.0.vm03.stdout:5/567: truncate d1c/d20/d55/d4f/d58/d5d/faa 4056047 0 2026-03-09T00:04:03.556 INFO:tasks.workunit.client.1.vm06.stdout:8/797: rename db/dd/d85/d9f to db/d74/d78/d98/db6/dc7/d101 0 2026-03-09T00:04:03.557 INFO:tasks.workunit.client.1.vm06.stdout:4/778: truncate f14 541213 0 2026-03-09T00:04:03.557 INFO:tasks.workunit.client.1.vm06.stdout:4/779: truncate d17/d24/d49/de4/fc0 569921 0 2026-03-09T00:04:03.557 INFO:tasks.workunit.client.1.vm06.stdout:4/780: chown d17/d21/d4c/ced 1165 1 2026-03-09T00:04:03.557 INFO:tasks.workunit.client.1.vm06.stdout:3/799: write d11/d28/d2e/d2f/d5b/d5f/d91/fce [945306,94647] 0 2026-03-09T00:04:03.557 INFO:tasks.workunit.client.1.vm06.stdout:3/800: chown d11/d28/d2e/d2f/d36/d8f/fca 63376 1 2026-03-09T00:04:03.557 INFO:tasks.workunit.client.1.vm06.stdout:3/801: chown d11/d28/d2e/d2f/ldf 63522685 1 2026-03-09T00:04:03.560 INFO:tasks.workunit.client.0.vm03.stdout:4/663: dwrite d7/d27/f31 [0,4194304] 0 2026-03-09T00:04:03.563 INFO:tasks.workunit.client.1.vm06.stdout:0/838: dwrite d3/f7 [0,4194304] 0 2026-03-09T00:04:03.576 INFO:tasks.workunit.client.1.vm06.stdout:8/798: mknod db/d1e/d46/c102 0 2026-03-09T00:04:03.577 INFO:tasks.workunit.client.1.vm06.stdout:2/873: rename d7/da/fef to d7/d1a/d25/f10d 0 2026-03-09T00:04:03.577 INFO:tasks.workunit.client.0.vm03.stdout:4/664: mknod d7/d20/d29/d38/da9/cd6 0 2026-03-09T00:04:03.577 INFO:tasks.workunit.client.0.vm03.stdout:4/665: stat d7/d20/d6a/d77/l7a 0 2026-03-09T00:04:03.577 INFO:tasks.workunit.client.0.vm03.stdout:4/666: write f4 [7217401,74098] 0 2026-03-09T00:04:03.577 INFO:tasks.workunit.client.0.vm03.stdout:4/667: rmdir d7/d20/d35/d66 39 2026-03-09T00:04:03.577 INFO:tasks.workunit.client.0.vm03.stdout:4/668: creat d7/d20/d29/fd7 x:0 0 0 2026-03-09T00:04:03.580 INFO:tasks.workunit.client.0.vm03.stdout:4/669: dread d7/d20/d6a/d77/db7/f9a [0,4194304] 0 2026-03-09T00:04:03.580 INFO:tasks.workunit.client.0.vm03.stdout:4/670: readlink d7/d20/d6a/d77/l7a 0 2026-03-09T00:04:03.580 INFO:tasks.workunit.client.0.vm03.stdout:4/671: chown d7/d20/d35/d66 633467 1 2026-03-09T00:04:03.582 INFO:tasks.workunit.client.1.vm06.stdout:4/781: mknod d17/d21/d4c/c104 0 2026-03-09T00:04:03.583 INFO:tasks.workunit.client.1.vm06.stdout:4/782: dread d17/d24/d49/d5f/f6b [0,4194304] 0 2026-03-09T00:04:03.583 INFO:tasks.workunit.client.1.vm06.stdout:4/783: truncate d17/d21/f4b 3175611 0 2026-03-09T00:04:03.583 INFO:tasks.workunit.client.1.vm06.stdout:4/784: creat d17/d21/d4c/d66/f105 x:0 0 0 2026-03-09T00:04:03.583 INFO:tasks.workunit.client.1.vm06.stdout:4/785: fsync d17/d24/d3b/f4a 0 2026-03-09T00:04:03.583 INFO:tasks.workunit.client.1.vm06.stdout:4/786: stat d17/d21/d4c/d66/fa2 0 2026-03-09T00:04:03.584 INFO:tasks.workunit.client.0.vm03.stdout:4/672: unlink d7/d20/d29/d38/ld5 0 2026-03-09T00:04:03.588 INFO:tasks.workunit.client.1.vm06.stdout:2/874: mknod d7/d1a/c10e 0 2026-03-09T00:04:03.602 INFO:tasks.workunit.client.1.vm06.stdout:9/687: rename d1/d3/d4f/d52/l7c to d1/d3/d4f/d91/dae/le0 0 2026-03-09T00:04:03.602 INFO:tasks.workunit.client.1.vm06.stdout:4/787: creat d17/d24/d3b/d97/db7/f106 x:0 0 0 2026-03-09T00:04:03.602 INFO:tasks.workunit.client.1.vm06.stdout:4/788: creat d17/d24/d3b/d97/db7/f107 x:0 0 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/875: unlink d7/d1b/d71/d79/db4/dc1/d86/f8a 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/876: creat d7/d1b/d31/f10f x:0 0 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:3/802: rename d11/d28/d2e/d2f/d5b/d94/d118 to d11/d28/d2e/d2f/d5b/d94/ddd/d11b 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:9/688: mknod d1/d3/d2b/ce1 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:4/789: symlink d17/d21/d4c/d66/de3/l108 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/877: creat d7/da/d63/d81/dfe/db2/d102/f110 x:0 0 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/878: chown d7/d1b/d71/d79/db4/dc1/d86/fe1 17136 1 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/879: chown d7/da/d63/d81/dfe/db2/dc9/f10b 0 1 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/799: rename db/d53/d70/f71 to db/dd/d24/db0/f103 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/800: write db/dd/d24/d63/fe9 [935639,43335] 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:9/689: symlink d1/d4/d6e/d14/d25/d85/le2 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:4/790: rmdir d17/d24/d3b/dbf 39 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/801: rename db/d1e/fda to db/d1e/d9b/df5/f104 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:9/690: mkdir d1/d3/d4f/d52/de3 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/880: rename d7/d1b/f108 to d7/d1b/d71/d79/db4/dc1/d86/f111 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:2/881: chown d7/d1b/d71/d79/db4/dc1/fd2 48569944 1 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/802: rmdir db/d53/d70/d38/d4d/d79 39 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/803: getdents db/d74/d78/d98/db6/dc7/d101/db7/de8 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/804: symlink db/l105 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/805: truncate db/d74/d87/d100/d8f/fcc 3667202 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/806: fdatasync db/d53/d70/d38/f3a 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/807: write db/d53/d70/d38/f72 [613097,122434] 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/808: read db/d74/d87/d100/fa0 [818232,3564] 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:4/791: rename d17/l86 to d17/d24/d49/l109 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:8/809: creat db/d74/d78/d98/db6/dc7/d101/db7/de8/f106 x:0 0 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:4/792: link d17/d24/d49/de4/db0/ddd/l103 d17/d21/d4c/d66/de3/l10a 0 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:4/793: dread - d17/d21/fb8 zero size 2026-03-09T00:04:03.604 INFO:tasks.workunit.client.1.vm06.stdout:4/794: write d17/d5b/d8f/fd3 [1043501,117316] 0 2026-03-09T00:04:03.608 INFO:tasks.workunit.client.1.vm06.stdout:9/691: dread d1/d73/f8f [4194304,4194304] 0 2026-03-09T00:04:03.614 INFO:tasks.workunit.client.1.vm06.stdout:2/882: dread d7/da/d1c/f70 [0,4194304] 0 2026-03-09T00:04:03.614 INFO:tasks.workunit.client.1.vm06.stdout:2/883: stat d7/d1b/d31/f104 0 2026-03-09T00:04:03.616 INFO:tasks.workunit.client.0.vm03.stdout:8/568: dwrite d7/df/d1a/d40/f4d [0,4194304] 0 2026-03-09T00:04:03.621 INFO:tasks.workunit.client.1.vm06.stdout:9/692: dread d1/d4/d6e/d9/fc3 [0,4194304] 0 2026-03-09T00:04:03.624 INFO:tasks.workunit.client.1.vm06.stdout:4/795: mkdir d17/d21/d4c/d66/de3/d10b 0 2026-03-09T00:04:03.644 INFO:tasks.workunit.client.1.vm06.stdout:2/884: symlink d7/d1a/d25/d66/l112 0 2026-03-09T00:04:03.645 INFO:tasks.workunit.client.1.vm06.stdout:4/796: rename d17/d21/lfa to d17/d24/d3b/dbf/ddf/df5/l10c 0 2026-03-09T00:04:03.647 INFO:tasks.workunit.client.0.vm03.stdout:8/569: rmdir d7/df/d1e/d38/d91 39 2026-03-09T00:04:03.647 INFO:tasks.workunit.client.0.vm03.stdout:8/570: chown d7/df/d1a/d2b/c82 361 1 2026-03-09T00:04:03.648 INFO:tasks.workunit.client.1.vm06.stdout:4/797: creat d17/d24/d3b/d97/db7/df1/f10d x:0 0 0 2026-03-09T00:04:03.653 INFO:tasks.workunit.client.1.vm06.stdout:4/798: rename d17/d21/d4c/d66/d68/ffb to d17/d24/d3b/d5e/f10e 0 2026-03-09T00:04:03.655 INFO:tasks.workunit.client.1.vm06.stdout:4/799: mknod d17/d24/d3b/d97/db7/c10f 0 2026-03-09T00:04:03.658 INFO:tasks.workunit.client.1.vm06.stdout:4/800: symlink d17/d21/d4c/d50/l110 0 2026-03-09T00:04:03.659 INFO:tasks.workunit.client.1.vm06.stdout:4/801: creat d17/d24/d3b/dbf/ddf/dfc/f111 x:0 0 0 2026-03-09T00:04:03.659 INFO:tasks.workunit.client.1.vm06.stdout:4/802: creat d17/d5b/dac/f112 x:0 0 0 2026-03-09T00:04:03.666 INFO:tasks.workunit.client.1.vm06.stdout:4/803: dread d17/d24/d49/f2a [0,4194304] 0 2026-03-09T00:04:03.666 INFO:tasks.workunit.client.1.vm06.stdout:4/804: dread - d17/d21/d4c/d66/dd9/f7e zero size 2026-03-09T00:04:03.670 INFO:tasks.workunit.client.1.vm06.stdout:8/810: dwrite db/d1e/f82 [0,4194304] 0 2026-03-09T00:04:03.670 INFO:tasks.workunit.client.1.vm06.stdout:8/811: chown db/d1e/l3b 26113827 1 2026-03-09T00:04:03.670 INFO:tasks.workunit.client.1.vm06.stdout:8/812: chown db/d1e/f34 764 1 2026-03-09T00:04:03.670 INFO:tasks.workunit.client.1.vm06.stdout:8/813: fsync db/dd/d24/dac/fc6 0 2026-03-09T00:04:03.683 INFO:tasks.workunit.client.1.vm06.stdout:4/805: rename d17/d21/d4c/f90 to d17/d24/d3b/f113 0 2026-03-09T00:04:03.689 INFO:tasks.workunit.client.1.vm06.stdout:4/806: chown d17/d24/d49/d5f/db2/cc6 2 1 2026-03-09T00:04:03.690 INFO:tasks.workunit.client.0.vm03.stdout:4/673: dwrite d7/d20/d35/f68 [4194304,4194304] 0 2026-03-09T00:04:03.690 INFO:tasks.workunit.client.0.vm03.stdout:4/674: stat d7/d20/cb1 0 2026-03-09T00:04:03.690 INFO:tasks.workunit.client.0.vm03.stdout:4/675: stat d7/d20/fce 0 2026-03-09T00:04:03.691 INFO:tasks.workunit.client.1.vm06.stdout:4/807: creat d17/d24/d3b/dbf/ddf/f114 x:0 0 0 2026-03-09T00:04:03.692 INFO:tasks.workunit.client.1.vm06.stdout:4/808: symlink d17/d24/d49/de4/db0/l115 0 2026-03-09T00:04:03.692 INFO:tasks.workunit.client.1.vm06.stdout:4/809: creat d17/d24/d3b/dbf/ddf/dfc/f116 x:0 0 0 2026-03-09T00:04:03.694 INFO:tasks.workunit.client.1.vm06.stdout:4/810: write d17/d24/f3a [1084163,127214] 0 2026-03-09T00:04:03.695 INFO:tasks.workunit.client.1.vm06.stdout:4/811: truncate d17/d5b/f77 3310743 0 2026-03-09T00:04:03.696 INFO:tasks.workunit.client.1.vm06.stdout:0/839: dwrite d3/d18/d2c/d2d/d74/dc7/d110/d45/fa9 [4194304,4194304] 0 2026-03-09T00:04:03.700 INFO:tasks.workunit.client.0.vm03.stdout:4/676: getdents d7/d20/d35 0 2026-03-09T00:04:03.710 INFO:tasks.workunit.client.0.vm03.stdout:4/677: mknod d7/d20/d29/d78/cd8 0 2026-03-09T00:04:03.710 INFO:tasks.workunit.client.1.vm06.stdout:0/840: creat d3/d18/d2c/d2d/d74/da8/d109/f11d x:0 0 0 2026-03-09T00:04:03.710 INFO:tasks.workunit.client.1.vm06.stdout:0/841: unlink d3/d18/d1f/c24 0 2026-03-09T00:04:03.710 INFO:tasks.workunit.client.1.vm06.stdout:0/842: rename d3/cf7 to d3/d18/d2c/d2d/d8c/c11e 0 2026-03-09T00:04:03.717 INFO:tasks.workunit.client.1.vm06.stdout:0/843: symlink d3/d18/d2c/d2d/l11f 0 2026-03-09T00:04:03.722 INFO:tasks.workunit.client.1.vm06.stdout:2/885: dwrite d7/da/d4e/d57/fc5 [0,4194304] 0 2026-03-09T00:04:03.722 INFO:tasks.workunit.client.1.vm06.stdout:2/886: fdatasync d7/d1a/d25/d66/d87/fc3 0 2026-03-09T00:04:03.724 INFO:tasks.workunit.client.1.vm06.stdout:2/887: dread d7/d1a/d25/d66/d87/fc3 [0,4194304] 0 2026-03-09T00:04:03.728 INFO:tasks.workunit.client.1.vm06.stdout:2/888: creat d7/d1b/d71/d79/db4/f113 x:0 0 0 2026-03-09T00:04:03.728 INFO:tasks.workunit.client.1.vm06.stdout:2/889: chown d7/da/d93 45 1 2026-03-09T00:04:03.728 INFO:tasks.workunit.client.1.vm06.stdout:2/890: truncate d7/da/d63/d81/dfe/db2/dc9/f10b 427926 0 2026-03-09T00:04:03.728 INFO:tasks.workunit.client.1.vm06.stdout:2/891: truncate d7/d1b/d71/d79/db4/dc1/fe6 267501 0 2026-03-09T00:04:03.730 INFO:tasks.workunit.client.1.vm06.stdout:2/892: symlink d7/da/d4e/d57/d9d/l114 0 2026-03-09T00:04:03.732 INFO:tasks.workunit.client.1.vm06.stdout:2/893: link d7/d1b/d31/f10f d7/da/d4e/d57/f115 0 2026-03-09T00:04:03.734 INFO:tasks.workunit.client.1.vm06.stdout:2/894: creat d7/d1b/d71/d79/f116 x:0 0 0 2026-03-09T00:04:03.740 INFO:tasks.workunit.client.0.vm03.stdout:5/568: dwrite d1c/d20/d55/d4f/d58/d73/d9e/fae [0,4194304] 0 2026-03-09T00:04:03.740 INFO:tasks.workunit.client.0.vm03.stdout:5/569: write fe [1196598,109744] 0 2026-03-09T00:04:03.748 INFO:tasks.workunit.client.0.vm03.stdout:4/678: dread d7/d20/d6a/d77/d25/fa1 [0,4194304] 0 2026-03-09T00:04:03.760 INFO:tasks.workunit.client.0.vm03.stdout:5/570: dread d1c/d20/d55/f5a [0,4194304] 0 2026-03-09T00:04:03.760 INFO:tasks.workunit.client.0.vm03.stdout:5/571: write d1c/f1f [1493233,124206] 0 2026-03-09T00:04:03.760 INFO:tasks.workunit.client.0.vm03.stdout:4/679: dread d7/d20/f70 [0,4194304] 0 2026-03-09T00:04:03.760 INFO:tasks.workunit.client.0.vm03.stdout:4/680: write d7/d20/d29/d4e/f74 [694660,74452] 0 2026-03-09T00:04:03.760 INFO:tasks.workunit.client.0.vm03.stdout:5/572: creat d1c/d20/d56/db4/fb7 x:0 0 0 2026-03-09T00:04:03.760 INFO:tasks.workunit.client.0.vm03.stdout:5/573: mknod d1c/d20/d55/d4f/d58/d73/cb8 0 2026-03-09T00:04:03.763 INFO:tasks.workunit.client.0.vm03.stdout:8/571: dwrite d7/df/f29 [0,4194304] 0 2026-03-09T00:04:03.764 INFO:tasks.workunit.client.0.vm03.stdout:8/572: fdatasync d7/df/d1a/f33 0 2026-03-09T00:04:03.767 INFO:tasks.workunit.client.1.vm06.stdout:9/693: dwrite d1/d3/fa6 [4194304,4194304] 0 2026-03-09T00:04:03.772 INFO:tasks.workunit.client.0.vm03.stdout:3/433: dwrite d2/db/d3b/f4f [0,4194304] 0 2026-03-09T00:04:03.774 INFO:tasks.workunit.client.0.vm03.stdout:2/551: sync 2026-03-09T00:04:03.774 INFO:tasks.workunit.client.0.vm03.stdout:1/648: sync 2026-03-09T00:04:03.782 INFO:tasks.workunit.client.1.vm06.stdout:9/694: write d1/d4/d6e/f5d [2174689,34930] 0 2026-03-09T00:04:03.790 INFO:tasks.workunit.client.1.vm06.stdout:9/695: fdatasync d1/d4/f39 0 2026-03-09T00:04:03.802 INFO:tasks.workunit.client.1.vm06.stdout:8/814: dwrite db/dd/d48/f7f [0,4194304] 0 2026-03-09T00:04:03.812 INFO:tasks.workunit.client.1.vm06.stdout:1/709: sync 2026-03-09T00:04:03.812 INFO:tasks.workunit.client.1.vm06.stdout:6/798: sync 2026-03-09T00:04:03.819 INFO:tasks.workunit.client.1.vm06.stdout:7/795: dread d0/df/d1a/d22/f2c [0,4194304] 0 2026-03-09T00:04:03.820 INFO:tasks.workunit.client.1.vm06.stdout:7/796: chown d0/df/d1a/d27/d4c/d40/d51/d86/cbf 0 1 2026-03-09T00:04:03.820 INFO:tasks.workunit.client.0.vm03.stdout:2/552: dread d8/d1b/d24/f82 [0,4194304] 0 2026-03-09T00:04:03.827 INFO:tasks.workunit.client.1.vm06.stdout:6/799: write d4/d27/d3e/d45/f4d [1864848,53969] 0 2026-03-09T00:04:03.831 INFO:tasks.workunit.client.1.vm06.stdout:9/696: rename d1/d4/d6e/d14/f37 to d1/d3/d2b/d58/fe4 0 2026-03-09T00:04:03.834 INFO:tasks.workunit.client.0.vm03.stdout:4/681: dwrite d7/d20/d6a/d77/d25/f7f [0,4194304] 0 2026-03-09T00:04:03.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:03.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:03.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:03.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:04:03.842 INFO:tasks.workunit.client.0.vm03.stdout:0/539: dread d2/da/dd/d49/d6c/d4b/f88 [0,4194304] 0 2026-03-09T00:04:03.853 INFO:tasks.workunit.client.1.vm06.stdout:5/892: dwrite d5/d1c/d21/f73 [0,4194304] 0 2026-03-09T00:04:03.854 INFO:tasks.workunit.client.1.vm06.stdout:9/697: dread d1/d3/d4f/d52/f5e [4194304,4194304] 0 2026-03-09T00:04:03.854 INFO:tasks.workunit.client.1.vm06.stdout:9/698: chown d1/d4/d6e/d14/cbe 478608 1 2026-03-09T00:04:03.855 INFO:tasks.workunit.client.1.vm06.stdout:3/803: getdents d11/d28/d4d/d89/d90/dd2/dfd 0 2026-03-09T00:04:03.855 INFO:tasks.workunit.client.1.vm06.stdout:4/812: dwrite d17/d21/d4c/fcc [0,4194304] 0 2026-03-09T00:04:03.855 INFO:tasks.workunit.client.1.vm06.stdout:4/813: read - d17/d21/d4c/d50/ffe zero size 2026-03-09T00:04:03.855 INFO:tasks.workunit.client.1.vm06.stdout:4/814: fsync d17/d21/d32/f96 0 2026-03-09T00:04:03.869 INFO:tasks.workunit.client.0.vm03.stdout:7/534: rename d2/d1f/d42 to d2/d1f/d3a/d24/da4 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.0.vm03.stdout:7/535: fsync d2/f3 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.0.vm03.stdout:7/536: write d2/d1f/d3a/f5d [5109572,36980] 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.1.vm06.stdout:5/893: dread d5/d1c/d21/d28/d5e/d66/d78/fc1 [0,4194304] 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.1.vm06.stdout:8/815: getdents db/d53/d70/d38/d47 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.1.vm06.stdout:8/816: readlink db/l26 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.1.vm06.stdout:5/894: chown d5/d1c/d68/dec/d115/d11e/d92/f4e 70999765 1 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.1.vm06.stdout:5/895: fdatasync d5/d1c/d21/f96 0 2026-03-09T00:04:03.876 INFO:tasks.workunit.client.1.vm06.stdout:8/817: truncate db/d53/d6d/d7b/f8a 1014571 0 2026-03-09T00:04:03.879 INFO:tasks.workunit.client.1.vm06.stdout:4/815: dread d17/d24/f39 [0,4194304] 0 2026-03-09T00:04:03.879 INFO:tasks.workunit.client.1.vm06.stdout:4/816: truncate d17/d5b/dac/f112 434496 0 2026-03-09T00:04:03.893 INFO:tasks.workunit.client.0.vm03.stdout:3/434: creat d2/db/d40/d58/f7f x:0 0 0 2026-03-09T00:04:03.898 INFO:tasks.workunit.client.1.vm06.stdout:6/800: mkdir d4/d16/d46/df8 0 2026-03-09T00:04:03.917 INFO:tasks.workunit.client.1.vm06.stdout:5/896: truncate d5/f1d 2533160 0 2026-03-09T00:04:03.919 INFO:tasks.workunit.client.1.vm06.stdout:3/804: dread d11/d28/d2e/d7e/fdc [0,4194304] 0 2026-03-09T00:04:03.920 INFO:tasks.workunit.client.1.vm06.stdout:6/801: symlink d4/d27/d3e/d45/lf9 0 2026-03-09T00:04:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:04:03.924 INFO:tasks.workunit.client.0.vm03.stdout:0/540: symlink d2/da/d76/d8a/d8f/db8/lc7 0 2026-03-09T00:04:03.924 INFO:tasks.workunit.client.0.vm03.stdout:0/541: read d2/ff [1536810,112654] 0 2026-03-09T00:04:03.929 INFO:tasks.workunit.client.0.vm03.stdout:1/649: dwrite d4/d3a/d43/f49 [4194304,4194304] 0 2026-03-09T00:04:03.933 INFO:tasks.workunit.client.0.vm03.stdout:8/573: getdents d7/df/d1e/d3f 0 2026-03-09T00:04:03.933 INFO:tasks.workunit.client.0.vm03.stdout:8/574: fdatasync d7/df/d1e/f24 0 2026-03-09T00:04:03.935 INFO:tasks.workunit.client.0.vm03.stdout:7/537: creat d2/d4/d1e/d78/fa5 x:0 0 0 2026-03-09T00:04:03.937 INFO:tasks.workunit.client.0.vm03.stdout:3/435: creat d2/db/f80 x:0 0 0 2026-03-09T00:04:03.938 INFO:tasks.workunit.client.1.vm06.stdout:5/897: symlink d5/d1c/d68/dec/d115/d11e/l12a 0 2026-03-09T00:04:03.939 INFO:tasks.workunit.client.1.vm06.stdout:3/805: truncate d11/d28/d2e/d2f/d5b/d5f/f60 1117957 0 2026-03-09T00:04:03.949 INFO:tasks.workunit.client.0.vm03.stdout:2/553: mknod d8/d1b/d2a/d6b/cb7 0 2026-03-09T00:04:03.949 INFO:tasks.workunit.client.0.vm03.stdout:2/554: fdatasync d8/d1b/d2a/d6b/f9d 0 2026-03-09T00:04:03.949 INFO:tasks.workunit.client.0.vm03.stdout:8/575: unlink d7/df/d1a/d40/d9d/da3/fa6 0 2026-03-09T00:04:03.949 INFO:tasks.workunit.client.0.vm03.stdout:0/542: truncate d2/da/dd/d49/d6c/d4b/d55/f78 15457 0 2026-03-09T00:04:03.956 INFO:tasks.workunit.client.0.vm03.stdout:7/538: mknod d2/d1f/d3a/d24/da4/d91/d67/d6b/ca6 0 2026-03-09T00:04:03.956 INFO:tasks.workunit.client.0.vm03.stdout:7/539: fsync d2/d1f/d3a/d24/da4/d46/d54/f77 0 2026-03-09T00:04:03.956 INFO:tasks.workunit.client.0.vm03.stdout:2/555: dread d8/d1b/f1f [0,4194304] 0 2026-03-09T00:04:03.956 INFO:tasks.workunit.client.0.vm03.stdout:2/556: stat d8/d1b/d2a/d6b/f92 0 2026-03-09T00:04:03.962 INFO:tasks.workunit.client.0.vm03.stdout:0/543: creat d2/da/fc8 x:0 0 0 2026-03-09T00:04:03.962 INFO:tasks.workunit.client.0.vm03.stdout:0/544: stat d2/da/fc8 0 2026-03-09T00:04:03.963 INFO:tasks.workunit.client.0.vm03.stdout:8/576: rename d7/c23 to d7/cb0 0 2026-03-09T00:04:03.964 INFO:tasks.workunit.client.1.vm06.stdout:9/699: dwrite d1/d3/ddc/fde [0,4194304] 0 2026-03-09T00:04:03.964 INFO:tasks.workunit.client.1.vm06.stdout:9/700: readlink d1/d4/l42 0 2026-03-09T00:04:03.964 INFO:tasks.workunit.client.1.vm06.stdout:9/701: chown d1/d3/d2b/d58/lb4 77522 1 2026-03-09T00:04:03.967 INFO:tasks.workunit.client.1.vm06.stdout:5/898: write d5/d1c/d23/d34/fb2 [520249,45010] 0 2026-03-09T00:04:03.968 INFO:tasks.workunit.client.0.vm03.stdout:1/650: write d4/d15/f6d [3813069,21216] 0 2026-03-09T00:04:03.968 INFO:tasks.workunit.client.0.vm03.stdout:7/540: rename d2/d4/d1e/d5e/d7e/l99 to d2/d4/d1e/d85/la7 0 2026-03-09T00:04:03.972 INFO:tasks.workunit.client.0.vm03.stdout:8/577: dread d7/df/d1e/d38/d91/fa5 [0,4194304] 0 2026-03-09T00:04:03.972 INFO:tasks.workunit.client.0.vm03.stdout:8/578: chown d7/laf 0 1 2026-03-09T00:04:03.972 INFO:tasks.workunit.client.1.vm06.stdout:5/899: dread d5/fe7 [0,4194304] 0 2026-03-09T00:04:03.978 INFO:tasks.workunit.client.1.vm06.stdout:1/710: dwrite d6/d4c/d71/fea [0,4194304] 0 2026-03-09T00:04:03.978 INFO:tasks.workunit.client.0.vm03.stdout:7/541: dread d2/fc [0,4194304] 0 2026-03-09T00:04:03.978 INFO:tasks.workunit.client.0.vm03.stdout:7/542: fsync d2/d1f/d3a/d24/da4/d46/d54/f9b 0 2026-03-09T00:04:03.978 INFO:tasks.workunit.client.0.vm03.stdout:7/543: creat d2/d4/d1e/fa8 x:0 0 0 2026-03-09T00:04:03.979 INFO:tasks.workunit.client.0.vm03.stdout:2/557: link d8/d26/d5e/ca7 d8/d26/d5e/cb8 0 2026-03-09T00:04:03.979 INFO:tasks.workunit.client.0.vm03.stdout:4/682: dwrite d7/d27/f52 [0,4194304] 0 2026-03-09T00:04:03.979 INFO:tasks.workunit.client.0.vm03.stdout:4/683: chown d7/l14 48968 1 2026-03-09T00:04:03.979 INFO:tasks.workunit.client.1.vm06.stdout:4/817: dwrite d17/f61 [0,4194304] 0 2026-03-09T00:04:03.983 INFO:tasks.workunit.client.0.vm03.stdout:0/545: unlink d2/da/dd/l35 0 2026-03-09T00:04:03.983 INFO:tasks.workunit.client.0.vm03.stdout:0/546: write d2/da/dd/f7b [274495,96287] 0 2026-03-09T00:04:03.983 INFO:tasks.workunit.client.1.vm06.stdout:8/818: dwrite db/dd/d24/f33 [0,4194304] 0 2026-03-09T00:04:03.998 INFO:tasks.workunit.client.0.vm03.stdout:8/579: symlink d7/df/d1a/lb1 0 2026-03-09T00:04:03.998 INFO:tasks.workunit.client.0.vm03.stdout:4/684: write d7/d20/fce [1191121,62737] 0 2026-03-09T00:04:04.016 INFO:tasks.workunit.client.1.vm06.stdout:9/702: truncate d1/d4/d6e/d14/d25/d85/f5a 86677 0 2026-03-09T00:04:04.018 INFO:tasks.workunit.client.1.vm06.stdout:2/895: dwrite d7/da/db/f98 [0,4194304] 0 2026-03-09T00:04:04.025 INFO:tasks.workunit.client.1.vm06.stdout:5/900: symlink d5/d1c/d21/d28/d102/l12b 0 2026-03-09T00:04:04.025 INFO:tasks.workunit.client.1.vm06.stdout:5/901: write d5/d1c/d21/d28/d5e/d66/d78/da6/fd4 [553547,40000] 0 2026-03-09T00:04:04.027 INFO:tasks.workunit.client.0.vm03.stdout:7/544: rename d2/d1f/d3a/l87 to d2/d4/d8c/la9 0 2026-03-09T00:04:04.027 INFO:tasks.workunit.client.0.vm03.stdout:7/545: chown d2/d1f/d3a/d24/c7a 18 1 2026-03-09T00:04:04.028 INFO:tasks.workunit.client.0.vm03.stdout:2/558: truncate d8/d1b/f71 3240607 0 2026-03-09T00:04:04.029 INFO:tasks.workunit.client.0.vm03.stdout:0/547: symlink d2/da/dd/d49/d6c/d4b/lc9 0 2026-03-09T00:04:04.029 INFO:tasks.workunit.client.0.vm03.stdout:0/548: write d2/da/d4e/faa [415011,6413] 0 2026-03-09T00:04:04.029 INFO:tasks.workunit.client.0.vm03.stdout:0/549: readlink d2/l62 0 2026-03-09T00:04:04.029 INFO:tasks.workunit.client.0.vm03.stdout:0/550: write d2/da/dd/f7b [1066417,46136] 0 2026-03-09T00:04:04.029 INFO:tasks.workunit.client.0.vm03.stdout:0/551: creat d2/da/fca x:0 0 0 2026-03-09T00:04:04.029 INFO:tasks.workunit.client.0.vm03.stdout:0/552: truncate d2/da/dd/d49/d6c/f9d 98877 0 2026-03-09T00:04:04.030 INFO:tasks.workunit.client.1.vm06.stdout:4/818: mknod d17/d24/c117 0 2026-03-09T00:04:04.030 INFO:tasks.workunit.client.0.vm03.stdout:4/685: symlink d7/d20/ld9 0 2026-03-09T00:04:04.030 INFO:tasks.workunit.client.0.vm03.stdout:4/686: write f4 [3865107,36719] 0 2026-03-09T00:04:04.032 INFO:tasks.workunit.client.0.vm03.stdout:7/546: mkdir d2/d1f/d3a/d24/da4/d91/daa 0 2026-03-09T00:04:04.032 INFO:tasks.workunit.client.0.vm03.stdout:7/547: chown d2/l8b 35 1 2026-03-09T00:04:04.032 INFO:tasks.workunit.client.1.vm06.stdout:9/703: mkdir d1/d3/d4f/d52/de3/de5 0 2026-03-09T00:04:04.032 INFO:tasks.workunit.client.1.vm06.stdout:9/704: fsync d1/f1c 0 2026-03-09T00:04:04.034 INFO:tasks.workunit.client.0.vm03.stdout:2/559: rename d8/l19 to d8/d26/d5e/d5f/lb9 0 2026-03-09T00:04:04.035 INFO:tasks.workunit.client.0.vm03.stdout:0/553: link d2/da/d1a/f56 d2/da/dd/d49/fcb 0 2026-03-09T00:04:04.036 INFO:tasks.workunit.client.1.vm06.stdout:2/896: unlink d7/da/d1c/f70 0 2026-03-09T00:04:04.037 INFO:tasks.workunit.client.0.vm03.stdout:4/687: link d7/d20/d6a/d77/d25/l4c d7/d20/d6a/d77/lda 0 2026-03-09T00:04:04.037 INFO:tasks.workunit.client.0.vm03.stdout:4/688: read - d7/d20/d29/d38/fd1 zero size 2026-03-09T00:04:04.043 INFO:tasks.workunit.client.0.vm03.stdout:7/548: mknod d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/cab 0 2026-03-09T00:04:04.043 INFO:tasks.workunit.client.0.vm03.stdout:0/554: creat d2/da/dd/d49/d6c/d4b/d55/d6f/dad/fcc x:0 0 0 2026-03-09T00:04:04.043 INFO:tasks.workunit.client.0.vm03.stdout:2/560: write d8/f5d [2515855,20758] 0 2026-03-09T00:04:04.044 INFO:tasks.workunit.client.1.vm06.stdout:5/902: rmdir d5/d1c/d21/d28/d5e/d66/d78/dc8 39 2026-03-09T00:04:04.044 INFO:tasks.workunit.client.1.vm06.stdout:4/819: creat d17/d21/d4c/d66/de3/d10b/f118 x:0 0 0 2026-03-09T00:04:04.044 INFO:tasks.workunit.client.1.vm06.stdout:4/820: creat d17/d24/d3b/dbf/ddf/dfc/f119 x:0 0 0 2026-03-09T00:04:04.055 INFO:tasks.workunit.client.1.vm06.stdout:9/705: rename d1/d4/d6e/d14/d25/f4e to d1/d3/d50/fe6 0 2026-03-09T00:04:04.055 INFO:tasks.workunit.client.1.vm06.stdout:2/897: mknod d7/da/d63/d81/dfe/c117 0 2026-03-09T00:04:04.057 INFO:tasks.workunit.client.0.vm03.stdout:0/555: creat d2/fcd x:0 0 0 2026-03-09T00:04:04.058 INFO:tasks.workunit.client.1.vm06.stdout:1/711: dwrite d6/db0/fdc [0,4194304] 0 2026-03-09T00:04:04.066 INFO:tasks.workunit.client.0.vm03.stdout:2/561: creat d8/d1b/d24/da5/da8/fba x:0 0 0 2026-03-09T00:04:04.066 INFO:tasks.workunit.client.0.vm03.stdout:2/562: write d8/d1b/d2a/d6b/f8b [799142,69950] 0 2026-03-09T00:04:04.068 INFO:tasks.workunit.client.0.vm03.stdout:0/556: symlink d2/da/dd/d49/d6c/d4b/daf/lce 0 2026-03-09T00:04:04.068 INFO:tasks.workunit.client.1.vm06.stdout:9/706: unlink d1/d4/d6e/fa9 0 2026-03-09T00:04:04.070 INFO:tasks.workunit.client.0.vm03.stdout:0/557: mkdir d2/da/dd/d49/d6c/da6/dcf 0 2026-03-09T00:04:04.071 INFO:tasks.workunit.client.1.vm06.stdout:1/712: rmdir d6/d21/d2d/d3b/d87/d9d/dd8 39 2026-03-09T00:04:04.072 INFO:tasks.workunit.client.1.vm06.stdout:1/713: symlink d6/d21/def/lf1 0 2026-03-09T00:04:04.073 INFO:tasks.workunit.client.1.vm06.stdout:1/714: link d6/db0/fdc d6/d21/ff2 0 2026-03-09T00:04:04.073 INFO:tasks.workunit.client.1.vm06.stdout:1/715: truncate d6/d21/da6/fe5 660168 0 2026-03-09T00:04:04.073 INFO:tasks.workunit.client.1.vm06.stdout:1/716: fdatasync d6/d21/fe7 0 2026-03-09T00:04:04.077 INFO:tasks.workunit.client.0.vm03.stdout:0/558: dread d2/da/f2d [0,4194304] 0 2026-03-09T00:04:04.081 INFO:tasks.workunit.client.0.vm03.stdout:0/559: unlink d2/da/f2d 0 2026-03-09T00:04:04.084 INFO:tasks.workunit.client.1.vm06.stdout:1/717: read d6/d21/d2d/d37/f86 [963001,18525] 0 2026-03-09T00:04:04.085 INFO:tasks.workunit.client.1.vm06.stdout:1/718: creat d6/d21/d2d/d37/ff3 x:0 0 0 2026-03-09T00:04:04.090 INFO:tasks.workunit.client.1.vm06.stdout:7/797: dwrite d0/df/d1a/d27/d4c/d40/d5b/fd7 [0,4194304] 0 2026-03-09T00:04:04.098 INFO:tasks.workunit.client.1.vm06.stdout:4/821: dread d17/d24/f3a [0,4194304] 0 2026-03-09T00:04:04.099 INFO:tasks.workunit.client.0.vm03.stdout:0/560: write d2/da/dd/d49/d6c/f57 [2756439,46064] 0 2026-03-09T00:04:04.100 INFO:tasks.workunit.client.1.vm06.stdout:8/819: dwrite db/d74/d78/fe2 [0,4194304] 0 2026-03-09T00:04:04.102 INFO:tasks.workunit.client.1.vm06.stdout:7/798: mkdir d0/df/d1a/d3f/de8 0 2026-03-09T00:04:04.156 INFO:tasks.workunit.client.1.vm06.stdout:2/898: dwrite d7/d1a/d25/d66/f84 [0,4194304] 0 2026-03-09T00:04:04.156 INFO:tasks.workunit.client.0.vm03.stdout:8/580: dwrite d7/f9c [0,4194304] 0 2026-03-09T00:04:04.156 INFO:tasks.workunit.client.0.vm03.stdout:8/581: creat d7/df/d1e/d38/d4c/fb2 x:0 0 0 2026-03-09T00:04:04.156 INFO:tasks.workunit.client.0.vm03.stdout:8/582: truncate d7/f11 1298870 0 2026-03-09T00:04:04.156 INFO:tasks.workunit.client.0.vm03.stdout:8/583: fdatasync d7/df/d1a/d40/d58/f7f 0 2026-03-09T00:04:04.170 INFO:tasks.workunit.client.0.vm03.stdout:4/689: dwrite d7/d6f/f9b [0,4194304] 0 2026-03-09T00:04:04.173 INFO:tasks.workunit.client.0.vm03.stdout:4/690: unlink d7/d20/d6a/fba 0 2026-03-09T00:04:04.200 INFO:tasks.workunit.client.0.vm03.stdout:7/549: dwrite d2/d1f/d3a/d24/da4/d46/d54/f9b [0,4194304] 0 2026-03-09T00:04:04.206 INFO:tasks.workunit.client.0.vm03.stdout:7/550: read d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f56 [2573834,16814] 0 2026-03-09T00:04:04.212 INFO:tasks.workunit.client.1.vm06.stdout:5/903: dwrite d5/d1c/d21/d28/d5e/d66/d78/f7c [0,4194304] 0 2026-03-09T00:04:04.213 INFO:tasks.workunit.client.1.vm06.stdout:5/904: chown d5/d1c/d21 994943819 1 2026-03-09T00:04:04.224 INFO:tasks.workunit.client.0.vm03.stdout:2/563: dwrite d8/d26/d5e/d6f/d97/f1d [0,4194304] 0 2026-03-09T00:04:04.225 INFO:tasks.workunit.client.0.vm03.stdout:2/564: dread d8/d26/f85 [0,4194304] 0 2026-03-09T00:04:04.228 INFO:tasks.workunit.client.1.vm06.stdout:7/799: dwrite d0/df/d1a/d3a/d4e/d5e/ddc/f81 [0,4194304] 0 2026-03-09T00:04:04.229 INFO:tasks.workunit.client.0.vm03.stdout:4/691: dwrite d7/f81 [0,4194304] 0 2026-03-09T00:04:04.235 INFO:tasks.workunit.client.0.vm03.stdout:4/692: read d7/fa7 [2033356,55110] 0 2026-03-09T00:04:04.265 INFO:tasks.workunit.client.0.vm03.stdout:2/565: creat d8/d1b/d2a/fbb x:0 0 0 2026-03-09T00:04:04.286 INFO:tasks.workunit.client.0.vm03.stdout:4/693: creat d7/d20/d29/d38/fdb x:0 0 0 2026-03-09T00:04:04.293 INFO:tasks.workunit.client.0.vm03.stdout:2/566: rmdir d8/d1b/d2a/d6b 39 2026-03-09T00:04:04.294 INFO:tasks.workunit.client.1.vm06.stdout:7/800: stat d0/df/d1a/d27/d4c/d40/d51/d90/la2 0 2026-03-09T00:04:04.294 INFO:tasks.workunit.client.1.vm06.stdout:7/801: readlink d0/d55/d85/lb7 0 2026-03-09T00:04:04.298 INFO:tasks.workunit.client.0.vm03.stdout:2/567: dread d8/f11 [0,4194304] 0 2026-03-09T00:04:04.299 INFO:tasks.workunit.client.0.vm03.stdout:2/568: fsync d8/f59 0 2026-03-09T00:04:04.299 INFO:tasks.workunit.client.0.vm03.stdout:2/569: fsync d8/d1b/f8d 0 2026-03-09T00:04:04.299 INFO:tasks.workunit.client.1.vm06.stdout:7/802: symlink d0/df/d1a/d35/le9 0 2026-03-09T00:04:04.299 INFO:tasks.workunit.client.1.vm06.stdout:7/803: chown d0/fb1 11 1 2026-03-09T00:04:04.299 INFO:tasks.workunit.client.0.vm03.stdout:2/570: creat d8/d1b/d6c/fbc x:0 0 0 2026-03-09T00:04:04.303 INFO:tasks.workunit.client.1.vm06.stdout:7/804: truncate d0/df/d1a/d27/f66 18192 0 2026-03-09T00:04:04.304 INFO:tasks.workunit.client.1.vm06.stdout:7/805: readlink d0/df/d1a/d3a/l54 0 2026-03-09T00:04:04.310 INFO:tasks.workunit.client.0.vm03.stdout:2/571: symlink d8/d1b/d2a/d6b/d50/d8a/lbd 0 2026-03-09T00:04:04.310 INFO:tasks.workunit.client.0.vm03.stdout:2/572: stat d8/d1b/d2a/d2e/c40 0 2026-03-09T00:04:04.310 INFO:tasks.workunit.client.0.vm03.stdout:2/573: fdatasync d8/d1b/d2a/d2e/fad 0 2026-03-09T00:04:04.310 INFO:tasks.workunit.client.0.vm03.stdout:2/574: chown d8/d1b/d2a/c3a 13 1 2026-03-09T00:04:04.316 INFO:tasks.workunit.client.1.vm06.stdout:7/806: link d0/df/d1a/d3a/l54 d0/df/d1a/d3a/d4e/d5e/dc8/dcd/lea 0 2026-03-09T00:04:04.316 INFO:tasks.workunit.client.1.vm06.stdout:7/807: chown d0/df/d1a/d27/d4c/d40/d5b/fd7 3447530 1 2026-03-09T00:04:04.319 INFO:tasks.workunit.client.0.vm03.stdout:9/592: sync 2026-03-09T00:04:04.324 INFO:tasks.workunit.client.0.vm03.stdout:9/593: creat d15/d1c/d21/d64/fc2 x:0 0 0 2026-03-09T00:04:04.324 INFO:tasks.workunit.client.1.vm06.stdout:7/808: write d0/df/d1a/d3a/d4e/d5e/f6f [816929,93736] 0 2026-03-09T00:04:04.331 INFO:tasks.workunit.client.0.vm03.stdout:9/594: mknod d15/d1c/cc3 0 2026-03-09T00:04:04.335 INFO:tasks.workunit.client.1.vm06.stdout:0/844: dwrite d3/d18/d1f/f26 [0,4194304] 0 2026-03-09T00:04:04.335 INFO:tasks.workunit.client.1.vm06.stdout:0/845: chown d3/d18/d2c/d2d/d74/lce 55985960 1 2026-03-09T00:04:04.335 INFO:tasks.workunit.client.1.vm06.stdout:0/846: chown d3/d18/d2c/d2d/d74/dc7/d110/cc6 1753586 1 2026-03-09T00:04:04.335 INFO:tasks.workunit.client.1.vm06.stdout:0/847: write d3/d18/d1f/d39/d3b/fcf [732136,12815] 0 2026-03-09T00:04:04.335 INFO:tasks.workunit.client.1.vm06.stdout:0/848: chown d3/d18/d2c/d2d/d31/c80 247 1 2026-03-09T00:04:04.373 INFO:tasks.workunit.client.1.vm06.stdout:3/806: write d11/d28/d57/f7b [2523067,117619] 0 2026-03-09T00:04:04.418 INFO:tasks.workunit.client.0.vm03.stdout:2/575: dwrite d8/d1b/d2a/d56/f57 [0,4194304] 0 2026-03-09T00:04:04.419 INFO:tasks.workunit.client.0.vm03.stdout:2/576: fsync d8/d26/d5e/d6f/d97/f34 0 2026-03-09T00:04:04.424 INFO:tasks.workunit.client.0.vm03.stdout:9/595: dwrite fb [0,4194304] 0 2026-03-09T00:04:04.446 INFO:tasks.workunit.client.1.vm06.stdout:7/809: dwrite d0/df/d1a/d27/d70/fc7 [0,4194304] 0 2026-03-09T00:04:04.451 INFO:tasks.workunit.client.0.vm03.stdout:6/520: sync 2026-03-09T00:04:04.451 INFO:tasks.workunit.client.0.vm03.stdout:9/596: dread d15/f17 [0,4194304] 0 2026-03-09T00:04:04.483 INFO:tasks.workunit.client.1.vm06.stdout:0/849: dwrite d3/d18/d1f/f26 [0,4194304] 0 2026-03-09T00:04:04.559 INFO:tasks.workunit.client.0.vm03.stdout:0/561: dwrite d2/da/dd/f75 [0,4194304] 0 2026-03-09T00:04:04.559 INFO:tasks.workunit.client.0.vm03.stdout:0/562: read d2/da/d1a/f25 [562194,89718] 0 2026-03-09T00:04:04.600 INFO:tasks.workunit.client.0.vm03.stdout:0/563: dwrite d2/da/d4e/faa [0,4194304] 0 2026-03-09T00:04:04.600 INFO:tasks.workunit.client.0.vm03.stdout:0/564: write d2/da/dd/d49/d6c/d4b/d55/d6f/dad/fcc [89377,49327] 0 2026-03-09T00:04:04.607 INFO:tasks.workunit.client.0.vm03.stdout:0/565: mknod d2/da/dd/d49/d6c/da6/dcf/cd0 0 2026-03-09T00:04:04.607 INFO:tasks.workunit.client.0.vm03.stdout:0/566: readlink d2/da/dd/d49/d6c/d81/l94 0 2026-03-09T00:04:04.613 INFO:tasks.workunit.client.1.vm06.stdout:3/807: link d11/d28/d2e/d2f/l70 d11/d28/d2e/d2f/d36/l11c 0 2026-03-09T00:04:04.636 INFO:tasks.workunit.client.0.vm03.stdout:0/567: creat d2/d5a/fd1 x:0 0 0 2026-03-09T00:04:04.651 INFO:tasks.workunit.client.0.vm03.stdout:0/568: symlink d2/da/d76/d8a/d8f/db8/ld2 0 2026-03-09T00:04:04.666 INFO:tasks.workunit.client.0.vm03.stdout:6/521: mkdir d13/d35/d4c/db4 0 2026-03-09T00:04:04.666 INFO:tasks.workunit.client.0.vm03.stdout:6/522: chown d13/d35/d4c/la4 30758 1 2026-03-09T00:04:04.666 INFO:tasks.workunit.client.0.vm03.stdout:6/523: rmdir d13/d35/d4c/db4 0 2026-03-09T00:04:04.666 INFO:tasks.workunit.client.0.vm03.stdout:6/524: truncate d13/d1e/f30 437239 0 2026-03-09T00:04:04.666 INFO:tasks.workunit.client.0.vm03.stdout:6/525: readlink d13/d35/l37 0 2026-03-09T00:04:04.667 INFO:tasks.workunit.client.0.vm03.stdout:6/526: chown d13/d1e/c23 36400 1 2026-03-09T00:04:04.667 INFO:tasks.workunit.client.0.vm03.stdout:6/527: readlink d13/d1e/l56 0 2026-03-09T00:04:04.708 INFO:tasks.workunit.client.0.vm03.stdout:6/528: dwrite d13/f17 [0,4194304] 0 2026-03-09T00:04:04.755 INFO:tasks.workunit.client.1.vm06.stdout:2/899: link d7/d1a/d3c/ld5 d7/da/d4e/d57/l118 0 2026-03-09T00:04:04.764 INFO:tasks.workunit.client.1.vm06.stdout:2/900: getdents d7/da 0 2026-03-09T00:04:04.765 INFO:tasks.workunit.client.1.vm06.stdout:2/901: mkdir d7/da/d63/d81/dfe/db2/d102/d119 0 2026-03-09T00:04:04.765 INFO:tasks.workunit.client.1.vm06.stdout:2/902: readlink d7/da/d1c/l9c 0 2026-03-09T00:04:04.765 INFO:tasks.workunit.client.1.vm06.stdout:2/903: write d7/d1b/d71/d79/db4/dc1/d86/fdc [553407,54784] 0 2026-03-09T00:04:04.811 INFO:tasks.workunit.client.1.vm06.stdout:2/904: dwrite d7/d1a/d25/d66/f84 [0,4194304] 0 2026-03-09T00:04:04.812 INFO:tasks.workunit.client.1.vm06.stdout:2/905: mkdir d7/d1a/d25/d66/d87/d11a 0 2026-03-09T00:04:04.857 INFO:tasks.workunit.client.1.vm06.stdout:2/906: dwrite d7/d1a/fd3 [0,4194304] 0 2026-03-09T00:04:04.938 INFO:tasks.workunit.client.1.vm06.stdout:3/808: getdents d11/d28/d2e/d2f/d36 0 2026-03-09T00:04:04.943 INFO:tasks.workunit.client.1.vm06.stdout:3/809: read d11/d28/d2e/f32 [3864586,67936] 0 2026-03-09T00:04:04.944 INFO:tasks.workunit.client.1.vm06.stdout:3/810: rename d11/d28/d4d/d89/d90/dd2/dfd/f119 to d11/d28/d2e/d2f/d5b/ddb/f11d 0 2026-03-09T00:04:04.955 INFO:tasks.workunit.client.1.vm06.stdout:3/811: creat d11/d28/d4d/d89/d90/dd2/dfd/f11e x:0 0 0 2026-03-09T00:04:04.968 INFO:tasks.workunit.client.0.vm03.stdout:6/529: mkdir d13/d35/db5 0 2026-03-09T00:04:04.968 INFO:tasks.workunit.client.1.vm06.stdout:2/907: mknod d7/d1a/d25/c11b 0 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:1/719: sync 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:6/802: sync 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:9/707: sync 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:5/905: sync 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:9/708: chown d1/d4 98209 1 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:4/822: sync 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:4/823: chown d17/d5b/d8f/c95 20279 1 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:4/824: creat d17/d24/d3b/d5e/d7a/f11a x:0 0 0 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:4/825: chown d17/d24/ca0 51 1 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:4/826: readlink d17/d24/l67 0 2026-03-09T00:04:04.969 INFO:tasks.workunit.client.1.vm06.stdout:8/820: sync 2026-03-09T00:04:04.977 INFO:tasks.workunit.client.1.vm06.stdout:2/908: mkdir d7/d1a/d56/d11c 0 2026-03-09T00:04:04.977 INFO:tasks.workunit.client.1.vm06.stdout:1/720: mknod d6/d21/cf4 0 2026-03-09T00:04:04.978 INFO:tasks.workunit.client.1.vm06.stdout:4/827: creat d17/d21/f11b x:0 0 0 2026-03-09T00:04:04.978 INFO:tasks.workunit.client.1.vm06.stdout:4/828: chown d17/cdc 1936044 1 2026-03-09T00:04:04.979 INFO:tasks.workunit.client.1.vm06.stdout:2/909: mknod d7/da/c11d 0 2026-03-09T00:04:04.979 INFO:tasks.workunit.client.1.vm06.stdout:8/821: creat db/d53/d70/d38/d4d/d79/f107 x:0 0 0 2026-03-09T00:04:04.983 INFO:tasks.workunit.client.1.vm06.stdout:8/822: write db/d53/d70/d38/f5b [2874090,64694] 0 2026-03-09T00:04:04.984 INFO:tasks.workunit.client.0.vm03.stdout:8/584: rename d7/d92 to d7/df/d1a/d40/db3 0 2026-03-09T00:04:04.986 INFO:tasks.workunit.client.1.vm06.stdout:8/823: creat db/d1e/d9b/df5/f108 x:0 0 0 2026-03-09T00:04:04.986 INFO:tasks.workunit.client.1.vm06.stdout:8/824: write db/d1e/ff2 [395235,70677] 0 2026-03-09T00:04:04.991 INFO:tasks.workunit.client.0.vm03.stdout:4/694: rename d7/d20/d29/d38/d3a to d7/d20/d29/d38/da9/ddc 0 2026-03-09T00:04:04.991 INFO:tasks.workunit.client.0.vm03.stdout:4/695: rmdir d7/d20/d6a/d77/db7 39 2026-03-09T00:04:04.992 INFO:tasks.workunit.client.1.vm06.stdout:8/825: getdents db/dd/d48 0 2026-03-09T00:04:05.002 INFO:tasks.workunit.client.0.vm03.stdout:4/696: symlink d7/d20/d29/d78/ldd 0 2026-03-09T00:04:05.002 INFO:tasks.workunit.client.0.vm03.stdout:4/697: fdatasync d7/d20/d29/fa0 0 2026-03-09T00:04:05.002 INFO:tasks.workunit.client.1.vm06.stdout:8/826: symlink db/d1e/l109 0 2026-03-09T00:04:05.002 INFO:tasks.workunit.client.1.vm06.stdout:8/827: chown db/d1e/d46/fe1 9 1 2026-03-09T00:04:05.002 INFO:tasks.workunit.client.1.vm06.stdout:8/828: chown db/d53/d70/d38/c90 29 1 2026-03-09T00:04:05.002 INFO:tasks.workunit.client.1.vm06.stdout:8/829: write db/dd/d24/dac/fe6 [10475,36753] 0 2026-03-09T00:04:05.006 INFO:tasks.workunit.client.0.vm03.stdout:4/698: write d7/d20/d29/d54/d58/f6b [211189,119] 0 2026-03-09T00:04:05.007 INFO:tasks.workunit.client.0.vm03.stdout:6/530: write d13/f17 [2674592,1562] 0 2026-03-09T00:04:05.027 INFO:tasks.workunit.client.0.vm03.stdout:9/597: rename d15/d1c/d21/d54/d87/d93/d74 to d15/d1c/d36/d4d/dc4 0 2026-03-09T00:04:05.028 INFO:tasks.workunit.client.0.vm03.stdout:4/699: mkdir d7/d20/d6a/dde 0 2026-03-09T00:04:05.028 INFO:tasks.workunit.client.0.vm03.stdout:4/700: dread - d7/d27/fc5 zero size 2026-03-09T00:04:05.029 INFO:tasks.workunit.client.0.vm03.stdout:4/701: rmdir d7/d20/d29/d38/da9 39 2026-03-09T00:04:05.036 INFO:tasks.workunit.client.1.vm06.stdout:4/829: read d17/d21/d4c/d50/f69 [3316366,9509] 0 2026-03-09T00:04:05.038 INFO:tasks.workunit.client.1.vm06.stdout:4/830: creat d17/d24/d3b/d5e/d7a/f11c x:0 0 0 2026-03-09T00:04:05.038 INFO:tasks.workunit.client.1.vm06.stdout:4/831: readlink d17/d21/d4c/d50/l53 0 2026-03-09T00:04:05.043 INFO:tasks.workunit.client.1.vm06.stdout:4/832: symlink d17/d24/d3b/d97/db7/l11d 0 2026-03-09T00:04:05.066 INFO:tasks.workunit.client.1.vm06.stdout:5/906: link d5/d1c/d68/dec/d115/d11e/c80 d5/c12c 0 2026-03-09T00:04:05.066 INFO:tasks.workunit.client.1.vm06.stdout:5/907: dread - d5/d1c/d21/d28/d5e/d66/d78/dc8/f90 zero size 2026-03-09T00:04:05.070 INFO:tasks.workunit.client.0.vm03.stdout:7/551: link d2/d4/c17 d2/d4/cac 0 2026-03-09T00:04:05.072 INFO:tasks.workunit.client.0.vm03.stdout:6/531: dwrite d13/d1e/f9f [0,4194304] 0 2026-03-09T00:04:05.072 INFO:tasks.workunit.client.1.vm06.stdout:3/812: rename d11/d28/d2e/d2f/d5b/d5f/la5 to d11/d28/d2e/d2f/d5b/d5f/db1/l11f 0 2026-03-09T00:04:05.072 INFO:tasks.workunit.client.1.vm06.stdout:3/813: stat d11/d28/d2e/d2f/d5b 0 2026-03-09T00:04:05.074 INFO:tasks.workunit.client.1.vm06.stdout:9/709: dwrite d1/d4/d6e/d14/d25/d85/f28 [4194304,4194304] 0 2026-03-09T00:04:05.075 INFO:tasks.workunit.client.0.vm03.stdout:7/552: getdents d2/d1f 0 2026-03-09T00:04:05.075 INFO:tasks.workunit.client.0.vm03.stdout:7/553: fdatasync d2/f4d 0 2026-03-09T00:04:05.077 INFO:tasks.workunit.client.1.vm06.stdout:3/814: read d11/d28/f3a [534020,15525] 0 2026-03-09T00:04:05.087 INFO:tasks.workunit.client.1.vm06.stdout:6/803: rename d4/d27/d3e/d78/fbc to d4/d16/d53/df2/ffa 0 2026-03-09T00:04:05.087 INFO:tasks.workunit.client.1.vm06.stdout:6/804: chown d4/d16/d46/d90 20291673 1 2026-03-09T00:04:05.087 INFO:tasks.workunit.client.1.vm06.stdout:6/805: chown d4/d16/d53/ddf/d52/l88 453707 1 2026-03-09T00:04:05.087 INFO:tasks.workunit.client.1.vm06.stdout:6/806: write d4/d16/d53/ddf/dc8/fd2 [786091,80504] 0 2026-03-09T00:04:05.088 INFO:tasks.workunit.client.1.vm06.stdout:6/807: write d4/d16/d53/f5f [132199,82296] 0 2026-03-09T00:04:05.090 INFO:tasks.workunit.client.1.vm06.stdout:9/710: creat d1/d4/d6e/d14/fe7 x:0 0 0 2026-03-09T00:04:05.090 INFO:tasks.workunit.client.1.vm06.stdout:9/711: dread - d1/d3/d50/fba zero size 2026-03-09T00:04:05.090 INFO:tasks.workunit.client.1.vm06.stdout:9/712: chown d1/d3/d2b/d58/fe4 245 1 2026-03-09T00:04:05.090 INFO:tasks.workunit.client.1.vm06.stdout:3/815: stat d11/d28/d4d/d89/cae 0 2026-03-09T00:04:05.091 INFO:tasks.workunit.client.0.vm03.stdout:7/554: unlink d2/d1f/d3a/d24/da4/f59 0 2026-03-09T00:04:05.095 INFO:tasks.workunit.client.1.vm06.stdout:1/721: rename d6/d21/d2d/d37/ff3 to d6/d4c/d79/ff5 0 2026-03-09T00:04:05.095 INFO:tasks.workunit.client.0.vm03.stdout:7/555: truncate d2/d1f/d3a/d24/da4/d91/f72 4042623 0 2026-03-09T00:04:05.097 INFO:tasks.workunit.client.1.vm06.stdout:3/816: symlink d11/d28/d2e/d2f/d36/d8f/l120 0 2026-03-09T00:04:05.100 INFO:tasks.workunit.client.0.vm03.stdout:7/556: stat d2/d4/c17 0 2026-03-09T00:04:05.100 INFO:tasks.workunit.client.1.vm06.stdout:2/910: rename d7/d1b/d71/d79/db4/dc1/d86/fdc to d7/d1a/d39/df1/f11e 0 2026-03-09T00:04:05.101 INFO:tasks.workunit.client.1.vm06.stdout:3/817: getdents d11/d28/d2e/d7e/d83/d87 0 2026-03-09T00:04:05.101 INFO:tasks.workunit.client.1.vm06.stdout:3/818: chown d11/d117 1 1 2026-03-09T00:04:05.102 INFO:tasks.workunit.client.1.vm06.stdout:1/722: read d6/db0/fdc [3399629,96811] 0 2026-03-09T00:04:05.102 INFO:tasks.workunit.client.1.vm06.stdout:1/723: chown d6/d21/f7b 736061241 1 2026-03-09T00:04:05.102 INFO:tasks.workunit.client.1.vm06.stdout:1/724: truncate d6/d4c/d71/d83/f9b 4796419 0 2026-03-09T00:04:05.104 INFO:tasks.workunit.client.1.vm06.stdout:3/819: write d11/d28/d2e/d2f/d5b/d5f/d91/ffa [508187,75738] 0 2026-03-09T00:04:05.104 INFO:tasks.workunit.client.1.vm06.stdout:3/820: write d11/d28/d2e/d2f/f3e [2173480,87510] 0 2026-03-09T00:04:05.104 INFO:tasks.workunit.client.1.vm06.stdout:3/821: write d11/d28/d2e/d7e/d83/d87/ff9 [339360,58132] 0 2026-03-09T00:04:05.104 INFO:tasks.workunit.client.1.vm06.stdout:3/822: write d11/d28/d2e/db2/f101 [925751,99502] 0 2026-03-09T00:04:05.105 INFO:tasks.workunit.client.1.vm06.stdout:1/725: truncate d6/d21/fc1 179834 0 2026-03-09T00:04:05.108 INFO:tasks.workunit.client.1.vm06.stdout:1/726: creat d6/d21/d2d/d37/d6d/dd7/ff6 x:0 0 0 2026-03-09T00:04:05.108 INFO:tasks.workunit.client.1.vm06.stdout:1/727: write d6/d21/ff2 [16316,74697] 0 2026-03-09T00:04:05.108 INFO:tasks.workunit.client.1.vm06.stdout:1/728: write d6/d21/f55 [1221879,117484] 0 2026-03-09T00:04:05.108 INFO:tasks.workunit.client.1.vm06.stdout:1/729: dread - d6/d21/d2d/d3b/d87/d9d/dd8/fe6 zero size 2026-03-09T00:04:05.113 INFO:tasks.workunit.client.1.vm06.stdout:1/730: dread d6/d21/f3d [0,4194304] 0 2026-03-09T00:04:05.113 INFO:tasks.workunit.client.0.vm03.stdout:4/702: dwrite d7/d27/dc9/fd2 [4194304,4194304] 0 2026-03-09T00:04:05.123 INFO:tasks.workunit.client.0.vm03.stdout:4/703: mknod d7/d20/d6a/d77/d25/cdf 0 2026-03-09T00:04:05.123 INFO:tasks.workunit.client.0.vm03.stdout:4/704: dread - d7/d20/d29/d38/fca zero size 2026-03-09T00:04:05.125 INFO:tasks.workunit.client.1.vm06.stdout:1/731: dread d6/d4c/fbe [0,4194304] 0 2026-03-09T00:04:05.129 INFO:tasks.workunit.client.1.vm06.stdout:1/732: dread d6/d21/d2d/d37/f86 [0,4194304] 0 2026-03-09T00:04:05.130 INFO:tasks.workunit.client.0.vm03.stdout:4/705: creat d7/d27/dc4/fe0 x:0 0 0 2026-03-09T00:04:05.137 INFO:tasks.workunit.client.0.vm03.stdout:4/706: rename d7/d27/dc4/fe0 to d7/d20/d29/d4e/fe1 0 2026-03-09T00:04:05.137 INFO:tasks.workunit.client.0.vm03.stdout:4/707: truncate d7/d20/d29/d4e/f74 1771365 0 2026-03-09T00:04:05.137 INFO:tasks.workunit.client.0.vm03.stdout:4/708: dread - d7/d20/d29/d4e/f4f zero size 2026-03-09T00:04:05.142 INFO:tasks.workunit.client.1.vm06.stdout:1/733: symlink d6/d21/d2d/lf7 0 2026-03-09T00:04:05.142 INFO:tasks.workunit.client.0.vm03.stdout:4/709: write d7/d27/f89 [1321082,101637] 0 2026-03-09T00:04:05.144 INFO:tasks.workunit.client.0.vm03.stdout:0/569: creat d2/fd3 x:0 0 0 2026-03-09T00:04:05.158 INFO:tasks.workunit.client.1.vm06.stdout:1/734: creat d6/d21/d2d/d3b/d87/ff8 x:0 0 0 2026-03-09T00:04:05.158 INFO:tasks.workunit.client.1.vm06.stdout:1/735: chown d6/db0/fdc 54157 1 2026-03-09T00:04:05.158 INFO:tasks.workunit.client.1.vm06.stdout:1/736: write d6/d4c/d79/fee [804500,18753] 0 2026-03-09T00:04:05.160 INFO:tasks.workunit.client.1.vm06.stdout:1/737: mknod d6/d21/d2d/d3b/d87/d9d/cf9 0 2026-03-09T00:04:05.160 INFO:tasks.workunit.client.1.vm06.stdout:1/738: write d6/d4c/d71/fbf [910188,27778] 0 2026-03-09T00:04:05.164 INFO:tasks.workunit.client.1.vm06.stdout:1/739: dread d6/d63/f82 [0,4194304] 0 2026-03-09T00:04:05.164 INFO:tasks.workunit.client.1.vm06.stdout:1/740: readlink d6/l3f 0 2026-03-09T00:04:05.167 INFO:tasks.workunit.client.1.vm06.stdout:1/741: rmdir d6/d4c/de0 0 2026-03-09T00:04:05.201 INFO:tasks.workunit.client.1.vm06.stdout:6/808: dwrite d4/f12 [0,4194304] 0 2026-03-09T00:04:05.202 INFO:tasks.workunit.client.1.vm06.stdout:9/713: dwrite d1/f78 [0,4194304] 0 2026-03-09T00:04:05.202 INFO:tasks.workunit.client.1.vm06.stdout:9/714: chown d1/d4/d6e/d14/f1a 3792234 1 2026-03-09T00:04:05.202 INFO:tasks.workunit.client.1.vm06.stdout:9/715: fdatasync d1/d3/d4f/d91/d94/f95 0 2026-03-09T00:04:05.205 INFO:tasks.workunit.client.1.vm06.stdout:9/716: dread d1/d4/d6e/d14/d25/d85/d49/f56 [0,4194304] 0 2026-03-09T00:04:05.211 INFO:tasks.workunit.client.1.vm06.stdout:0/850: link d3/d18/d2c/d2d/d74/da8/d109/l111 d3/d18/d2c/l120 0 2026-03-09T00:04:05.212 INFO:tasks.workunit.client.0.vm03.stdout:0/570: rename d2/da/dd/d49/d6c/f5c to d2/da/dd/d49/d6c/d4b/d55/d6f/fd4 0 2026-03-09T00:04:05.218 INFO:tasks.workunit.client.0.vm03.stdout:0/571: read d2/da/dd/f7b [739866,64074] 0 2026-03-09T00:04:05.222 INFO:tasks.workunit.client.0.vm03.stdout:4/710: dwrite d7/d20/f21 [0,4194304] 0 2026-03-09T00:04:05.233 INFO:tasks.workunit.client.1.vm06.stdout:9/717: mkdir d1/d3/d4f/d91/de8 0 2026-03-09T00:04:05.235 INFO:tasks.workunit.client.1.vm06.stdout:0/851: creat d3/d10f/f121 x:0 0 0 2026-03-09T00:04:05.236 INFO:tasks.workunit.client.1.vm06.stdout:9/718: mkdir d1/d3/d4f/d91/dae/de9 0 2026-03-09T00:04:05.237 INFO:tasks.workunit.client.1.vm06.stdout:7/810: symlink d0/leb 0 2026-03-09T00:04:05.237 INFO:tasks.workunit.client.0.vm03.stdout:9/598: rmdir d15/d1c 39 2026-03-09T00:04:05.238 INFO:tasks.workunit.client.1.vm06.stdout:0/852: mkdir d3/d18/d2c/d2d/d74/dc7/d122 0 2026-03-09T00:04:05.238 INFO:tasks.workunit.client.1.vm06.stdout:0/853: chown d3/d10f/lf3 165989 1 2026-03-09T00:04:05.244 INFO:tasks.workunit.client.1.vm06.stdout:6/809: dwrite d4/d27/d3e/f55 [0,4194304] 0 2026-03-09T00:04:05.244 INFO:tasks.workunit.client.1.vm06.stdout:6/810: creat d4/d16/d53/ddf/da6/dbb/ffb x:0 0 0 2026-03-09T00:04:05.244 INFO:tasks.workunit.client.1.vm06.stdout:6/811: write d4/d16/fcf [73862,26817] 0 2026-03-09T00:04:05.245 INFO:tasks.workunit.client.1.vm06.stdout:9/719: link d1/d4/ff d1/da7/fea 0 2026-03-09T00:04:05.248 INFO:tasks.workunit.client.1.vm06.stdout:6/812: dread d4/d16/f33 [0,4194304] 0 2026-03-09T00:04:05.248 INFO:tasks.workunit.client.1.vm06.stdout:6/813: dread - d4/d27/fef zero size 2026-03-09T00:04:05.260 INFO:tasks.workunit.client.1.vm06.stdout:6/814: dread d4/d16/d53/d67/f8f [0,4194304] 0 2026-03-09T00:04:05.262 INFO:tasks.workunit.client.0.vm03.stdout:0/572: rmdir d2/d5a 39 2026-03-09T00:04:05.280 INFO:tasks.workunit.client.0.vm03.stdout:3/436: sync 2026-03-09T00:04:05.280 INFO:tasks.workunit.client.0.vm03.stdout:5/574: sync 2026-03-09T00:04:05.280 INFO:tasks.workunit.client.0.vm03.stdout:1/651: sync 2026-03-09T00:04:05.283 INFO:tasks.workunit.client.1.vm06.stdout:7/811: mkdir d0/df/d1a/dec 0 2026-03-09T00:04:05.283 INFO:tasks.workunit.client.1.vm06.stdout:7/812: write d0/df/d1a/d35/f94 [19164,45620] 0 2026-03-09T00:04:05.283 INFO:tasks.workunit.client.1.vm06.stdout:7/813: fsync d0/df/d1a/d22/f9e 0 2026-03-09T00:04:05.283 INFO:tasks.workunit.client.1.vm06.stdout:7/814: read d0/df/d1a/d22/f2c [283105,27827] 0 2026-03-09T00:04:05.289 INFO:tasks.workunit.client.0.vm03.stdout:9/599: link d15/d1c/d21/d54/l5a d15/d1c/d9c/lc5 0 2026-03-09T00:04:05.294 INFO:tasks.workunit.client.1.vm06.stdout:8/830: rename db/d1e/d9b to db/d74/d87/d100/d10a 0 2026-03-09T00:04:05.294 INFO:tasks.workunit.client.1.vm06.stdout:8/831: write db/dd/d24/da7/fc2 [27584,46748] 0 2026-03-09T00:04:05.294 INFO:tasks.workunit.client.1.vm06.stdout:8/832: write db/d1e/f4f [1914581,119461] 0 2026-03-09T00:04:05.297 INFO:tasks.workunit.client.0.vm03.stdout:0/573: dwrite d2/da/dd/d49/d6c/f52 [0,4194304] 0 2026-03-09T00:04:05.297 INFO:tasks.workunit.client.0.vm03.stdout:0/574: fdatasync d2/ff 0 2026-03-09T00:04:05.298 INFO:tasks.workunit.client.0.vm03.stdout:0/575: dread d2/f7f [0,4194304] 0 2026-03-09T00:04:05.298 INFO:tasks.workunit.client.0.vm03.stdout:0/576: fsync d2/da/dd/f75 0 2026-03-09T00:04:05.304 INFO:tasks.workunit.client.0.vm03.stdout:0/577: dread d2/f22 [0,4194304] 0 2026-03-09T00:04:05.307 INFO:tasks.workunit.client.1.vm06.stdout:0/854: getdents d3/d18/d2c/d2d/d74/daf 0 2026-03-09T00:04:05.316 INFO:tasks.workunit.client.0.vm03.stdout:4/711: unlink d7/d20/d6a/d77/db7/f91 0 2026-03-09T00:04:05.334 INFO:tasks.workunit.client.0.vm03.stdout:1/652: mkdir d4/d3a/d61/d78/d81/d9e/de1 0 2026-03-09T00:04:05.334 INFO:tasks.workunit.client.0.vm03.stdout:1/653: chown d4/d15/f8a 0 1 2026-03-09T00:04:05.344 INFO:tasks.workunit.client.0.vm03.stdout:3/437: rename d2/db/d40/d51/c5b to d2/db/d56/c81 0 2026-03-09T00:04:05.357 INFO:tasks.workunit.client.0.vm03.stdout:0/578: creat d2/da/d1a/fd5 x:0 0 0 2026-03-09T00:04:05.359 INFO:tasks.workunit.client.1.vm06.stdout:9/720: read d1/d4/d6e/d14/d25/d85/f5a [68600,63398] 0 2026-03-09T00:04:05.360 INFO:tasks.workunit.client.1.vm06.stdout:5/908: write d5/d1c/d68/dec/d115/d11e/d92/d49/f83 [1701261,48104] 0 2026-03-09T00:04:05.360 INFO:tasks.workunit.client.1.vm06.stdout:5/909: readlink d5/l1e 0 2026-03-09T00:04:05.361 INFO:tasks.workunit.client.0.vm03.stdout:1/654: mkdir d4/de2 0 2026-03-09T00:04:05.366 INFO:tasks.workunit.client.0.vm03.stdout:2/577: sync 2026-03-09T00:04:05.370 INFO:tasks.workunit.client.1.vm06.stdout:6/815: symlink d4/d27/d3e/d57/lfc 0 2026-03-09T00:04:05.370 INFO:tasks.workunit.client.1.vm06.stdout:6/816: chown d4/c20 2261337 1 2026-03-09T00:04:05.376 INFO:tasks.workunit.client.0.vm03.stdout:3/438: rename d2/db/c66 to d2/db/d3b/d5d/c82 0 2026-03-09T00:04:05.376 INFO:tasks.workunit.client.0.vm03.stdout:3/439: fsync d2/db/d40/f78 0 2026-03-09T00:04:05.379 INFO:tasks.workunit.client.1.vm06.stdout:7/815: truncate d0/df/d1a/d27/d4c/d40/fa5 1081257 0 2026-03-09T00:04:05.379 INFO:tasks.workunit.client.1.vm06.stdout:7/816: fdatasync d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc9 0 2026-03-09T00:04:05.382 INFO:tasks.workunit.client.0.vm03.stdout:0/579: symlink d2/da/dd/d49/d6c/da6/dcf/ld6 0 2026-03-09T00:04:05.388 INFO:tasks.workunit.client.1.vm06.stdout:2/911: rename d7/da/db/f98 to d7/da/d93/f11f 0 2026-03-09T00:04:05.389 INFO:tasks.workunit.client.0.vm03.stdout:8/585: dwrite d7/df/f55 [0,4194304] 0 2026-03-09T00:04:05.402 INFO:tasks.workunit.client.0.vm03.stdout:1/655: symlink d4/d15/d1a/le3 0 2026-03-09T00:04:05.411 INFO:tasks.workunit.client.1.vm06.stdout:8/833: mknod db/d74/d78/d98/db6/dc7/c10b 0 2026-03-09T00:04:05.418 INFO:tasks.workunit.client.0.vm03.stdout:3/440: creat d2/db/d6a/f83 x:0 0 0 2026-03-09T00:04:05.418 INFO:tasks.workunit.client.0.vm03.stdout:3/441: chown d2/db/d3b/d3f/c49 182817 1 2026-03-09T00:04:05.424 INFO:tasks.workunit.client.1.vm06.stdout:9/721: mknod d1/d3/d50/ceb 0 2026-03-09T00:04:05.424 INFO:tasks.workunit.client.1.vm06.stdout:9/722: chown d1/d4/d6e/d14/lb0 117992295 1 2026-03-09T00:04:05.427 INFO:tasks.workunit.client.1.vm06.stdout:5/910: creat d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/f12d x:0 0 0 2026-03-09T00:04:05.427 INFO:tasks.workunit.client.0.vm03.stdout:5/575: rename d1c/d51/l6c to d1c/d20/d55/d66/d6b/d8f/lb9 0 2026-03-09T00:04:05.427 INFO:tasks.workunit.client.0.vm03.stdout:5/576: truncate d1c/d20/d56/db4/fb7 212110 0 2026-03-09T00:04:05.433 INFO:tasks.workunit.client.0.vm03.stdout:8/586: unlink d7/df/d1a/c5b 0 2026-03-09T00:04:05.448 INFO:tasks.workunit.client.0.vm03.stdout:2/578: getdents d8/d26/d5e/d6f/d97 0 2026-03-09T00:04:05.449 INFO:tasks.workunit.client.0.vm03.stdout:5/577: mknod d1c/d20/d55/d66/d6b/cba 0 2026-03-09T00:04:05.449 INFO:tasks.workunit.client.0.vm03.stdout:5/578: fsync d1c/d20/d55/d4f/d58/fa0 0 2026-03-09T00:04:05.449 INFO:tasks.workunit.client.1.vm06.stdout:7/817: symlink d0/df/d1a/d27/d70/led 0 2026-03-09T00:04:05.449 INFO:tasks.workunit.client.1.vm06.stdout:3/823: rename d11/d28/d2e/d2f/f49 to d11/d28/d2e/dff/f121 0 2026-03-09T00:04:05.449 INFO:tasks.workunit.client.1.vm06.stdout:8/834: creat db/d74/d87/d100/d10a/f10c x:0 0 0 2026-03-09T00:04:05.451 INFO:tasks.workunit.client.1.vm06.stdout:0/855: dwrite d3/d18/d2c/d2d/d74/dc7/d110/f86 [0,4194304] 0 2026-03-09T00:04:05.453 INFO:tasks.workunit.client.0.vm03.stdout:2/579: dread d8/d26/d5e/d6f/d97/f27 [0,4194304] 0 2026-03-09T00:04:05.455 INFO:tasks.workunit.client.0.vm03.stdout:8/587: write d7/df/d1a/d40/f76 [1628860,72885] 0 2026-03-09T00:04:05.455 INFO:tasks.workunit.client.0.vm03.stdout:4/712: dwrite d7/fa7 [0,4194304] 0 2026-03-09T00:04:05.456 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: [09/Mar/2026:00:04:03] ENGINE Bus STARTING 2026-03-09T00:04:05.456 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: [09/Mar/2026:00:04:03] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T00:04:05.456 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: [09/Mar/2026:00:04:04] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T00:04:05.456 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: [09/Mar/2026:00:04:04] ENGINE Bus STARTED 2026-03-09T00:04:05.456 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: [09/Mar/2026:00:04:04] ENGINE Client ('192.168.123.103', 54524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T00:04:05.457 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.457 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: mgrmap e30: vm03.yvcons(active, since 4s), standbys: vm06.rzcvhn 2026-03-09T00:04:05.457 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.457 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.457 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:05 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.486 INFO:tasks.workunit.client.0.vm03.stdout:3/442: rename d2/db/l22 to d2/db/d56/l84 0 2026-03-09T00:04:05.488 INFO:tasks.workunit.client.0.vm03.stdout:3/443: dread d2/db/d3b/d5d/f60 [0,4194304] 0 2026-03-09T00:04:05.488 INFO:tasks.workunit.client.0.vm03.stdout:3/444: fsync d2/db/d3b/d3f/f7c 0 2026-03-09T00:04:05.490 INFO:tasks.workunit.client.0.vm03.stdout:0/580: dwrite d2/da/dd/f11 [4194304,4194304] 0 2026-03-09T00:04:05.490 INFO:tasks.workunit.client.1.vm06.stdout:6/817: dwrite d4/f12 [8388608,4194304] 0 2026-03-09T00:04:05.490 INFO:tasks.workunit.client.1.vm06.stdout:6/818: chown d4/d27/d3e/d45/f4d 4 1 2026-03-09T00:04:05.495 INFO:tasks.workunit.client.1.vm06.stdout:9/723: link d1/d3/d4f/d52/f5e d1/d4/d6e/d14/d25/d85/d49/fec 0 2026-03-09T00:04:05.508 INFO:tasks.workunit.client.0.vm03.stdout:8/588: creat d7/df/d1e/d3f/d95/fb4 x:0 0 0 2026-03-09T00:04:05.508 INFO:tasks.workunit.client.1.vm06.stdout:5/911: mkdir d5/d1c/d68/dec/d115/d11e/d92/d95/d12e 0 2026-03-09T00:04:05.508 INFO:tasks.workunit.client.1.vm06.stdout:7/818: symlink d0/df/d1a/d35/lee 0 2026-03-09T00:04:05.508 INFO:tasks.workunit.client.1.vm06.stdout:1/742: rmdir d6/d4c 39 2026-03-09T00:04:05.508 INFO:tasks.workunit.client.1.vm06.stdout:8/835: unlink db/d53/d6d/fa2 0 2026-03-09T00:04:05.511 INFO:tasks.workunit.client.1.vm06.stdout:1/743: dread d6/d63/f99 [0,4194304] 0 2026-03-09T00:04:05.520 INFO:tasks.workunit.client.1.vm06.stdout:2/912: dwrite d7/d1b/da5/dca/fe9 [0,4194304] 0 2026-03-09T00:04:05.520 INFO:tasks.workunit.client.0.vm03.stdout:5/579: dwrite d1c/d20/f4e [4194304,4194304] 0 2026-03-09T00:04:05.521 INFO:tasks.workunit.client.1.vm06.stdout:2/913: write d7/d1a/d25/d66/d87/fc3 [44449,101744] 0 2026-03-09T00:04:05.521 INFO:tasks.workunit.client.1.vm06.stdout:2/914: chown d7/da/d4e/d57 0 1 2026-03-09T00:04:05.523 INFO:tasks.workunit.client.0.vm03.stdout:4/713: mkdir d7/d20/d6a/d77/d25/de2 0 2026-03-09T00:04:05.524 INFO:tasks.workunit.client.1.vm06.stdout:3/824: dwrite d11/d28/d4d/d89/d90/dd2/fef [4194304,4194304] 0 2026-03-09T00:04:05.544 INFO:tasks.workunit.client.1.vm06.stdout:0/856: rename d3/d18/d2c/d2d/d74/fbc to d3/d18/d1f/d119/f123 0 2026-03-09T00:04:05.544 INFO:tasks.workunit.client.1.vm06.stdout:0/857: stat d3/d18/d2c/d2d/d74/daf/lf4 0 2026-03-09T00:04:05.550 INFO:tasks.workunit.client.0.vm03.stdout:2/580: dwrite d8/d26/d5e/d6f/d97/f1c [0,4194304] 0 2026-03-09T00:04:05.567 INFO:tasks.workunit.client.0.vm03.stdout:3/445: unlink d2/db/d3b/f4f 0 2026-03-09T00:04:05.569 INFO:tasks.workunit.client.1.vm06.stdout:6/819: stat d4/d16/d53/ddf/ce7 0 2026-03-09T00:04:05.574 INFO:tasks.workunit.client.0.vm03.stdout:0/581: mknod d2/da/d76/d8a/cd7 0 2026-03-09T00:04:05.574 INFO:tasks.workunit.client.0.vm03.stdout:0/582: creat d2/da/d1a/fd8 x:0 0 0 2026-03-09T00:04:05.575 INFO:tasks.workunit.client.1.vm06.stdout:9/724: truncate d1/d4/f24 2146836 0 2026-03-09T00:04:05.583 INFO:tasks.workunit.client.0.vm03.stdout:8/589: creat d7/df/d1a/d40/fb5 x:0 0 0 2026-03-09T00:04:05.596 INFO:tasks.workunit.client.0.vm03.stdout:8/590: stat d7/df/f31 0 2026-03-09T00:04:05.596 INFO:tasks.workunit.client.0.vm03.stdout:7/557: sync 2026-03-09T00:04:05.596 INFO:tasks.workunit.client.0.vm03.stdout:6/532: sync 2026-03-09T00:04:05.596 INFO:tasks.workunit.client.1.vm06.stdout:5/912: mkdir d5/d1c/d21/d28/d12f 0 2026-03-09T00:04:05.597 INFO:tasks.workunit.client.1.vm06.stdout:7/819: creat d0/df/d1a/d3f/d53/fef x:0 0 0 2026-03-09T00:04:05.597 INFO:tasks.workunit.client.1.vm06.stdout:7/820: write d0/df/d1a/d27/d4c/d40/d51/d90/dcc/fce [658878,77179] 0 2026-03-09T00:04:05.597 INFO:tasks.workunit.client.1.vm06.stdout:4/833: sync 2026-03-09T00:04:05.600 INFO:tasks.workunit.client.1.vm06.stdout:5/913: dread d5/d1c/d68/dec/d115/d11e/d92/f86 [0,4194304] 0 2026-03-09T00:04:05.600 INFO:tasks.workunit.client.0.vm03.stdout:8/591: dread d7/df/d1a/d40/d58/f7f [0,4194304] 0 2026-03-09T00:04:05.604 INFO:tasks.workunit.client.1.vm06.stdout:5/914: dread d5/f36 [0,4194304] 0 2026-03-09T00:04:05.605 INFO:tasks.workunit.client.1.vm06.stdout:8/836: symlink db/d74/d87/d100/l10d 0 2026-03-09T00:04:05.609 INFO:tasks.workunit.client.0.vm03.stdout:2/581: getdents d8/d1b/d2a/d2e/d9a 0 2026-03-09T00:04:05.619 INFO:tasks.workunit.client.1.vm06.stdout:3/825: dwrite d11/d28/d57/f7b [0,4194304] 0 2026-03-09T00:04:05.619 INFO:tasks.workunit.client.1.vm06.stdout:3/826: chown d11/d28/d2e/d7e/d104 3406 1 2026-03-09T00:04:05.619 INFO:tasks.workunit.client.1.vm06.stdout:3/827: fsync d11/d28/d2e/d2f/d36/faf 0 2026-03-09T00:04:05.619 INFO:tasks.workunit.client.0.vm03.stdout:7/558: dread d2/d1f/d3a/d24/da4/d91/d67/f8a [0,4194304] 0 2026-03-09T00:04:05.619 INFO:tasks.workunit.client.0.vm03.stdout:7/559: fdatasync d2/d1f/d3a/d24/da4/f7f 0 2026-03-09T00:04:05.621 INFO:tasks.workunit.client.0.vm03.stdout:5/580: dwrite d1c/d20/d97/fb3 [0,4194304] 0 2026-03-09T00:04:05.623 INFO:tasks.workunit.client.0.vm03.stdout:4/714: dwrite d7/d20/d29/d38/da9/ddc/f65 [0,4194304] 0 2026-03-09T00:04:05.625 INFO:tasks.workunit.client.0.vm03.stdout:3/446: dwrite d2/db/d3b/f63 [0,4194304] 0 2026-03-09T00:04:05.631 INFO:tasks.workunit.client.0.vm03.stdout:3/447: dread d2/db/f3a [0,4194304] 0 2026-03-09T00:04:05.631 INFO:tasks.workunit.client.0.vm03.stdout:3/448: dread - d2/db/d40/d51/f57 zero size 2026-03-09T00:04:05.635 INFO:tasks.workunit.client.1.vm06.stdout:1/744: rename d6/d4c/d79/c61 to d6/d21/d2d/d3b/d87/cfa 0 2026-03-09T00:04:05.640 INFO:tasks.workunit.client.1.vm06.stdout:2/915: fsync d7/d1a/d25/d66/d87/fc3 0 2026-03-09T00:04:05.641 INFO:tasks.workunit.client.0.vm03.stdout:0/583: rmdir d2/da/dd/d49/d6c/d4b 39 2026-03-09T00:04:05.644 INFO:tasks.workunit.client.1.vm06.stdout:0/858: mknod d3/d18/d1f/d44/c124 0 2026-03-09T00:04:05.654 INFO:tasks.workunit.client.0.vm03.stdout:6/533: rename f8 to d13/d35/fb6 0 2026-03-09T00:04:05.654 INFO:tasks.workunit.client.0.vm03.stdout:6/534: creat d13/d35/d72/fb7 x:0 0 0 2026-03-09T00:04:05.654 INFO:tasks.workunit.client.0.vm03.stdout:6/535: truncate d13/f5b 259184 0 2026-03-09T00:04:05.654 INFO:tasks.workunit.client.0.vm03.stdout:6/536: chown d13/d35/c76 183548 1 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: [09/Mar/2026:00:04:03] ENGINE Bus STARTING 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: [09/Mar/2026:00:04:03] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: [09/Mar/2026:00:04:04] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: [09/Mar/2026:00:04:04] ENGINE Bus STARTED 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: [09/Mar/2026:00:04:04] ENGINE Client ('192.168.123.103', 54524) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: mgrmap e30: vm03.yvcons(active, since 4s), standbys: vm06.rzcvhn 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:05 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:05.671 INFO:tasks.workunit.client.0.vm03.stdout:2/582: symlink d8/d1b/d6c/lbe 0 2026-03-09T00:04:05.671 INFO:tasks.workunit.client.0.vm03.stdout:2/583: truncate d8/d26/d5e/f7c 781282 0 2026-03-09T00:04:05.671 INFO:tasks.workunit.client.0.vm03.stdout:2/584: chown d8/d26/d5e/f64 1659 1 2026-03-09T00:04:05.673 INFO:tasks.workunit.client.1.vm06.stdout:9/725: read d1/d4/d6e/d9/f10 [283209,4618] 0 2026-03-09T00:04:05.675 INFO:tasks.workunit.client.1.vm06.stdout:7/821: getdents d0/df/d1a/d3a/d4e/d5e 0 2026-03-09T00:04:05.675 INFO:tasks.workunit.client.0.vm03.stdout:2/585: write d8/d1b/f31 [2678844,3164] 0 2026-03-09T00:04:05.680 INFO:tasks.workunit.client.0.vm03.stdout:7/560: dwrite d2/f4d [0,4194304] 0 2026-03-09T00:04:05.682 INFO:tasks.workunit.client.0.vm03.stdout:7/561: dread d2/d1f/d3a/d24/da4/d91/d67/f95 [0,4194304] 0 2026-03-09T00:04:05.692 INFO:tasks.workunit.client.1.vm06.stdout:1/745: dwrite d6/d21/d2d/d3b/d87/d9d/dd8/fdb [0,4194304] 0 2026-03-09T00:04:05.699 INFO:tasks.workunit.client.1.vm06.stdout:5/915: unlink d5/d44/l74 0 2026-03-09T00:04:05.699 INFO:tasks.workunit.client.1.vm06.stdout:5/916: read - d5/d1c/d21/f111 zero size 2026-03-09T00:04:05.699 INFO:tasks.workunit.client.1.vm06.stdout:4/834: link d17/d24/c93 d17/d21/d4c/c11e 0 2026-03-09T00:04:05.699 INFO:tasks.workunit.client.0.vm03.stdout:9/600: sync 2026-03-09T00:04:05.701 INFO:tasks.workunit.client.1.vm06.stdout:5/917: dread d5/d1c/d23/d34/fb2 [0,4194304] 0 2026-03-09T00:04:05.701 INFO:tasks.workunit.client.1.vm06.stdout:5/918: truncate d5/d1c/d68/dec/d115/d11e/fa9 5167630 0 2026-03-09T00:04:05.706 INFO:tasks.workunit.client.1.vm06.stdout:8/837: mkdir db/d74/d78/d98/db6/dc7/d10e 0 2026-03-09T00:04:05.706 INFO:tasks.workunit.client.1.vm06.stdout:8/838: read db/dd/f27 [1222245,111100] 0 2026-03-09T00:04:05.706 INFO:tasks.workunit.client.1.vm06.stdout:8/839: dread - db/dd/d24/da7/dab/fdc zero size 2026-03-09T00:04:05.710 INFO:tasks.workunit.client.1.vm06.stdout:3/828: unlink d11/d28/d2e/d2f/d5b/ddb/f11d 0 2026-03-09T00:04:05.710 INFO:tasks.workunit.client.1.vm06.stdout:3/829: dread - d11/d28/d2e/db2/dc2/fd1 zero size 2026-03-09T00:04:05.714 INFO:tasks.workunit.client.0.vm03.stdout:3/449: readlink d2/l19 0 2026-03-09T00:04:05.716 INFO:tasks.workunit.client.1.vm06.stdout:2/916: truncate d7/d1b/fd8 3915188 0 2026-03-09T00:04:05.720 INFO:tasks.workunit.client.1.vm06.stdout:9/726: dwrite d1/d3/d2b/f6d [0,4194304] 0 2026-03-09T00:04:05.724 INFO:tasks.workunit.client.1.vm06.stdout:4/835: dwrite d17/f35 [8388608,4194304] 0 2026-03-09T00:04:05.734 INFO:tasks.workunit.client.0.vm03.stdout:3/450: dread d2/f1d [0,4194304] 0 2026-03-09T00:04:05.740 INFO:tasks.workunit.client.0.vm03.stdout:0/584: creat d2/da/d76/fd9 x:0 0 0 2026-03-09T00:04:05.740 INFO:tasks.workunit.client.0.vm03.stdout:0/585: write d2/f22 [3665073,86648] 0 2026-03-09T00:04:05.740 INFO:tasks.workunit.client.0.vm03.stdout:0/586: write d2/da/dd/d49/d6c/f89 [290636,2656] 0 2026-03-09T00:04:05.745 INFO:tasks.workunit.client.1.vm06.stdout:0/859: creat d3/d18/d2c/d2d/d8c/d10a/f125 x:0 0 0 2026-03-09T00:04:05.757 INFO:tasks.workunit.client.0.vm03.stdout:4/715: rename d7/d20/d29/d4e/fe1 to d7/d6f/da5/fe3 0 2026-03-09T00:04:05.763 INFO:tasks.workunit.client.1.vm06.stdout:6/820: getdents d4/d27/d3e/d57 0 2026-03-09T00:04:05.767 INFO:tasks.workunit.client.1.vm06.stdout:7/822: unlink d0/leb 0 2026-03-09T00:04:05.767 INFO:tasks.workunit.client.1.vm06.stdout:7/823: stat d0/df/d1a/d3a/d4e/d5e/f6f 0 2026-03-09T00:04:05.784 INFO:tasks.workunit.client.0.vm03.stdout:6/537: unlink d13/d35/c76 0 2026-03-09T00:04:05.785 INFO:tasks.workunit.client.0.vm03.stdout:1/656: sync 2026-03-09T00:04:05.785 INFO:tasks.workunit.client.0.vm03.stdout:1/657: dread - d4/d3a/d32/fb9 zero size 2026-03-09T00:04:05.785 INFO:tasks.workunit.client.0.vm03.stdout:1/658: fsync d4/d15/d77/fbd 0 2026-03-09T00:04:05.798 INFO:tasks.workunit.client.0.vm03.stdout:2/586: symlink d8/d26/d5e/d6f/d97/lbf 0 2026-03-09T00:04:05.798 INFO:tasks.workunit.client.0.vm03.stdout:2/587: chown d8/c76 0 1 2026-03-09T00:04:05.798 INFO:tasks.workunit.client.0.vm03.stdout:2/588: fdatasync d8/d1b/d24/fb2 0 2026-03-09T00:04:05.802 INFO:tasks.workunit.client.0.vm03.stdout:5/581: mknod d1c/d20/d55/d43/cbb 0 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:1/746: symlink d6/d21/d2d/d3b/dc9/lfb 0 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:5/919: symlink d5/d1c/d68/dec/l130 0 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:5/920: chown d5/c6 0 1 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:5/921: stat d5/d1c/d68/da2/d11f 0 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:3/830: creat d11/d28/f122 x:0 0 0 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:8/840: mkdir db/d74/d78/d98/db6/dc7/d10f 0 2026-03-09T00:04:05.807 INFO:tasks.workunit.client.1.vm06.stdout:8/841: dread - db/d53/d70/d38/fa8 zero size 2026-03-09T00:04:05.812 INFO:tasks.workunit.client.0.vm03.stdout:3/451: dwrite d2/f16 [0,4194304] 0 2026-03-09T00:04:05.820 INFO:tasks.workunit.client.1.vm06.stdout:2/917: getdents d7/d1b/da5/dca 0 2026-03-09T00:04:05.820 INFO:tasks.workunit.client.1.vm06.stdout:2/918: truncate d7/da/d93/ff3 168336 0 2026-03-09T00:04:05.820 INFO:tasks.workunit.client.1.vm06.stdout:2/919: write d7/d1a/d89/fb7 [1403104,4572] 0 2026-03-09T00:04:05.821 INFO:tasks.workunit.client.0.vm03.stdout:0/587: truncate d2/da/dd/f75 4060028 0 2026-03-09T00:04:05.823 INFO:tasks.workunit.client.0.vm03.stdout:3/452: dread d2/db/d56/f7d [0,4194304] 0 2026-03-09T00:04:05.823 INFO:tasks.workunit.client.0.vm03.stdout:0/588: write d2/da/dd/d49/d6c/d81/db5/dba/fbc [165146,114146] 0 2026-03-09T00:04:05.831 INFO:tasks.workunit.client.0.vm03.stdout:7/562: rename d2/d1f/d35/d9a to d2/d1f/d3a/d24/da4/d46/d54/d8d/dad 0 2026-03-09T00:04:05.832 INFO:tasks.workunit.client.0.vm03.stdout:3/453: write d2/f4e [916563,79160] 0 2026-03-09T00:04:05.835 INFO:tasks.workunit.client.1.vm06.stdout:9/727: truncate d1/d4/d6e/d14/d25/d85/f90 2105845 0 2026-03-09T00:04:05.838 INFO:tasks.workunit.client.0.vm03.stdout:6/538: mknod d13/d35/d71/cb8 0 2026-03-09T00:04:05.844 INFO:tasks.workunit.client.0.vm03.stdout:4/716: creat d7/d20/d6a/fe4 x:0 0 0 2026-03-09T00:04:05.844 INFO:tasks.workunit.client.0.vm03.stdout:5/582: creat d1c/d20/d55/fbc x:0 0 0 2026-03-09T00:04:05.844 INFO:tasks.workunit.client.0.vm03.stdout:5/583: chown d1c/d20/d55/d66/c78 15095334 1 2026-03-09T00:04:05.844 INFO:tasks.workunit.client.0.vm03.stdout:5/584: stat d1c/f1e 0 2026-03-09T00:04:05.848 INFO:tasks.workunit.client.1.vm06.stdout:7/824: rename d0/d55/d99/faa to d0/df/d1a/d35/ff0 0 2026-03-09T00:04:05.850 INFO:tasks.workunit.client.1.vm06.stdout:5/922: link d5/d1c/d68/dec/d115/d11e/d92/f40 d5/d1c/d23/d34/f131 0 2026-03-09T00:04:05.851 INFO:tasks.workunit.client.1.vm06.stdout:0/860: dwrite d3/d18/d1f/d39/d69/f71 [0,4194304] 0 2026-03-09T00:04:05.851 INFO:tasks.workunit.client.0.vm03.stdout:9/601: rmdir d15/d1c/d21/d54/dab 39 2026-03-09T00:04:05.851 INFO:tasks.workunit.client.0.vm03.stdout:9/602: chown d15/d1c/d21/d64/c35 19 1 2026-03-09T00:04:05.856 INFO:tasks.workunit.client.1.vm06.stdout:3/831: unlink d11/d28/d2e/d2f/d36/d8f/lf3 0 2026-03-09T00:04:05.856 INFO:tasks.workunit.client.1.vm06.stdout:3/832: chown d11/d28/d2e/l6d 0 1 2026-03-09T00:04:05.861 INFO:tasks.workunit.client.0.vm03.stdout:9/603: write fd [1761628,67753] 0 2026-03-09T00:04:05.862 INFO:tasks.workunit.client.0.vm03.stdout:7/563: link d2/d1f/d3a/d24/da4/d91/d67/f8a d2/d4/d1e/fae 0 2026-03-09T00:04:05.863 INFO:tasks.workunit.client.1.vm06.stdout:8/842: rmdir db 39 2026-03-09T00:04:05.875 INFO:tasks.workunit.client.1.vm06.stdout:8/843: dread db/d53/d70/f75 [0,4194304] 0 2026-03-09T00:04:05.876 INFO:tasks.workunit.client.0.vm03.stdout:3/454: creat d2/db/d3b/d5d/d6d/f85 x:0 0 0 2026-03-09T00:04:05.877 INFO:tasks.workunit.client.1.vm06.stdout:6/821: dwrite d4/d16/d46/f89 [0,4194304] 0 2026-03-09T00:04:05.879 INFO:tasks.workunit.client.0.vm03.stdout:6/539: unlink d13/d35/d71/d97/f86 0 2026-03-09T00:04:05.879 INFO:tasks.workunit.client.0.vm03.stdout:6/540: write d13/d35/d72/fb7 [787785,83490] 0 2026-03-09T00:04:05.893 INFO:tasks.workunit.client.0.vm03.stdout:2/589: rmdir d8/d1b/d2a/d2e 39 2026-03-09T00:04:05.893 INFO:tasks.workunit.client.0.vm03.stdout:2/590: creat d8/d1b/d2a/d6b/d50/d8a/fc0 x:0 0 0 2026-03-09T00:04:05.893 INFO:tasks.workunit.client.0.vm03.stdout:2/591: write d8/d26/d5e/d6f/f98 [503891,34331] 0 2026-03-09T00:04:05.893 INFO:tasks.workunit.client.1.vm06.stdout:4/836: dwrite d17/d21/d32/fbd [0,4194304] 0 2026-03-09T00:04:05.896 INFO:tasks.workunit.client.0.vm03.stdout:1/659: dwrite d4/d3a/d3d/f58 [0,4194304] 0 2026-03-09T00:04:05.897 INFO:tasks.workunit.client.0.vm03.stdout:1/660: fdatasync d4/d6/f90 0 2026-03-09T00:04:05.902 INFO:tasks.workunit.client.0.vm03.stdout:9/604: symlink d15/d1c/d21/d64/lc6 0 2026-03-09T00:04:05.902 INFO:tasks.workunit.client.0.vm03.stdout:9/605: stat d15/d1c/d21/d64/f3d 0 2026-03-09T00:04:05.902 INFO:tasks.workunit.client.1.vm06.stdout:7/825: symlink d0/df/d1a/d3f/de8/lf1 0 2026-03-09T00:04:05.905 INFO:tasks.workunit.client.0.vm03.stdout:1/661: dread d4/d3a/d32/f4f [0,4194304] 0 2026-03-09T00:04:05.905 INFO:tasks.workunit.client.0.vm03.stdout:1/662: getdents d4/d3a/d61/da6/dc3 0 2026-03-09T00:04:05.905 INFO:tasks.workunit.client.0.vm03.stdout:1/663: readlink d4/d15/d1a/lab 0 2026-03-09T00:04:05.905 INFO:tasks.workunit.client.0.vm03.stdout:1/664: creat d4/d15/d77/fe4 x:0 0 0 2026-03-09T00:04:05.907 INFO:tasks.workunit.client.1.vm06.stdout:2/920: dread d7/da/d4e/d57/f7a [0,4194304] 0 2026-03-09T00:04:05.907 INFO:tasks.workunit.client.0.vm03.stdout:0/589: rename d2/da/dd/d49/d6c/d81 to d2/da/dd/d49/d6c/da6/dda 0 2026-03-09T00:04:05.922 INFO:tasks.workunit.client.1.vm06.stdout:5/923: getdents d5/d1c/d23/d34/d47/dcf 0 2026-03-09T00:04:05.925 INFO:tasks.workunit.client.0.vm03.stdout:6/541: read d13/d1e/d44/d59/d77/f96 [1992758,123853] 0 2026-03-09T00:04:05.925 INFO:tasks.workunit.client.0.vm03.stdout:6/542: write d13/d1e/f2d [3745712,36024] 0 2026-03-09T00:04:05.925 INFO:tasks.workunit.client.0.vm03.stdout:8/592: dwrite d7/df/d1a/f4f [0,4194304] 0 2026-03-09T00:04:05.928 INFO:tasks.workunit.client.0.vm03.stdout:3/455: creat d2/db/d3b/d5d/d6d/d72/f86 x:0 0 0 2026-03-09T00:04:05.939 INFO:tasks.workunit.client.0.vm03.stdout:3/456: readlink d2/db/d40/d44/l53 0 2026-03-09T00:04:05.939 INFO:tasks.workunit.client.0.vm03.stdout:3/457: write d2/db/f14 [8686306,28980] 0 2026-03-09T00:04:05.939 INFO:tasks.workunit.client.0.vm03.stdout:2/592: symlink d8/d26/d5e/d5f/lc1 0 2026-03-09T00:04:05.940 INFO:tasks.workunit.client.0.vm03.stdout:2/593: fsync d8/d26/d5e/d6f/d97/f27 0 2026-03-09T00:04:05.940 INFO:tasks.workunit.client.0.vm03.stdout:2/594: chown d8/d26/d5e/cb8 108841 1 2026-03-09T00:04:05.940 INFO:tasks.workunit.client.0.vm03.stdout:2/595: chown d8/d1b/d2a/d6b/d50/f63 16 1 2026-03-09T00:04:05.940 INFO:tasks.workunit.client.1.vm06.stdout:0/861: creat d3/d18/d1f/d39/d3b/df9/df2/d73/f126 x:0 0 0 2026-03-09T00:04:05.940 INFO:tasks.workunit.client.1.vm06.stdout:0/862: dread - d3/d18/d2c/d2d/d74/da8/d109/f11d zero size 2026-03-09T00:04:05.943 INFO:tasks.workunit.client.0.vm03.stdout:2/596: write d8/d1b/d2a/f4c [7874383,26368] 0 2026-03-09T00:04:05.947 INFO:tasks.workunit.client.0.vm03.stdout:4/717: unlink d7/d20/d6a/d77/fae 0 2026-03-09T00:04:05.952 INFO:tasks.workunit.client.1.vm06.stdout:8/844: unlink db/d1e/f25 0 2026-03-09T00:04:05.952 INFO:tasks.workunit.client.1.vm06.stdout:8/845: stat db/d74/d78/d98/db6/dc7/d101/db7/de8/f106 0 2026-03-09T00:04:05.956 INFO:tasks.workunit.client.1.vm06.stdout:9/728: dwrite d1/d3/f5c [4194304,4194304] 0 2026-03-09T00:04:05.963 INFO:tasks.workunit.client.1.vm06.stdout:8/846: write db/d74/d87/d100/f95 [3708540,28978] 0 2026-03-09T00:04:05.979 INFO:tasks.workunit.client.1.vm06.stdout:6/822: symlink d4/d16/d53/ddf/da6/lfd 0 2026-03-09T00:04:05.979 INFO:tasks.workunit.client.0.vm03.stdout:1/665: mkdir d4/d15/de5 0 2026-03-09T00:04:05.980 INFO:tasks.workunit.client.0.vm03.stdout:1/666: write d4/d3a/d61/d78/f8e [4191666,103151] 0 2026-03-09T00:04:05.980 INFO:tasks.workunit.client.0.vm03.stdout:1/667: write d4/d15/d86/fad [4973947,108473] 0 2026-03-09T00:04:05.984 INFO:tasks.workunit.client.0.vm03.stdout:0/590: rename d2/f1e to d2/da/dd/d49/d6c/d4b/fdb 0 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.1.vm06.stdout:4/837: link d17/d24/d3b/f113 d17/d21/d4c/dc2/f11f 0 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.1.vm06.stdout:4/838: readlink d17/d24/d3b/dbf/ddf/lf6 0 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.1.vm06.stdout:4/839: chown d17/d21/d4c/d66/dd9 31930 1 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.1.vm06.stdout:4/840: dread - d17/d5b/dac/fec zero size 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.1.vm06.stdout:4/841: stat d17/f20 0 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.1.vm06.stdout:4/842: write d17/d21/fb8 [77049,126436] 0 2026-03-09T00:04:05.985 INFO:tasks.workunit.client.0.vm03.stdout:6/543: mknod d13/d35/d72/cb9 0 2026-03-09T00:04:05.987 INFO:tasks.workunit.client.0.vm03.stdout:3/458: mkdir d2/db/d40/d44/d87 0 2026-03-09T00:04:05.992 INFO:tasks.workunit.client.0.vm03.stdout:2/597: mknod d8/d74/cc2 0 2026-03-09T00:04:05.992 INFO:tasks.workunit.client.0.vm03.stdout:2/598: write d8/d1b/d6c/f90 [418481,68658] 0 2026-03-09T00:04:05.992 INFO:tasks.workunit.client.1.vm06.stdout:5/924: creat d5/d1c/d68/dec/d115/d11e/d92/d95/d12e/f132 x:0 0 0 2026-03-09T00:04:05.992 INFO:tasks.workunit.client.1.vm06.stdout:0/863: truncate d3/d18/d1f/d39/d49/d60/f113 751566 0 2026-03-09T00:04:05.998 INFO:tasks.workunit.client.0.vm03.stdout:1/668: mkdir d4/d3a/de6 0 2026-03-09T00:04:05.998 INFO:tasks.workunit.client.0.vm03.stdout:1/669: truncate d4/d3a/d61/d78/f79 400732 0 2026-03-09T00:04:06.001 INFO:tasks.workunit.client.0.vm03.stdout:2/599: dread d8/f5d [0,4194304] 0 2026-03-09T00:04:06.001 INFO:tasks.workunit.client.0.vm03.stdout:2/600: read - d8/d1b/f8d zero size 2026-03-09T00:04:06.001 INFO:tasks.workunit.client.0.vm03.stdout:2/601: write d8/d1b/d2a/f2d [1018890,81624] 0 2026-03-09T00:04:06.001 INFO:tasks.workunit.client.0.vm03.stdout:2/602: chown d8/d1b/f1f 11 1 2026-03-09T00:04:06.001 INFO:tasks.workunit.client.0.vm03.stdout:2/603: write d8/d1b/d2a/d56/fa4 [581999,50246] 0 2026-03-09T00:04:06.001 INFO:tasks.workunit.client.0.vm03.stdout:2/604: creat d8/d1b/d2a/d6b/d50/d8a/fc3 x:0 0 0 2026-03-09T00:04:06.007 INFO:tasks.workunit.client.0.vm03.stdout:0/591: symlink d2/da/d76/d8a/d8f/ldc 0 2026-03-09T00:04:06.007 INFO:tasks.workunit.client.0.vm03.stdout:0/592: creat d2/fdd x:0 0 0 2026-03-09T00:04:06.007 INFO:tasks.workunit.client.0.vm03.stdout:0/593: chown d2/da/dd/d49/d6c/l73 43185918 1 2026-03-09T00:04:06.007 INFO:tasks.workunit.client.0.vm03.stdout:0/594: read - d2/da/dd/d49/d6c/d4b/daf/fc6 zero size 2026-03-09T00:04:06.017 INFO:tasks.workunit.client.0.vm03.stdout:7/564: dwrite d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f56 [4194304,4194304] 0 2026-03-09T00:04:06.030 INFO:tasks.workunit.client.0.vm03.stdout:9/606: rename d15/d7f/fa3 to d15/d1c/d21/fc7 0 2026-03-09T00:04:06.033 INFO:tasks.workunit.client.0.vm03.stdout:6/544: creat d13/d35/d71/d97/da5/db1/fba x:0 0 0 2026-03-09T00:04:06.038 INFO:tasks.workunit.client.0.vm03.stdout:1/670: symlink d4/d3a/d3d/d46/le7 0 2026-03-09T00:04:06.038 INFO:tasks.workunit.client.0.vm03.stdout:1/671: write d4/d3a/d3d/d46/f5d [878360,109169] 0 2026-03-09T00:04:06.047 INFO:tasks.workunit.client.1.vm06.stdout:4/843: mknod d17/d24/d3b/dbf/ddf/dfc/c120 0 2026-03-09T00:04:06.049 INFO:tasks.workunit.client.1.vm06.stdout:5/925: creat d5/d1c/d68/dec/d115/f133 x:0 0 0 2026-03-09T00:04:06.049 INFO:tasks.workunit.client.1.vm06.stdout:5/926: fdatasync d5/fae 0 2026-03-09T00:04:06.050 INFO:tasks.workunit.client.0.vm03.stdout:0/595: rename d2/d5a/ca7 to d2/da/cde 0 2026-03-09T00:04:06.050 INFO:tasks.workunit.client.0.vm03.stdout:0/596: truncate d2/da/d4e/f9c 1898699 0 2026-03-09T00:04:06.051 INFO:tasks.workunit.client.1.vm06.stdout:0/864: mkdir d3/d18/d2c/d2d/d74/daf/d10d/d127 0 2026-03-09T00:04:06.051 INFO:tasks.workunit.client.0.vm03.stdout:1/672: symlink d4/d3a/d3d/le8 0 2026-03-09T00:04:06.053 INFO:tasks.workunit.client.0.vm03.stdout:1/673: read d4/d15/d86/f9b [962151,107358] 0 2026-03-09T00:04:06.053 INFO:tasks.workunit.client.1.vm06.stdout:9/729: getdents d1/d3/d2b/d58 0 2026-03-09T00:04:06.054 INFO:tasks.workunit.client.0.vm03.stdout:1/674: truncate d4/d3a/d32/d6a/f76 1487539 0 2026-03-09T00:04:06.055 INFO:tasks.workunit.client.0.vm03.stdout:0/597: mkdir d2/da/d36/ddf 0 2026-03-09T00:04:06.063 INFO:tasks.workunit.client.1.vm06.stdout:7/826: dwrite d0/df/d1a/d27/d4c/d40/d51/d86/fbd [0,4194304] 0 2026-03-09T00:04:06.064 INFO:tasks.workunit.client.0.vm03.stdout:9/607: dread d15/d1c/d21/d54/d87/d93/fba [0,4194304] 0 2026-03-09T00:04:06.064 INFO:tasks.workunit.client.0.vm03.stdout:9/608: fsync d15/d1c/d36/f6d 0 2026-03-09T00:04:06.065 INFO:tasks.workunit.client.0.vm03.stdout:6/545: dread d13/d1e/f30 [0,4194304] 0 2026-03-09T00:04:06.073 INFO:tasks.workunit.client.1.vm06.stdout:2/921: dwrite d7/da/d63/d81/dfe/db2/dc9/f10b [0,4194304] 0 2026-03-09T00:04:06.090 INFO:tasks.workunit.client.1.vm06.stdout:2/922: fsync d7/d1b/d71/d79/fdf 0 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.1.vm06.stdout:6/823: rmdir d4/d16/d53/d67 39 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:9/609: dread fb [0,4194304] 0 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:5/585: dwrite d1c/d20/d55/d4f/f69 [0,4194304] 0 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:1/675: rename d4/l10 to d4/d3a/d8f/le9 0 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:1/676: chown d4/d6 31676202 1 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:1/677: fsync f2 0 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:9/610: symlink d15/d1c/d21/d54/d87/d93/lc8 0 2026-03-09T00:04:06.091 INFO:tasks.workunit.client.0.vm03.stdout:5/586: symlink d1c/d20/d55/d4f/d58/db5/lbd 0 2026-03-09T00:04:06.096 INFO:tasks.workunit.client.1.vm06.stdout:4/844: mkdir d17/d24/d3b/d97/db7/d121 0 2026-03-09T00:04:06.096 INFO:tasks.workunit.client.1.vm06.stdout:4/845: readlink d17/d24/d3b/d97/lc5 0 2026-03-09T00:04:06.096 INFO:tasks.workunit.client.0.vm03.stdout:9/611: creat d15/d1c/d9c/fc9 x:0 0 0 2026-03-09T00:04:06.097 INFO:tasks.workunit.client.1.vm06.stdout:0/865: mknod d3/d18/d2c/d2d/d74/d90/c128 0 2026-03-09T00:04:06.098 INFO:tasks.workunit.client.0.vm03.stdout:5/587: truncate d1c/d20/d55/d4f/d58/d5d/faa 2639930 0 2026-03-09T00:04:06.098 INFO:tasks.workunit.client.0.vm03.stdout:5/588: dread - d1c/d20/fa3 zero size 2026-03-09T00:04:06.099 INFO:tasks.workunit.client.0.vm03.stdout:9/612: rmdir d15/d1c/d28/dc1 0 2026-03-09T00:04:06.103 INFO:tasks.workunit.client.0.vm03.stdout:9/613: creat d15/d1c/d28/d6e/da2/fca x:0 0 0 2026-03-09T00:04:06.103 INFO:tasks.workunit.client.0.vm03.stdout:9/614: chown d15/d1c/d21/d64/f50 16397174 1 2026-03-09T00:04:06.103 INFO:tasks.workunit.client.0.vm03.stdout:5/589: mknod d1c/d20/d56/da1/cbe 0 2026-03-09T00:04:06.103 INFO:tasks.workunit.client.0.vm03.stdout:9/615: rmdir d15/d1c/d21/d54 39 2026-03-09T00:04:06.105 INFO:tasks.workunit.client.1.vm06.stdout:4/846: dread d17/d21/d4c/d50/ff4 [0,4194304] 0 2026-03-09T00:04:06.105 INFO:tasks.workunit.client.1.vm06.stdout:4/847: creat d17/d24/d3b/dbf/dea/f122 x:0 0 0 2026-03-09T00:04:06.107 INFO:tasks.workunit.client.1.vm06.stdout:0/866: write d3/fa [255642,91609] 0 2026-03-09T00:04:06.112 INFO:tasks.workunit.client.0.vm03.stdout:9/616: dread f11 [0,4194304] 0 2026-03-09T00:04:06.113 INFO:tasks.workunit.client.1.vm06.stdout:2/923: link d7/da/d63/d81/dfe/db2/cda d7/d1b/d71/d79/c120 0 2026-03-09T00:04:06.113 INFO:tasks.workunit.client.1.vm06.stdout:2/924: stat d7/da/db/de/f49 0 2026-03-09T00:04:06.113 INFO:tasks.workunit.client.1.vm06.stdout:2/925: stat d7/da/d4e/d57/fc5 0 2026-03-09T00:04:06.115 INFO:tasks.workunit.client.1.vm06.stdout:6/824: unlink d4/d27/lda 0 2026-03-09T00:04:06.116 INFO:tasks.workunit.client.0.vm03.stdout:5/590: getdents d1c/d51/d6a/d75 0 2026-03-09T00:04:06.116 INFO:tasks.workunit.client.0.vm03.stdout:9/617: dread d15/d1c/d21/d64/f50 [0,4194304] 0 2026-03-09T00:04:06.116 INFO:tasks.workunit.client.0.vm03.stdout:9/618: write d15/d1c/d28/f2f [1354687,108221] 0 2026-03-09T00:04:06.125 INFO:tasks.workunit.client.0.vm03.stdout:9/619: getdents d15/d1c/d36 0 2026-03-09T00:04:06.142 INFO:tasks.workunit.client.0.vm03.stdout:9/620: rename d15/l59 to d15/d1c/d21/d54/d87/lcb 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/621: chown d15/d1c/d28/d6e/da2/fca 615563 1 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/622: fsync fb 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/623: chown d15/d1c/d21/d54/d87 20764 1 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/624: symlink d15/d1c/d28/lcc 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/625: getdents d15/d1c/d21/db5 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/626: creat d15/d1c/d21/fcd x:0 0 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.0.vm03.stdout:9/627: chown d15/d1c/d21/f4c 24 1 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:4/848: mknod d17/d24/d49/de4/db0/c123 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:0/867: creat d3/f129 x:0 0 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:2/926: creat d7/d1a/d89/f121 x:0 0 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:6/825: rmdir d4/db4 39 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:4/849: mknod d17/c124 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:4/850: write d17/d21/d4c/fd4 [1268699,719] 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:0/868: rmdir d3/d18/d2c/d2d/d74/dc7/d122 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:0/869: stat d3/d18/lb6 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:2/927: creat d7/da/d63/d81/f122 x:0 0 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:0/870: rmdir d3/d18/d2c/d2d/d74/da8/d11a 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:2/928: unlink d7/d1b/d71/d79/fdf 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:2/929: fdatasync d7/d1a/d96/fba 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:0/871: truncate d3/d18/d1f/d39/d49/f64 2678142 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:0/872: unlink d3/f1b 0 2026-03-09T00:04:06.143 INFO:tasks.workunit.client.1.vm06.stdout:2/930: rmdir d7/d1b/d71/d79/db4/dc1 39 2026-03-09T00:04:06.146 INFO:tasks.workunit.client.1.vm06.stdout:1/747: dwrite d6/db0/fdc [0,4194304] 0 2026-03-09T00:04:06.166 INFO:tasks.workunit.client.0.vm03.stdout:3/459: dwrite d2/db/f28 [8388608,4194304] 0 2026-03-09T00:04:06.166 INFO:tasks.workunit.client.0.vm03.stdout:3/460: stat d2/db/d40/d58 0 2026-03-09T00:04:06.168 INFO:tasks.workunit.client.0.vm03.stdout:3/461: mkdir d2/db/d40/d88 0 2026-03-09T00:04:06.169 INFO:tasks.workunit.client.1.vm06.stdout:2/931: creat d7/d1a/d39/df1/f123 x:0 0 0 2026-03-09T00:04:06.169 INFO:tasks.workunit.client.1.vm06.stdout:2/932: getdents d7/d1a/d89/d105 0 2026-03-09T00:04:06.169 INFO:tasks.workunit.client.1.vm06.stdout:2/933: chown d7/d1b/d31/l38 26 1 2026-03-09T00:04:06.187 INFO:tasks.workunit.client.0.vm03.stdout:4/718: dwrite d7/d20/d29/d38/da9/ddc/f7c [0,4194304] 0 2026-03-09T00:04:06.187 INFO:tasks.workunit.client.0.vm03.stdout:4/719: chown d7/d20/d6a/f76 539966 1 2026-03-09T00:04:06.187 INFO:tasks.workunit.client.1.vm06.stdout:6/826: dread d4/f36 [8388608,4194304] 0 2026-03-09T00:04:06.189 INFO:tasks.workunit.client.0.vm03.stdout:4/720: creat d7/d20/d29/fe5 x:0 0 0 2026-03-09T00:04:06.189 INFO:tasks.workunit.client.1.vm06.stdout:6/827: symlink d4/d16/d53/lfe 0 2026-03-09T00:04:06.189 INFO:tasks.workunit.client.1.vm06.stdout:6/828: read d4/d27/d3e/d78/fc9 [2359823,92935] 0 2026-03-09T00:04:06.189 INFO:tasks.workunit.client.1.vm06.stdout:6/829: mknod d4/d8d/cff 0 2026-03-09T00:04:06.241 INFO:tasks.workunit.client.1.vm06.stdout:8/847: dwrite db/d1e/d46/f4b [4194304,4194304] 0 2026-03-09T00:04:06.242 INFO:tasks.workunit.client.1.vm06.stdout:8/848: stat db/dd/f40 0 2026-03-09T00:04:06.244 INFO:tasks.workunit.client.1.vm06.stdout:8/849: creat db/d53/d6d/f110 x:0 0 0 2026-03-09T00:04:06.245 INFO:tasks.workunit.client.1.vm06.stdout:8/850: write db/d53/f76 [1148841,111354] 0 2026-03-09T00:04:06.246 INFO:tasks.workunit.client.1.vm06.stdout:8/851: readlink db/d74/d87/d100/d8f/lf4 0 2026-03-09T00:04:06.246 INFO:tasks.workunit.client.1.vm06.stdout:8/852: write db/d74/d78/d98/db6/dc7/d101/db7/ffb [150914,33691] 0 2026-03-09T00:04:06.253 INFO:tasks.workunit.client.1.vm06.stdout:4/851: write d17/d24/d49/f62 [3161843,12624] 0 2026-03-09T00:04:06.253 INFO:tasks.workunit.client.0.vm03.stdout:2/605: dwrite d8/d1b/d24/da5/da8/fba [0,4194304] 0 2026-03-09T00:04:06.285 INFO:tasks.workunit.client.1.vm06.stdout:5/927: dwrite d5/d1c/d68/dec/d115/d11e/d92/d49/fc2 [0,4194304] 0 2026-03-09T00:04:06.285 INFO:tasks.workunit.client.1.vm06.stdout:5/928: fsync d5/f3d 0 2026-03-09T00:04:06.287 INFO:tasks.workunit.client.0.vm03.stdout:8/593: dwrite d7/df/d1a/d40/f78 [0,4194304] 0 2026-03-09T00:04:06.287 INFO:tasks.workunit.client.0.vm03.stdout:8/594: dread - d7/df/d1a/d40/fb5 zero size 2026-03-09T00:04:06.288 INFO:tasks.workunit.client.0.vm03.stdout:8/595: rmdir d7/df/d1e/d38 39 2026-03-09T00:04:06.293 INFO:tasks.workunit.client.0.vm03.stdout:8/596: truncate d7/f11 1704917 0 2026-03-09T00:04:06.293 INFO:tasks.workunit.client.0.vm03.stdout:8/597: creat d7/df/d1a/d40/d58/fb6 x:0 0 0 2026-03-09T00:04:06.302 INFO:tasks.workunit.client.0.vm03.stdout:8/598: rmdir d7/df/d1a/d40/db3 39 2026-03-09T00:04:06.304 INFO:tasks.workunit.client.0.vm03.stdout:8/599: symlink d7/df/d1a/d40/db3/lb7 0 2026-03-09T00:04:06.304 INFO:tasks.workunit.client.0.vm03.stdout:8/600: truncate d7/df/d1e/d38/d60/f6e 852330 0 2026-03-09T00:04:06.304 INFO:tasks.workunit.client.0.vm03.stdout:8/601: dread - d7/df/d1e/d3f/f8f zero size 2026-03-09T00:04:06.305 INFO:tasks.workunit.client.0.vm03.stdout:8/602: symlink d7/lb8 0 2026-03-09T00:04:06.305 INFO:tasks.workunit.client.0.vm03.stdout:8/603: dread - d7/df/d1a/d2b/f72 zero size 2026-03-09T00:04:06.329 INFO:tasks.workunit.client.0.vm03.stdout:2/606: write d8/d1b/d2a/f33 [3071138,30542] 0 2026-03-09T00:04:06.329 INFO:tasks.workunit.client.0.vm03.stdout:0/598: dwrite d2/da/dd/d49/d6c/da6/dda/fc5 [0,4194304] 0 2026-03-09T00:04:06.330 INFO:tasks.workunit.client.0.vm03.stdout:0/599: mknod d2/ce0 0 2026-03-09T00:04:06.334 INFO:tasks.workunit.client.0.vm03.stdout:0/600: rmdir d2/da/dd/d6e 39 2026-03-09T00:04:06.334 INFO:tasks.workunit.client.0.vm03.stdout:0/601: chown d2/da/c10 233 1 2026-03-09T00:04:06.334 INFO:tasks.workunit.client.0.vm03.stdout:0/602: dread - d2/da/d1a/fb7 zero size 2026-03-09T00:04:06.352 INFO:tasks.workunit.client.1.vm06.stdout:2/934: write d7/da/db/de/f53 [810373,102727] 0 2026-03-09T00:04:06.352 INFO:tasks.workunit.client.1.vm06.stdout:7/827: dread d0/df/d1a/d27/d4c/d40/d5b/fd7 [0,4194304] 0 2026-03-09T00:04:06.352 INFO:tasks.workunit.client.1.vm06.stdout:2/935: readlink d7/d1a/d3c/ld5 0 2026-03-09T00:04:06.352 INFO:tasks.workunit.client.1.vm06.stdout:2/936: dread - d7/d1b/d71/d79/f116 zero size 2026-03-09T00:04:06.354 INFO:tasks.workunit.client.1.vm06.stdout:7/828: symlink d0/df/d1a/d27/d70/d9b/de2/lf2 0 2026-03-09T00:04:06.358 INFO:tasks.workunit.client.1.vm06.stdout:7/829: creat d0/df/d7b/ff3 x:0 0 0 2026-03-09T00:04:06.358 INFO:tasks.workunit.client.1.vm06.stdout:7/830: chown d0/df/d17/dba/de4 52 1 2026-03-09T00:04:06.358 INFO:tasks.workunit.client.1.vm06.stdout:7/831: creat d0/df/d1a/d27/d4c/d40/d51/d90/dae/de0/ff4 x:0 0 0 2026-03-09T00:04:06.358 INFO:tasks.workunit.client.1.vm06.stdout:7/832: write d0/df/d1a/d35/f77 [2399283,19889] 0 2026-03-09T00:04:06.365 INFO:tasks.workunit.client.0.vm03.stdout:8/604: dread d7/df/d1a/d40/f5e [0,4194304] 0 2026-03-09T00:04:06.365 INFO:tasks.workunit.client.0.vm03.stdout:8/605: chown d7/df/f55 1026934 1 2026-03-09T00:04:06.365 INFO:tasks.workunit.client.0.vm03.stdout:8/606: fdatasync d7/df/f31 0 2026-03-09T00:04:06.375 INFO:tasks.workunit.client.1.vm06.stdout:4/852: dwrite d17/d24/d3b/d5e/f6f [0,4194304] 0 2026-03-09T00:04:06.376 INFO:tasks.workunit.client.0.vm03.stdout:2/607: dread d8/d1b/d24/da5/da8/fba [0,4194304] 0 2026-03-09T00:04:06.376 INFO:tasks.workunit.client.0.vm03.stdout:2/608: write f6 [3242211,90228] 0 2026-03-09T00:04:06.376 INFO:tasks.workunit.client.0.vm03.stdout:8/607: rename d7/df/d1e/d38/d4c/c51 to d7/df/d1e/cb9 0 2026-03-09T00:04:06.386 INFO:tasks.workunit.client.0.vm03.stdout:7/565: dwrite d2/d1f/d3a/d24/da4/d46/d54/f77 [0,4194304] 0 2026-03-09T00:04:06.386 INFO:tasks.workunit.client.0.vm03.stdout:7/566: dread - d2/d4/d1e/f97 zero size 2026-03-09T00:04:06.386 INFO:tasks.workunit.client.0.vm03.stdout:7/567: write d2/d1f/f62 [1077878,50297] 0 2026-03-09T00:04:06.390 INFO:tasks.workunit.client.1.vm06.stdout:0/873: dwrite d3/d18/d1f/d39/d49/f4b [0,4194304] 0 2026-03-09T00:04:06.392 INFO:tasks.workunit.client.0.vm03.stdout:9/628: write d15/d1c/d28/faa [3649846,129997] 0 2026-03-09T00:04:06.395 INFO:tasks.workunit.client.0.vm03.stdout:2/609: rmdir d8/d1b/d8f 39 2026-03-09T00:04:06.396 INFO:tasks.workunit.client.0.vm03.stdout:2/610: truncate d8/d1b/d24/f82 4704539 0 2026-03-09T00:04:06.396 INFO:tasks.workunit.client.0.vm03.stdout:2/611: write d8/d1b/d2a/d6b/f92 [443290,75050] 0 2026-03-09T00:04:06.397 INFO:tasks.workunit.client.1.vm06.stdout:0/874: creat d3/d18/d1f/d39/d3b/df9/df2/f12a x:0 0 0 2026-03-09T00:04:06.401 INFO:tasks.workunit.client.0.vm03.stdout:1/678: dwrite f0 [0,4194304] 0 2026-03-09T00:04:06.402 INFO:tasks.workunit.client.0.vm03.stdout:6/546: dwrite d13/d35/d72/f85 [0,4194304] 0 2026-03-09T00:04:06.402 INFO:tasks.workunit.client.0.vm03.stdout:6/547: chown d13/d1e/f21 22859507 1 2026-03-09T00:04:06.403 INFO:tasks.workunit.client.1.vm06.stdout:0/875: write d3/d18/f14 [2164071,68806] 0 2026-03-09T00:04:06.403 INFO:tasks.workunit.client.1.vm06.stdout:1/748: dwrite d6/d4c/fc3 [0,4194304] 0 2026-03-09T00:04:06.419 INFO:tasks.workunit.client.0.vm03.stdout:6/548: dread d13/d1e/d44/d59/d77/f94 [0,4194304] 0 2026-03-09T00:04:06.423 INFO:tasks.workunit.client.1.vm06.stdout:1/749: rename d6/d4c/d51/db3 to d6/d21/dfc 0 2026-03-09T00:04:06.423 INFO:tasks.workunit.client.1.vm06.stdout:1/750: fdatasync d6/d4c/feb 0 2026-03-09T00:04:06.423 INFO:tasks.workunit.client.0.vm03.stdout:9/629: symlink d15/db6/lce 0 2026-03-09T00:04:06.427 INFO:tasks.workunit.client.0.vm03.stdout:1/679: mknod d4/d3a/d32/dc2/cea 0 2026-03-09T00:04:06.429 INFO:tasks.workunit.client.1.vm06.stdout:0/876: symlink d3/d18/d2c/d2d/d74/daf/de3/l12b 0 2026-03-09T00:04:06.432 INFO:tasks.workunit.client.0.vm03.stdout:6/549: mknod d13/d35/d74/cbb 0 2026-03-09T00:04:06.432 INFO:tasks.workunit.client.0.vm03.stdout:6/550: write d13/f55 [436385,28123] 0 2026-03-09T00:04:06.434 INFO:tasks.workunit.client.0.vm03.stdout:6/551: getdents d13/d1e/d44/d4a/d52 0 2026-03-09T00:04:06.435 INFO:tasks.workunit.client.0.vm03.stdout:6/552: link d13/d35/d72/laa d13/d1e/d44/lbc 0 2026-03-09T00:04:06.436 INFO:tasks.workunit.client.0.vm03.stdout:6/553: getdents d13/d35/d69 0 2026-03-09T00:04:06.436 INFO:tasks.workunit.client.0.vm03.stdout:6/554: write d13/d35/d72/fb7 [989246,91124] 0 2026-03-09T00:04:06.436 INFO:tasks.workunit.client.0.vm03.stdout:6/555: chown d13/d35/fb6 3389 1 2026-03-09T00:04:06.437 INFO:tasks.workunit.client.0.vm03.stdout:6/556: mknod d13/d35/d71/d97/da5/cbd 0 2026-03-09T00:04:06.438 INFO:tasks.workunit.client.0.vm03.stdout:6/557: write fb [6742294,83946] 0 2026-03-09T00:04:06.439 INFO:tasks.workunit.client.0.vm03.stdout:6/558: symlink d13/d35/db5/lbe 0 2026-03-09T00:04:06.450 INFO:tasks.workunit.client.1.vm06.stdout:1/751: read d6/d63/f9c [379415,110477] 0 2026-03-09T00:04:06.473 INFO:tasks.workunit.client.1.vm06.stdout:9/730: dwrite d1/da7/fc0 [0,4194304] 0 2026-03-09T00:04:06.482 INFO:tasks.workunit.client.1.vm06.stdout:9/731: creat d1/d3/d4f/d52/de3/de5/fed x:0 0 0 2026-03-09T00:04:06.488 INFO:tasks.workunit.client.1.vm06.stdout:9/732: symlink d1/d3/d4f/d52/lee 0 2026-03-09T00:04:06.530 INFO:tasks.workunit.client.1.vm06.stdout:6/830: dwrite d4/d16/d53/df2/ffa [0,4194304] 0 2026-03-09T00:04:06.537 INFO:tasks.workunit.client.1.vm06.stdout:5/929: dwrite d5/d1c/d21/d28/d5e/d66/d78/fc1 [0,4194304] 0 2026-03-09T00:04:06.546 INFO:tasks.workunit.client.1.vm06.stdout:5/930: creat d5/db1/dcc/f134 x:0 0 0 2026-03-09T00:04:06.547 INFO:tasks.workunit.client.1.vm06.stdout:6/831: write d4/d16/d53/ddf/d4b/f83 [685696,19275] 0 2026-03-09T00:04:06.549 INFO:tasks.workunit.client.1.vm06.stdout:5/931: symlink d5/d1c/d68/dec/d115/d11e/d92/d95/l135 0 2026-03-09T00:04:06.549 INFO:tasks.workunit.client.1.vm06.stdout:5/932: chown d5/d1c 10957 1 2026-03-09T00:04:06.552 INFO:tasks.workunit.client.1.vm06.stdout:6/832: read d4/d16/d46/d90/fd0 [3457578,36259] 0 2026-03-09T00:04:06.553 INFO:tasks.workunit.client.1.vm06.stdout:5/933: creat d5/d1c/d68/dec/d115/d11e/d92/d95/f136 x:0 0 0 2026-03-09T00:04:06.555 INFO:tasks.workunit.client.1.vm06.stdout:6/833: mknod d4/d8d/c100 0 2026-03-09T00:04:06.566 INFO:tasks.workunit.client.1.vm06.stdout:4/853: dwrite d17/d24/d3b/d5e/f6f [0,4194304] 0 2026-03-09T00:04:06.566 INFO:tasks.workunit.client.1.vm06.stdout:2/937: dwrite d7/d1b/d71/fcb [0,4194304] 0 2026-03-09T00:04:06.567 INFO:tasks.workunit.client.0.vm03.stdout:3/462: dwrite d2/db/f28 [4194304,4194304] 0 2026-03-09T00:04:06.569 INFO:tasks.workunit.client.0.vm03.stdout:8/608: dwrite d7/df/d1e/d3f/f59 [0,4194304] 0 2026-03-09T00:04:06.571 INFO:tasks.workunit.client.0.vm03.stdout:7/568: dwrite d2/d1f/d3a/f5d [0,4194304] 0 2026-03-09T00:04:06.576 INFO:tasks.workunit.client.0.vm03.stdout:7/569: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/cab 126617963 1 2026-03-09T00:04:06.576 INFO:tasks.workunit.client.0.vm03.stdout:7/570: read d2/d1f/d3a/d24/da4/d46/d81/d96/f3f [1725233,45238] 0 2026-03-09T00:04:06.584 INFO:tasks.workunit.client.0.vm03.stdout:3/463: dread d2/db/d3b/f63 [0,4194304] 0 2026-03-09T00:04:06.584 INFO:tasks.workunit.client.0.vm03.stdout:3/464: getdents d2/db/d6a/d70 0 2026-03-09T00:04:06.586 INFO:tasks.workunit.client.1.vm06.stdout:4/854: rename d17/d21/d32/d102 to d17/d24/d3b/d75/d125 0 2026-03-09T00:04:06.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:06 vm03.local ceph-mon[52346]: pgmap v6: 65 pgs: 65 active+clean; 2.8 GiB data, 9.5 GiB used, 110 GiB / 120 GiB avail 2026-03-09T00:04:06.595 INFO:tasks.workunit.client.0.vm03.stdout:7/571: truncate d2/fc 1313289 0 2026-03-09T00:04:06.595 INFO:tasks.workunit.client.0.vm03.stdout:7/572: stat d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fa1 0 2026-03-09T00:04:06.595 INFO:tasks.workunit.client.1.vm06.stdout:4/855: mkdir d17/d24/d3b/d97/d126 0 2026-03-09T00:04:06.602 INFO:tasks.workunit.client.1.vm06.stdout:4/856: mkdir d17/d24/d3b/d5e/d127 0 2026-03-09T00:04:06.602 INFO:tasks.workunit.client.1.vm06.stdout:4/857: dread - d17/d21/d4c/d66/dd9/f7e zero size 2026-03-09T00:04:06.602 INFO:tasks.workunit.client.0.vm03.stdout:3/465: creat d2/db/d40/d88/f89 x:0 0 0 2026-03-09T00:04:06.603 INFO:tasks.workunit.client.1.vm06.stdout:2/938: rename d7/da/d1c/c43 to d7/d1b/da5/daa/c124 0 2026-03-09T00:04:06.603 INFO:tasks.workunit.client.1.vm06.stdout:2/939: dread - d7/d1b/d71/d79/f116 zero size 2026-03-09T00:04:06.603 INFO:tasks.workunit.client.1.vm06.stdout:2/940: chown d7/da/d63/d81/dfe/db2 7174827 1 2026-03-09T00:04:06.617 INFO:tasks.workunit.client.1.vm06.stdout:2/941: unlink d7/da/db/c58 0 2026-03-09T00:04:06.618 INFO:tasks.workunit.client.1.vm06.stdout:2/942: dread d7/da/fbf [0,4194304] 0 2026-03-09T00:04:06.620 INFO:tasks.workunit.client.0.vm03.stdout:9/630: dwrite d15/d1c/d21/fc7 [0,4194304] 0 2026-03-09T00:04:06.621 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:06 vm06.local ceph-mon[58395]: pgmap v6: 65 pgs: 65 active+clean; 2.8 GiB data, 9.5 GiB used, 110 GiB / 120 GiB avail 2026-03-09T00:04:06.634 INFO:tasks.workunit.client.1.vm06.stdout:2/943: dread d7/da/d1c/ff5 [0,4194304] 0 2026-03-09T00:04:06.637 INFO:tasks.workunit.client.1.vm06.stdout:2/944: symlink d7/d1b/d71/d79/l125 0 2026-03-09T00:04:06.637 INFO:tasks.workunit.client.1.vm06.stdout:2/945: chown d7/da/d63/c7f 107994835 1 2026-03-09T00:04:06.638 INFO:tasks.workunit.client.0.vm03.stdout:7/573: mkdir d2/d4/d1e/d5e/daf 0 2026-03-09T00:04:06.644 INFO:tasks.workunit.client.0.vm03.stdout:3/466: creat d2/f8a x:0 0 0 2026-03-09T00:04:06.644 INFO:tasks.workunit.client.1.vm06.stdout:2/946: creat d7/da/db/d109/f126 x:0 0 0 2026-03-09T00:04:06.644 INFO:tasks.workunit.client.1.vm06.stdout:2/947: write d7/d1a/d39/df1/f11e [452234,119866] 0 2026-03-09T00:04:06.644 INFO:tasks.workunit.client.1.vm06.stdout:2/948: chown d7/lcd 347 1 2026-03-09T00:04:06.651 INFO:tasks.workunit.client.0.vm03.stdout:2/612: dwrite d8/d1b/f3d [0,4194304] 0 2026-03-09T00:04:06.656 INFO:tasks.workunit.client.1.vm06.stdout:2/949: creat d7/d1b/d71/d79/db4/f127 x:0 0 0 2026-03-09T00:04:06.661 INFO:tasks.workunit.client.1.vm06.stdout:7/833: dwrite d0/df/d1a/d27/d4c/fb0 [0,4194304] 0 2026-03-09T00:04:06.661 INFO:tasks.workunit.client.0.vm03.stdout:9/631: mkdir d15/d1c/d21/d54/d87/d93/dcf 0 2026-03-09T00:04:06.663 INFO:tasks.workunit.client.1.vm06.stdout:1/752: dwrite d6/d21/f2e [0,4194304] 0 2026-03-09T00:04:06.663 INFO:tasks.workunit.client.0.vm03.stdout:1/680: dwrite d4/d3a/d32/fb9 [0,4194304] 0 2026-03-09T00:04:06.664 INFO:tasks.workunit.client.1.vm06.stdout:0/877: dwrite d3/f1c [0,4194304] 0 2026-03-09T00:04:06.669 INFO:tasks.workunit.client.1.vm06.stdout:0/878: dread d3/d18/d1f/d39/d49/d60/fb3 [0,4194304] 0 2026-03-09T00:04:06.674 INFO:tasks.workunit.client.1.vm06.stdout:2/950: mknod d7/d1a/d39/df1/c128 0 2026-03-09T00:04:06.676 INFO:tasks.workunit.client.1.vm06.stdout:9/733: dwrite d1/d4/d6e/d14/d25/f70 [0,4194304] 0 2026-03-09T00:04:06.677 INFO:tasks.workunit.client.1.vm06.stdout:9/734: write d1/d3/d4f/d52/fa5 [1313940,1690] 0 2026-03-09T00:04:06.677 INFO:tasks.workunit.client.1.vm06.stdout:9/735: dread d1/d4/d6e/f2c [0,4194304] 0 2026-03-09T00:04:06.682 INFO:tasks.workunit.client.1.vm06.stdout:6/834: dwrite d4/d27/d3e/f41 [0,4194304] 0 2026-03-09T00:04:06.683 INFO:tasks.workunit.client.0.vm03.stdout:2/613: truncate d8/d1b/d24/f38 91591 0 2026-03-09T00:04:06.683 INFO:tasks.workunit.client.0.vm03.stdout:2/614: chown d8/d26/d5e/d6f 2338968 1 2026-03-09T00:04:06.691 INFO:tasks.workunit.client.0.vm03.stdout:6/559: dwrite f2 [0,4194304] 0 2026-03-09T00:04:06.694 INFO:tasks.workunit.client.1.vm06.stdout:7/834: mknod d0/df/d1a/d27/d4c/d40/d51/d90/dae/cf5 0 2026-03-09T00:04:06.694 INFO:tasks.workunit.client.1.vm06.stdout:1/753: mknod d6/dc4/cfd 0 2026-03-09T00:04:06.694 INFO:tasks.workunit.client.1.vm06.stdout:7/835: readlink d0/df/d1a/d3a/l54 0 2026-03-09T00:04:06.694 INFO:tasks.workunit.client.0.vm03.stdout:9/632: symlink d15/d1c/d21/d54/d87/ld0 0 2026-03-09T00:04:06.694 INFO:tasks.workunit.client.0.vm03.stdout:9/633: readlink d15/l19 0 2026-03-09T00:04:06.694 INFO:tasks.workunit.client.0.vm03.stdout:9/634: dread - d15/d1c/d28/d6e/fa9 zero size 2026-03-09T00:04:06.697 INFO:tasks.workunit.client.0.vm03.stdout:6/560: dread d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:04:06.697 INFO:tasks.workunit.client.0.vm03.stdout:6/561: chown d13/d1e/d44/d4a/f58 3982388 1 2026-03-09T00:04:06.697 INFO:tasks.workunit.client.0.vm03.stdout:6/562: fdatasync d13/d1e/d44/d59/f6e 0 2026-03-09T00:04:06.697 INFO:tasks.workunit.client.0.vm03.stdout:6/563: write d13/d35/d71/fb0 [889862,130973] 0 2026-03-09T00:04:06.698 INFO:tasks.workunit.client.0.vm03.stdout:0/603: truncate d2/da/dd/d49/d6c/d4b/fdb 1120737 0 2026-03-09T00:04:06.701 INFO:tasks.workunit.client.0.vm03.stdout:6/564: write d13/d1e/f9f [1971759,164] 0 2026-03-09T00:04:06.701 INFO:tasks.workunit.client.0.vm03.stdout:6/565: readlink d13/d1e/d44/l46 0 2026-03-09T00:04:06.712 INFO:tasks.workunit.client.0.vm03.stdout:8/609: rename d7/df/d1e to d7/df/d1a/d40/db3/dba 0 2026-03-09T00:04:06.713 INFO:tasks.workunit.client.0.vm03.stdout:8/610: fdatasync d7/df/d1a/d40/db3/dba/d3f/d95/fb4 0 2026-03-09T00:04:06.714 INFO:tasks.workunit.client.1.vm06.stdout:0/879: creat d3/d18/d2c/d2d/d74/dc7/f12c x:0 0 0 2026-03-09T00:04:06.714 INFO:tasks.workunit.client.0.vm03.stdout:1/681: unlink d4/l14 0 2026-03-09T00:04:06.714 INFO:tasks.workunit.client.0.vm03.stdout:1/682: read d4/d3a/d32/f53 [437658,128023] 0 2026-03-09T00:04:06.716 INFO:tasks.workunit.client.0.vm03.stdout:7/574: rmdir d2/d1f/d35 39 2026-03-09T00:04:06.716 INFO:tasks.workunit.client.1.vm06.stdout:2/951: getdents d7/da/d4e/d57 0 2026-03-09T00:04:06.716 INFO:tasks.workunit.client.1.vm06.stdout:2/952: chown d7/da/db/la7 541 1 2026-03-09T00:04:06.721 INFO:tasks.workunit.client.1.vm06.stdout:9/736: rename d1/caa to d1/d3/d4f/d52/de3/cef 0 2026-03-09T00:04:06.723 INFO:tasks.workunit.client.1.vm06.stdout:2/953: write d7/d1b/da5/daa/fad [539235,113394] 0 2026-03-09T00:04:06.723 INFO:tasks.workunit.client.0.vm03.stdout:8/611: write d7/df/d1a/d40/db3/dba/d38/d60/f71 [4106093,29165] 0 2026-03-09T00:04:06.723 INFO:tasks.workunit.client.0.vm03.stdout:8/612: chown d7/df/d1a/f33 0 1 2026-03-09T00:04:06.763 INFO:tasks.workunit.client.0.vm03.stdout:9/635: truncate d15/d1c/d21/d54/d87/d93/fba 463949 0 2026-03-09T00:04:06.764 INFO:tasks.workunit.client.0.vm03.stdout:9/636: fdatasync d15/d1c/d36/d4d/fad 0 2026-03-09T00:04:06.766 INFO:tasks.workunit.client.0.vm03.stdout:6/566: dwrite d13/d1e/d44/f49 [0,4194304] 0 2026-03-09T00:04:06.766 INFO:tasks.workunit.client.0.vm03.stdout:6/567: stat d13/d1e/d44/d59/f6e 0 2026-03-09T00:04:06.766 INFO:tasks.workunit.client.0.vm03.stdout:6/568: readlink d13/d1e/d44/d4a/l6b 0 2026-03-09T00:04:06.766 INFO:tasks.workunit.client.0.vm03.stdout:6/569: fsync f12 0 2026-03-09T00:04:06.766 INFO:tasks.workunit.client.1.vm06.stdout:2/954: dwrite d7/d1a/d25/d66/fa6 [0,4194304] 0 2026-03-09T00:04:06.771 INFO:tasks.workunit.client.0.vm03.stdout:6/570: dread d13/f3a [0,4194304] 0 2026-03-09T00:04:06.772 INFO:tasks.workunit.client.0.vm03.stdout:2/615: dwrite d8/f5d [4194304,4194304] 0 2026-03-09T00:04:06.772 INFO:tasks.workunit.client.1.vm06.stdout:2/955: write d7/da/d4e/d57/f9f [10410049,24865] 0 2026-03-09T00:04:06.775 INFO:tasks.workunit.client.0.vm03.stdout:3/467: rename d2/f8 to d2/db/d2d/f8b 0 2026-03-09T00:04:06.775 INFO:tasks.workunit.client.0.vm03.stdout:3/468: chown d2/db/d2d/d55/f6f 8423 1 2026-03-09T00:04:06.778 INFO:tasks.workunit.client.1.vm06.stdout:1/754: creat d6/d4c/d51/ffe x:0 0 0 2026-03-09T00:04:06.779 INFO:tasks.workunit.client.1.vm06.stdout:3/833: sync 2026-03-09T00:04:06.780 INFO:tasks.workunit.client.0.vm03.stdout:1/683: mkdir d4/d3a/d61/d78/d81/deb 0 2026-03-09T00:04:06.785 INFO:tasks.workunit.client.1.vm06.stdout:7/836: creat d0/df/d1a/d27/d4c/d40/d51/d86/ff6 x:0 0 0 2026-03-09T00:04:06.789 INFO:tasks.workunit.client.1.vm06.stdout:0/880: rmdir d3/d18/d1f/d39/d3b/df9 39 2026-03-09T00:04:06.789 INFO:tasks.workunit.client.1.vm06.stdout:0/881: write d3/d18/d1f/d44/f7c [4293665,114881] 0 2026-03-09T00:04:06.789 INFO:tasks.workunit.client.0.vm03.stdout:7/575: creat d2/d1f/d3a/d24/da4/d46/d81/d96/d37/fb0 x:0 0 0 2026-03-09T00:04:06.789 INFO:tasks.workunit.client.0.vm03.stdout:7/576: read - d2/d4/d1e/fa8 zero size 2026-03-09T00:04:06.791 INFO:tasks.workunit.client.0.vm03.stdout:8/613: mknod d7/df/d1a/cbb 0 2026-03-09T00:04:06.796 INFO:tasks.workunit.client.0.vm03.stdout:7/577: dread d2/d1f/d3a/f1a [0,4194304] 0 2026-03-09T00:04:06.802 INFO:tasks.workunit.client.1.vm06.stdout:9/737: creat d1/d3/d4f/d52/de3/ff0 x:0 0 0 2026-03-09T00:04:06.803 INFO:tasks.workunit.client.0.vm03.stdout:7/578: write d2/d1f/d3a/f5d [3264786,87071] 0 2026-03-09T00:04:06.803 INFO:tasks.workunit.client.0.vm03.stdout:7/579: readlink d2/d4/d1e/l21 0 2026-03-09T00:04:06.803 INFO:tasks.workunit.client.0.vm03.stdout:9/637: mknod d15/d1c/d21/d54/d87/d93/cd1 0 2026-03-09T00:04:06.803 INFO:tasks.workunit.client.0.vm03.stdout:9/638: creat d15/d1c/d36/d4d/fd2 x:0 0 0 2026-03-09T00:04:06.806 INFO:tasks.workunit.client.1.vm06.stdout:6/835: rename d4/d27/d3e/d57 to d4/d16/d53/ddf/d7e/dac/dd3/d101 0 2026-03-09T00:04:06.808 INFO:tasks.workunit.client.1.vm06.stdout:6/836: write d4/d27/f70 [452459,40696] 0 2026-03-09T00:04:06.808 INFO:tasks.workunit.client.1.vm06.stdout:6/837: write d4/fae [642082,20429] 0 2026-03-09T00:04:06.808 INFO:tasks.workunit.client.1.vm06.stdout:6/838: stat d4/d16/d53/ddf/da6/dbb/fed 0 2026-03-09T00:04:06.809 INFO:tasks.workunit.client.0.vm03.stdout:6/571: mkdir d13/d1e/d44/d4a/d52/dbf 0 2026-03-09T00:04:06.810 INFO:tasks.workunit.client.1.vm06.stdout:2/956: symlink d7/da/d93/l129 0 2026-03-09T00:04:06.811 INFO:tasks.workunit.client.1.vm06.stdout:1/755: link d6/d21/d2d/f3c d6/d21/dfc/fff 0 2026-03-09T00:04:06.814 INFO:tasks.workunit.client.1.vm06.stdout:3/834: truncate d11/d28/d2e/dff/f121 2538631 0 2026-03-09T00:04:06.814 INFO:tasks.workunit.client.1.vm06.stdout:3/835: chown d11/d28/d2e/d2f/d36/d8f 7576927 1 2026-03-09T00:04:06.815 INFO:tasks.workunit.client.1.vm06.stdout:3/836: chown d11/d28/d2e/d2f/dc1/cd4 472812 1 2026-03-09T00:04:06.815 INFO:tasks.workunit.client.0.vm03.stdout:3/469: unlink d2/db/d56/c81 0 2026-03-09T00:04:06.815 INFO:tasks.workunit.client.0.vm03.stdout:3/470: stat d2/db/d40/d51/f57 0 2026-03-09T00:04:06.818 INFO:tasks.workunit.client.1.vm06.stdout:7/837: rmdir d0/df/d1a/d27/d4c/d40/d51/d90/dae 39 2026-03-09T00:04:06.825 INFO:tasks.workunit.client.0.vm03.stdout:2/616: dread d8/f3e [0,4194304] 0 2026-03-09T00:04:06.833 INFO:tasks.workunit.client.1.vm06.stdout:3/837: write d11/d28/d2e/d2f/d36/f4e [1420031,51320] 0 2026-03-09T00:04:06.834 INFO:tasks.workunit.client.1.vm06.stdout:7/838: dread d0/df/d1a/d27/d4c/d40/f67 [0,4194304] 0 2026-03-09T00:04:06.835 INFO:tasks.workunit.client.1.vm06.stdout:0/882: mkdir d3/d18/d2c/d2d/d74/dc7/d110/d12d 0 2026-03-09T00:04:06.837 INFO:tasks.workunit.client.0.vm03.stdout:1/684: creat d4/d15/d86/fec x:0 0 0 2026-03-09T00:04:06.837 INFO:tasks.workunit.client.0.vm03.stdout:1/685: stat d4/d15/d77/f7c 0 2026-03-09T00:04:06.838 INFO:tasks.workunit.client.0.vm03.stdout:1/686: write d4/d3a/d3d/fa2 [386841,116491] 0 2026-03-09T00:04:06.849 INFO:tasks.workunit.client.0.vm03.stdout:7/580: mknod d2/d1f/d3a/d24/da4/d91/daa/cb1 0 2026-03-09T00:04:06.849 INFO:tasks.workunit.client.1.vm06.stdout:9/738: getdents d1/d4/d6e 0 2026-03-09T00:04:06.849 INFO:tasks.workunit.client.1.vm06.stdout:2/957: truncate d7/da/db/f74 403441 0 2026-03-09T00:04:06.851 INFO:tasks.workunit.client.1.vm06.stdout:5/934: dwrite d5/d1c/d68/fb4 [0,4194304] 0 2026-03-09T00:04:06.854 INFO:tasks.workunit.client.0.vm03.stdout:7/581: read d2/d4/f13 [4092546,107020] 0 2026-03-09T00:04:06.855 INFO:tasks.workunit.client.0.vm03.stdout:9/639: symlink d15/db6/ld3 0 2026-03-09T00:04:06.855 INFO:tasks.workunit.client.0.vm03.stdout:9/640: fdatasync d15/f26 0 2026-03-09T00:04:06.856 INFO:tasks.workunit.client.1.vm06.stdout:9/739: dread d1/d73/f8f [4194304,4194304] 0 2026-03-09T00:04:06.856 INFO:tasks.workunit.client.1.vm06.stdout:9/740: readlink d1/d4/d6e/d9/l98 0 2026-03-09T00:04:06.858 INFO:tasks.workunit.client.1.vm06.stdout:1/756: mknod d6/d21/d2d/d37/dbc/c100 0 2026-03-09T00:04:06.858 INFO:tasks.workunit.client.1.vm06.stdout:1/757: chown d6/d21/fe7 98745 1 2026-03-09T00:04:06.859 INFO:tasks.workunit.client.0.vm03.stdout:6/572: getdents d13/d1e/d44/d4a 0 2026-03-09T00:04:06.862 INFO:tasks.workunit.client.0.vm03.stdout:9/641: write d15/f7b [2186385,114202] 0 2026-03-09T00:04:06.864 INFO:tasks.workunit.client.1.vm06.stdout:9/741: dread d1/d4/d2f/f43 [0,4194304] 0 2026-03-09T00:04:06.874 INFO:tasks.workunit.client.0.vm03.stdout:3/471: dwrite d2/f9 [0,4194304] 0 2026-03-09T00:04:06.876 INFO:tasks.workunit.client.0.vm03.stdout:0/604: rename d2/da/c99 to d2/da/d36/ce1 0 2026-03-09T00:04:06.876 INFO:tasks.workunit.client.0.vm03.stdout:0/605: chown d2/da/dd/d49/d6c/d4b/daf/fc6 5138 1 2026-03-09T00:04:06.879 INFO:tasks.workunit.client.1.vm06.stdout:6/839: dwrite d4/d27/f4e [0,4194304] 0 2026-03-09T00:04:06.879 INFO:tasks.workunit.client.1.vm06.stdout:6/840: getdents d4/d27/d3e/d45/dea 0 2026-03-09T00:04:06.879 INFO:tasks.workunit.client.0.vm03.stdout:3/472: dread d2/f4e [0,4194304] 0 2026-03-09T00:04:06.884 INFO:tasks.workunit.client.0.vm03.stdout:0/606: dread d2/da/dd/f24 [0,4194304] 0 2026-03-09T00:04:06.895 INFO:tasks.workunit.client.0.vm03.stdout:2/617: mknod d8/d1b/d2a/d56/cc4 0 2026-03-09T00:04:06.900 INFO:tasks.workunit.client.0.vm03.stdout:0/607: dread d2/da/dd/f75 [0,4194304] 0 2026-03-09T00:04:06.903 INFO:tasks.workunit.client.1.vm06.stdout:3/838: rename d11/d28/d2e/db2/dc2/f114 to d11/d28/d2e/dff/f123 0 2026-03-09T00:04:06.909 INFO:tasks.workunit.client.1.vm06.stdout:9/742: dwrite d1/f78 [0,4194304] 0 2026-03-09T00:04:06.909 INFO:tasks.workunit.client.1.vm06.stdout:7/839: mknod d0/df/d1a/d3f/d53/cf7 0 2026-03-09T00:04:06.909 INFO:tasks.workunit.client.0.vm03.stdout:4/721: sync 2026-03-09T00:04:06.911 INFO:tasks.workunit.client.0.vm03.stdout:1/687: rmdir d4/d3a 39 2026-03-09T00:04:06.911 INFO:tasks.workunit.client.0.vm03.stdout:1/688: write d4/d3a/d3d/d46/f70 [429845,115036] 0 2026-03-09T00:04:06.911 INFO:tasks.workunit.client.0.vm03.stdout:1/689: chown d4/d15/d5c/f5f 30 1 2026-03-09T00:04:06.922 INFO:tasks.workunit.client.1.vm06.stdout:0/883: mkdir d3/d12e 0 2026-03-09T00:04:06.926 INFO:tasks.workunit.client.1.vm06.stdout:0/884: dread d3/fa [0,4194304] 0 2026-03-09T00:04:06.933 INFO:tasks.workunit.client.0.vm03.stdout:4/722: read d7/d20/d29/d38/da9/ddc/f50 [1952967,73341] 0 2026-03-09T00:04:06.933 INFO:tasks.workunit.client.0.vm03.stdout:8/614: getdents d7/df/d1a/d40/d9d/da3 0 2026-03-09T00:04:06.933 INFO:tasks.workunit.client.0.vm03.stdout:8/615: creat d7/df/d1a/d40/d58/fbc x:0 0 0 2026-03-09T00:04:06.938 INFO:tasks.workunit.client.0.vm03.stdout:7/582: mkdir d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/db2 0 2026-03-09T00:04:06.939 INFO:tasks.workunit.client.0.vm03.stdout:5/591: sync 2026-03-09T00:04:06.939 INFO:tasks.workunit.client.0.vm03.stdout:6/573: rmdir d13/d35/d74/d89/d9d 39 2026-03-09T00:04:06.956 INFO:tasks.workunit.client.1.vm06.stdout:5/935: rename d5/d1c/d68/dec/d115/d11e/d92/d49/c94 to d5/d1c/d68/dec/d115/d11e/d92/d49/c137 0 2026-03-09T00:04:06.956 INFO:tasks.workunit.client.1.vm06.stdout:5/936: stat d5/d1c 0 2026-03-09T00:04:06.956 INFO:tasks.workunit.client.1.vm06.stdout:5/937: write d5/d1c/d68/dec/d115/d11e/d92/d49/da0/ffc [549719,36587] 0 2026-03-09T00:04:06.959 INFO:tasks.workunit.client.0.vm03.stdout:2/618: mkdir d8/d26/d5e/dc5 0 2026-03-09T00:04:06.961 INFO:tasks.workunit.client.1.vm06.stdout:1/758: creat d6/d8f/f101 x:0 0 0 2026-03-09T00:04:06.963 INFO:tasks.workunit.client.1.vm06.stdout:6/841: rmdir d4/d16/d53/ddf/d4b/ddb 39 2026-03-09T00:04:06.967 INFO:tasks.workunit.client.1.vm06.stdout:8/853: sync 2026-03-09T00:04:06.968 INFO:tasks.workunit.client.1.vm06.stdout:4/858: sync 2026-03-09T00:04:06.971 INFO:tasks.workunit.client.1.vm06.stdout:2/958: rename d7/fc2 to d7/d1b/d71/f12a 0 2026-03-09T00:04:06.977 INFO:tasks.workunit.client.0.vm03.stdout:2/619: dread d8/d1b/f31 [0,4194304] 0 2026-03-09T00:04:06.978 INFO:tasks.workunit.client.1.vm06.stdout:6/842: dread d4/d27/d3e/d78/f92 [0,4194304] 0 2026-03-09T00:04:06.990 INFO:tasks.workunit.client.1.vm06.stdout:8/854: dread db/d1e/f82 [0,4194304] 0 2026-03-09T00:04:06.994 INFO:tasks.workunit.client.0.vm03.stdout:1/690: dwrite d4/d3a/d32/dc2/fdd [0,4194304] 0 2026-03-09T00:04:06.998 INFO:tasks.workunit.client.0.vm03.stdout:1/691: write d4/d6/f20 [1446195,769] 0 2026-03-09T00:04:07.006 INFO:tasks.workunit.client.1.vm06.stdout:3/839: mknod d11/d28/d4d/d89/d90/dd2/c124 0 2026-03-09T00:04:07.006 INFO:tasks.workunit.client.1.vm06.stdout:3/840: readlink d11/d28/l97 0 2026-03-09T00:04:07.014 INFO:tasks.workunit.client.1.vm06.stdout:9/743: symlink d1/d73/dcf/lf1 0 2026-03-09T00:04:07.015 INFO:tasks.workunit.client.0.vm03.stdout:4/723: write d7/d20/d6a/d77/f82 [980192,91690] 0 2026-03-09T00:04:07.019 INFO:tasks.workunit.client.1.vm06.stdout:7/840: getdents d0/df/d1a/d35 0 2026-03-09T00:04:07.019 INFO:tasks.workunit.client.0.vm03.stdout:6/574: dwrite d13/d1e/d44/d59/d77/f96 [0,4194304] 0 2026-03-09T00:04:07.023 INFO:tasks.workunit.client.1.vm06.stdout:0/885: rmdir d3/d18/d2c/d2d/d74/daf/d10d/d127 0 2026-03-09T00:04:07.028 INFO:tasks.workunit.client.1.vm06.stdout:5/938: unlink d5/d1c/d23/d34/c109 0 2026-03-09T00:04:07.029 INFO:tasks.workunit.client.0.vm03.stdout:8/616: unlink d7/df/d1a/d40/db3/dba/d38/d91/l9a 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.0.vm03.stdout:7/583: creat d2/d1f/d3a/d24/da4/d91/daa/fb3 x:0 0 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.0.vm03.stdout:7/584: chown d2/d1f/l20 9102 1 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:1/759: creat d6/dc4/f102 x:0 0 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:1/760: fsync d6/d4c/d79/fee 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:4/859: mknod d17/d24/d3b/d97/db7/df1/c128 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:2/959: creat d7/d1a/f12b x:0 0 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:2/960: chown d7/da/d4e/d57/l118 16300735 1 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:1/761: write d6/d21/d2d/fe9 [2581133,29582] 0 2026-03-09T00:04:07.030 INFO:tasks.workunit.client.1.vm06.stdout:1/762: creat d6/d8f/f103 x:0 0 0 2026-03-09T00:04:07.041 INFO:tasks.workunit.client.1.vm06.stdout:8/855: mknod db/c111 0 2026-03-09T00:04:07.050 INFO:tasks.workunit.client.1.vm06.stdout:3/841: mkdir d11/d28/d125 0 2026-03-09T00:04:07.051 INFO:tasks.workunit.client.0.vm03.stdout:9/642: getdents d15/d1c/d28/d6e/da2 0 2026-03-09T00:04:07.063 INFO:tasks.workunit.client.1.vm06.stdout:9/744: creat d1/d73/dcf/ff2 x:0 0 0 2026-03-09T00:04:07.064 INFO:tasks.workunit.client.0.vm03.stdout:0/608: symlink d2/da/d36/da4/le2 0 2026-03-09T00:04:07.066 INFO:tasks.workunit.client.0.vm03.stdout:2/620: dwrite d8/d1b/d2a/d6b/f87 [0,4194304] 0 2026-03-09T00:04:07.070 INFO:tasks.workunit.client.1.vm06.stdout:7/841: symlink d0/df/d1a/d3f/lf8 0 2026-03-09T00:04:07.076 INFO:tasks.workunit.client.1.vm06.stdout:7/842: readlink d0/df/d1a/d35/lc2 0 2026-03-09T00:04:07.079 INFO:tasks.workunit.client.0.vm03.stdout:9/643: dread d15/f26 [0,4194304] 0 2026-03-09T00:04:07.079 INFO:tasks.workunit.client.0.vm03.stdout:9/644: creat d15/d1c/d28/d6e/fd4 x:0 0 0 2026-03-09T00:04:07.079 INFO:tasks.workunit.client.0.vm03.stdout:9/645: truncate d15/d1c/d21/d54/d87/d93/f7e 4950539 0 2026-03-09T00:04:07.080 INFO:tasks.workunit.client.0.vm03.stdout:4/724: mkdir d7/de6 0 2026-03-09T00:04:07.084 INFO:tasks.workunit.client.1.vm06.stdout:0/886: unlink d3/d18/d1f/d44/leb 0 2026-03-09T00:04:07.090 INFO:tasks.workunit.client.0.vm03.stdout:6/575: symlink d13/d1e/d44/d4a/d52/lc0 0 2026-03-09T00:04:07.090 INFO:tasks.workunit.client.0.vm03.stdout:6/576: fsync d13/f1a 0 2026-03-09T00:04:07.096 INFO:tasks.workunit.client.1.vm06.stdout:5/939: unlink d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/fe2 0 2026-03-09T00:04:07.100 INFO:tasks.workunit.client.0.vm03.stdout:8/617: mknod d7/df/d1a/d40/db3/dba/d38/d4c/d98/cbd 0 2026-03-09T00:04:07.105 INFO:tasks.workunit.client.0.vm03.stdout:7/585: creat d2/d4/d1e/d5e/d7e/fb4 x:0 0 0 2026-03-09T00:04:07.110 INFO:tasks.workunit.client.1.vm06.stdout:2/961: creat d7/da/d63/d81/dfe/db2/d102/d119/f12c x:0 0 0 2026-03-09T00:04:07.113 INFO:tasks.workunit.client.0.vm03.stdout:0/609: rmdir d2/da 39 2026-03-09T00:04:07.121 INFO:tasks.workunit.client.0.vm03.stdout:1/692: dwrite d4/d3a/d32/f4f [0,4194304] 0 2026-03-09T00:04:07.122 INFO:tasks.workunit.client.1.vm06.stdout:2/962: symlink d7/da/d63/d81/dfe/db2/l12d 0 2026-03-09T00:04:07.124 INFO:tasks.workunit.client.0.vm03.stdout:9/646: rename d15/f9a to d15/d1c/fd5 0 2026-03-09T00:04:07.125 INFO:tasks.workunit.client.0.vm03.stdout:6/577: rmdir d13/d35/d71 39 2026-03-09T00:04:07.125 INFO:tasks.workunit.client.0.vm03.stdout:6/578: fdatasync d13/f70 0 2026-03-09T00:04:07.126 INFO:tasks.workunit.client.0.vm03.stdout:8/618: dread d7/df/d1a/d40/db3/dba/d38/d60/f6e [0,4194304] 0 2026-03-09T00:04:07.127 INFO:tasks.workunit.client.0.vm03.stdout:0/610: truncate d2/da/dd/d49/d6c/d4b/d55/f83 1479475 0 2026-03-09T00:04:07.129 INFO:tasks.workunit.client.0.vm03.stdout:0/611: write d2/d71/f77 [960073,82816] 0 2026-03-09T00:04:07.129 INFO:tasks.workunit.client.0.vm03.stdout:2/621: mkdir d8/d1b/d2a/d6b/dc6 0 2026-03-09T00:04:07.129 INFO:tasks.workunit.client.0.vm03.stdout:2/622: chown d8/d26/d5e/d5f/lb9 88 1 2026-03-09T00:04:07.137 INFO:tasks.workunit.client.0.vm03.stdout:5/592: dwrite f12 [0,4194304] 0 2026-03-09T00:04:07.139 INFO:tasks.workunit.client.0.vm03.stdout:1/693: mknod d4/d15/d77/d8c/ced 0 2026-03-09T00:04:07.148 INFO:tasks.workunit.client.0.vm03.stdout:1/694: fdatasync d4/d5e/f82 0 2026-03-09T00:04:07.148 INFO:tasks.workunit.client.0.vm03.stdout:9/647: unlink d15/d7f/fb3 0 2026-03-09T00:04:07.149 INFO:tasks.workunit.client.0.vm03.stdout:6/579: mkdir d13/d35/d71/d97/da5/dc1 0 2026-03-09T00:04:07.149 INFO:tasks.workunit.client.0.vm03.stdout:5/593: mknod d1c/d20/d55/d4f/d58/d73/cbf 0 2026-03-09T00:04:07.149 INFO:tasks.workunit.client.0.vm03.stdout:9/648: creat d15/d1c/d21/d54/d87/fd6 x:0 0 0 2026-03-09T00:04:07.149 INFO:tasks.workunit.client.0.vm03.stdout:6/580: mknod d13/d8f/cc2 0 2026-03-09T00:04:07.149 INFO:tasks.workunit.client.0.vm03.stdout:5/594: mkdir d1c/d20/dc0 0 2026-03-09T00:04:07.149 INFO:tasks.workunit.client.0.vm03.stdout:5/595: truncate d1c/d20/d56/db4/fb7 382874 0 2026-03-09T00:04:07.168 INFO:tasks.workunit.client.0.vm03.stdout:1/695: rename d4/d3a/d61/d78/d81 to d4/d3a/d3d/d98/dee 0 2026-03-09T00:04:07.168 INFO:tasks.workunit.client.0.vm03.stdout:1/696: fdatasync d4/d3a/d61/f75 0 2026-03-09T00:04:07.168 INFO:tasks.workunit.client.0.vm03.stdout:9/649: getdents d15/d1c/d36/d4d/dc4 0 2026-03-09T00:04:07.168 INFO:tasks.workunit.client.0.vm03.stdout:9/650: fsync d15/d1c/d36/fa8 0 2026-03-09T00:04:07.168 INFO:tasks.workunit.client.0.vm03.stdout:9/651: chown d15/d1c/d21/d54/d87/d93/l92 407928 1 2026-03-09T00:04:07.170 INFO:tasks.workunit.client.0.vm03.stdout:1/697: mkdir d4/d3a/d3d/d98/dee/d9e/def 0 2026-03-09T00:04:07.170 INFO:tasks.workunit.client.0.vm03.stdout:1/698: fsync d4/d15/d1a/f1d 0 2026-03-09T00:04:07.172 INFO:tasks.workunit.client.0.vm03.stdout:9/652: symlink d15/d1c/d21/d54/dab/ld7 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/699: mknod d4/d15/d77/dce/cf0 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/700: creat d4/d3a/d8f/ff1 x:0 0 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/701: chown d4/d3a/d32/dc2/cdf 67911576 1 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/702: fdatasync d4/d3a/d43/daf/fbf 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:5/596: rename d1c/d20/d55/f52 to d1c/d20/d55/d4f/fc1 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:5/597: creat d1c/d51/d6a/fc2 x:0 0 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:5/598: fsync d1c/d20/d55/d4f/f69 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/703: rename d4/d3a/d32/d87/fa5 to d4/d6/d52/db5/ff2 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/586: dread d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f4c [0,4194304] 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/587: fsync d2/f50 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/588: fsync d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/f74 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/589: read - d2/d4/d1e/d78/fa5 zero size 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/590: chown d2/d1f/d3a/d24/da4/d91/d67/f95 597223 1 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/704: rmdir d4/d15/dae 39 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/705: write d4/d3a/d61/d78/f94 [1473883,34958] 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/706: write d4/d6/f6e [904247,19729] 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/591: symlink d2/d4/d1e/d5e/lb5 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:1/707: mkdir d4/d15/d77/dce/dd9/df3 0 2026-03-09T00:04:07.192 INFO:tasks.workunit.client.0.vm03.stdout:7/592: creat d2/d4/d1e/d85/fb6 x:0 0 0 2026-03-09T00:04:07.193 INFO:tasks.workunit.client.0.vm03.stdout:7/593: getdents d2/d4 0 2026-03-09T00:04:07.195 INFO:tasks.workunit.client.1.vm06.stdout:8/856: dwrite db/d74/d78/fd2 [0,4194304] 0 2026-03-09T00:04:07.196 INFO:tasks.workunit.client.0.vm03.stdout:9/653: write d15/d1c/d36/f86 [504196,10282] 0 2026-03-09T00:04:07.206 INFO:tasks.workunit.client.0.vm03.stdout:7/594: write d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f4c [141911,103089] 0 2026-03-09T00:04:07.206 INFO:tasks.workunit.client.0.vm03.stdout:7/595: write d2/d1f/d3a/d24/da4/f47 [4663509,13232] 0 2026-03-09T00:04:07.217 INFO:tasks.workunit.client.0.vm03.stdout:5/599: dread d1c/d20/d55/f46 [4194304,4194304] 0 2026-03-09T00:04:07.218 INFO:tasks.workunit.client.0.vm03.stdout:5/600: creat d1c/d20/dc0/fc3 x:0 0 0 2026-03-09T00:04:07.218 INFO:tasks.workunit.client.0.vm03.stdout:5/601: readlink d1c/d20/d55/l35 0 2026-03-09T00:04:07.218 INFO:tasks.workunit.client.0.vm03.stdout:5/602: creat d1c/fc4 x:0 0 0 2026-03-09T00:04:07.233 INFO:tasks.workunit.client.1.vm06.stdout:0/887: dwrite d3/d18/d2c/d2d/d8c/fb5 [0,4194304] 0 2026-03-09T00:04:07.239 INFO:tasks.workunit.client.1.vm06.stdout:1/763: dwrite d6/d21/d2d/d37/f86 [0,4194304] 0 2026-03-09T00:04:07.239 INFO:tasks.workunit.client.1.vm06.stdout:1/764: dread d6/d21/f2e [4194304,4194304] 0 2026-03-09T00:04:07.239 INFO:tasks.workunit.client.1.vm06.stdout:1/765: truncate d6/d21/d2d/d3b/d42/f9a 1650068 0 2026-03-09T00:04:07.243 INFO:tasks.workunit.client.1.vm06.stdout:1/766: creat d6/d21/dfc/de8/f104 x:0 0 0 2026-03-09T00:04:07.245 INFO:tasks.workunit.client.1.vm06.stdout:1/767: creat d6/dc4/f105 x:0 0 0 2026-03-09T00:04:07.245 INFO:tasks.workunit.client.1.vm06.stdout:1/768: truncate d6/d21/d2d/d37/f8b 2112408 0 2026-03-09T00:04:07.250 INFO:tasks.workunit.client.1.vm06.stdout:1/769: mkdir d6/d21/d2d/d3b/d42/df0/d106 0 2026-03-09T00:04:07.250 INFO:tasks.workunit.client.1.vm06.stdout:1/770: chown d6/f98 6787402 1 2026-03-09T00:04:07.250 INFO:tasks.workunit.client.1.vm06.stdout:1/771: dread - d6/dc4/f105 zero size 2026-03-09T00:04:07.251 INFO:tasks.workunit.client.1.vm06.stdout:1/772: truncate d6/d4c/d71/d83/f9b 639368 0 2026-03-09T00:04:07.251 INFO:tasks.workunit.client.1.vm06.stdout:1/773: chown d6/d63/f82 612153 1 2026-03-09T00:04:07.273 INFO:tasks.workunit.client.0.vm03.stdout:8/619: dwrite d7/df/d1a/d40/db3/dba/d3f/f8f [0,4194304] 0 2026-03-09T00:04:07.273 INFO:tasks.workunit.client.0.vm03.stdout:8/620: truncate d7/df/d1a/d40/d58/fbc 796175 0 2026-03-09T00:04:07.273 INFO:tasks.workunit.client.0.vm03.stdout:8/621: dread - d7/df/d1a/d40/db3/dba/d3f/d95/fb4 zero size 2026-03-09T00:04:07.273 INFO:tasks.workunit.client.0.vm03.stdout:8/622: chown d7/la 7 1 2026-03-09T00:04:07.275 INFO:tasks.workunit.client.0.vm03.stdout:8/623: getdents d7/df 0 2026-03-09T00:04:07.275 INFO:tasks.workunit.client.0.vm03.stdout:8/624: dread - d7/df/d1a/d2b/f72 zero size 2026-03-09T00:04:07.275 INFO:tasks.workunit.client.0.vm03.stdout:8/625: write d7/df/d1a/d40/d58/f57 [797668,61153] 0 2026-03-09T00:04:07.276 INFO:tasks.workunit.client.0.vm03.stdout:8/626: rename d7/df/d1a/d40/db3/dba/d3f/d95/c99 to d7/df/d1a/d40/db3/dba/dad/cbe 0 2026-03-09T00:04:07.279 INFO:tasks.workunit.client.0.vm03.stdout:8/627: creat d7/df/d1a/d40/db3/dba/d38/d91/fbf x:0 0 0 2026-03-09T00:04:07.281 INFO:tasks.workunit.client.1.vm06.stdout:7/843: dwrite d0/df/d1a/d27/d70/fc4 [0,4194304] 0 2026-03-09T00:04:07.282 INFO:tasks.workunit.client.1.vm06.stdout:7/844: rename d0/d55/d99/fcb to d0/df/d1a/d27/d4c/ff9 0 2026-03-09T00:04:07.282 INFO:tasks.workunit.client.1.vm06.stdout:7/845: creat d0/df/d1a/d3f/de8/ffa x:0 0 0 2026-03-09T00:04:07.288 INFO:tasks.workunit.client.1.vm06.stdout:7/846: rename d0/df/d1a/d3a/d4e/d5e/cbb to d0/df/d1a/d27/d4c/d40/d51/cfb 0 2026-03-09T00:04:07.323 INFO:tasks.workunit.client.0.vm03.stdout:9/654: dwrite d15/d1c/d9c/fc9 [0,4194304] 0 2026-03-09T00:04:07.325 INFO:tasks.workunit.client.0.vm03.stdout:9/655: mknod d15/d1c/d36/d4d/cd8 0 2026-03-09T00:04:07.325 INFO:tasks.workunit.client.0.vm03.stdout:9/656: creat d15/d1c/d36/d4d/fd9 x:0 0 0 2026-03-09T00:04:07.344 INFO:tasks.workunit.client.0.vm03.stdout:9/657: write d15/d1c/d36/f4a [1344630,129300] 0 2026-03-09T00:04:07.355 INFO:tasks.workunit.client.0.vm03.stdout:2/623: dwrite d8/d1b/d2a/f33 [4194304,4194304] 0 2026-03-09T00:04:07.356 INFO:tasks.workunit.client.0.vm03.stdout:4/725: dwrite d7/d20/d6a/d77/db7/f9f [0,4194304] 0 2026-03-09T00:04:07.356 INFO:tasks.workunit.client.0.vm03.stdout:4/726: fdatasync d7/d20/d29/d38/da9/ddc/f7c 0 2026-03-09T00:04:07.359 INFO:tasks.workunit.client.0.vm03.stdout:6/581: dwrite d13/d35/d69/fac [0,4194304] 0 2026-03-09T00:04:07.360 INFO:tasks.workunit.client.1.vm06.stdout:3/842: dwrite d11/d28/d4d/f6e [0,4194304] 0 2026-03-09T00:04:07.360 INFO:tasks.workunit.client.1.vm06.stdout:3/843: dread - d11/d28/d2e/d2f/f64 zero size 2026-03-09T00:04:07.367 INFO:tasks.workunit.client.1.vm06.stdout:3/844: creat d11/d28/d2e/dff/f126 x:0 0 0 2026-03-09T00:04:07.369 INFO:tasks.workunit.client.1.vm06.stdout:3/845: stat d11/d28/d2e/d2f/d36/f4a 0 2026-03-09T00:04:07.370 INFO:tasks.workunit.client.0.vm03.stdout:6/582: rename d13/d35/d74/f81 to d13/d1e/fc3 0 2026-03-09T00:04:07.372 INFO:tasks.workunit.client.1.vm06.stdout:3/846: link d11/f24 d11/d28/d2e/d2f/d5b/db5/f127 0 2026-03-09T00:04:07.372 INFO:tasks.workunit.client.1.vm06.stdout:3/847: fsync d11/d28/d2e/db2/f101 0 2026-03-09T00:04:07.376 INFO:tasks.workunit.client.1.vm06.stdout:3/848: dread d11/d28/f3a [0,4194304] 0 2026-03-09T00:04:07.377 INFO:tasks.workunit.client.1.vm06.stdout:3/849: symlink d11/d28/d4d/d89/d90/dd2/dfd/l128 0 2026-03-09T00:04:07.378 INFO:tasks.workunit.client.1.vm06.stdout:3/850: mkdir d11/d3f/d129 0 2026-03-09T00:04:07.383 INFO:tasks.workunit.client.0.vm03.stdout:6/583: dread d13/d1e/f21 [0,4194304] 0 2026-03-09T00:04:07.384 INFO:tasks.workunit.client.0.vm03.stdout:6/584: mkdir d13/dc4 0 2026-03-09T00:04:07.384 INFO:tasks.workunit.client.0.vm03.stdout:6/585: write d13/d35/f9e [258017,32476] 0 2026-03-09T00:04:07.384 INFO:tasks.workunit.client.0.vm03.stdout:6/586: creat d13/d35/d74/fc5 x:0 0 0 2026-03-09T00:04:07.384 INFO:tasks.workunit.client.0.vm03.stdout:6/587: chown d13/d8f/cc2 951741 1 2026-03-09T00:04:07.395 INFO:tasks.workunit.client.0.vm03.stdout:0/612: dwrite d2/da/dd/d49/d6c/d4b/d55/f60 [0,4194304] 0 2026-03-09T00:04:07.400 INFO:tasks.workunit.client.0.vm03.stdout:0/613: stat d2/da/dd/d49/d6c/d4b/l50 0 2026-03-09T00:04:07.400 INFO:tasks.workunit.client.0.vm03.stdout:0/614: rmdir d2/d5a 39 2026-03-09T00:04:07.400 INFO:tasks.workunit.client.0.vm03.stdout:0/615: rename d2/da/dd/c18 to d2/da/d36/da4/ce3 0 2026-03-09T00:04:07.400 INFO:tasks.workunit.client.0.vm03.stdout:0/616: write d2/da/dd/d49/d6c/d4b/f88 [1707336,21961] 0 2026-03-09T00:04:07.403 INFO:tasks.workunit.client.0.vm03.stdout:0/617: dread d2/da/d1a/f25 [0,4194304] 0 2026-03-09T00:04:07.425 INFO:tasks.workunit.client.0.vm03.stdout:8/628: getdents d7/df/d1a/d40/d58 0 2026-03-09T00:04:07.425 INFO:tasks.workunit.client.0.vm03.stdout:8/629: symlink d7/df/d1a/d40/db3/dba/lc0 0 2026-03-09T00:04:07.425 INFO:tasks.workunit.client.0.vm03.stdout:8/630: fsync d7/df/d1a/d40/d58/fbc 0 2026-03-09T00:04:07.426 INFO:tasks.workunit.client.0.vm03.stdout:8/631: truncate d7/df/d1a/d40/db3/dba/d3f/f47 1156638 0 2026-03-09T00:04:07.426 INFO:tasks.workunit.client.0.vm03.stdout:8/632: stat d7/df/d1a/d40/db3/dba/d38/d60/cac 0 2026-03-09T00:04:07.426 INFO:tasks.workunit.client.0.vm03.stdout:8/633: chown d7/df/d1a/d40/db3/dba/d38 309095507 1 2026-03-09T00:04:07.427 INFO:tasks.workunit.client.0.vm03.stdout:8/634: creat d7/df/d1a/d40/db3/dba/d38/d4c/fc1 x:0 0 0 2026-03-09T00:04:07.427 INFO:tasks.workunit.client.0.vm03.stdout:8/635: write f6 [1682479,113297] 0 2026-03-09T00:04:07.427 INFO:tasks.workunit.client.0.vm03.stdout:8/636: fsync f3 0 2026-03-09T00:04:07.429 INFO:tasks.workunit.client.0.vm03.stdout:8/637: mkdir d7/df/d1a/d40/db3/dba/d38/d91/dc2 0 2026-03-09T00:04:07.431 INFO:tasks.workunit.client.0.vm03.stdout:8/638: mkdir d7/df/d1a/d40/db3/dba/dc3 0 2026-03-09T00:04:07.450 INFO:tasks.workunit.client.1.vm06.stdout:1/774: dwrite d6/d4c/d71/fea [0,4194304] 0 2026-03-09T00:04:07.450 INFO:tasks.workunit.client.1.vm06.stdout:1/775: write d6/d21/fd4 [386071,14691] 0 2026-03-09T00:04:07.450 INFO:tasks.workunit.client.1.vm06.stdout:1/776: read - d6/d21/d2d/d3b/d87/d9d/dd8/fe6 zero size 2026-03-09T00:04:07.470 INFO:tasks.workunit.client.1.vm06.stdout:9/745: dwrite d1/d4/fe [0,4194304] 0 2026-03-09T00:04:07.472 INFO:tasks.workunit.client.1.vm06.stdout:9/746: truncate d1/d3/d4f/d52/f5e 4556807 0 2026-03-09T00:04:07.472 INFO:tasks.workunit.client.1.vm06.stdout:9/747: fsync d1/d3/d4f/d52/de3/de5/fed 0 2026-03-09T00:04:07.472 INFO:tasks.workunit.client.1.vm06.stdout:9/748: mknod d1/d3/d4f/d52/de3/cf3 0 2026-03-09T00:04:07.496 INFO:tasks.workunit.client.0.vm03.stdout:2/624: dwrite d8/d1b/d24/da5/da8/fba [0,4194304] 0 2026-03-09T00:04:07.497 INFO:tasks.workunit.client.1.vm06.stdout:0/888: dwrite d3/d18/d2c/d2d/d74/dc7/d110/f103 [0,4194304] 0 2026-03-09T00:04:07.497 INFO:tasks.workunit.client.1.vm06.stdout:0/889: fsync d3/d18/d1f/d39/fb1 0 2026-03-09T00:04:07.497 INFO:tasks.workunit.client.1.vm06.stdout:0/890: creat d3/d18/d1f/d39/d3b/f12f x:0 0 0 2026-03-09T00:04:07.497 INFO:tasks.workunit.client.1.vm06.stdout:0/891: write d3/d18/de9/f100 [348924,119831] 0 2026-03-09T00:04:07.497 INFO:tasks.workunit.client.0.vm03.stdout:1/708: dwrite d4/d15/d77/f7a [0,4194304] 0 2026-03-09T00:04:07.498 INFO:tasks.workunit.client.0.vm03.stdout:2/625: link d8/d1b/d2a/d6b/d50/f54 d8/d74/fc7 0 2026-03-09T00:04:07.502 INFO:tasks.workunit.client.0.vm03.stdout:4/727: dwrite d7/d20/d29/fa4 [0,4194304] 0 2026-03-09T00:04:07.508 INFO:tasks.workunit.client.0.vm03.stdout:2/626: rename d8/d1b/d2a/d2e/f94 to d8/d1b/d2a/d6b/d50/fc8 0 2026-03-09T00:04:07.513 INFO:tasks.workunit.client.1.vm06.stdout:0/892: rename d3/d18/d1f/d39/d49/d60/f113 to d3/d18/d1f/f130 0 2026-03-09T00:04:07.513 INFO:tasks.workunit.client.1.vm06.stdout:0/893: write d3/d18/d2c/d2d/d31/f5d [1463859,14731] 0 2026-03-09T00:04:07.514 INFO:tasks.workunit.client.0.vm03.stdout:0/618: dwrite d2/da/dd/d49/d6c/d4b/f4c [0,4194304] 0 2026-03-09T00:04:07.516 INFO:tasks.workunit.client.1.vm06.stdout:0/894: creat d3/d18/d2c/d2d/d74/daf/d10d/f131 x:0 0 0 2026-03-09T00:04:07.522 INFO:tasks.workunit.client.0.vm03.stdout:3/473: sync 2026-03-09T00:04:07.522 INFO:tasks.workunit.client.0.vm03.stdout:2/627: dread d8/f59 [0,4194304] 0 2026-03-09T00:04:07.522 INFO:tasks.workunit.client.0.vm03.stdout:0/619: symlink d2/da/dd/d6e/le4 0 2026-03-09T00:04:07.523 INFO:tasks.workunit.client.0.vm03.stdout:3/474: creat d2/f8c x:0 0 0 2026-03-09T00:04:07.523 INFO:tasks.workunit.client.0.vm03.stdout:3/475: write d2/db/d3b/d5d/d6d/d72/f7a [550388,23109] 0 2026-03-09T00:04:07.523 INFO:tasks.workunit.client.0.vm03.stdout:3/476: write d2/db/d3b/d5d/d6d/d72/f7a [1182219,11380] 0 2026-03-09T00:04:07.524 INFO:tasks.workunit.client.0.vm03.stdout:2/628: mkdir d8/d1b/d24/da5/dc9 0 2026-03-09T00:04:07.524 INFO:tasks.workunit.client.0.vm03.stdout:2/629: readlink d8/d1b/d2a/d2e/l7d 0 2026-03-09T00:04:07.525 INFO:tasks.workunit.client.0.vm03.stdout:0/620: creat d2/da/d36/fe5 x:0 0 0 2026-03-09T00:04:07.526 INFO:tasks.workunit.client.0.vm03.stdout:3/477: creat d2/db/d3b/d5d/f8d x:0 0 0 2026-03-09T00:04:07.530 INFO:tasks.workunit.client.0.vm03.stdout:3/478: dread - d2/db/f7e zero size 2026-03-09T00:04:07.530 INFO:tasks.workunit.client.0.vm03.stdout:2/630: rename d8/d26/d5e/d5f/c83 to d8/d1b/d6c/cca 0 2026-03-09T00:04:07.530 INFO:tasks.workunit.client.0.vm03.stdout:2/631: write d8/f9 [1414820,77036] 0 2026-03-09T00:04:07.530 INFO:tasks.workunit.client.0.vm03.stdout:2/632: fdatasync d8/d1b/f30 0 2026-03-09T00:04:07.538 INFO:tasks.workunit.client.0.vm03.stdout:8/639: dwrite d7/df/d1a/d40/db3/dba/d38/f3e [0,4194304] 0 2026-03-09T00:04:07.538 INFO:tasks.workunit.client.0.vm03.stdout:8/640: dread - d7/df/d1a/d40/db3/dba/d38/f85 zero size 2026-03-09T00:04:07.545 INFO:tasks.workunit.client.1.vm06.stdout:0/895: dread d3/d18/d2c/d2d/f46 [0,4194304] 0 2026-03-09T00:04:07.549 INFO:tasks.workunit.client.1.vm06.stdout:0/896: link d3/f19 d3/d18/d2c/d2d/d74/dc7/f132 0 2026-03-09T00:04:07.555 INFO:tasks.workunit.client.1.vm06.stdout:0/897: fdatasync d3/d18/d1f/d39/d3b/fbb 0 2026-03-09T00:04:07.555 INFO:tasks.workunit.client.0.vm03.stdout:5/603: rmdir d1c/d51/d6a 39 2026-03-09T00:04:07.555 INFO:tasks.workunit.client.0.vm03.stdout:5/604: symlink d1c/d20/d55/d4f/d58/d73/d76/d91/lc5 0 2026-03-09T00:04:07.555 INFO:tasks.workunit.client.0.vm03.stdout:5/605: chown d1c/d20/d55/db0 0 1 2026-03-09T00:04:07.555 INFO:tasks.workunit.client.1.vm06.stdout:0/898: mknod d3/d18/d1f/d119/c133 0 2026-03-09T00:04:07.557 INFO:tasks.workunit.client.0.vm03.stdout:5/606: mkdir d1c/d20/d55/d66/dc6 0 2026-03-09T00:04:07.558 INFO:tasks.workunit.client.1.vm06.stdout:0/899: symlink d3/d18/d2c/d2d/d74/daf/l134 0 2026-03-09T00:04:07.561 INFO:tasks.workunit.client.1.vm06.stdout:0/900: unlink d3/d18/f82 0 2026-03-09T00:04:07.565 INFO:tasks.workunit.client.0.vm03.stdout:7/596: dwrite d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f4c [0,4194304] 0 2026-03-09T00:04:07.569 INFO:tasks.workunit.client.1.vm06.stdout:0/901: rmdir d3/d18/d2c/d2d/d74/daf 39 2026-03-09T00:04:07.569 INFO:tasks.workunit.client.0.vm03.stdout:5/607: rmdir d1c/d20/dc0 39 2026-03-09T00:04:07.571 INFO:tasks.workunit.client.0.vm03.stdout:5/608: mkdir d1c/d20/d55/db0/dc7 0 2026-03-09T00:04:07.582 INFO:tasks.workunit.client.0.vm03.stdout:5/609: creat d1c/d20/d56/fc8 x:0 0 0 2026-03-09T00:04:07.589 INFO:tasks.workunit.client.0.vm03.stdout:6/588: fsync d13/d1e/fc3 0 2026-03-09T00:04:07.599 INFO:tasks.workunit.client.0.vm03.stdout:1/709: dwrite d4/d3a/d3d/d98/fdb [0,4194304] 0 2026-03-09T00:04:07.602 INFO:tasks.workunit.client.0.vm03.stdout:1/710: read d4/d3a/d61/d78/f94 [384854,23574] 0 2026-03-09T00:04:07.602 INFO:tasks.workunit.client.0.vm03.stdout:5/610: symlink d1c/d20/d55/db0/lc9 0 2026-03-09T00:04:07.605 INFO:tasks.workunit.client.0.vm03.stdout:1/711: truncate d4/d3a/d32/f4b 781270 0 2026-03-09T00:04:07.607 INFO:tasks.workunit.client.0.vm03.stdout:5/611: write d1c/d20/d55/d4f/f69 [3241006,106119] 0 2026-03-09T00:04:07.631 INFO:tasks.workunit.client.0.vm03.stdout:5/612: rmdir d1c/d20/dc0 39 2026-03-09T00:04:07.631 INFO:tasks.workunit.client.0.vm03.stdout:5/613: mkdir d1c/d20/d55/d66/d6b/d8f/dca 0 2026-03-09T00:04:07.631 INFO:tasks.workunit.client.0.vm03.stdout:5/614: rename d1c/d20/d55/d4f/d58/d73/c90 to d1c/d51/d6a/d75/ccb 0 2026-03-09T00:04:07.633 INFO:tasks.workunit.client.1.vm06.stdout:8/857: sync 2026-03-09T00:04:07.633 INFO:tasks.workunit.client.1.vm06.stdout:5/940: sync 2026-03-09T00:04:07.633 INFO:tasks.workunit.client.1.vm06.stdout:5/941: rename d5 to d5/d1c/d23/d34/d47/ddd/dd9/d138 22 2026-03-09T00:04:07.633 INFO:tasks.workunit.client.1.vm06.stdout:6/843: sync 2026-03-09T00:04:07.634 INFO:tasks.workunit.client.1.vm06.stdout:4/860: sync 2026-03-09T00:04:07.634 INFO:tasks.workunit.client.1.vm06.stdout:7/847: sync 2026-03-09T00:04:07.634 INFO:tasks.workunit.client.1.vm06.stdout:2/963: sync 2026-03-09T00:04:07.636 INFO:tasks.workunit.client.1.vm06.stdout:5/942: truncate d5/d1c/d21/d28/d5e/d66/d78/da6/fef 4005782 0 2026-03-09T00:04:07.638 INFO:tasks.workunit.client.0.vm03.stdout:3/479: dwrite d2/f4e [0,4194304] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.0.vm03.stdout:3/480: symlink d2/db/d40/l8e 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.0.vm03.stdout:5/615: dread d1c/f1f [0,4194304] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.0.vm03.stdout:5/616: rename d1c/d20/d55/d4f/d58 to d1c/d20/d55/d4f/d58/d73/d76/d91/dcc 22 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:4/861: dread d17/d24/d3b/f113 [0,4194304] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:8/858: dread db/dd/d24/db0/f103 [0,4194304] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:8/859: chown db/d1e/d46/d94/la1 3 1 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:5/943: dread d5/d1c/d21/d28/f57 [0,4194304] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:9/749: write d1/d4/fd6 [1550422,30812] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:1/777: dwrite d6/db0/fdc [0,4194304] 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:1/778: chown d6/d21/d2d/d3b/d42/f9a 715615 1 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:7/848: symlink d0/df/d1a/d3a/d4e/d5e/lfc 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:4/862: mknod d17/d24/d3b/c129 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:8/860: mkdir db/dd/d85/d112 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:5/944: creat d5/d1c/d68/dec/d115/d11e/d92/d49/f139 x:0 0 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:8/861: creat db/dd/d24/d63/f113 x:0 0 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:9/750: link d1/da7/fb9 d1/d3/d4f/d91/de8/ff4 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:1/779: symlink d6/d21/d2d/d37/l107 0 2026-03-09T00:04:07.656 INFO:tasks.workunit.client.1.vm06.stdout:4/863: symlink d17/d24/d3b/d97/db7/d121/l12a 0 2026-03-09T00:04:07.660 INFO:tasks.workunit.client.0.vm03.stdout:5/617: creat d1c/d20/d55/db0/fcd x:0 0 0 2026-03-09T00:04:07.661 INFO:tasks.workunit.client.1.vm06.stdout:7/849: dread d0/df/d1a/d27/d4c/d40/f5a [0,4194304] 0 2026-03-09T00:04:07.661 INFO:tasks.workunit.client.0.vm03.stdout:5/618: unlink d1c/d67/c82 0 2026-03-09T00:04:07.661 INFO:tasks.workunit.client.0.vm03.stdout:5/619: truncate d1c/d20/d55/d4f/d58/d5d/f8d 472817 0 2026-03-09T00:04:07.661 INFO:tasks.workunit.client.1.vm06.stdout:4/864: read d17/d21/d32/fbd [3485664,65738] 0 2026-03-09T00:04:07.661 INFO:tasks.workunit.client.1.vm06.stdout:4/865: write d17/d24/d49/de4/db0/ddd/fef [837703,114846] 0 2026-03-09T00:04:07.662 INFO:tasks.workunit.client.1.vm06.stdout:2/964: write d7/d1a/d25/d66/fa6 [5698173,122483] 0 2026-03-09T00:04:07.663 INFO:tasks.workunit.client.1.vm06.stdout:7/850: dread d0/df/d1a/d27/d4c/d40/fa5 [0,4194304] 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:5/945: creat d5/d1c/d68/da2/f13a x:0 0 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:5/946: creat d5/d1c/d68/da2/f13b x:0 0 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:1/780: creat d6/d21/d2d/f108 x:0 0 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:1/781: dread - d6/d4c/feb zero size 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:8/862: creat db/f114 x:0 0 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:9/751: mkdir d1/d4/df5 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:9/752: chown d1/d3/ddc/fde 1 1 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:9/753: creat d1/d3/d4f/d91/d94/ff6 x:0 0 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:9/754: readlink d1/d4/d6e/d14/d25/d85/d49/l88 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:4/866: symlink d17/d24/d49/de4/db0/l12b 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:7/851: getdents d0/df/d1a/d27/d4c/d40/d51 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:7/852: fsync d0/df/d17/dba/fd4 0 2026-03-09T00:04:07.668 INFO:tasks.workunit.client.1.vm06.stdout:7/853: write d0/df/d17/f74 [900396,31458] 0 2026-03-09T00:04:07.673 INFO:tasks.workunit.client.1.vm06.stdout:1/782: creat d6/d21/d2d/d3b/d87/d9d/dd8/f109 x:0 0 0 2026-03-09T00:04:07.673 INFO:tasks.workunit.client.1.vm06.stdout:1/783: chown d6/d21/d2d/d37/f78 228534233 1 2026-03-09T00:04:07.673 INFO:tasks.workunit.client.1.vm06.stdout:1/784: write d6/d4c/fe4 [154639,21173] 0 2026-03-09T00:04:07.676 INFO:tasks.workunit.client.1.vm06.stdout:8/863: rename db/d53/d6d/d7b/f8a to db/d74/d78/d98/d9c/f115 0 2026-03-09T00:04:07.676 INFO:tasks.workunit.client.1.vm06.stdout:5/947: dread d5/d44/d84/dc5/fd2 [0,4194304] 0 2026-03-09T00:04:07.676 INFO:tasks.workunit.client.1.vm06.stdout:8/864: creat db/d1e/d46/d94/f116 x:0 0 0 2026-03-09T00:04:07.676 INFO:tasks.workunit.client.1.vm06.stdout:8/865: fdatasync db/d74/d78/d98/db6/ff0 0 2026-03-09T00:04:07.684 INFO:tasks.workunit.client.1.vm06.stdout:5/948: creat d5/d1c/d68/dec/d115/d11e/d92/f13c x:0 0 0 2026-03-09T00:04:07.685 INFO:tasks.workunit.client.1.vm06.stdout:7/854: symlink d0/df/d1a/d35/lfd 0 2026-03-09T00:04:07.686 INFO:tasks.workunit.client.1.vm06.stdout:8/866: unlink db/d53/d70/d38/d4d/db1/fd4 0 2026-03-09T00:04:07.686 INFO:tasks.workunit.client.1.vm06.stdout:8/867: write db/d74/d78/fc9 [419447,125851] 0 2026-03-09T00:04:07.687 INFO:tasks.workunit.client.1.vm06.stdout:4/867: rmdir d17/d24/d3b/d97/db7/df1 39 2026-03-09T00:04:07.687 INFO:tasks.workunit.client.1.vm06.stdout:4/868: stat d17/d21/f11b 0 2026-03-09T00:04:07.689 INFO:tasks.workunit.client.1.vm06.stdout:7/855: stat d0/d55/d85/cde 0 2026-03-09T00:04:07.689 INFO:tasks.workunit.client.1.vm06.stdout:5/949: mkdir d5/d1c/d68/da2/d11f/d13d 0 2026-03-09T00:04:07.693 INFO:tasks.workunit.client.1.vm06.stdout:4/869: mkdir d17/d24/d3b/d97/db7/d12c 0 2026-03-09T00:04:07.698 INFO:tasks.workunit.client.1.vm06.stdout:7/856: dread d0/df/f8a [0,4194304] 0 2026-03-09T00:04:07.698 INFO:tasks.workunit.client.1.vm06.stdout:7/857: read - d0/df/d1a/d3f/f7d zero size 2026-03-09T00:04:07.701 INFO:tasks.workunit.client.1.vm06.stdout:3/851: dwrite d11/d28/d2e/d2f/d5b/d94/fa1 [0,4194304] 0 2026-03-09T00:04:07.703 INFO:tasks.workunit.client.1.vm06.stdout:8/868: dread db/dd/d84/fe4 [0,4194304] 0 2026-03-09T00:04:07.703 INFO:tasks.workunit.client.1.vm06.stdout:8/869: creat db/d53/d70/d38/d4d/d79/f117 x:0 0 0 2026-03-09T00:04:07.704 INFO:tasks.workunit.client.1.vm06.stdout:9/755: dread d1/d4/f39 [0,4194304] 0 2026-03-09T00:04:07.704 INFO:tasks.workunit.client.1.vm06.stdout:9/756: chown d1/da7/fc0 92851236 1 2026-03-09T00:04:07.704 INFO:tasks.workunit.client.1.vm06.stdout:9/757: truncate d1/d3/d50/fba 164690 0 2026-03-09T00:04:07.704 INFO:tasks.workunit.client.1.vm06.stdout:9/758: dread - d1/d73/fb1 zero size 2026-03-09T00:04:07.704 INFO:tasks.workunit.client.0.vm03.stdout:3/481: write d2/f4e [499390,25410] 0 2026-03-09T00:04:07.712 INFO:tasks.workunit.client.1.vm06.stdout:5/950: symlink d5/d1c/d21/d28/d5e/d66/dab/d11b/l13e 0 2026-03-09T00:04:07.715 INFO:tasks.workunit.client.1.vm06.stdout:7/858: mkdir d0/df/d1a/d35/dfe 0 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:7/859: chown d0/df/d1a/d3a/d4e 245808 1 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:7/860: chown d0/df/d1a/d27/d4c 13 1 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:7/861: stat d0/df/d1a/d27/d4c/ff9 0 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:7/862: dread - d0/df/d1a/d3f/f7d zero size 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:7/863: write d0/df/d1a/d22/f9e [84352,119034] 0 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:3/852: rmdir d11/d28/d2e/d2f/d5b/d5f/d91 39 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:3/853: readlink d11/d28/d2e/d2f/d36/d8f/lcb 0 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:8/870: symlink db/d74/d78/d98/db6/dc7/d10e/l118 0 2026-03-09T00:04:07.735 INFO:tasks.workunit.client.1.vm06.stdout:8/871: creat db/d53/d5c/f119 x:0 0 0 2026-03-09T00:04:07.736 INFO:tasks.workunit.client.1.vm06.stdout:5/951: mkdir d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/df0/d13f 0 2026-03-09T00:04:07.736 INFO:tasks.workunit.client.1.vm06.stdout:7/864: creat d0/d55/d99/db2/fff x:0 0 0 2026-03-09T00:04:07.736 INFO:tasks.workunit.client.1.vm06.stdout:7/865: chown d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc1 10834 1 2026-03-09T00:04:07.736 INFO:tasks.workunit.client.1.vm06.stdout:8/872: creat db/d53/d70/d38/d4d/d79/dd5/f11a x:0 0 0 2026-03-09T00:04:07.736 INFO:tasks.workunit.client.1.vm06.stdout:3/854: link d11/d28/d2e/db2/dc2/cd7 d11/d3f/c12a 0 2026-03-09T00:04:07.736 INFO:tasks.workunit.client.1.vm06.stdout:3/855: read d11/d28/d2e/d2f/d36/fb7 [852286,107939] 0 2026-03-09T00:04:07.739 INFO:tasks.workunit.client.1.vm06.stdout:9/759: read d1/d3/d4f/d52/fa5 [491707,109651] 0 2026-03-09T00:04:07.739 INFO:tasks.workunit.client.1.vm06.stdout:9/760: fdatasync d1/d3/faf 0 2026-03-09T00:04:07.740 INFO:tasks.workunit.client.1.vm06.stdout:3/856: truncate d11/d28/d2e/f32 531570 0 2026-03-09T00:04:07.750 INFO:tasks.workunit.client.1.vm06.stdout:3/857: creat d11/d28/d125/f12b x:0 0 0 2026-03-09T00:04:07.750 INFO:tasks.workunit.client.1.vm06.stdout:3/858: dread d11/d28/d4d/d89/d90/fba [0,4194304] 0 2026-03-09T00:04:07.750 INFO:tasks.workunit.client.1.vm06.stdout:3/859: chown d11/f3c 2836 1 2026-03-09T00:04:07.752 INFO:tasks.workunit.client.1.vm06.stdout:3/860: symlink d11/d28/d2e/d2f/d5b/l12c 0 2026-03-09T00:04:07.752 INFO:tasks.workunit.client.1.vm06.stdout:3/861: chown d11/d28/d2e/d2f/d5b/db5 104 1 2026-03-09T00:04:07.752 INFO:tasks.workunit.client.1.vm06.stdout:3/862: fdatasync d11/f5a 0 2026-03-09T00:04:07.753 INFO:tasks.workunit.client.1.vm06.stdout:3/863: read - d11/d28/d2e/db2/dc2/f108 zero size 2026-03-09T00:04:07.753 INFO:tasks.workunit.client.1.vm06.stdout:3/864: mknod d11/d28/d2e/d7e/d83/d87/c12d 0 2026-03-09T00:04:07.755 INFO:tasks.workunit.client.1.vm06.stdout:3/865: truncate d11/d28/d2e/d2f/f3e 2772175 0 2026-03-09T00:04:07.756 INFO:tasks.workunit.client.1.vm06.stdout:3/866: mkdir d11/d28/d2e/d2f/d36/d8f/d12e 0 2026-03-09T00:04:07.765 INFO:tasks.workunit.client.0.vm03.stdout:8/641: dwrite d7/df/d1a/d40/f76 [0,4194304] 0 2026-03-09T00:04:07.765 INFO:tasks.workunit.client.0.vm03.stdout:0/621: fsync d2/da/dd/d49/d6c/d4b/f88 0 2026-03-09T00:04:07.766 INFO:tasks.workunit.client.1.vm06.stdout:6/844: dwrite d4/d16/d53/ddf/d4b/f95 [0,4194304] 0 2026-03-09T00:04:07.766 INFO:tasks.workunit.client.1.vm06.stdout:6/845: fsync d4/f12 0 2026-03-09T00:04:07.766 INFO:tasks.workunit.client.1.vm06.stdout:6/846: fsync d4/d27/d3e/d45/f4d 0 2026-03-09T00:04:07.766 INFO:tasks.workunit.client.1.vm06.stdout:6/847: chown d4/d16/d53/d67/ce4 22 1 2026-03-09T00:04:07.766 INFO:tasks.workunit.client.1.vm06.stdout:6/848: write d4/d16/d53/ddf/d4b/f83 [3193704,83949] 0 2026-03-09T00:04:07.771 INFO:tasks.workunit.client.0.vm03.stdout:8/642: link d7/f11 d7/df/d1a/fc4 0 2026-03-09T00:04:07.777 INFO:tasks.workunit.client.0.vm03.stdout:0/622: unlink d2/da/d36/ce1 0 2026-03-09T00:04:07.779 INFO:tasks.workunit.client.0.vm03.stdout:1/712: dwrite d4/d6/d52/f9a [0,4194304] 0 2026-03-09T00:04:07.785 INFO:tasks.workunit.client.0.vm03.stdout:8/643: unlink d7/df/f2c 0 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:0/623: rename d2/da/dd/d49/d6c/d4b/d55/f5b to d2/da/d76/d8a/d8f/db8/fe6 0 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:0/624: symlink d2/da/d76/d8a/d8f/db8/le7 0 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:0/625: dread - d2/da/d1a/fd5 zero size 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:8/644: symlink d7/df/d1a/d40/db3/dba/d38/lc5 0 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:0/626: mkdir d2/da/dd/d49/d6c/d4b/d55/d6f/dad/de8 0 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:8/645: getdents d7 0 2026-03-09T00:04:07.800 INFO:tasks.workunit.client.0.vm03.stdout:8/646: readlink d7/df/d1a/lb1 0 2026-03-09T00:04:07.802 INFO:tasks.workunit.client.0.vm03.stdout:2/633: dwrite d8/d1b/d6c/f7b [0,4194304] 0 2026-03-09T00:04:07.803 INFO:tasks.workunit.client.0.vm03.stdout:0/627: stat d2/da/l13 0 2026-03-09T00:04:07.804 INFO:tasks.workunit.client.0.vm03.stdout:8/647: dread d7/df/d1a/d40/db3/dba/d3f/f47 [0,4194304] 0 2026-03-09T00:04:07.814 INFO:tasks.workunit.client.0.vm03.stdout:2/634: symlink d8/d1b/d24/da5/da8/lcb 0 2026-03-09T00:04:07.827 INFO:tasks.workunit.client.0.vm03.stdout:0/628: truncate d2/da/dd/d49/f69 333563 0 2026-03-09T00:04:07.827 INFO:tasks.workunit.client.0.vm03.stdout:2/635: getdents d8/d1b 0 2026-03-09T00:04:07.827 INFO:tasks.workunit.client.0.vm03.stdout:0/629: getdents d2/da/d76/d8a 0 2026-03-09T00:04:07.828 INFO:tasks.workunit.client.0.vm03.stdout:2/636: dread d8/d1b/d2a/d6b/f92 [0,4194304] 0 2026-03-09T00:04:07.829 INFO:tasks.workunit.client.0.vm03.stdout:2/637: mknod d8/d1b/d2a/d2e/d9a/ccc 0 2026-03-09T00:04:07.829 INFO:tasks.workunit.client.0.vm03.stdout:2/638: fdatasync d8/d1b/d2a/d6b/d50/d8a/fc3 0 2026-03-09T00:04:07.829 INFO:tasks.workunit.client.0.vm03.stdout:2/639: write d8/d26/d5e/d5f/f60 [5217407,1514] 0 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:07 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.843 INFO:tasks.workunit.client.0.vm03.stdout:4/728: dwrite d7/d20/f34 [4194304,4194304] 0 2026-03-09T00:04:07.850 INFO:tasks.workunit.client.1.vm06.stdout:2/965: dwrite d7/da/d63/fe4 [4194304,4194304] 0 2026-03-09T00:04:07.850 INFO:tasks.workunit.client.1.vm06.stdout:2/966: read d7/da/db/f6e [1561685,38638] 0 2026-03-09T00:04:07.850 INFO:tasks.workunit.client.1.vm06.stdout:2/967: chown d7/da/d4e/d57/f115 398197 1 2026-03-09T00:04:07.850 INFO:tasks.workunit.client.1.vm06.stdout:2/968: stat d7/d1a/d96 0 2026-03-09T00:04:07.851 INFO:tasks.workunit.client.0.vm03.stdout:0/630: dread d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:04:07.859 INFO:tasks.workunit.client.0.vm03.stdout:4/729: rename d7/d20/d29/d78/lbc to d7/d20/d35/d66/le7 0 2026-03-09T00:04:07.882 INFO:tasks.workunit.client.0.vm03.stdout:0/631: truncate d2/d71/f77 303917 0 2026-03-09T00:04:07.882 INFO:tasks.workunit.client.0.vm03.stdout:0/632: truncate d2/da/d76/fb2 903043 0 2026-03-09T00:04:07.882 INFO:tasks.workunit.client.0.vm03.stdout:0/633: mknod d2/d5a/ce9 0 2026-03-09T00:04:07.890 INFO:tasks.workunit.client.1.vm06.stdout:4/870: dwrite d17/d21/d4c/d66/dd9/f7e [0,4194304] 0 2026-03-09T00:04:07.894 INFO:tasks.workunit.client.1.vm06.stdout:4/871: dread d17/d21/d4c/fd4 [0,4194304] 0 2026-03-09T00:04:07.894 INFO:tasks.workunit.client.1.vm06.stdout:4/872: creat d17/d21/d4c/dc2/f12d x:0 0 0 2026-03-09T00:04:07.900 INFO:tasks.workunit.client.1.vm06.stdout:0/902: dwrite d3/d18/d1f/d39/d49/f4b [4194304,4194304] 0 2026-03-09T00:04:07.902 INFO:tasks.workunit.client.1.vm06.stdout:3/867: dwrite d11/f1d [0,4194304] 0 2026-03-09T00:04:07.907 INFO:tasks.workunit.client.1.vm06.stdout:4/873: symlink d17/d24/d3b/d5e/d127/l12e 0 2026-03-09T00:04:07.915 INFO:tasks.workunit.client.1.vm06.stdout:0/903: creat d3/d18/d1f/f135 x:0 0 0 2026-03-09T00:04:07.915 INFO:tasks.workunit.client.1.vm06.stdout:0/904: write d3/d18/d1f/d39/d3b/df9/df2/f12a [398388,77596] 0 2026-03-09T00:04:07.919 INFO:tasks.workunit.client.1.vm06.stdout:0/905: write d3/d18/de9/fbd [1466703,86884] 0 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:04:07.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:07 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:07.923 INFO:tasks.workunit.client.1.vm06.stdout:3/868: link d11/d28/d2e/d2f/dc1/f10f d11/d28/f12f 0 2026-03-09T00:04:07.923 INFO:tasks.workunit.client.1.vm06.stdout:4/874: mkdir d17/d21/d4c/d50/d12f 0 2026-03-09T00:04:07.937 INFO:tasks.workunit.client.1.vm06.stdout:4/875: link d17/d24/d3b/d5e/f6f d17/d21/d4c/d66/d68/dbe/f130 0 2026-03-09T00:04:07.937 INFO:tasks.workunit.client.1.vm06.stdout:4/876: read - d17/d5b/ff9 zero size 2026-03-09T00:04:07.937 INFO:tasks.workunit.client.1.vm06.stdout:4/877: chown d17/d21/d4c/c11e 187259428 1 2026-03-09T00:04:07.939 INFO:tasks.workunit.client.1.vm06.stdout:4/878: dread d17/d24/fce [0,4194304] 0 2026-03-09T00:04:07.939 INFO:tasks.workunit.client.1.vm06.stdout:4/879: dread - d17/d24/d3b/d5e/d7a/fde zero size 2026-03-09T00:04:07.939 INFO:tasks.workunit.client.1.vm06.stdout:4/880: fdatasync d17/d21/d4c/d50/f60 0 2026-03-09T00:04:07.940 INFO:tasks.workunit.client.1.vm06.stdout:4/881: mknod d17/d21/d4c/d66/de3/c131 0 2026-03-09T00:04:07.942 INFO:tasks.workunit.client.1.vm06.stdout:4/882: rename d17/d24/d3b/dbf/ddf/dfc/c120 to d17/d24/d49/d5f/c132 0 2026-03-09T00:04:07.957 INFO:tasks.workunit.client.0.vm03.stdout:1/713: dwrite d4/d3a/f4d [0,4194304] 0 2026-03-09T00:04:07.961 INFO:tasks.workunit.client.1.vm06.stdout:7/866: dwrite d0/df/d1a/d27/d70/fc4 [0,4194304] 0 2026-03-09T00:04:07.963 INFO:tasks.workunit.client.0.vm03.stdout:1/714: dread d4/d15/f18 [0,4194304] 0 2026-03-09T00:04:07.963 INFO:tasks.workunit.client.0.vm03.stdout:1/715: readlink d4/d3a/d3d/le8 0 2026-03-09T00:04:07.969 INFO:tasks.workunit.client.0.vm03.stdout:1/716: rename d4/l7 to d4/d15/d1a/lf4 0 2026-03-09T00:04:07.988 INFO:tasks.workunit.client.0.vm03.stdout:1/717: chown d4/d3a/d32/dc2/cea 18573308 1 2026-03-09T00:04:07.988 INFO:tasks.workunit.client.0.vm03.stdout:0/634: getdents d2/da/d36 0 2026-03-09T00:04:07.989 INFO:tasks.workunit.client.1.vm06.stdout:3/869: dread d11/d28/d2e/f47 [4194304,4194304] 0 2026-03-09T00:04:07.989 INFO:tasks.workunit.client.1.vm06.stdout:7/867: creat d0/df/d17/dba/de4/f100 x:0 0 0 2026-03-09T00:04:07.989 INFO:tasks.workunit.client.1.vm06.stdout:3/870: creat d11/d28/d2e/d2f/d5b/db5/f130 x:0 0 0 2026-03-09T00:04:07.992 INFO:tasks.workunit.client.1.vm06.stdout:3/871: read d11/d28/d4d/f6e [2620542,54123] 0 2026-03-09T00:04:07.993 INFO:tasks.workunit.client.0.vm03.stdout:1/718: dread d4/f12 [0,4194304] 0 2026-03-09T00:04:07.996 INFO:tasks.workunit.client.0.vm03.stdout:6/589: dwrite d13/d1e/f3e [0,4194304] 0 2026-03-09T00:04:07.999 INFO:tasks.workunit.client.0.vm03.stdout:0/635: symlink d2/da/dd/d49/d6c/da6/dda/lea 0 2026-03-09T00:04:08.005 INFO:tasks.workunit.client.0.vm03.stdout:1/719: rmdir d4/d3a/d3d/d98/dee/d9e/de1 0 2026-03-09T00:04:08.009 INFO:tasks.workunit.client.0.vm03.stdout:6/590: rename d13/f55 to d13/d35/db5/fc6 0 2026-03-09T00:04:08.012 INFO:tasks.workunit.client.0.vm03.stdout:5/620: dwrite f15 [0,4194304] 0 2026-03-09T00:04:08.012 INFO:tasks.workunit.client.0.vm03.stdout:5/621: chown d1c/d20/d56/d74/l7c 39 1 2026-03-09T00:04:08.019 INFO:tasks.workunit.client.1.vm06.stdout:0/906: dwrite d3/d18/d1f/d39/fb1 [0,4194304] 0 2026-03-09T00:04:08.041 INFO:tasks.workunit.client.0.vm03.stdout:5/622: symlink d1c/d20/d55/d4f/d58/d73/d76/d91/lce 0 2026-03-09T00:04:08.045 INFO:tasks.workunit.client.1.vm06.stdout:0/907: mknod d3/d18/d2c/d2d/d74/c136 0 2026-03-09T00:04:08.046 INFO:tasks.workunit.client.1.vm06.stdout:0/908: mknod d3/d18/d2c/d2d/c137 0 2026-03-09T00:04:08.048 INFO:tasks.workunit.client.0.vm03.stdout:5/623: getdents d1c/d20/d55/d4f/d58/d73 0 2026-03-09T00:04:08.048 INFO:tasks.workunit.client.0.vm03.stdout:5/624: readlink d1c/d51/d6a/lab 0 2026-03-09T00:04:08.048 INFO:tasks.workunit.client.0.vm03.stdout:5/625: write d1c/d51/d6a/d75/f77 [771001,35242] 0 2026-03-09T00:04:08.050 INFO:tasks.workunit.client.1.vm06.stdout:0/909: creat d3/d18/d1f/d39/d49/d115/f138 x:0 0 0 2026-03-09T00:04:08.062 INFO:tasks.workunit.client.1.vm06.stdout:0/910: fdatasync d3/d18/d2c/d2d/f40 0 2026-03-09T00:04:08.062 INFO:tasks.workunit.client.1.vm06.stdout:0/911: readlink d3/d18/d2c/d2d/d74/dc7/d110/d45/l62 0 2026-03-09T00:04:08.062 INFO:tasks.workunit.client.0.vm03.stdout:5/626: creat d1c/d20/d55/d4f/d58/d73/d9e/da5/fcf x:0 0 0 2026-03-09T00:04:08.062 INFO:tasks.workunit.client.0.vm03.stdout:5/627: unlink d1c/d20/d55/d4f/c5c 0 2026-03-09T00:04:08.063 INFO:tasks.workunit.client.0.vm03.stdout:5/628: rename d1c/d20/f3e to d1c/d20/d55/d4f/d58/d5d/fd0 0 2026-03-09T00:04:08.074 INFO:tasks.workunit.client.0.vm03.stdout:5/629: dread fe [0,4194304] 0 2026-03-09T00:04:08.089 INFO:tasks.workunit.client.0.vm03.stdout:5/630: stat d1c/d20/l4b 0 2026-03-09T00:04:08.090 INFO:tasks.workunit.client.1.vm06.stdout:9/761: dwrite d1/d73/fc2 [0,4194304] 0 2026-03-09T00:04:08.090 INFO:tasks.workunit.client.1.vm06.stdout:1/785: dwrite d6/d21/d2d/d37/d6d/dd7/ff6 [0,4194304] 0 2026-03-09T00:04:08.090 INFO:tasks.workunit.client.0.vm03.stdout:5/631: dread d1c/d20/d55/d66/d6b/d8f/f98 [0,4194304] 0 2026-03-09T00:04:08.090 INFO:tasks.workunit.client.0.vm03.stdout:5/632: fsync fe 0 2026-03-09T00:04:08.090 INFO:tasks.workunit.client.0.vm03.stdout:5/633: link d1c/d20/d55/f3d d1c/d20/d55/d4f/d58/d73/d9e/fd1 0 2026-03-09T00:04:08.090 INFO:tasks.workunit.client.1.vm06.stdout:2/969: dwrite d7/d1a/d89/f121 [0,4194304] 0 2026-03-09T00:04:08.092 INFO:tasks.workunit.client.0.vm03.stdout:4/730: dwrite d7/d20/d29/d54/d58/f6b [0,4194304] 0 2026-03-09T00:04:08.103 INFO:tasks.workunit.client.1.vm06.stdout:7/868: dwrite d0/df/d1a/d35/f94 [0,4194304] 0 2026-03-09T00:04:08.104 INFO:tasks.workunit.client.0.vm03.stdout:4/731: dread d7/f1c [0,4194304] 0 2026-03-09T00:04:08.105 INFO:tasks.workunit.client.0.vm03.stdout:4/732: chown d7/d20/ld9 244973 1 2026-03-09T00:04:08.110 INFO:tasks.workunit.client.1.vm06.stdout:8/873: dwrite db/d74/d87/d100/d10a/df5/f108 [0,4194304] 0 2026-03-09T00:04:08.117 INFO:tasks.workunit.client.1.vm06.stdout:2/970: getdents d7/ddb 0 2026-03-09T00:04:08.125 INFO:tasks.workunit.client.1.vm06.stdout:1/786: rmdir d6/d21/d2d/d3b/d87 39 2026-03-09T00:04:08.125 INFO:tasks.workunit.client.1.vm06.stdout:8/874: mknod db/dd/d24/da7/dab/c11b 0 2026-03-09T00:04:08.125 INFO:tasks.workunit.client.1.vm06.stdout:8/875: chown db/dd/l18 107 1 2026-03-09T00:04:08.126 INFO:tasks.workunit.client.1.vm06.stdout:8/876: creat db/d74/d78/d98/db6/dc7/d101/db7/f11c x:0 0 0 2026-03-09T00:04:08.128 INFO:tasks.workunit.client.1.vm06.stdout:8/877: creat db/d74/d78/d98/db6/dc7/f11d x:0 0 0 2026-03-09T00:04:08.128 INFO:tasks.workunit.client.1.vm06.stdout:8/878: readlink db/d74/d87/d100/l10d 0 2026-03-09T00:04:08.130 INFO:tasks.workunit.client.1.vm06.stdout:8/879: getdents db/d53/d70 0 2026-03-09T00:04:08.130 INFO:tasks.workunit.client.1.vm06.stdout:8/880: write db/dd/d24/dac/fe6 [545862,49513] 0 2026-03-09T00:04:08.130 INFO:tasks.workunit.client.1.vm06.stdout:8/881: read - db/d74/d87/fca zero size 2026-03-09T00:04:08.130 INFO:tasks.workunit.client.1.vm06.stdout:8/882: readlink db/d53/d5c/l6a 0 2026-03-09T00:04:08.132 INFO:tasks.workunit.client.1.vm06.stdout:8/883: creat db/dd/d24/da7/f11e x:0 0 0 2026-03-09T00:04:08.132 INFO:tasks.workunit.client.1.vm06.stdout:8/884: readlink db/d53/d5c/l6a 0 2026-03-09T00:04:08.134 INFO:tasks.workunit.client.1.vm06.stdout:8/885: creat db/d53/d70/d38/d4d/f11f x:0 0 0 2026-03-09T00:04:08.197 INFO:tasks.workunit.client.0.vm03.stdout:1/720: dwrite d4/d3a/d61/d78/f94 [0,4194304] 0 2026-03-09T00:04:08.235 INFO:tasks.workunit.client.0.vm03.stdout:6/591: dwrite d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:04:08.251 INFO:tasks.workunit.client.0.vm03.stdout:0/636: dwrite d2/da/d36/da4/f3f [0,4194304] 0 2026-03-09T00:04:08.252 INFO:tasks.workunit.client.0.vm03.stdout:6/592: dread d13/d35/d71/fb0 [0,4194304] 0 2026-03-09T00:04:08.252 INFO:tasks.workunit.client.0.vm03.stdout:6/593: fdatasync d13/d1e/f3e 0 2026-03-09T00:04:08.252 INFO:tasks.workunit.client.0.vm03.stdout:6/594: read d13/d35/db5/fc6 [60085,105462] 0 2026-03-09T00:04:08.253 INFO:tasks.workunit.client.0.vm03.stdout:2/640: dwrite d8/f5d [0,4194304] 0 2026-03-09T00:04:08.253 INFO:tasks.workunit.client.0.vm03.stdout:6/595: getdents d13/d1e/d44/d4a/d52 0 2026-03-09T00:04:08.256 INFO:tasks.workunit.client.0.vm03.stdout:0/637: dread d2/da/dd/d49/d6c/d4b/f4c [0,4194304] 0 2026-03-09T00:04:08.256 INFO:tasks.workunit.client.0.vm03.stdout:2/641: mknod d8/d1b/d8f/ccd 0 2026-03-09T00:04:08.259 INFO:tasks.workunit.client.1.vm06.stdout:5/952: dwrite d5/d1c/d21/f120 [0,4194304] 0 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.0.vm03.stdout:6/596: symlink d13/d35/lc7 0 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.0.vm03.stdout:6/597: readlink d13/d35/d71/d97/l8a 0 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.0.vm03.stdout:6/598: chown d13/d35/d4c/c5e 0 1 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.0.vm03.stdout:6/599: stat d13/d1e/f21 0 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.0.vm03.stdout:6/600: write d13/f92 [649591,51785] 0 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.0.vm03.stdout:8/648: dwrite d7/df/f55 [0,4194304] 0 2026-03-09T00:04:08.260 INFO:tasks.workunit.client.1.vm06.stdout:2/971: dwrite d7/d1b/d71/f12a [0,4194304] 0 2026-03-09T00:04:08.261 INFO:tasks.workunit.client.1.vm06.stdout:2/972: chown d7/da/c11d 0 1 2026-03-09T00:04:08.261 INFO:tasks.workunit.client.0.vm03.stdout:0/638: link d2/da/dd/d6e/lb9 d2/leb 0 2026-03-09T00:04:08.264 INFO:tasks.workunit.client.1.vm06.stdout:5/953: dread d5/d1c/d21/d28/d5e/d66/d78/da6/f124 [0,4194304] 0 2026-03-09T00:04:08.268 INFO:tasks.workunit.client.0.vm03.stdout:2/642: getdents d8/d26/d5e/d5f 0 2026-03-09T00:04:08.274 INFO:tasks.workunit.client.1.vm06.stdout:7/869: dwrite d0/df/d1a/d27/d4c/d40/f67 [0,4194304] 0 2026-03-09T00:04:08.277 INFO:tasks.workunit.client.1.vm06.stdout:2/973: unlink d7/d1b/d31/l38 0 2026-03-09T00:04:08.277 INFO:tasks.workunit.client.1.vm06.stdout:2/974: dread - d7/d1b/d71/d79/db4/dc1/d86/fe1 zero size 2026-03-09T00:04:08.277 INFO:tasks.workunit.client.1.vm06.stdout:2/975: creat d7/d1b/d71/d79/f12e x:0 0 0 2026-03-09T00:04:08.278 INFO:tasks.workunit.client.1.vm06.stdout:2/976: write d7/d1b/f37 [393996,60903] 0 2026-03-09T00:04:08.278 INFO:tasks.workunit.client.1.vm06.stdout:2/977: dread - d7/d1a/d56/fd4 zero size 2026-03-09T00:04:08.278 INFO:tasks.workunit.client.1.vm06.stdout:2/978: chown d7/d1b/d71/d79/db4/dc1/l8f 11 1 2026-03-09T00:04:08.278 INFO:tasks.workunit.client.1.vm06.stdout:2/979: fsync d7/da/d1c/ff5 0 2026-03-09T00:04:08.286 INFO:tasks.workunit.client.1.vm06.stdout:3/872: dwrite d11/d28/d4d/d89/d90/fba [0,4194304] 0 2026-03-09T00:04:08.296 INFO:tasks.workunit.client.0.vm03.stdout:9/658: sync 2026-03-09T00:04:08.296 INFO:tasks.workunit.client.0.vm03.stdout:6/601: rename d13/d1e/d44/d59/lb2 to d13/d35/d74/d89/db3/lc8 0 2026-03-09T00:04:08.305 INFO:tasks.workunit.client.1.vm06.stdout:5/954: symlink d5/d44/d84/l140 0 2026-03-09T00:04:08.305 INFO:tasks.workunit.client.1.vm06.stdout:5/955: write d5/d1c/f2d [479696,77636] 0 2026-03-09T00:04:08.305 INFO:tasks.workunit.client.1.vm06.stdout:5/956: stat d5/d1c/d68/dec/d115/d11e/f91 0 2026-03-09T00:04:08.305 INFO:tasks.workunit.client.1.vm06.stdout:5/957: dread - d5/d1c/d68/da2/d107/f10c zero size 2026-03-09T00:04:08.305 INFO:tasks.workunit.client.1.vm06.stdout:0/912: write d3/d18/d1f/f4a [297075,114593] 0 2026-03-09T00:04:08.306 INFO:tasks.workunit.client.0.vm03.stdout:8/649: creat d7/df/d1a/d40/db3/dba/dad/fc6 x:0 0 0 2026-03-09T00:04:08.310 INFO:tasks.workunit.client.1.vm06.stdout:9/762: write d1/d4/d6e/d9/f10 [646630,16777] 0 2026-03-09T00:04:08.310 INFO:tasks.workunit.client.1.vm06.stdout:9/763: readlink d1/d4/d6e/d9/l79 0 2026-03-09T00:04:08.310 INFO:tasks.workunit.client.1.vm06.stdout:9/764: dread - d1/d3/d4f/fbd zero size 2026-03-09T00:04:08.310 INFO:tasks.workunit.client.1.vm06.stdout:9/765: creat d1/d3/d4f/d91/ff7 x:0 0 0 2026-03-09T00:04:08.310 INFO:tasks.workunit.client.1.vm06.stdout:9/766: chown d1/d3/d4f/d52/f5e 662 1 2026-03-09T00:04:08.311 INFO:tasks.workunit.client.1.vm06.stdout:7/870: creat d0/df/d7b/dd2/f101 x:0 0 0 2026-03-09T00:04:08.314 INFO:tasks.workunit.client.0.vm03.stdout:0/639: truncate d2/da/d36/da4/f43 559095 0 2026-03-09T00:04:08.315 INFO:tasks.workunit.client.1.vm06.stdout:2/980: symlink d7/d1a/d25/l12f 0 2026-03-09T00:04:08.317 INFO:tasks.workunit.client.1.vm06.stdout:2/981: truncate d7/d1b/d71/d79/db4/f127 375285 0 2026-03-09T00:04:08.317 INFO:tasks.workunit.client.1.vm06.stdout:1/787: dwrite d6/fb [0,4194304] 0 2026-03-09T00:04:08.323 INFO:tasks.workunit.client.0.vm03.stdout:0/640: symlink d2/da/dd/d49/d6c/d4b/d55/d6f/lec 0 2026-03-09T00:04:08.326 INFO:tasks.workunit.client.1.vm06.stdout:0/913: rmdir d3/d18/d2c 39 2026-03-09T00:04:08.328 INFO:tasks.workunit.client.1.vm06.stdout:9/767: mknod d1/d4/d2f/cf8 0 2026-03-09T00:04:08.328 INFO:tasks.workunit.client.1.vm06.stdout:9/768: chown d1/d4/d6e/d9/f3d 1810 1 2026-03-09T00:04:08.328 INFO:tasks.workunit.client.1.vm06.stdout:9/769: chown d1/d3/d4f/d91/d94/ff6 2 1 2026-03-09T00:04:08.328 INFO:tasks.workunit.client.1.vm06.stdout:9/770: truncate d1/d4/d6e/d9/f3d 1288017 0 2026-03-09T00:04:08.329 INFO:tasks.workunit.client.1.vm06.stdout:7/871: mkdir d0/d55/d99/d102 0 2026-03-09T00:04:08.331 INFO:tasks.workunit.client.1.vm06.stdout:1/788: symlink d6/d4c/d79/l10a 0 2026-03-09T00:04:08.331 INFO:tasks.workunit.client.1.vm06.stdout:9/771: mknod d1/d4/d2f/cf9 0 2026-03-09T00:04:08.333 INFO:tasks.workunit.client.1.vm06.stdout:7/872: truncate d0/df/d1a/d27/d4c/d40/d51/d86/fc3 4224946 0 2026-03-09T00:04:08.333 INFO:tasks.workunit.client.1.vm06.stdout:7/873: readlink d0/df/d1a/d3a/l54 0 2026-03-09T00:04:08.334 INFO:tasks.workunit.client.1.vm06.stdout:1/789: mknod d6/db0/c10b 0 2026-03-09T00:04:08.335 INFO:tasks.workunit.client.1.vm06.stdout:9/772: creat d1/d4/d6e/ffa x:0 0 0 2026-03-09T00:04:08.336 INFO:tasks.workunit.client.1.vm06.stdout:7/874: creat d0/df/d1a/d3a/d4e/f103 x:0 0 0 2026-03-09T00:04:08.339 INFO:tasks.workunit.client.1.vm06.stdout:1/790: mkdir d6/d4c/d79/d10c 0 2026-03-09T00:04:08.341 INFO:tasks.workunit.client.1.vm06.stdout:5/958: dread d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/ff1 [0,4194304] 0 2026-03-09T00:04:08.341 INFO:tasks.workunit.client.1.vm06.stdout:5/959: truncate d5/db1/dcc/f134 273651 0 2026-03-09T00:04:08.347 INFO:tasks.workunit.client.0.vm03.stdout:9/659: dread d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:04:08.347 INFO:tasks.workunit.client.0.vm03.stdout:9/660: write d15/f23 [475361,124529] 0 2026-03-09T00:04:08.347 INFO:tasks.workunit.client.0.vm03.stdout:9/661: read d15/d1c/d36/f86 [3903224,108631] 0 2026-03-09T00:04:08.347 INFO:tasks.workunit.client.1.vm06.stdout:0/914: dread d3/d18/d1f/d39/d3b/df9/fba [0,4194304] 0 2026-03-09T00:04:08.349 INFO:tasks.workunit.client.1.vm06.stdout:9/773: rmdir d1/d4/d6e/d14 39 2026-03-09T00:04:08.350 INFO:tasks.workunit.client.1.vm06.stdout:5/960: mknod d5/d44/d84/dc5/c141 0 2026-03-09T00:04:08.353 INFO:tasks.workunit.client.0.vm03.stdout:9/662: mkdir d15/d1c/d28/dda 0 2026-03-09T00:04:08.366 INFO:tasks.workunit.client.0.vm03.stdout:3/482: sync 2026-03-09T00:04:08.366 INFO:tasks.workunit.client.0.vm03.stdout:9/663: creat d15/d1c/d21/d54/fdb x:0 0 0 2026-03-09T00:04:08.366 INFO:tasks.workunit.client.1.vm06.stdout:5/961: rename d5/d1c/d68/da2/d107 to d5/d44/d84/d142 0 2026-03-09T00:04:08.366 INFO:tasks.workunit.client.1.vm06.stdout:5/962: write d5/d1c/d68/da2/fa4 [1026253,126622] 0 2026-03-09T00:04:08.366 INFO:tasks.workunit.client.1.vm06.stdout:5/963: mkdir d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/df0/d143 0 2026-03-09T00:04:08.381 INFO:tasks.workunit.client.0.vm03.stdout:1/721: dwrite d4/d15/f8a [0,4194304] 0 2026-03-09T00:04:08.391 INFO:tasks.workunit.client.0.vm03.stdout:1/722: getdents d4/d15/d1a 0 2026-03-09T00:04:08.397 INFO:tasks.workunit.client.0.vm03.stdout:1/723: mknod d4/d3a/d61/d78/cf5 0 2026-03-09T00:04:08.400 INFO:tasks.workunit.client.0.vm03.stdout:1/724: unlink d4/d3a/f48 0 2026-03-09T00:04:08.400 INFO:tasks.workunit.client.0.vm03.stdout:1/725: write d4/d15/d1a/f1d [5400184,50450] 0 2026-03-09T00:04:08.405 INFO:tasks.workunit.client.0.vm03.stdout:6/602: dwrite d13/d35/d4c/f99 [0,4194304] 0 2026-03-09T00:04:08.405 INFO:tasks.workunit.client.0.vm03.stdout:6/603: readlink d13/d1e/d44/d59/l93 0 2026-03-09T00:04:08.405 INFO:tasks.workunit.client.0.vm03.stdout:6/604: write d13/d1e/d44/d4a/d52/f7a [360310,101845] 0 2026-03-09T00:04:08.410 INFO:tasks.workunit.client.0.vm03.stdout:6/605: dread d13/d1e/d44/f49 [0,4194304] 0 2026-03-09T00:04:08.426 INFO:tasks.workunit.client.0.vm03.stdout:6/606: mkdir d13/dc4/dc9 0 2026-03-09T00:04:08.427 INFO:tasks.workunit.client.0.vm03.stdout:6/607: dread d13/f3a [0,4194304] 0 2026-03-09T00:04:08.427 INFO:tasks.workunit.client.0.vm03.stdout:6/608: mkdir d13/d35/d72/dca 0 2026-03-09T00:04:08.432 INFO:tasks.workunit.client.1.vm06.stdout:2/982: dwrite d7/d1a/d25/d66/f8d [0,4194304] 0 2026-03-09T00:04:08.437 INFO:tasks.workunit.client.0.vm03.stdout:6/609: dread d13/f31 [0,4194304] 0 2026-03-09T00:04:08.442 INFO:tasks.workunit.client.0.vm03.stdout:6/610: fdatasync d13/f31 0 2026-03-09T00:04:08.443 INFO:tasks.workunit.client.0.vm03.stdout:1/726: read d4/d3a/d3d/f58 [2812563,60728] 0 2026-03-09T00:04:08.446 INFO:tasks.workunit.client.0.vm03.stdout:1/727: rename d4/d15/d86 to d4/d15/d77/dce/df6 0 2026-03-09T00:04:08.446 INFO:tasks.workunit.client.0.vm03.stdout:1/728: chown d4/d3a/d32/l80 47049081 1 2026-03-09T00:04:08.463 INFO:tasks.workunit.client.1.vm06.stdout:9/774: dwrite d1/da7/fb9 [0,4194304] 0 2026-03-09T00:04:08.467 INFO:tasks.workunit.client.1.vm06.stdout:9/775: link d1/d4/d2f/fa0 d1/da7/ffb 0 2026-03-09T00:04:08.473 INFO:tasks.workunit.client.0.vm03.stdout:2/643: dwrite d8/d1b/f47 [0,4194304] 0 2026-03-09T00:04:08.473 INFO:tasks.workunit.client.1.vm06.stdout:9/776: unlink d1/d4/d6e/d14/d25/la8 0 2026-03-09T00:04:08.474 INFO:tasks.workunit.client.1.vm06.stdout:9/777: mkdir d1/da7/dfc 0 2026-03-09T00:04:08.475 INFO:tasks.workunit.client.1.vm06.stdout:9/778: rmdir d1/d3/d4f/d52/de3/de5 39 2026-03-09T00:04:08.475 INFO:tasks.workunit.client.1.vm06.stdout:9/779: symlink d1/d3/d4f/d91/lfd 0 2026-03-09T00:04:08.475 INFO:tasks.workunit.client.1.vm06.stdout:9/780: mkdir d1/d3/d4f/d91/d94/ddf/dfe 0 2026-03-09T00:04:08.478 INFO:tasks.workunit.client.0.vm03.stdout:7/597: sync 2026-03-09T00:04:08.478 INFO:tasks.workunit.client.0.vm03.stdout:7/598: read d2/d4/fb [5608165,50373] 0 2026-03-09T00:04:08.480 INFO:tasks.workunit.client.0.vm03.stdout:2/644: link d8/d1b/d2a/d2e/d9a/ccc d8/d1b/d2a/d2e/cce 0 2026-03-09T00:04:08.483 INFO:tasks.workunit.client.0.vm03.stdout:2/645: rename d8/d1b/d24/da5/da8/fba to d8/d1b/d2a/d6b/fcf 0 2026-03-09T00:04:08.483 INFO:tasks.workunit.client.0.vm03.stdout:2/646: write d8/d1b/d24/da5/fb5 [1009179,65528] 0 2026-03-09T00:04:08.484 INFO:tasks.workunit.client.0.vm03.stdout:7/599: write d2/d1f/d3a/d24/da4/d91/f72 [398612,113718] 0 2026-03-09T00:04:08.485 INFO:tasks.workunit.client.0.vm03.stdout:2/647: creat d8/d26/d5e/db1/fd0 x:0 0 0 2026-03-09T00:04:08.488 INFO:tasks.workunit.client.0.vm03.stdout:2/648: unlink d8/d1b/c67 0 2026-03-09T00:04:08.501 INFO:tasks.workunit.client.0.vm03.stdout:2/649: rename d8/d1b/d2a/d6b/f81 to d8/d26/d5e/dc5/fd1 0 2026-03-09T00:04:08.506 INFO:tasks.workunit.client.0.vm03.stdout:7/600: dread d2/d1f/d3a/f5d [0,4194304] 0 2026-03-09T00:04:08.508 INFO:tasks.workunit.client.1.vm06.stdout:5/964: dwrite d5/d1c/d21/d28/d5e/d66/d78/dc8/daa/f12d [0,4194304] 0 2026-03-09T00:04:08.509 INFO:tasks.workunit.client.0.vm03.stdout:6/611: dwrite d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:04:08.519 INFO:tasks.workunit.client.0.vm03.stdout:7/601: write d2/d1f/d3a/d24/da4/d91/d67/f64 [1778330,5865] 0 2026-03-09T00:04:08.521 INFO:tasks.workunit.client.0.vm03.stdout:2/650: write d8/d26/d5e/dc5/fd1 [2908473,129463] 0 2026-03-09T00:04:08.526 INFO:tasks.workunit.client.0.vm03.stdout:7/602: truncate d2/f73 2995987 0 2026-03-09T00:04:08.536 INFO:tasks.workunit.client.1.vm06.stdout:7/875: write d0/df/d1a/d27/f66 [853687,71951] 0 2026-03-09T00:04:08.536 INFO:tasks.workunit.client.1.vm06.stdout:7/876: write d0/df/d1a/d27/d4c/d40/fa5 [1195656,71216] 0 2026-03-09T00:04:08.536 INFO:tasks.workunit.client.1.vm06.stdout:7/877: truncate d0/df/d1a/d35/ff0 823472 0 2026-03-09T00:04:08.536 INFO:tasks.workunit.client.1.vm06.stdout:7/878: stat d0/df/d1a/d27/d4c/fb0 0 2026-03-09T00:04:08.536 INFO:tasks.workunit.client.1.vm06.stdout:1/791: write d6/d21/d2d/d3b/d42/f9a [2255736,19408] 0 2026-03-09T00:04:08.550 INFO:tasks.workunit.client.0.vm03.stdout:2/651: link d8/d1b/d6c/f7b d8/d1b/d2a/d6b/dc6/fd2 0 2026-03-09T00:04:08.550 INFO:tasks.workunit.client.0.vm03.stdout:2/652: truncate d8/d1b/d2a/d6b/d50/d8a/fa3 248386 0 2026-03-09T00:04:08.568 INFO:tasks.workunit.client.1.vm06.stdout:7/879: symlink d0/d55/d99/d102/l104 0 2026-03-09T00:04:08.569 INFO:tasks.workunit.client.0.vm03.stdout:2/653: mknod d8/d26/d5e/cd3 0 2026-03-09T00:04:08.569 INFO:tasks.workunit.client.1.vm06.stdout:1/792: rmdir d6/db0 39 2026-03-09T00:04:08.570 INFO:tasks.workunit.client.0.vm03.stdout:2/654: rename d8/d26/d5e/d6f/f98 to d8/d1b/d2a/d6b/d50/d8a/fd4 0 2026-03-09T00:04:08.571 INFO:tasks.workunit.client.1.vm06.stdout:7/880: mknod d0/df/d17/dba/c105 0 2026-03-09T00:04:08.572 INFO:tasks.workunit.client.0.vm03.stdout:2/655: creat d8/d1b/d2a/d6b/fd5 x:0 0 0 2026-03-09T00:04:08.572 INFO:tasks.workunit.client.0.vm03.stdout:4/733: sync 2026-03-09T00:04:08.572 INFO:tasks.workunit.client.0.vm03.stdout:5/634: sync 2026-03-09T00:04:08.576 INFO:tasks.workunit.client.1.vm06.stdout:1/793: rename d6/d21/d2d/d37/f78 to d6/d21/dfc/f10d 0 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.1.vm06.stdout:1/794: chown d6/d21/d2d/d37/d6d/dd7/ff6 1 1 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.1.vm06.stdout:7/881: creat d0/df/d1a/d35/f106 x:0 0 0 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.0.vm03.stdout:4/734: mkdir d7/d6f/dcf/de8 0 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.0.vm03.stdout:4/735: creat d7/d27/dc9/fe9 x:0 0 0 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.0.vm03.stdout:4/736: rename d7/d20/d29 to d7/d20/d6a/dea 0 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.0.vm03.stdout:4/737: stat d7/d20/d6a/dea/d54/l61 0 2026-03-09T00:04:08.595 INFO:tasks.workunit.client.0.vm03.stdout:4/738: fsync d7/d20/d6a/dea/fa4 0 2026-03-09T00:04:08.598 INFO:tasks.workunit.client.0.vm03.stdout:4/739: dread d7/d20/d6a/dea/d4e/f9d [0,4194304] 0 2026-03-09T00:04:08.608 INFO:tasks.workunit.client.0.vm03.stdout:4/740: creat d7/d20/d35/d66/feb x:0 0 0 2026-03-09T00:04:08.610 INFO:tasks.workunit.client.0.vm03.stdout:4/741: creat d7/d20/d6a/dde/fec x:0 0 0 2026-03-09T00:04:08.618 INFO:tasks.workunit.client.0.vm03.stdout:4/742: dread d7/f1f [0,4194304] 0 2026-03-09T00:04:08.622 INFO:tasks.workunit.client.0.vm03.stdout:4/743: write d7/d20/d6a/dea/d54/fc6 [855559,9745] 0 2026-03-09T00:04:08.622 INFO:tasks.workunit.client.0.vm03.stdout:4/744: chown d7/d20/d6a/dea/d54 32 1 2026-03-09T00:04:08.638 INFO:tasks.workunit.client.1.vm06.stdout:3/873: dwrite d11/f24 [4194304,4194304] 0 2026-03-09T00:04:08.640 INFO:tasks.workunit.client.1.vm06.stdout:3/874: dread d11/f5a [0,4194304] 0 2026-03-09T00:04:08.640 INFO:tasks.workunit.client.1.vm06.stdout:3/875: write d11/d28/d2e/d2f/d5b/d5f/d91/fce [1281844,103904] 0 2026-03-09T00:04:08.645 INFO:tasks.workunit.client.0.vm03.stdout:4/745: write d7/d20/f34 [7364536,42638] 0 2026-03-09T00:04:08.665 INFO:tasks.workunit.client.0.vm03.stdout:4/746: mknod d7/d27/ced 0 2026-03-09T00:04:08.665 INFO:tasks.workunit.client.0.vm03.stdout:4/747: mkdir d7/d6f/dcf/de8/dee 0 2026-03-09T00:04:08.665 INFO:tasks.workunit.client.0.vm03.stdout:4/748: creat d7/d27/dc4/fef x:0 0 0 2026-03-09T00:04:08.665 INFO:tasks.workunit.client.0.vm03.stdout:4/749: unlink d7/d20/d6a/d77/d25/f3e 0 2026-03-09T00:04:08.665 INFO:tasks.workunit.client.0.vm03.stdout:4/750: read d7/f15 [1220846,20986] 0 2026-03-09T00:04:08.667 INFO:tasks.workunit.client.1.vm06.stdout:9/781: dwrite d1/d4/d6e/d14/d25/f6f [0,4194304] 0 2026-03-09T00:04:08.674 INFO:tasks.workunit.client.1.vm06.stdout:9/782: dread d1/d4/d6e/d9/fc3 [0,4194304] 0 2026-03-09T00:04:08.692 INFO:tasks.workunit.client.0.vm03.stdout:0/641: dwrite d2/da/dd/f38 [4194304,4194304] 0 2026-03-09T00:04:08.697 INFO:tasks.workunit.client.0.vm03.stdout:0/642: rmdir d2 39 2026-03-09T00:04:08.697 INFO:tasks.workunit.client.0.vm03.stdout:0/643: chown d2/l86 4151 1 2026-03-09T00:04:08.698 INFO:tasks.workunit.client.0.vm03.stdout:0/644: mknod d2/da/dd/d49/ced 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/645: link d2/da/dd/d49/d6c/d4b/fa0 d2/da/dd/d6e/fee 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/646: rmdir d2/d5a 39 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/647: getdents d2/da/dd 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/648: chown d2/da/dd/d49/d6c/d4b/d55/lb3 262630065 1 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/649: symlink d2/da/d76/d8a/d8f/lef 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/650: fdatasync d2/f7f 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/651: fsync d2/da/d36/da4/f3f 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/652: chown d2/da/dd/d49/d6c/da6/dda/db5/dba/fbc 101 1 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/653: creat d2/da/dd/d49/d6c/d4b/ff0 x:0 0 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/654: unlink d2/da/fc8 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/655: creat d2/da/dd/d49/d6c/da6/dda/db5/ff1 x:0 0 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/656: rmdir d2/da/dd/d49/d6c/db4 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/657: creat d2/da/dd/d49/ff2 x:0 0 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/658: symlink d2/da/dd/d49/d6c/d4b/d55/d6f/lf3 0 2026-03-09T00:04:08.704 INFO:tasks.workunit.client.0.vm03.stdout:0/659: chown d2/da/d76/d8a/d8f/ldc 229433200 1 2026-03-09T00:04:08.705 INFO:tasks.workunit.client.0.vm03.stdout:0/660: symlink d2/da/d36/ddf/lf4 0 2026-03-09T00:04:08.705 INFO:tasks.workunit.client.0.vm03.stdout:0/661: dread - d2/da/d76/d8a/d8f/db8/fe6 zero size 2026-03-09T00:04:08.705 INFO:tasks.workunit.client.0.vm03.stdout:5/635: dwrite f12 [0,4194304] 0 2026-03-09T00:04:08.706 INFO:tasks.workunit.client.0.vm03.stdout:0/662: rmdir d2/da/dd/d49/d6c 39 2026-03-09T00:04:08.706 INFO:tasks.workunit.client.0.vm03.stdout:0/663: fdatasync d2/da/d1a/fd5 0 2026-03-09T00:04:08.707 INFO:tasks.workunit.client.0.vm03.stdout:8/650: dwrite d7/df/d1a/d40/db3/dba/d38/d60/f6e [0,4194304] 0 2026-03-09T00:04:08.707 INFO:tasks.workunit.client.0.vm03.stdout:7/603: dwrite d2/d1f/d35/f5a [0,4194304] 0 2026-03-09T00:04:08.707 INFO:tasks.workunit.client.0.vm03.stdout:3/483: dwrite d2/db/d40/d51/f5a [0,4194304] 0 2026-03-09T00:04:08.709 INFO:tasks.workunit.client.1.vm06.stdout:2/983: dwrite d7/da/d1c/ff5 [0,4194304] 0 2026-03-09T00:04:08.709 INFO:tasks.workunit.client.0.vm03.stdout:5/636: creat d1c/d51/d6a/d75/fd2 x:0 0 0 2026-03-09T00:04:08.713 INFO:tasks.workunit.client.0.vm03.stdout:4/751: write d7/d20/d6a/dea/d38/f8f [2284834,14528] 0 2026-03-09T00:04:08.719 INFO:tasks.workunit.client.0.vm03.stdout:8/651: unlink d7/df/d1a/d40/db3/dba/d38/d4c/l6a 0 2026-03-09T00:04:08.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:08 vm03.local ceph-mon[52346]: pgmap v7: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 109 GiB / 120 GiB avail; 86 MiB/s rd, 104 MiB/s wr, 179 op/s 2026-03-09T00:04:08.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:08 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:08.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:08 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:08.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:08 vm03.local ceph-mon[52346]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T00:04:08.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:08 vm03.local ceph-mon[52346]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T00:04:08.730 INFO:tasks.workunit.client.1.vm06.stdout:0/915: dwrite d3/d18/f59 [0,4194304] 0 2026-03-09T00:04:08.738 INFO:tasks.workunit.client.1.vm06.stdout:2/984: link d7/d1b/f37 d7/d1b/d71/d79/db4/dc1/f130 0 2026-03-09T00:04:08.740 INFO:tasks.workunit.client.1.vm06.stdout:2/985: chown d7/d1b/f46 459 1 2026-03-09T00:04:08.740 INFO:tasks.workunit.client.1.vm06.stdout:2/986: chown d7/da/d63/d81/dfe/db2/lbc 29695555 1 2026-03-09T00:04:08.740 INFO:tasks.workunit.client.1.vm06.stdout:2/987: write d7/da/db/de/f60 [706566,92277] 0 2026-03-09T00:04:08.742 INFO:tasks.workunit.client.0.vm03.stdout:2/656: dwrite d8/d1b/f47 [0,4194304] 0 2026-03-09T00:04:08.747 INFO:tasks.workunit.client.1.vm06.stdout:0/916: getdents d3/d18/d2c/d2d/d74/dc7 0 2026-03-09T00:04:08.747 INFO:tasks.workunit.client.1.vm06.stdout:0/917: dread - d3/d18/d2c/d2d/d74/d90/fac zero size 2026-03-09T00:04:08.748 INFO:tasks.workunit.client.0.vm03.stdout:5/637: symlink d1c/d20/d55/d66/d6b/d8f/dca/ld3 0 2026-03-09T00:04:08.750 INFO:tasks.workunit.client.0.vm03.stdout:0/664: rename d2/da/d36/da4/c31 to d2/da/dd/d49/d6c/d4b/d55/d6f/cf5 0 2026-03-09T00:04:08.751 INFO:tasks.workunit.client.1.vm06.stdout:6/849: sync 2026-03-09T00:04:08.755 INFO:tasks.workunit.client.1.vm06.stdout:8/886: sync 2026-03-09T00:04:08.755 INFO:tasks.workunit.client.1.vm06.stdout:4/883: sync 2026-03-09T00:04:08.755 INFO:tasks.workunit.client.1.vm06.stdout:0/918: unlink d3/d18/d1f/d39/d49/f50 0 2026-03-09T00:04:08.755 INFO:tasks.workunit.client.1.vm06.stdout:0/919: truncate d3/d18/d1f/d39/d3b/fa4 396652 0 2026-03-09T00:04:08.755 INFO:tasks.workunit.client.1.vm06.stdout:6/850: mkdir d4/d16/d53/ddf/d52/d102 0 2026-03-09T00:04:08.755 INFO:tasks.workunit.client.1.vm06.stdout:6/851: creat d4/d27/f103 x:0 0 0 2026-03-09T00:04:08.760 INFO:tasks.workunit.client.0.vm03.stdout:9/664: dwrite d15/d1c/d28/d6e/fa9 [0,4194304] 0 2026-03-09T00:04:08.760 INFO:tasks.workunit.client.0.vm03.stdout:9/665: write d15/d1c/d36/d4d/dc4/f9d [166746,53639] 0 2026-03-09T00:04:08.771 INFO:tasks.workunit.client.0.vm03.stdout:1/729: dwrite d4/d6/f6e [0,4194304] 0 2026-03-09T00:04:08.783 INFO:tasks.workunit.client.0.vm03.stdout:0/665: write d2/fe [776961,74093] 0 2026-03-09T00:04:08.784 INFO:tasks.workunit.client.1.vm06.stdout:8/887: creat db/dd/d84/df1/f120 x:0 0 0 2026-03-09T00:04:08.784 INFO:tasks.workunit.client.1.vm06.stdout:8/888: chown db/d74/d87/d100/d10a/fbc 245 1 2026-03-09T00:04:08.784 INFO:tasks.workunit.client.1.vm06.stdout:5/965: dwrite d5/d1c/d21/f3c [4194304,4194304] 0 2026-03-09T00:04:08.786 INFO:tasks.workunit.client.0.vm03.stdout:2/657: creat d8/d1b/d8f/fd6 x:0 0 0 2026-03-09T00:04:08.788 INFO:tasks.workunit.client.0.vm03.stdout:7/604: write d2/f73 [547069,92101] 0 2026-03-09T00:04:08.799 INFO:tasks.workunit.client.0.vm03.stdout:3/484: mknod d2/c8f 0 2026-03-09T00:04:08.801 INFO:tasks.workunit.client.0.vm03.stdout:8/652: dwrite d7/df/d1a/f2e [0,4194304] 0 2026-03-09T00:04:08.801 INFO:tasks.workunit.client.0.vm03.stdout:8/653: stat d7/df/d1a/d40/d9d/da9 0 2026-03-09T00:04:08.801 INFO:tasks.workunit.client.0.vm03.stdout:8/654: chown d7/df/f37 2 1 2026-03-09T00:04:08.804 INFO:tasks.workunit.client.1.vm06.stdout:6/852: rename d4/d16/d53/ddf/d7e/dac/dd3/d101/lce to d4/d16/d53/ddf/d52/d102/l104 0 2026-03-09T00:04:08.810 INFO:tasks.workunit.client.1.vm06.stdout:7/882: dread d0/df/d17/f38 [4194304,4194304] 0 2026-03-09T00:04:08.810 INFO:tasks.workunit.client.1.vm06.stdout:7/883: chown d0/df/d1a/d27/d4c/d40/d51/d90/dae/cf5 88410453 1 2026-03-09T00:04:08.811 INFO:tasks.workunit.client.1.vm06.stdout:7/884: dread d0/df/d17/f2d [0,4194304] 0 2026-03-09T00:04:08.824 INFO:tasks.workunit.client.0.vm03.stdout:5/638: rename d1c/f30 to d1c/d20/dc0/fd4 0 2026-03-09T00:04:08.825 INFO:tasks.workunit.client.1.vm06.stdout:7/885: dread d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc9 [0,4194304] 0 2026-03-09T00:04:08.825 INFO:tasks.workunit.client.1.vm06.stdout:7/886: chown d0/df/d1a/d27/d70/fc7 1 1 2026-03-09T00:04:08.825 INFO:tasks.workunit.client.1.vm06.stdout:7/887: chown d0/df/d17/f2d 509794534 1 2026-03-09T00:04:08.825 INFO:tasks.workunit.client.1.vm06.stdout:5/966: link d5/d1c/d23/d34/d47/f127 d5/d1c/d68/da2/f144 0 2026-03-09T00:04:08.825 INFO:tasks.workunit.client.1.vm06.stdout:4/884: rmdir d17/d24/d3b/d75 39 2026-03-09T00:04:08.825 INFO:tasks.workunit.client.1.vm06.stdout:8/889: rename db/d74/d78/d98/db6/dc7/d101/ca5 to db/d74/d78/d98/db6/dc7/d10f/c121 0 2026-03-09T00:04:08.827 INFO:tasks.workunit.client.0.vm03.stdout:0/666: stat d2/da/dd/l98 0 2026-03-09T00:04:08.829 INFO:tasks.workunit.client.1.vm06.stdout:5/967: rename d5/f1d to d5/d1c/d68/dec/d115/d11e/d92/d95/d12e/f145 0 2026-03-09T00:04:08.829 INFO:tasks.workunit.client.1.vm06.stdout:5/968: dread - d5/d1c/d68/da2/f13b zero size 2026-03-09T00:04:08.829 INFO:tasks.workunit.client.1.vm06.stdout:5/969: fsync d5/d1c/d68/dec/d115/d11e/d92/d49/fc2 0 2026-03-09T00:04:08.829 INFO:tasks.workunit.client.1.vm06.stdout:7/888: write d0/df/d1a/f25 [2267664,98500] 0 2026-03-09T00:04:08.830 INFO:tasks.workunit.client.0.vm03.stdout:2/658: unlink d8/c28 0 2026-03-09T00:04:08.830 INFO:tasks.workunit.client.0.vm03.stdout:2/659: fdatasync d8/d1b/d2a/d6b/fd5 0 2026-03-09T00:04:08.831 INFO:tasks.workunit.client.1.vm06.stdout:6/853: getdents d4/d16/d53/ddf/da6/dbb 0 2026-03-09T00:04:08.838 INFO:tasks.workunit.client.0.vm03.stdout:3/485: link d2/db/d2d/f45 d2/db/d3b/d5f/d65/f90 0 2026-03-09T00:04:08.840 INFO:tasks.workunit.client.1.vm06.stdout:8/890: dread db/d53/d6d/d7b/f9a [0,4194304] 0 2026-03-09T00:04:08.842 INFO:tasks.workunit.client.1.vm06.stdout:1/795: dwrite d6/d21/d2d/d37/d6d/dd7/ff6 [0,4194304] 0 2026-03-09T00:04:08.842 INFO:tasks.workunit.client.1.vm06.stdout:1/796: fsync d6/f1b 0 2026-03-09T00:04:08.846 INFO:tasks.workunit.client.1.vm06.stdout:1/797: write d6/d21/ff2 [140351,61007] 0 2026-03-09T00:04:08.856 INFO:tasks.workunit.client.1.vm06.stdout:7/889: link d0/df/d1a/d27/f60 d0/df/d1a/d27/d70/d9b/f107 0 2026-03-09T00:04:08.857 INFO:tasks.workunit.client.0.vm03.stdout:5/639: link d1c/d20/d55/d4f/d58/db5/f6f d1c/d20/d55/d4f/d58/d73/d76/fd5 0 2026-03-09T00:04:08.860 INFO:tasks.workunit.client.1.vm06.stdout:8/891: creat db/dd/de3/f122 x:0 0 0 2026-03-09T00:04:08.861 INFO:tasks.workunit.client.1.vm06.stdout:8/892: readlink db/dd/l18 0 2026-03-09T00:04:08.861 INFO:tasks.workunit.client.1.vm06.stdout:8/893: read - db/d74/d87/d100/d10a/fcf zero size 2026-03-09T00:04:08.865 INFO:tasks.workunit.client.1.vm06.stdout:8/894: mknod db/dd/d48/c123 0 2026-03-09T00:04:08.867 INFO:tasks.workunit.client.0.vm03.stdout:1/730: rmdir d4/d3a/d43 39 2026-03-09T00:04:08.867 INFO:tasks.workunit.client.0.vm03.stdout:1/731: chown d4/d3a/d61/d78/l99 2975091 1 2026-03-09T00:04:08.867 INFO:tasks.workunit.client.0.vm03.stdout:1/732: chown d4/d3a/d32/d87/db3 0 1 2026-03-09T00:04:08.871 INFO:tasks.workunit.client.0.vm03.stdout:3/486: link d2/l19 d2/db/d3b/l91 0 2026-03-09T00:04:08.874 INFO:tasks.workunit.client.0.vm03.stdout:1/733: dread d4/d6/d52/db5/ff2 [0,4194304] 0 2026-03-09T00:04:08.876 INFO:tasks.workunit.client.0.vm03.stdout:3/487: dread d2/f9 [0,4194304] 0 2026-03-09T00:04:08.886 INFO:tasks.workunit.client.0.vm03.stdout:6/612: dwrite d13/d35/d4c/d62/f9a [0,4194304] 0 2026-03-09T00:04:08.887 INFO:tasks.workunit.client.1.vm06.stdout:8/895: read db/f17 [3219129,55469] 0 2026-03-09T00:04:08.888 INFO:tasks.workunit.client.1.vm06.stdout:2/988: dwrite d7/d1a/d89/fb7 [0,4194304] 0 2026-03-09T00:04:08.890 INFO:tasks.workunit.client.1.vm06.stdout:8/896: mknod db/d74/d78/d98/c124 0 2026-03-09T00:04:08.902 INFO:tasks.workunit.client.1.vm06.stdout:2/989: read d7/da/db/de/f32 [1558535,7588] 0 2026-03-09T00:04:08.903 INFO:tasks.workunit.client.1.vm06.stdout:2/990: write d7/da/d63/d81/dfe/fe3 [176804,50321] 0 2026-03-09T00:04:08.910 INFO:tasks.workunit.client.0.vm03.stdout:1/734: rmdir d4/d3a/d43/daf 39 2026-03-09T00:04:08.911 INFO:tasks.workunit.client.0.vm03.stdout:3/488: mknod d2/db/d40/d58/c92 0 2026-03-09T00:04:08.916 INFO:tasks.workunit.client.0.vm03.stdout:6/613: mknod d13/ccb 0 2026-03-09T00:04:08.916 INFO:tasks.workunit.client.0.vm03.stdout:5/640: getdents d1c/d20/d55/d4f/d58/d73/d9e/da5 0 2026-03-09T00:04:08.916 INFO:tasks.workunit.client.1.vm06.stdout:2/991: write d7/d1a/d25/fae [555099,53016] 0 2026-03-09T00:04:08.918 INFO:tasks.workunit.client.0.vm03.stdout:1/735: mkdir d4/d3a/d3d/d46/df7 0 2026-03-09T00:04:08.918 INFO:tasks.workunit.client.0.vm03.stdout:1/736: chown d4/d3a/d3d/d98/dee/d9e 34108695 1 2026-03-09T00:04:08.919 INFO:tasks.workunit.client.0.vm03.stdout:6/614: unlink d13/d35/d72/l80 0 2026-03-09T00:04:08.919 INFO:tasks.workunit.client.0.vm03.stdout:5/641: symlink d1c/d20/d56/ld6 0 2026-03-09T00:04:08.919 INFO:tasks.workunit.client.0.vm03.stdout:5/642: fsync d1c/d20/d55/f34 0 2026-03-09T00:04:08.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:08 vm06.local ceph-mon[58395]: pgmap v7: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 109 GiB / 120 GiB avail; 86 MiB/s rd, 104 MiB/s wr, 179 op/s 2026-03-09T00:04:08.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:08 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:08.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:08 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:08.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:08 vm06.local ceph-mon[58395]: Reconfiguring prometheus.vm03 (dependencies changed)... 2026-03-09T00:04:08.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:08 vm06.local ceph-mon[58395]: Reconfiguring daemon prometheus.vm03 on vm03 2026-03-09T00:04:08.922 INFO:tasks.workunit.client.0.vm03.stdout:1/737: unlink d4/d15/lcc 0 2026-03-09T00:04:08.922 INFO:tasks.workunit.client.0.vm03.stdout:1/738: chown d4/d3a/d3d/d98/dee/cc5 2 1 2026-03-09T00:04:08.923 INFO:tasks.workunit.client.1.vm06.stdout:5/970: dread d5/f43 [0,4194304] 0 2026-03-09T00:04:08.923 INFO:tasks.workunit.client.1.vm06.stdout:5/971: creat d5/d1c/d21/d28/d5e/d66/d78/dd5/f146 x:0 0 0 2026-03-09T00:04:08.923 INFO:tasks.workunit.client.0.vm03.stdout:6/615: mkdir d13/d35/d72/dcc 0 2026-03-09T00:04:08.923 INFO:tasks.workunit.client.0.vm03.stdout:2/660: write d8/d26/d5e/d5f/f60 [4096458,108043] 0 2026-03-09T00:04:08.923 INFO:tasks.workunit.client.0.vm03.stdout:2/661: truncate d8/d26/f5a 4977166 0 2026-03-09T00:04:08.928 INFO:tasks.workunit.client.0.vm03.stdout:9/666: dwrite d15/d1c/d21/d64/fac [0,4194304] 0 2026-03-09T00:04:08.928 INFO:tasks.workunit.client.0.vm03.stdout:9/667: creat d15/d1c/d21/fdc x:0 0 0 2026-03-09T00:04:08.930 INFO:tasks.workunit.client.0.vm03.stdout:5/643: creat d1c/d20/d55/db0/dc7/fd7 x:0 0 0 2026-03-09T00:04:08.930 INFO:tasks.workunit.client.0.vm03.stdout:5/644: stat d1c/d51/d6a/d75 0 2026-03-09T00:04:08.930 INFO:tasks.workunit.client.0.vm03.stdout:5/645: truncate d1c/d51/d6a/fc2 425820 0 2026-03-09T00:04:08.931 INFO:tasks.workunit.client.1.vm06.stdout:5/972: symlink d5/d44/d84/l147 0 2026-03-09T00:04:08.932 INFO:tasks.workunit.client.0.vm03.stdout:1/739: rename d4/f12 to d4/d15/d77/ff8 0 2026-03-09T00:04:08.935 INFO:tasks.workunit.client.0.vm03.stdout:6/616: link d13/d35/l4b d13/d35/d74/d89/lcd 0 2026-03-09T00:04:08.936 INFO:tasks.workunit.client.0.vm03.stdout:5/646: dread d1c/d20/d55/d4f/d58/d5d/faa [0,4194304] 0 2026-03-09T00:04:08.936 INFO:tasks.workunit.client.0.vm03.stdout:5/647: fsync d1c/d20/d55/d4f/d58/db5/f7f 0 2026-03-09T00:04:08.936 INFO:tasks.workunit.client.0.vm03.stdout:5/648: write d1c/d20/d55/d4f/d58/db5/f3c [594698,35799] 0 2026-03-09T00:04:08.937 INFO:tasks.workunit.client.1.vm06.stdout:5/973: creat d5/d1c/d21/d28/d5e/d66/dab/f148 x:0 0 0 2026-03-09T00:04:08.939 INFO:tasks.workunit.client.0.vm03.stdout:2/662: mkdir d8/d1b/d6c/dd7 0 2026-03-09T00:04:08.939 INFO:tasks.workunit.client.1.vm06.stdout:5/974: creat d5/d1c/d21/d28/d12f/f149 x:0 0 0 2026-03-09T00:04:08.940 INFO:tasks.workunit.client.1.vm06.stdout:5/975: mkdir d5/d1c/d21/d28/d14a 0 2026-03-09T00:04:08.940 INFO:tasks.workunit.client.1.vm06.stdout:5/976: read d5/fae [733696,4581] 0 2026-03-09T00:04:08.940 INFO:tasks.workunit.client.1.vm06.stdout:5/977: dread - d5/d44/d84/f118 zero size 2026-03-09T00:04:08.941 INFO:tasks.workunit.client.0.vm03.stdout:5/649: creat d1c/d20/d55/d4f/d58/d73/fd8 x:0 0 0 2026-03-09T00:04:08.943 INFO:tasks.workunit.client.0.vm03.stdout:2/663: rmdir d8/d1b/d2a/d6b/dc6 39 2026-03-09T00:04:08.945 INFO:tasks.workunit.client.1.vm06.stdout:5/978: rename d5/d1c/d68/dec/d115/d11e/d92/c5f to d5/d44/c14b 0 2026-03-09T00:04:08.945 INFO:tasks.workunit.client.1.vm06.stdout:5/979: read - d5/d44/d84/dc5/de8/f116 zero size 2026-03-09T00:04:08.945 INFO:tasks.workunit.client.0.vm03.stdout:6/617: dread d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:04:08.948 INFO:tasks.workunit.client.0.vm03.stdout:5/650: link d1c/d20/d55/d4f/d58/db5/f7f d1c/d20/d56/fd9 0 2026-03-09T00:04:08.951 INFO:tasks.workunit.client.0.vm03.stdout:2/664: symlink d8/d26/d5e/d6f/d97/ld8 0 2026-03-09T00:04:08.951 INFO:tasks.workunit.client.0.vm03.stdout:2/665: fdatasync d8/d1b/d2a/d6b/d50/f54 0 2026-03-09T00:04:08.952 INFO:tasks.workunit.client.0.vm03.stdout:2/666: fdatasync d8/d1b/d2a/f33 0 2026-03-09T00:04:08.952 INFO:tasks.workunit.client.0.vm03.stdout:2/667: write d8/d74/fc7 [2533286,61151] 0 2026-03-09T00:04:08.952 INFO:tasks.workunit.client.1.vm06.stdout:5/980: dread d5/d1c/d21/d28/d5e/d66/d78/da6/fef [0,4194304] 0 2026-03-09T00:04:08.953 INFO:tasks.workunit.client.1.vm06.stdout:5/981: dread - d5/d44/d84/dc5/de8/f116 zero size 2026-03-09T00:04:08.957 INFO:tasks.workunit.client.0.vm03.stdout:1/740: dread d4/d3a/d43/f47 [0,4194304] 0 2026-03-09T00:04:08.957 INFO:tasks.workunit.client.0.vm03.stdout:1/741: truncate d4/d15/d5c/f5f 414820 0 2026-03-09T00:04:08.959 INFO:tasks.workunit.client.0.vm03.stdout:5/651: creat d1c/d67/fda x:0 0 0 2026-03-09T00:04:08.959 INFO:tasks.workunit.client.0.vm03.stdout:5/652: chown d1c/d20/d55/fbc 39495658 1 2026-03-09T00:04:08.960 INFO:tasks.workunit.client.0.vm03.stdout:5/653: creat d1c/d20/d55/fdb x:0 0 0 2026-03-09T00:04:08.960 INFO:tasks.workunit.client.1.vm06.stdout:0/920: dwrite d3/f29 [0,4194304] 0 2026-03-09T00:04:08.960 INFO:tasks.workunit.client.0.vm03.stdout:8/655: dwrite d7/df/d1a/d40/d58/fbc [0,4194304] 0 2026-03-09T00:04:08.960 INFO:tasks.workunit.client.0.vm03.stdout:8/656: fdatasync d7/df/f31 0 2026-03-09T00:04:08.961 INFO:tasks.workunit.client.1.vm06.stdout:3/876: dwrite d11/d28/d2e/d2f/d5b/d5f/f60 [0,4194304] 0 2026-03-09T00:04:08.963 INFO:tasks.workunit.client.0.vm03.stdout:5/654: truncate d1c/d51/f68 3998886 0 2026-03-09T00:04:08.974 INFO:tasks.workunit.client.1.vm06.stdout:0/921: link d3/d18/de9/l10e d3/d18/d2c/d2d/d74/dc7/d110/d12d/l139 0 2026-03-09T00:04:08.974 INFO:tasks.workunit.client.1.vm06.stdout:5/982: dread d5/d1c/d21/d28/d5e/d66/d78/fc1 [0,4194304] 0 2026-03-09T00:04:08.974 INFO:tasks.workunit.client.1.vm06.stdout:5/983: chown d5/d1c/d23/d34/d47/cac 145510 1 2026-03-09T00:04:08.974 INFO:tasks.workunit.client.1.vm06.stdout:5/984: chown d5/f3d 807786022 1 2026-03-09T00:04:08.975 INFO:tasks.workunit.client.0.vm03.stdout:1/742: rmdir d4/d6 39 2026-03-09T00:04:08.975 INFO:tasks.workunit.client.0.vm03.stdout:8/657: mknod d7/df/d1a/d40/db3/dba/d38/d91/cc7 0 2026-03-09T00:04:08.978 INFO:tasks.workunit.client.1.vm06.stdout:5/985: unlink d5/d1c/d23/f42 0 2026-03-09T00:04:08.978 INFO:tasks.workunit.client.1.vm06.stdout:0/922: getdents d3/d18/d2c/d2d/d74/daf/de3 0 2026-03-09T00:04:08.978 INFO:tasks.workunit.client.1.vm06.stdout:0/923: chown d3/d18/d2c/d2d/d74/d90 3566833 1 2026-03-09T00:04:08.979 INFO:tasks.workunit.client.1.vm06.stdout:5/986: write d5/d1c/d23/d34/d47/f61 [731916,79228] 0 2026-03-09T00:04:08.979 INFO:tasks.workunit.client.1.vm06.stdout:5/987: creat d5/d1c/d21/d28/d5e/d66/d78/dd5/f14c x:0 0 0 2026-03-09T00:04:08.979 INFO:tasks.workunit.client.0.vm03.stdout:1/743: rename d4/d3a/d3d/c97 to d4/d15/d77/dce/df6/cf9 0 2026-03-09T00:04:08.980 INFO:tasks.workunit.client.1.vm06.stdout:0/924: rename d3/d18/d1f/d44/cfc to d3/d18/d2c/d2d/c13a 0 2026-03-09T00:04:08.981 INFO:tasks.workunit.client.0.vm03.stdout:1/744: creat d4/d3a/d3d/d46/df7/ffa x:0 0 0 2026-03-09T00:04:08.981 INFO:tasks.workunit.client.0.vm03.stdout:1/745: chown d4/d3a 11608 1 2026-03-09T00:04:08.982 INFO:tasks.workunit.client.0.vm03.stdout:1/746: mkdir d4/d15/d1a/dfb 0 2026-03-09T00:04:08.982 INFO:tasks.workunit.client.0.vm03.stdout:1/747: dread - d4/d3a/d32/d87/fd5 zero size 2026-03-09T00:04:08.983 INFO:tasks.workunit.client.1.vm06.stdout:5/988: creat d5/d1c/d21/d28/d5e/d66/dab/d11b/f14d x:0 0 0 2026-03-09T00:04:08.983 INFO:tasks.workunit.client.0.vm03.stdout:1/748: creat d4/d3a/d3d/d98/dee/d93/ffc x:0 0 0 2026-03-09T00:04:08.983 INFO:tasks.workunit.client.0.vm03.stdout:1/749: chown d4/d15/f17 137 1 2026-03-09T00:04:08.983 INFO:tasks.workunit.client.0.vm03.stdout:1/750: chown d4 160 1 2026-03-09T00:04:08.984 INFO:tasks.workunit.client.1.vm06.stdout:0/925: mkdir d3/d18/d2c/d13b 0 2026-03-09T00:04:08.984 INFO:tasks.workunit.client.0.vm03.stdout:1/751: creat d4/d15/ffd x:0 0 0 2026-03-09T00:04:08.984 INFO:tasks.workunit.client.0.vm03.stdout:1/752: fsync d4/d15/d77/f7c 0 2026-03-09T00:04:08.984 INFO:tasks.workunit.client.0.vm03.stdout:1/753: write d4/d15/d77/dce/df6/fec [159286,128237] 0 2026-03-09T00:04:08.988 INFO:tasks.workunit.client.1.vm06.stdout:4/885: dwrite d17/d24/f2c [4194304,4194304] 0 2026-03-09T00:04:08.988 INFO:tasks.workunit.client.1.vm06.stdout:4/886: dread - d17/d21/f11b zero size 2026-03-09T00:04:08.988 INFO:tasks.workunit.client.1.vm06.stdout:4/887: chown d17/d24/d3b/d97/db7/d121/l12a 312116 1 2026-03-09T00:04:08.989 INFO:tasks.workunit.client.0.vm03.stdout:1/754: dread d4/d15/d5c/fb1 [0,4194304] 0 2026-03-09T00:04:08.989 INFO:tasks.workunit.client.0.vm03.stdout:1/755: chown d4/d15/d77/d8c/cb4 28905 1 2026-03-09T00:04:08.993 INFO:tasks.workunit.client.1.vm06.stdout:4/888: mkdir d17/d24/d3b/dbf/ddf/df5/d133 0 2026-03-09T00:04:08.993 INFO:tasks.workunit.client.1.vm06.stdout:0/926: getdents d3/d10f 0 2026-03-09T00:04:08.993 INFO:tasks.workunit.client.1.vm06.stdout:0/927: fdatasync d3/d18/d2c/f4d 0 2026-03-09T00:04:08.993 INFO:tasks.workunit.client.1.vm06.stdout:0/928: write d3/d18/f112 [554606,116823] 0 2026-03-09T00:04:08.993 INFO:tasks.workunit.client.1.vm06.stdout:0/929: read - d3/d18/d1f/d39/fe0 zero size 2026-03-09T00:04:08.993 INFO:tasks.workunit.client.1.vm06.stdout:0/930: stat d3/d18/d2c/d2d/d74/da8 0 2026-03-09T00:04:08.994 INFO:tasks.workunit.client.1.vm06.stdout:4/889: write d17/d5b/d8f/fd3 [312188,86121] 0 2026-03-09T00:04:08.995 INFO:tasks.workunit.client.1.vm06.stdout:0/931: mknod d3/d18/d2c/d2d/d74/dc7/d110/d45/c13c 0 2026-03-09T00:04:08.996 INFO:tasks.workunit.client.1.vm06.stdout:0/932: link d3/d10f/f121 d3/d18/d2c/d2d/d74/d90/f13d 0 2026-03-09T00:04:08.996 INFO:tasks.workunit.client.1.vm06.stdout:0/933: write d3/d18/d2c/d2d/d74/daf/f11b [836259,31389] 0 2026-03-09T00:04:08.998 INFO:tasks.workunit.client.0.vm03.stdout:1/756: dread d4/d3a/d3d/f58 [0,4194304] 0 2026-03-09T00:04:09.007 INFO:tasks.workunit.client.0.vm03.stdout:1/757: creat d4/d3a/d61/d78/dd8/ffe x:0 0 0 2026-03-09T00:04:09.007 INFO:tasks.workunit.client.0.vm03.stdout:1/758: creat d4/d15/d5c/fff x:0 0 0 2026-03-09T00:04:09.007 INFO:tasks.workunit.client.0.vm03.stdout:1/759: mknod d4/d6/d52/db5/c100 0 2026-03-09T00:04:09.007 INFO:tasks.workunit.client.0.vm03.stdout:1/760: mkdir d4/d15/dae/d101 0 2026-03-09T00:04:09.007 INFO:tasks.workunit.client.0.vm03.stdout:1/761: chown d4/d3a/d3d/d98/dee/cbc 3428 1 2026-03-09T00:04:09.008 INFO:tasks.workunit.client.0.vm03.stdout:1/762: symlink d4/l102 0 2026-03-09T00:04:09.008 INFO:tasks.workunit.client.0.vm03.stdout:1/763: chown d4/d3a/f41 387411761 1 2026-03-09T00:04:09.008 INFO:tasks.workunit.client.0.vm03.stdout:1/764: readlink d4/lb2 0 2026-03-09T00:04:09.009 INFO:tasks.workunit.client.0.vm03.stdout:1/765: mkdir d4/d15/d5c/d103 0 2026-03-09T00:04:09.009 INFO:tasks.workunit.client.1.vm06.stdout:3/877: dread d11/d28/d2e/d7e/d83/fe8 [0,4194304] 0 2026-03-09T00:04:09.009 INFO:tasks.workunit.client.1.vm06.stdout:3/878: write d11/d28/d2e/db2/dc2/f108 [311336,107833] 0 2026-03-09T00:04:09.018 INFO:tasks.workunit.client.0.vm03.stdout:9/668: dread f10 [0,4194304] 0 2026-03-09T00:04:09.018 INFO:tasks.workunit.client.0.vm03.stdout:9/669: link d15/d1c/d28/d6e/l8e d15/db6/ldd 0 2026-03-09T00:04:09.087 INFO:tasks.workunit.client.1.vm06.stdout:8/897: dwrite db/dd/f86 [0,4194304] 0 2026-03-09T00:04:09.089 INFO:tasks.workunit.client.1.vm06.stdout:8/898: rename db/d1e/d46/d94 to db/dd/d24/da7/d125 0 2026-03-09T00:04:09.118 INFO:tasks.workunit.client.0.vm03.stdout:0/667: dwrite d2/da/dd/f7b [0,4194304] 0 2026-03-09T00:04:09.118 INFO:tasks.workunit.client.0.vm03.stdout:0/668: write d2/da/d4e/faa [4946064,87486] 0 2026-03-09T00:04:09.138 INFO:tasks.workunit.client.1.vm06.stdout:6/854: dwrite d4/d16/f33 [0,4194304] 0 2026-03-09T00:04:09.140 INFO:tasks.workunit.client.0.vm03.stdout:4/752: dwrite d7/f28 [0,4194304] 0 2026-03-09T00:04:09.140 INFO:tasks.workunit.client.0.vm03.stdout:4/753: dread - d7/d20/d6a/dea/fd7 zero size 2026-03-09T00:04:09.149 INFO:tasks.workunit.client.0.vm03.stdout:2/668: dwrite d8/d1b/f71 [0,4194304] 0 2026-03-09T00:04:09.150 INFO:tasks.workunit.client.0.vm03.stdout:2/669: stat d8/d1b/d2a/d6b/l5b 0 2026-03-09T00:04:09.168 INFO:tasks.workunit.client.0.vm03.stdout:5/655: read d1c/d20/d56/d74/f9a [2423859,13088] 0 2026-03-09T00:04:09.171 INFO:tasks.workunit.client.1.vm06.stdout:6/855: link d4/d16/d53/ddf/da6/dbb/f9b d4/d16/d46/d90/f105 0 2026-03-09T00:04:09.181 INFO:tasks.workunit.client.0.vm03.stdout:6/618: dwrite d13/d35/d69/f84 [0,4194304] 0 2026-03-09T00:04:09.181 INFO:tasks.workunit.client.0.vm03.stdout:6/619: dread - d13/d35/d74/fc5 zero size 2026-03-09T00:04:09.190 INFO:tasks.workunit.client.0.vm03.stdout:3/489: dwrite d2/db/d40/d44/f4d [0,4194304] 0 2026-03-09T00:04:09.192 INFO:tasks.workunit.client.1.vm06.stdout:6/856: write d4/fc [182273,95476] 0 2026-03-09T00:04:09.192 INFO:tasks.workunit.client.0.vm03.stdout:5/656: rmdir d1c/d20/d55/d4f/d58 39 2026-03-09T00:04:09.195 INFO:tasks.workunit.client.1.vm06.stdout:3/879: dwrite d11/f1e [0,4194304] 0 2026-03-09T00:04:09.195 INFO:tasks.workunit.client.1.vm06.stdout:3/880: read - d11/d28/f122 zero size 2026-03-09T00:04:09.199 INFO:tasks.workunit.client.1.vm06.stdout:6/857: creat d4/d8d/f106 x:0 0 0 2026-03-09T00:04:09.202 INFO:tasks.workunit.client.0.vm03.stdout:1/766: dwrite d4/d3a/f2c [0,4194304] 0 2026-03-09T00:04:09.202 INFO:tasks.workunit.client.1.vm06.stdout:3/881: creat d11/d28/d2e/db2/d100/f131 x:0 0 0 2026-03-09T00:04:09.202 INFO:tasks.workunit.client.1.vm06.stdout:3/882: fdatasync d11/d28/d2e/db2/f116 0 2026-03-09T00:04:09.203 INFO:tasks.workunit.client.0.vm03.stdout:3/490: truncate d2/db/d2d/f8b 2878287 0 2026-03-09T00:04:09.204 INFO:tasks.workunit.client.1.vm06.stdout:6/858: creat d4/d16/d53/d67/f107 x:0 0 0 2026-03-09T00:04:09.208 INFO:tasks.workunit.client.0.vm03.stdout:1/767: mkdir d4/d3a/d8f/d104 0 2026-03-09T00:04:09.210 INFO:tasks.workunit.client.1.vm06.stdout:3/883: chown d11/d3f/d8d/l95 6108 1 2026-03-09T00:04:09.211 INFO:tasks.workunit.client.1.vm06.stdout:4/890: dwrite d17/d24/d3b/dbf/dea/f122 [0,4194304] 0 2026-03-09T00:04:09.212 INFO:tasks.workunit.client.1.vm06.stdout:6/859: mkdir d4/d16/d53/ddf/d7e/dac/dcd/d108 0 2026-03-09T00:04:09.220 INFO:tasks.workunit.client.1.vm06.stdout:6/860: rmdir d4/d16/d53/ddf/dc8 39 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.0.vm03.stdout:8/658: dwrite d7/df/d1a/d40/db3/dba/d38/d91/fa5 [0,4194304] 0 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.0.vm03.stdout:8/659: chown d7/df/d1a/d40/db3/dba/f24 30708 1 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:8/899: dwrite db/d74/d87/d100/d10a/fcf [0,4194304] 0 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/934: dwrite d3/f51 [0,4194304] 0 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/935: read - d3/d18/d1f/d39/d3b/fbb zero size 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/936: dread - d3/d10f/fcb zero size 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/937: write d3/d18/d1f/fe2 [970852,90152] 0 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/938: truncate d3/d18/d1f/d39/d3b/df9/df2/d73/fb8 128209 0 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/939: fsync d3/d18/d1f/d39/fb1 0 2026-03-09T00:04:09.273 INFO:tasks.workunit.client.1.vm06.stdout:0/940: fdatasync d3/d18/d2c/d2d/d74/dc7/f132 0 2026-03-09T00:04:09.275 INFO:tasks.workunit.client.0.vm03.stdout:6/620: dwrite d13/d35/f9e [0,4194304] 0 2026-03-09T00:04:09.280 INFO:tasks.workunit.client.0.vm03.stdout:6/621: write d13/d35/d4c/f99 [3417070,114215] 0 2026-03-09T00:04:09.283 INFO:tasks.workunit.client.0.vm03.stdout:6/622: dread d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:04:09.283 INFO:tasks.workunit.client.0.vm03.stdout:6/623: write d13/d35/d74/fc5 [530455,128397] 0 2026-03-09T00:04:09.285 INFO:tasks.workunit.client.0.vm03.stdout:8/660: truncate d7/f67 2748214 0 2026-03-09T00:04:09.285 INFO:tasks.workunit.client.0.vm03.stdout:8/661: fdatasync d7/df/d1a/d40/db3/dba/d38/d4c/f97 0 2026-03-09T00:04:09.286 INFO:tasks.workunit.client.0.vm03.stdout:6/624: dread d13/d1e/d44/d4a/d52/f6d [0,4194304] 0 2026-03-09T00:04:09.302 INFO:tasks.workunit.client.0.vm03.stdout:8/662: mkdir d7/df/d1a/d40/dc8 0 2026-03-09T00:04:09.303 INFO:tasks.workunit.client.1.vm06.stdout:0/941: link d3/d18/d1f/d44/ce7 d3/d18/d2c/d2d/d74/d90/c13e 0 2026-03-09T00:04:09.303 INFO:tasks.workunit.client.1.vm06.stdout:0/942: write d3/d18/d1f/fe2 [663145,89619] 0 2026-03-09T00:04:09.309 INFO:tasks.workunit.client.0.vm03.stdout:5/657: getdents d1c/d20/d55/d4f/d58/d73 0 2026-03-09T00:04:09.313 INFO:tasks.workunit.client.0.vm03.stdout:5/658: rmdir d1c/d20 39 2026-03-09T00:04:09.318 INFO:tasks.workunit.client.0.vm03.stdout:5/659: chown d1c/d20/d55/d4f/d58/d73/cbf 29847 1 2026-03-09T00:04:09.318 INFO:tasks.workunit.client.1.vm06.stdout:0/943: mknod d3/d18/d1f/d39/d49/c13f 0 2026-03-09T00:04:09.318 INFO:tasks.workunit.client.0.vm03.stdout:5/660: mkdir d1c/d20/d97/ddc 0 2026-03-09T00:04:09.318 INFO:tasks.workunit.client.0.vm03.stdout:5/661: rmdir d1c/d20/d55/d4f/d58/d5d 39 2026-03-09T00:04:09.320 INFO:tasks.workunit.client.1.vm06.stdout:0/944: mknod d3/d18/d2c/d2d/d74/dc7/c140 0 2026-03-09T00:04:09.320 INFO:tasks.workunit.client.1.vm06.stdout:0/945: chown d3/d18/d1f/d39/d49/f4b 133996 1 2026-03-09T00:04:09.325 INFO:tasks.workunit.client.1.vm06.stdout:0/946: link d3/c9e d3/d18/d2c/d2d/d8c/c141 0 2026-03-09T00:04:09.325 INFO:tasks.workunit.client.1.vm06.stdout:0/947: chown d3/d18/d2c/d2d/d31/f7b 12 1 2026-03-09T00:04:09.326 INFO:tasks.workunit.client.0.vm03.stdout:4/754: dwrite d7/d20/d6a/dea/d38/fd1 [0,4194304] 0 2026-03-09T00:04:09.332 INFO:tasks.workunit.client.0.vm03.stdout:4/755: fsync d7/f1f 0 2026-03-09T00:04:09.334 INFO:tasks.workunit.client.1.vm06.stdout:0/948: link d3/c12 d3/d18/de9/c142 0 2026-03-09T00:04:09.334 INFO:tasks.workunit.client.1.vm06.stdout:0/949: write d3/d18/d1f/d39/d49/d60/fd3 [729995,86214] 0 2026-03-09T00:04:09.334 INFO:tasks.workunit.client.1.vm06.stdout:0/950: fdatasync d3/d18/d1f/d39/d3b/df9/fca 0 2026-03-09T00:04:09.355 INFO:tasks.workunit.client.1.vm06.stdout:8/900: dwrite db/f114 [0,4194304] 0 2026-03-09T00:04:09.355 INFO:tasks.workunit.client.1.vm06.stdout:8/901: write db/dd/d48/f4e [53847,118302] 0 2026-03-09T00:04:09.360 INFO:tasks.workunit.client.1.vm06.stdout:8/902: mkdir db/dd/d24/dac/d126 0 2026-03-09T00:04:09.360 INFO:tasks.workunit.client.1.vm06.stdout:5/989: getdents d5/d1c/d21/d28/d5e/d66/d78/dd5 0 2026-03-09T00:04:09.364 INFO:tasks.workunit.client.1.vm06.stdout:8/903: dread db/f31 [0,4194304] 0 2026-03-09T00:04:09.364 INFO:tasks.workunit.client.1.vm06.stdout:8/904: creat db/dd/d84/df1/f127 x:0 0 0 2026-03-09T00:04:09.365 INFO:tasks.workunit.client.1.vm06.stdout:8/905: read db/dd/d24/d63/fe9 [847862,12230] 0 2026-03-09T00:04:09.369 INFO:tasks.workunit.client.1.vm06.stdout:8/906: symlink db/d53/d70/d38/l128 0 2026-03-09T00:04:09.369 INFO:tasks.workunit.client.1.vm06.stdout:8/907: readlink l8 0 2026-03-09T00:04:09.369 INFO:tasks.workunit.client.1.vm06.stdout:8/908: dread db/d74/d78/d98/db6/dc7/d101/f88 [4194304,4194304] 0 2026-03-09T00:04:09.371 INFO:tasks.workunit.client.0.vm03.stdout:5/662: dwrite d1c/f4c [0,4194304] 0 2026-03-09T00:04:09.371 INFO:tasks.workunit.client.0.vm03.stdout:5/663: readlink d1c/d20/d55/d4f/d58/d73/d76/d91/lc5 0 2026-03-09T00:04:09.376 INFO:tasks.workunit.client.0.vm03.stdout:5/664: mknod d1c/d51/d6a/d75/cdd 0 2026-03-09T00:04:09.377 INFO:tasks.workunit.client.0.vm03.stdout:6/625: dwrite f2 [0,4194304] 0 2026-03-09T00:04:09.377 INFO:tasks.workunit.client.0.vm03.stdout:6/626: readlink d13/d35/d4c/d62/l90 0 2026-03-09T00:04:09.384 INFO:tasks.workunit.client.0.vm03.stdout:5/665: rmdir d1c/d20/d97/ddc 0 2026-03-09T00:04:09.387 INFO:tasks.workunit.client.0.vm03.stdout:5/666: creat d1c/d20/d55/d66/d70/fde x:0 0 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/667: readlink d1c/l27 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/668: write d1c/d51/d6a/fc2 [826329,52532] 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/669: symlink d1c/d20/d55/d66/d6b/d8f/ldf 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/670: truncate d1c/d20/fa3 890028 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/671: write d1c/f96 [989105,43995] 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/672: truncate d1c/d20/d55/d4f/d58/d73/d9e/da5/fcf 898157 0 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/673: dread - d1c/d20/d55/d4f/d58/fa6 zero size 2026-03-09T00:04:09.388 INFO:tasks.workunit.client.0.vm03.stdout:5/674: write d1c/d20/d55/db0/dc7/fd7 [527210,102563] 0 2026-03-09T00:04:09.389 INFO:tasks.workunit.client.0.vm03.stdout:5/675: chown d1c/d20/d55/d4f/c6d 1504 1 2026-03-09T00:04:09.416 INFO:tasks.workunit.client.0.vm03.stdout:0/669: rename d2/da/d1a/f3a to d2/da/d36/ff6 0 2026-03-09T00:04:09.417 INFO:tasks.workunit.client.0.vm03.stdout:0/670: mkdir d2/da/d36/ddf/df7 0 2026-03-09T00:04:09.423 INFO:tasks.workunit.client.0.vm03.stdout:0/671: creat d2/da/dd/d49/d6c/d4b/d55/d6f/ff8 x:0 0 0 2026-03-09T00:04:09.432 INFO:tasks.workunit.client.0.vm03.stdout:0/672: dread d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:04:09.432 INFO:tasks.workunit.client.0.vm03.stdout:0/673: dread - d2/fcd zero size 2026-03-09T00:04:09.436 INFO:tasks.workunit.client.0.vm03.stdout:0/674: creat d2/da/dd/d49/d6c/da6/dcf/ff9 x:0 0 0 2026-03-09T00:04:09.436 INFO:tasks.workunit.client.0.vm03.stdout:0/675: readlink d2/da/d76/d8a/d8f/l97 0 2026-03-09T00:04:09.437 INFO:tasks.workunit.client.0.vm03.stdout:6/627: dwrite d13/d35/db5/fc6 [0,4194304] 0 2026-03-09T00:04:09.440 INFO:tasks.workunit.client.1.vm06.stdout:3/884: mkdir d11/d28/d2e/d7e/d83/d132 0 2026-03-09T00:04:09.441 INFO:tasks.workunit.client.1.vm06.stdout:8/909: dwrite db/d74/d87/d100/f95 [0,4194304] 0 2026-03-09T00:04:09.443 INFO:tasks.workunit.client.1.vm06.stdout:8/910: dread db/d74/d78/d98/db6/ff0 [0,4194304] 0 2026-03-09T00:04:09.449 INFO:tasks.workunit.client.1.vm06.stdout:8/911: stat db/dd/d84/f8d 0 2026-03-09T00:04:09.449 INFO:tasks.workunit.client.1.vm06.stdout:8/912: dread - db/d74/d87/fca zero size 2026-03-09T00:04:09.449 INFO:tasks.workunit.client.1.vm06.stdout:8/913: write db/d74/d78/d98/d9c/fd7 [177514,31447] 0 2026-03-09T00:04:09.450 INFO:tasks.workunit.client.0.vm03.stdout:6/628: dread d13/d1e/d44/d59/d77/f94 [0,4194304] 0 2026-03-09T00:04:09.450 INFO:tasks.workunit.client.0.vm03.stdout:6/629: symlink d13/d8f/lce 0 2026-03-09T00:04:09.450 INFO:tasks.workunit.client.0.vm03.stdout:6/630: fdatasync f10 0 2026-03-09T00:04:09.450 INFO:tasks.workunit.client.1.vm06.stdout:3/885: symlink d11/d117/l133 0 2026-03-09T00:04:09.450 INFO:tasks.workunit.client.1.vm06.stdout:3/886: dread d11/d28/d4d/d9b/f9d [0,4194304] 0 2026-03-09T00:04:09.451 INFO:tasks.workunit.client.0.vm03.stdout:2/670: rmdir d8/d26/d5e/d5f/d95 39 2026-03-09T00:04:09.452 INFO:tasks.workunit.client.0.vm03.stdout:6/631: link d13/d35/d74/c8b d13/d35/d71/ccf 0 2026-03-09T00:04:09.453 INFO:tasks.workunit.client.1.vm06.stdout:6/861: symlink d4/d16/d53/ddf/d4b/l109 0 2026-03-09T00:04:09.453 INFO:tasks.workunit.client.1.vm06.stdout:6/862: read d4/fc [4115724,75349] 0 2026-03-09T00:04:09.454 INFO:tasks.workunit.client.1.vm06.stdout:0/951: dwrite d3/d18/d2c/d2d/d74/daf/f11b [0,4194304] 0 2026-03-09T00:04:09.455 INFO:tasks.workunit.client.0.vm03.stdout:2/671: creat d8/d1b/d2a/d2e/fd9 x:0 0 0 2026-03-09T00:04:09.457 INFO:tasks.workunit.client.0.vm03.stdout:6/632: write d13/d1e/f48 [2340964,87910] 0 2026-03-09T00:04:09.457 INFO:tasks.workunit.client.0.vm03.stdout:0/676: dread d2/da/dd/d49/d6c/d4b/f67 [0,4194304] 0 2026-03-09T00:04:09.457 INFO:tasks.workunit.client.0.vm03.stdout:0/677: chown d2/da 2 1 2026-03-09T00:04:09.458 INFO:tasks.workunit.client.1.vm06.stdout:3/887: creat d11/d28/d2e/d2f/d5b/ddb/df1/f134 x:0 0 0 2026-03-09T00:04:09.459 INFO:tasks.workunit.client.1.vm06.stdout:6/863: write d4/d27/d3e/f41 [457236,48661] 0 2026-03-09T00:04:09.468 INFO:tasks.workunit.client.0.vm03.stdout:6/633: mknod d13/d35/d74/d89/db3/cd0 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.1.vm06.stdout:3/888: creat d11/d3f/d8d/f135 x:0 0 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.1.vm06.stdout:6/864: rmdir d4/d16 39 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.1.vm06.stdout:6/865: symlink d4/d16/d53/ddf/d52/d7d/l10a 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.1.vm06.stdout:6/866: getdents d4/d16/d53/ddf/d7e/dac/dd3 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.0.vm03.stdout:0/678: getdents d2/da/d76/d8a/d8f/db8 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.0.vm03.stdout:0/679: fsync d2/da/dd/d49/d6c/f52 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.0.vm03.stdout:0/680: truncate d2/da/fca 130727 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.0.vm03.stdout:0/681: dread d2/da/dd/d49/d6c/da6/dda/db5/dba/fbc [0,4194304] 0 2026-03-09T00:04:09.484 INFO:tasks.workunit.client.0.vm03.stdout:6/634: creat d13/d1e/d44/fd1 x:0 0 0 2026-03-09T00:04:09.514 INFO:tasks.workunit.client.0.vm03.stdout:6/635: dwrite d13/d35/d4c/f4f [0,4194304] 0 2026-03-09T00:04:09.515 INFO:tasks.workunit.client.0.vm03.stdout:6/636: dread fb [4194304,4194304] 0 2026-03-09T00:04:09.515 INFO:tasks.workunit.client.0.vm03.stdout:6/637: chown fb 0 1 2026-03-09T00:04:09.525 INFO:tasks.workunit.client.0.vm03.stdout:6/638: rename f12 to d13/d35/d71/d97/fd2 0 2026-03-09T00:04:09.525 INFO:tasks.workunit.client.0.vm03.stdout:6/639: chown d13/d1e/d44/lbc 32167 1 2026-03-09T00:04:09.525 INFO:tasks.workunit.client.0.vm03.stdout:6/640: fsync d13/d35/fb6 0 2026-03-09T00:04:09.525 INFO:tasks.workunit.client.0.vm03.stdout:6/641: stat d13/d1e/d44/d4a/d52 0 2026-03-09T00:04:09.528 INFO:tasks.workunit.client.0.vm03.stdout:6/642: creat d13/fd3 x:0 0 0 2026-03-09T00:04:09.529 INFO:tasks.workunit.client.0.vm03.stdout:6/643: creat d13/d35/fd4 x:0 0 0 2026-03-09T00:04:09.530 INFO:tasks.workunit.client.0.vm03.stdout:6/644: dread d13/d1e/d44/d59/d77/f98 [0,4194304] 0 2026-03-09T00:04:09.530 INFO:tasks.workunit.client.0.vm03.stdout:6/645: fsync d13/d35/db5/fc6 0 2026-03-09T00:04:09.530 INFO:tasks.workunit.client.0.vm03.stdout:6/646: dread - d13/d35/fd4 zero size 2026-03-09T00:04:09.531 INFO:tasks.workunit.client.1.vm06.stdout:9/783: sync 2026-03-09T00:04:09.532 INFO:tasks.workunit.client.0.vm03.stdout:6/647: creat d13/d1e/d44/d4a/d52/fd5 x:0 0 0 2026-03-09T00:04:09.534 INFO:tasks.workunit.client.0.vm03.stdout:6/648: creat d13/d35/d4c/fd6 x:0 0 0 2026-03-09T00:04:09.534 INFO:tasks.workunit.client.0.vm03.stdout:6/649: chown d13/d35/d72/dca 51085144 1 2026-03-09T00:04:09.534 INFO:tasks.workunit.client.0.vm03.stdout:6/650: fsync d13/d1e/d44/d4a/d52/f6d 0 2026-03-09T00:04:09.538 INFO:tasks.workunit.client.0.vm03.stdout:6/651: mkdir d13/d8f/dd7 0 2026-03-09T00:04:09.557 INFO:tasks.workunit.client.0.vm03.stdout:2/672: mkdir d8/d1b/d24/da5/dda 0 2026-03-09T00:04:09.562 INFO:tasks.workunit.client.0.vm03.stdout:2/673: mknod d8/d1b/d2a/d2e/cdb 0 2026-03-09T00:04:09.571 INFO:tasks.workunit.client.0.vm03.stdout:6/652: dwrite d13/d35/d4c/d62/fa0 [0,4194304] 0 2026-03-09T00:04:09.571 INFO:tasks.workunit.client.0.vm03.stdout:6/653: write d13/d35/d4c/f99 [2153495,84724] 0 2026-03-09T00:04:09.575 INFO:tasks.workunit.client.0.vm03.stdout:6/654: symlink d13/d1e/d44/d59/d77/ld8 0 2026-03-09T00:04:09.578 INFO:tasks.workunit.client.0.vm03.stdout:6/655: creat d13/dc4/fd9 x:0 0 0 2026-03-09T00:04:09.578 INFO:tasks.workunit.client.0.vm03.stdout:6/656: dread - d13/d35/d4c/fd6 zero size 2026-03-09T00:04:09.578 INFO:tasks.workunit.client.0.vm03.stdout:6/657: write d13/d1e/f48 [1581010,15223] 0 2026-03-09T00:04:09.578 INFO:tasks.workunit.client.0.vm03.stdout:6/658: creat d13/d35/fda x:0 0 0 2026-03-09T00:04:09.593 INFO:tasks.workunit.client.0.vm03.stdout:6/659: unlink d13/d1e/d44/d59/d77/ld8 0 2026-03-09T00:04:09.594 INFO:tasks.workunit.client.0.vm03.stdout:2/674: dwrite d8/d26/d5e/d6f/d97/f1c [4194304,4194304] 0 2026-03-09T00:04:09.600 INFO:tasks.workunit.client.0.vm03.stdout:2/675: symlink d8/d74/ldc 0 2026-03-09T00:04:09.642 INFO:tasks.workunit.client.0.vm03.stdout:6/660: dwrite d13/d35/f68 [0,4194304] 0 2026-03-09T00:04:09.645 INFO:tasks.workunit.client.0.vm03.stdout:2/676: dwrite f6 [0,4194304] 0 2026-03-09T00:04:09.645 INFO:tasks.workunit.client.0.vm03.stdout:6/661: creat d13/d35/d4c/d62/fdb x:0 0 0 2026-03-09T00:04:09.647 INFO:tasks.workunit.client.0.vm03.stdout:2/677: symlink d8/d1b/d2a/d2e/d9a/ldd 0 2026-03-09T00:04:09.650 INFO:tasks.workunit.client.0.vm03.stdout:6/662: write d13/d35/f68 [1217169,7016] 0 2026-03-09T00:04:09.650 INFO:tasks.workunit.client.0.vm03.stdout:2/678: creat d8/d26/d5e/dc5/fde x:0 0 0 2026-03-09T00:04:09.650 INFO:tasks.workunit.client.0.vm03.stdout:2/679: fsync d8/d26/d5e/d6f/d97/f1a 0 2026-03-09T00:04:09.650 INFO:tasks.workunit.client.0.vm03.stdout:2/680: creat d8/d1b/d24/da5/da8/fdf x:0 0 0 2026-03-09T00:04:09.650 INFO:tasks.workunit.client.0.vm03.stdout:2/681: dread - d8/d1b/d2a/d6b/fd5 zero size 2026-03-09T00:04:09.655 INFO:tasks.workunit.client.0.vm03.stdout:6/663: write d13/d35/d72/f85 [3620241,10866] 0 2026-03-09T00:04:09.657 INFO:tasks.workunit.client.0.vm03.stdout:6/664: fdatasync d13/d35/d72/f85 0 2026-03-09T00:04:09.660 INFO:tasks.workunit.client.0.vm03.stdout:6/665: mknod d13/d35/d4c/cdc 0 2026-03-09T00:04:09.687 INFO:tasks.workunit.client.0.vm03.stdout:6/666: dwrite d13/f31 [0,4194304] 0 2026-03-09T00:04:09.687 INFO:tasks.workunit.client.0.vm03.stdout:6/667: fdatasync d13/d35/d74/fc5 0 2026-03-09T00:04:09.690 INFO:tasks.workunit.client.0.vm03.stdout:6/668: symlink d13/d35/d4c/ldd 0 2026-03-09T00:04:09.693 INFO:tasks.workunit.client.0.vm03.stdout:6/669: creat d13/d35/d72/dca/fde x:0 0 0 2026-03-09T00:04:09.693 INFO:tasks.workunit.client.0.vm03.stdout:6/670: readlink d13/d35/l7d 0 2026-03-09T00:04:09.711 INFO:tasks.workunit.client.0.vm03.stdout:1/768: symlink d4/d15/dae/l105 0 2026-03-09T00:04:09.711 INFO:tasks.workunit.client.0.vm03.stdout:1/769: write d4/d3a/d43/f49 [732492,76418] 0 2026-03-09T00:04:09.711 INFO:tasks.workunit.client.0.vm03.stdout:1/770: write d4/d15/d1a/f1d [5416360,113983] 0 2026-03-09T00:04:09.737 INFO:tasks.workunit.client.1.vm06.stdout:9/784: symlink d1/d4/d6e/d14/lff 0 2026-03-09T00:04:09.747 INFO:tasks.workunit.client.0.vm03.stdout:1/771: dwrite d4/d3a/d32/f4b [0,4194304] 0 2026-03-09T00:04:09.747 INFO:tasks.workunit.client.0.vm03.stdout:1/772: stat d4/d15/d1a/f55 0 2026-03-09T00:04:09.749 INFO:tasks.workunit.client.1.vm06.stdout:9/785: symlink d1/d4/d2f/l100 0 2026-03-09T00:04:09.750 INFO:tasks.workunit.client.0.vm03.stdout:1/773: symlink d4/d3a/d3d/l106 0 2026-03-09T00:04:09.750 INFO:tasks.workunit.client.0.vm03.stdout:1/774: creat d4/d3a/d32/d87/f107 x:0 0 0 2026-03-09T00:04:09.754 INFO:tasks.workunit.client.1.vm06.stdout:9/786: link d1/d3/d4f/d91/d94/la1 d1/d3/d50/l101 0 2026-03-09T00:04:09.755 INFO:tasks.workunit.client.0.vm03.stdout:1/775: creat d4/d15/dae/d101/f108 x:0 0 0 2026-03-09T00:04:09.761 INFO:tasks.workunit.client.1.vm06.stdout:9/787: rename d1/d3/d2b/d58/f5f to d1/da7/dfc/f102 0 2026-03-09T00:04:09.798 INFO:tasks.workunit.client.1.vm06.stdout:8/914: write db/d74/d87/d100/d8f/fcc [1519491,45679] 0 2026-03-09T00:04:09.805 INFO:tasks.workunit.client.0.vm03.stdout:0/682: creat d2/da/dd/d49/d6c/d4b/d55/ffa x:0 0 0 2026-03-09T00:04:09.806 INFO:tasks.workunit.client.0.vm03.stdout:0/683: creat d2/da/d76/d8a/d8f/ffb x:0 0 0 2026-03-09T00:04:09.806 INFO:tasks.workunit.client.0.vm03.stdout:0/684: chown d2/da/dd/d49/d6c/d4b/d55/d6f/dad/fcc 9194 1 2026-03-09T00:04:09.806 INFO:tasks.workunit.client.0.vm03.stdout:0/685: chown d2/d71 461 1 2026-03-09T00:04:09.806 INFO:tasks.workunit.client.0.vm03.stdout:0/686: symlink d2/da/d1a/lfc 0 2026-03-09T00:04:09.807 INFO:tasks.workunit.client.0.vm03.stdout:0/687: mknod d2/da/dd/d49/d6c/da6/dda/db5/dba/cfd 0 2026-03-09T00:04:09.823 INFO:tasks.workunit.client.0.vm03.stdout:4/756: creat d7/d20/d6a/dea/ff0 x:0 0 0 2026-03-09T00:04:09.826 INFO:tasks.workunit.client.0.vm03.stdout:4/757: write d7/d27/f31 [280508,81682] 0 2026-03-09T00:04:09.838 INFO:tasks.workunit.client.0.vm03.stdout:4/758: mkdir d7/d20/d6a/d77/d25/de2/df1 0 2026-03-09T00:04:09.838 INFO:tasks.workunit.client.0.vm03.stdout:4/759: rename d7/d6f/da5/db0 to d7/d20/d6a/dea/d38/da9/ddc/df2 0 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/682: truncate d8/d26/d5e/d6f/d97/f1c 865273 0 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/683: write d8/d1b/d24/f86 [764445,79722] 0 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/684: chown d8/d1b/d2a/d6b/d50/c80 7 1 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/685: read d8/d26/d5e/d6f/d97/f6e [319720,52689] 0 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/686: write d8/d1b/d2a/d6b/dc6/fd2 [5116249,98780] 0 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/687: dread - d8/d1b/d8f/fd6 zero size 2026-03-09T00:04:09.839 INFO:tasks.workunit.client.0.vm03.stdout:2/688: write d8/d1b/d2a/d6b/d50/f54 [3840381,2528] 0 2026-03-09T00:04:09.841 INFO:tasks.workunit.client.0.vm03.stdout:3/491: unlink d2/db/d40/l75 0 2026-03-09T00:04:09.842 INFO:tasks.workunit.client.0.vm03.stdout:2/689: mkdir d8/d1b/d24/da5/dda/de0 0 2026-03-09T00:04:09.843 INFO:tasks.workunit.client.0.vm03.stdout:2/690: creat d8/d1b/d6c/fe1 x:0 0 0 2026-03-09T00:04:09.843 INFO:tasks.workunit.client.0.vm03.stdout:2/691: stat d8/d26/d5e/d5f/c65 0 2026-03-09T00:04:09.843 INFO:tasks.workunit.client.0.vm03.stdout:2/692: creat d8/d1b/d2a/d6b/fe2 x:0 0 0 2026-03-09T00:04:09.845 INFO:tasks.workunit.client.1.vm06.stdout:3/889: rmdir d11/d28 39 2026-03-09T00:04:09.859 INFO:tasks.workunit.client.0.vm03.stdout:8/663: creat d7/df/d1a/d40/db3/dba/d38/fc9 x:0 0 0 2026-03-09T00:04:09.864 INFO:tasks.workunit.client.0.vm03.stdout:8/664: stat d7/df/d1a/d2b/d62/c86 0 2026-03-09T00:04:09.864 INFO:tasks.workunit.client.0.vm03.stdout:8/665: write d7/df/d1a/f1c [2286904,102725] 0 2026-03-09T00:04:09.869 INFO:tasks.workunit.client.1.vm06.stdout:4/891: mkdir d17/d134 0 2026-03-09T00:04:09.870 INFO:tasks.workunit.client.0.vm03.stdout:8/666: unlink d7/df/f37 0 2026-03-09T00:04:09.870 INFO:tasks.workunit.client.1.vm06.stdout:4/892: symlink d17/d24/d3b/d5e/d7a/l135 0 2026-03-09T00:04:09.870 INFO:tasks.workunit.client.1.vm06.stdout:4/893: dread - d17/d21/f11b zero size 2026-03-09T00:04:09.871 INFO:tasks.workunit.client.0.vm03.stdout:8/667: rename d7/df/d1a/d40/db3/dba/d38/lc5 to d7/df/d1a/d40/db3/dba/d38/d91/dc2/lca 0 2026-03-09T00:04:09.871 INFO:tasks.workunit.client.0.vm03.stdout:8/668: chown d7/df/d1a/d2b/l65 5687 1 2026-03-09T00:04:09.877 INFO:tasks.workunit.client.1.vm06.stdout:4/894: link d17/d5b/f83 d17/d21/d4c/d50/f136 0 2026-03-09T00:04:09.885 INFO:tasks.workunit.client.1.vm06.stdout:4/895: dread d17/d21/d4c/d66/dd9/f7e [0,4194304] 0 2026-03-09T00:04:09.888 INFO:tasks.workunit.client.1.vm06.stdout:0/952: mknod d3/c143 0 2026-03-09T00:04:09.888 INFO:tasks.workunit.client.1.vm06.stdout:0/953: fdatasync d3/d18/d1f/d39/d69/fb4 0 2026-03-09T00:04:09.890 INFO:tasks.workunit.client.0.vm03.stdout:6/671: write d13/f6f [4263811,71106] 0 2026-03-09T00:04:09.890 INFO:tasks.workunit.client.0.vm03.stdout:6/672: getdents d13/d35/d69 0 2026-03-09T00:04:09.892 INFO:tasks.workunit.client.1.vm06.stdout:4/896: truncate d17/d5b/d8f/fa8 382078 0 2026-03-09T00:04:09.905 INFO:tasks.workunit.client.1.vm06.stdout:4/897: unlink d17/d24/d3b/dbf/ddf/dfc/f119 0 2026-03-09T00:04:09.910 INFO:tasks.workunit.client.1.vm06.stdout:4/898: link d17/l59 d17/d21/d4c/d66/d68/l137 0 2026-03-09T00:04:09.910 INFO:tasks.workunit.client.1.vm06.stdout:4/899: read d17/d21/fb8 [148561,102132] 0 2026-03-09T00:04:09.911 INFO:tasks.workunit.client.1.vm06.stdout:0/954: dread d3/d18/d2c/d2d/d31/f4f [0,4194304] 0 2026-03-09T00:04:09.911 INFO:tasks.workunit.client.1.vm06.stdout:0/955: read - d3/d18/d2c/d2d/d74/d90/fac zero size 2026-03-09T00:04:09.919 INFO:tasks.workunit.client.1.vm06.stdout:9/788: dwrite d1/d4/d6e/d14/d25/d85/fb8 [0,4194304] 0 2026-03-09T00:04:09.925 INFO:tasks.workunit.client.0.vm03.stdout:0/688: dwrite d2/da/dd/d49/d6c/f41 [0,4194304] 0 2026-03-09T00:04:09.925 INFO:tasks.workunit.client.1.vm06.stdout:8/915: dwrite db/d74/d78/d98/d9c/fd7 [0,4194304] 0 2026-03-09T00:04:09.925 INFO:tasks.workunit.client.1.vm06.stdout:8/916: chown db/d74/d78/d98/db6/dc7/d101/db7 802 1 2026-03-09T00:04:09.928 INFO:tasks.workunit.client.1.vm06.stdout:9/789: write d1/d3/d4f/d52/f5e [424988,108804] 0 2026-03-09T00:04:09.935 INFO:tasks.workunit.client.1.vm06.stdout:8/917: mknod db/d74/d87/d100/d10a/c129 0 2026-03-09T00:04:09.937 INFO:tasks.workunit.client.1.vm06.stdout:9/790: creat d1/d3/d4f/d91/f103 x:0 0 0 2026-03-09T00:04:09.938 INFO:tasks.workunit.client.1.vm06.stdout:8/918: mknod db/d53/d70/d38/d47/c12a 0 2026-03-09T00:04:09.938 INFO:tasks.workunit.client.1.vm06.stdout:8/919: read db/dd/f97 [3007117,13808] 0 2026-03-09T00:04:09.943 INFO:tasks.workunit.client.0.vm03.stdout:8/669: dwrite d7/df/d1a/d40/f4d [0,4194304] 0 2026-03-09T00:04:09.944 INFO:tasks.workunit.client.1.vm06.stdout:9/791: link d1/da7/dfc/f102 d1/d3/d4f/d52/de3/de5/f104 0 2026-03-09T00:04:09.945 INFO:tasks.workunit.client.1.vm06.stdout:8/920: creat db/d53/d70/d38/f12b x:0 0 0 2026-03-09T00:04:09.947 INFO:tasks.workunit.client.0.vm03.stdout:8/670: creat d7/df/d1a/d40/db3/dba/d3f/d95/fcb x:0 0 0 2026-03-09T00:04:09.947 INFO:tasks.workunit.client.0.vm03.stdout:8/671: readlink d7/df/d1a/d2b/l65 0 2026-03-09T00:04:09.947 INFO:tasks.workunit.client.1.vm06.stdout:9/792: mknod d1/d4/d6e/d14/c105 0 2026-03-09T00:04:09.948 INFO:tasks.workunit.client.0.vm03.stdout:8/672: rename d7/df/d1a/d2b/l35 to d7/df/d1a/d40/lcc 0 2026-03-09T00:04:09.953 INFO:tasks.workunit.client.0.vm03.stdout:8/673: dread d7/df/d1a/d40/db3/dba/d38/d60/f6e [0,4194304] 0 2026-03-09T00:04:09.957 INFO:tasks.workunit.client.1.vm06.stdout:9/793: creat d1/d3/f106 x:0 0 0 2026-03-09T00:04:09.976 INFO:tasks.workunit.client.0.vm03.stdout:8/674: mkdir d7/df/d1a/d40/db3/dba/d38/d60/dcd 0 2026-03-09T00:04:09.977 INFO:tasks.workunit.client.1.vm06.stdout:8/921: getdents db/d53/d70/d38/d4d/d79/dd5 0 2026-03-09T00:04:09.984 INFO:tasks.workunit.client.1.vm06.stdout:8/922: dread db/d74/d78/fe2 [0,4194304] 0 2026-03-09T00:04:09.984 INFO:tasks.workunit.client.1.vm06.stdout:8/923: rmdir db/d53 39 2026-03-09T00:04:09.985 INFO:tasks.workunit.client.1.vm06.stdout:8/924: creat db/dd/d24/dac/d126/f12c x:0 0 0 2026-03-09T00:04:09.993 INFO:tasks.workunit.client.0.vm03.stdout:3/492: dwrite d2/db/f64 [0,4194304] 0 2026-03-09T00:04:09.994 INFO:tasks.workunit.client.0.vm03.stdout:6/673: dwrite d13/d35/d71/fb0 [0,4194304] 0 2026-03-09T00:04:09.994 INFO:tasks.workunit.client.0.vm03.stdout:6/674: chown d13/d35/d71/fb0 3 1 2026-03-09T00:04:09.994 INFO:tasks.workunit.client.0.vm03.stdout:6/675: fsync d13/f1a 0 2026-03-09T00:04:09.994 INFO:tasks.workunit.client.0.vm03.stdout:4/760: dwrite d7/d20/d6a/dea/d54/f96 [4194304,4194304] 0 2026-03-09T00:04:09.997 INFO:tasks.workunit.client.0.vm03.stdout:3/493: dread d2/db/f28 [0,4194304] 0 2026-03-09T00:04:10.001 INFO:tasks.workunit.client.0.vm03.stdout:6/676: mknod d13/d35/d72/dca/cdf 0 2026-03-09T00:04:10.003 INFO:tasks.workunit.client.0.vm03.stdout:6/677: dread d13/d35/d72/fb7 [0,4194304] 0 2026-03-09T00:04:10.011 INFO:tasks.workunit.client.0.vm03.stdout:3/494: creat d2/db/d6a/d70/f93 x:0 0 0 2026-03-09T00:04:10.012 INFO:tasks.workunit.client.0.vm03.stdout:3/495: mknod d2/db/d6a/c94 0 2026-03-09T00:04:10.012 INFO:tasks.workunit.client.0.vm03.stdout:3/496: chown d2/db/d6a/c77 0 1 2026-03-09T00:04:10.012 INFO:tasks.workunit.client.0.vm03.stdout:3/497: dread - d2/db/f67 zero size 2026-03-09T00:04:10.012 INFO:tasks.workunit.client.0.vm03.stdout:3/498: fsync d2/db/f25 0 2026-03-09T00:04:10.049 INFO:tasks.workunit.client.1.vm06.stdout:0/956: dwrite d3/d18/f14 [0,4194304] 0 2026-03-09T00:04:10.050 INFO:tasks.workunit.client.1.vm06.stdout:3/890: dwrite d11/d28/d2e/db2/d100/f103 [0,4194304] 0 2026-03-09T00:04:10.050 INFO:tasks.workunit.client.1.vm06.stdout:3/891: chown d11/d28/d4d/d9b/l110 268 1 2026-03-09T00:04:10.050 INFO:tasks.workunit.client.1.vm06.stdout:3/892: stat d11/d28/d2e/d7e/d104 0 2026-03-09T00:04:10.052 INFO:tasks.workunit.client.1.vm06.stdout:3/893: creat d11/d28/d2e/d2f/d5b/d5f/f136 x:0 0 0 2026-03-09T00:04:10.052 INFO:tasks.workunit.client.1.vm06.stdout:3/894: getdents d11/d28/d57 0 2026-03-09T00:04:10.052 INFO:tasks.workunit.client.1.vm06.stdout:3/895: truncate d11/d28/d2e/dff/f123 616961 0 2026-03-09T00:04:10.062 INFO:tasks.workunit.client.1.vm06.stdout:3/896: dread d11/d28/f42 [0,4194304] 0 2026-03-09T00:04:10.062 INFO:tasks.workunit.client.1.vm06.stdout:3/897: chown d11/f1d 1054917 1 2026-03-09T00:04:10.062 INFO:tasks.workunit.client.1.vm06.stdout:3/898: chown d11/d28/d2e/d2f/d36/d8f/lcb 24119 1 2026-03-09T00:04:10.063 INFO:tasks.workunit.client.1.vm06.stdout:3/899: symlink d11/d28/d2e/d7e/l137 0 2026-03-09T00:04:10.063 INFO:tasks.workunit.client.1.vm06.stdout:3/900: write d11/d28/f3a [795826,65199] 0 2026-03-09T00:04:10.075 INFO:tasks.workunit.client.0.vm03.stdout:2/693: rmdir d8/d1b/d6c 39 2026-03-09T00:04:10.077 INFO:tasks.workunit.client.0.vm03.stdout:9/670: sync 2026-03-09T00:04:10.077 INFO:tasks.workunit.client.0.vm03.stdout:7/605: sync 2026-03-09T00:04:10.077 INFO:tasks.workunit.client.0.vm03.stdout:9/671: truncate d15/d1c/fb2 613462 0 2026-03-09T00:04:10.077 INFO:tasks.workunit.client.0.vm03.stdout:5/676: sync 2026-03-09T00:04:10.078 INFO:tasks.workunit.client.0.vm03.stdout:2/694: link d8/d26/d5e/d5f/lc1 d8/d1b/d2a/d6b/le3 0 2026-03-09T00:04:10.082 INFO:tasks.workunit.client.0.vm03.stdout:2/695: chown d8/la9 1000 0 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.0.vm03.stdout:9/672: mknod d15/d1c/d21/d64/cde 0 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:1/798: sync 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:2/992: sync 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:1/799: fsync d6/d21/d2d/d3b/d42/fb4 0 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:2/993: creat d7/d1b/d71/d79/db4/f131 x:0 0 0 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:7/890: sync 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:5/990: sync 2026-03-09T00:04:10.083 INFO:tasks.workunit.client.1.vm06.stdout:5/991: readlink d5/d1c/d68/dec/led 0 2026-03-09T00:04:10.085 INFO:tasks.workunit.client.1.vm06.stdout:1/800: dread d6/d4c/d79/f59 [0,4194304] 0 2026-03-09T00:04:10.085 INFO:tasks.workunit.client.1.vm06.stdout:1/801: chown d6/d21/d2d/d3b/d87/ce2 1001619 1 2026-03-09T00:04:10.085 INFO:tasks.workunit.client.1.vm06.stdout:9/794: dwrite d1/d3/d4f/d91/f103 [0,4194304] 0 2026-03-09T00:04:10.086 INFO:tasks.workunit.client.0.vm03.stdout:6/678: dwrite d13/d1e/f21 [0,4194304] 0 2026-03-09T00:04:10.087 INFO:tasks.workunit.client.1.vm06.stdout:1/802: dread d6/d4c/d51/fba [0,4194304] 0 2026-03-09T00:04:10.089 INFO:tasks.workunit.client.1.vm06.stdout:0/957: read d3/f51 [1744598,108124] 0 2026-03-09T00:04:10.089 INFO:tasks.workunit.client.1.vm06.stdout:6/867: sync 2026-03-09T00:04:10.089 INFO:tasks.workunit.client.1.vm06.stdout:6/868: readlink d4/d16/d53/l64 0 2026-03-09T00:04:10.092 INFO:tasks.workunit.client.1.vm06.stdout:4/900: rename d17/d21/f2f to d17/d24/d3b/d97/db7/f138 0 2026-03-09T00:04:10.098 INFO:tasks.workunit.client.1.vm06.stdout:2/994: stat d7/d1a/d25/d66/d87/fc3 0 2026-03-09T00:04:10.108 INFO:tasks.workunit.client.1.vm06.stdout:7/891: mknod d0/d55/d99/d102/c108 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.1.vm06.stdout:5/992: symlink d5/d1c/d21/d28/l14e 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.0.vm03.stdout:6/679: rename d13/d1e/d44/d59/f66 to d13/d1e/d44/d59/fe0 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.0.vm03.stdout:6/680: rename d13/d1e/d44/d4a/l88 to d13/d8f/dd7/le1 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.0.vm03.stdout:6/681: creat d13/d1e/d44/d4a/d52/fe2 x:0 0 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.1.vm06.stdout:0/958: creat d3/d10f/f144 x:0 0 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.1.vm06.stdout:0/959: truncate d3/f129 460672 0 2026-03-09T00:04:10.111 INFO:tasks.workunit.client.1.vm06.stdout:0/960: readlink d3/d18/d2c/d2d/l99 0 2026-03-09T00:04:10.112 INFO:tasks.workunit.client.1.vm06.stdout:6/869: mknod d4/d27/d3e/d45/c10b 0 2026-03-09T00:04:10.114 INFO:tasks.workunit.client.0.vm03.stdout:0/689: dwrite d2/da/dd/d49/d6c/d4b/f4c [0,4194304] 0 2026-03-09T00:04:10.117 INFO:tasks.workunit.client.1.vm06.stdout:8/925: rename db/l105 to db/d74/d87/d100/l12d 0 2026-03-09T00:04:10.119 INFO:tasks.workunit.client.0.vm03.stdout:3/499: dwrite d2/db/d2d/f45 [0,4194304] 0 2026-03-09T00:04:10.120 INFO:tasks.workunit.client.0.vm03.stdout:8/675: getdents d7/df/d1a/d40/db3/dba/d3f/d95 0 2026-03-09T00:04:10.120 INFO:tasks.workunit.client.1.vm06.stdout:4/901: creat d17/d21/d4c/d66/de3/f139 x:0 0 0 2026-03-09T00:04:10.125 INFO:tasks.workunit.client.0.vm03.stdout:0/690: creat d2/da/dd/d49/d6c/d4b/ffe x:0 0 0 2026-03-09T00:04:10.134 INFO:tasks.workunit.client.0.vm03.stdout:0/691: fdatasync d2/da/dd/d49/d6c/d4b/f88 0 2026-03-09T00:04:10.134 INFO:tasks.workunit.client.0.vm03.stdout:0/692: dread - d2/da/dd/d49/d6c/da6/fc2 zero size 2026-03-09T00:04:10.134 INFO:tasks.workunit.client.0.vm03.stdout:3/500: rmdir d2/db/d40/d44/d87 0 2026-03-09T00:04:10.134 INFO:tasks.workunit.client.0.vm03.stdout:8/676: creat d7/df/d1a/d2b/d62/fce x:0 0 0 2026-03-09T00:04:10.135 INFO:tasks.workunit.client.1.vm06.stdout:7/892: getdents d0/d55/d85 0 2026-03-09T00:04:10.135 INFO:tasks.workunit.client.1.vm06.stdout:5/993: creat d5/d1c/d21/d28/d5e/f14f x:0 0 0 2026-03-09T00:04:10.135 INFO:tasks.workunit.client.1.vm06.stdout:5/994: chown d5/d1c/d68/da2/le0 137 1 2026-03-09T00:04:10.135 INFO:tasks.workunit.client.1.vm06.stdout:0/961: mknod d3/d18/d2c/d2d/d31/c145 0 2026-03-09T00:04:10.135 INFO:tasks.workunit.client.1.vm06.stdout:1/803: rename d6/d21/d2d/d3b/d42/f9f to d6/db0/f10e 0 2026-03-09T00:04:10.135 INFO:tasks.workunit.client.1.vm06.stdout:1/804: stat d6/db0/c10b 0 2026-03-09T00:04:10.137 INFO:tasks.workunit.client.0.vm03.stdout:0/693: dread d2/da/d1a/f1c [0,4194304] 0 2026-03-09T00:04:10.140 INFO:tasks.workunit.client.1.vm06.stdout:8/926: creat db/dd/d24/db0/f12e x:0 0 0 2026-03-09T00:04:10.140 INFO:tasks.workunit.client.1.vm06.stdout:8/927: dread - db/d53/d70/d38/d47/fb2 zero size 2026-03-09T00:04:10.143 INFO:tasks.workunit.client.1.vm06.stdout:9/795: dread d1/d4/d6e/d14/d25/d85/fb8 [0,4194304] 0 2026-03-09T00:04:10.147 INFO:tasks.workunit.client.1.vm06.stdout:9/796: chown d1/d4/d6e/d14/d25/d85/d49/ld7 128 1 2026-03-09T00:04:10.147 INFO:tasks.workunit.client.1.vm06.stdout:9/797: truncate d1/d73/dcf/ff2 239845 0 2026-03-09T00:04:10.148 INFO:tasks.workunit.client.0.vm03.stdout:3/501: creat d2/db/d3b/f95 x:0 0 0 2026-03-09T00:04:10.149 INFO:tasks.workunit.client.1.vm06.stdout:8/928: dread db/dd/d24/d63/fe9 [0,4194304] 0 2026-03-09T00:04:10.149 INFO:tasks.workunit.client.0.vm03.stdout:8/677: rename d7/df/d1a/d40/c84 to d7/df/d1a/d40/dc8/ccf 0 2026-03-09T00:04:10.151 INFO:tasks.workunit.client.0.vm03.stdout:0/694: mkdir d2/da/dd/d49/d6c/da6/dda/db5/dba/dff 0 2026-03-09T00:04:10.151 INFO:tasks.workunit.client.0.vm03.stdout:0/695: chown d2/da/dd/d49/d6c/da6/dda/db5 473800889 1 2026-03-09T00:04:10.152 INFO:tasks.workunit.client.1.vm06.stdout:2/995: mkdir d7/d132 0 2026-03-09T00:04:10.153 INFO:tasks.workunit.client.1.vm06.stdout:7/893: getdents d0/d55/d99 0 2026-03-09T00:04:10.153 INFO:tasks.workunit.client.1.vm06.stdout:5/995: symlink d5/db1/dcc/l150 0 2026-03-09T00:04:10.154 INFO:tasks.workunit.client.0.vm03.stdout:8/678: mknod d7/df/d1a/d40/db3/dba/cd0 0 2026-03-09T00:04:10.155 INFO:tasks.workunit.client.1.vm06.stdout:0/962: unlink d3/f1a 0 2026-03-09T00:04:10.164 INFO:tasks.workunit.client.0.vm03.stdout:8/679: getdents d7/df/d1a/d40/db3/dba/dad 0 2026-03-09T00:04:10.164 INFO:tasks.workunit.client.1.vm06.stdout:9/798: creat d1/d4/d6e/d14/d25/d85/f107 x:0 0 0 2026-03-09T00:04:10.164 INFO:tasks.workunit.client.1.vm06.stdout:9/799: chown d1/d4/d6e/d9/f4c 170179282 1 2026-03-09T00:04:10.164 INFO:tasks.workunit.client.1.vm06.stdout:7/894: mkdir d0/df/d1a/d27/d70/d9b/de2/d109 0 2026-03-09T00:04:10.164 INFO:tasks.workunit.client.1.vm06.stdout:9/800: truncate d1/f45 966830 0 2026-03-09T00:04:10.164 INFO:tasks.workunit.client.1.vm06.stdout:9/801: creat d1/d4/d6e/d14/d25/d85/d49/f108 x:0 0 0 2026-03-09T00:04:10.167 INFO:tasks.workunit.client.1.vm06.stdout:2/996: dread d7/f8 [4194304,4194304] 0 2026-03-09T00:04:10.169 INFO:tasks.workunit.client.1.vm06.stdout:7/895: write d0/df/d1a/d27/d4c/d40/d5b/faf [3903052,101984] 0 2026-03-09T00:04:10.169 INFO:tasks.workunit.client.1.vm06.stdout:7/896: write d0/df/d17/f7e [8418635,102144] 0 2026-03-09T00:04:10.170 INFO:tasks.workunit.client.1.vm06.stdout:2/997: dread d7/da/fbf [0,4194304] 0 2026-03-09T00:04:10.172 INFO:tasks.workunit.client.1.vm06.stdout:7/897: rename d0/df/d7b/ff3 to d0/df/d17/dba/f10a 0 2026-03-09T00:04:10.188 INFO:tasks.workunit.client.1.vm06.stdout:2/998: write d7/d1a/d3c/f4d [2026548,21056] 0 2026-03-09T00:04:10.188 INFO:tasks.workunit.client.1.vm06.stdout:3/901: dwrite d11/d28/d2e/d2f/d36/fb7 [0,4194304] 0 2026-03-09T00:04:10.189 INFO:tasks.workunit.client.1.vm06.stdout:5/996: dread d5/d1c/d21/d28/f63 [4194304,4194304] 0 2026-03-09T00:04:10.191 INFO:tasks.workunit.client.1.vm06.stdout:2/999: rename d7/d1a/d89/d105 to d7/d1b/da5/d133 0 2026-03-09T00:04:10.192 INFO:tasks.workunit.client.1.vm06.stdout:3/902: rmdir d11/d28/d2e/d2f 39 2026-03-09T00:04:10.193 INFO:tasks.workunit.client.1.vm06.stdout:5/997: creat d5/d1c/d21/f151 x:0 0 0 2026-03-09T00:04:10.193 INFO:tasks.workunit.client.1.vm06.stdout:3/903: rmdir d11/d28/d2e/d7e 39 2026-03-09T00:04:10.193 INFO:tasks.workunit.client.1.vm06.stdout:3/904: read - d11/d28/d4d/d89/d90/f10b zero size 2026-03-09T00:04:10.193 INFO:tasks.workunit.client.1.vm06.stdout:3/905: write d11/d28/d2e/d7e/d83/d87/f10c [913422,32286] 0 2026-03-09T00:04:10.197 INFO:tasks.workunit.client.0.vm03.stdout:1/776: sync 2026-03-09T00:04:10.203 INFO:tasks.workunit.client.1.vm06.stdout:5/998: dread d5/d1c/d21/d28/d5e/d66/d78/da6/fef [0,4194304] 0 2026-03-09T00:04:10.203 INFO:tasks.workunit.client.1.vm06.stdout:5/999: dread - d5/d1c/d68/f8c zero size 2026-03-09T00:04:10.209 INFO:tasks.workunit.client.0.vm03.stdout:2/696: dwrite d8/d26/d5e/d6f/d97/f1a [0,4194304] 0 2026-03-09T00:04:10.227 INFO:tasks.workunit.client.0.vm03.stdout:5/677: dwrite d1c/d20/d55/d66/d70/f80 [0,4194304] 0 2026-03-09T00:04:10.228 INFO:tasks.workunit.client.0.vm03.stdout:5/678: symlink d1c/d20/dc0/le0 0 2026-03-09T00:04:10.228 INFO:tasks.workunit.client.0.vm03.stdout:6/682: dread d13/d1e/d44/d59/d77/f98 [0,4194304] 0 2026-03-09T00:04:10.230 INFO:tasks.workunit.client.0.vm03.stdout:5/679: mkdir d1c/d20/d55/d4f/d58/d73/d76/d91/de1 0 2026-03-09T00:04:10.230 INFO:tasks.workunit.client.0.vm03.stdout:5/680: dread - d1c/d20/d55/d66/d70/f8c zero size 2026-03-09T00:04:10.232 INFO:tasks.workunit.client.1.vm06.stdout:6/870: dread d4/d27/f70 [0,4194304] 0 2026-03-09T00:04:10.235 INFO:tasks.workunit.client.0.vm03.stdout:5/681: mknod d1c/d51/d6a/ce2 0 2026-03-09T00:04:10.245 INFO:tasks.workunit.client.1.vm06.stdout:6/871: mknod d4/d16/d53/ddf/c10c 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/682: mkdir d1c/d20/d55/d66/d6b/de3 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/683: read fe [1404186,28163] 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/684: write d1c/d20/d55/db0/dc7/fd7 [1501366,56725] 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/685: chown d1c/d20/fa3 0 1 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/686: write d1c/d20/d55/fdb [625309,52535] 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/687: read - d1c/d20/d55/fbc zero size 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/688: stat f12 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/689: creat d1c/d20/d55/d4f/d58/d73/d9e/fe4 x:0 0 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/690: symlink d1c/d20/d55/d4f/d58/d73/d76/d91/le5 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/691: write d1c/d20/d55/fdb [1344856,16655] 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/692: mknod d1c/d51/d6a/ce6 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/693: rename d1c/d20/d55/d4f/d58/fa0 to d1c/d20/d55/d4f/d58/d73/d9e/da5/fe7 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/694: mknod d1c/d20/d55/d43/ce8 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/695: write d1c/fc4 [1011412,5178] 0 2026-03-09T00:04:10.246 INFO:tasks.workunit.client.0.vm03.stdout:5/696: mknod d1c/d20/d55/d66/d6b/d8f/ce9 0 2026-03-09T00:04:10.249 INFO:tasks.workunit.client.1.vm06.stdout:6/872: dread d4/d16/d46/fc4 [0,4194304] 0 2026-03-09T00:04:10.249 INFO:tasks.workunit.client.1.vm06.stdout:6/873: write d4/d27/f103 [349521,41163] 0 2026-03-09T00:04:10.249 INFO:tasks.workunit.client.0.vm03.stdout:5/697: dread d1c/d20/d55/f9b [0,4194304] 0 2026-03-09T00:04:10.250 INFO:tasks.workunit.client.1.vm06.stdout:6/874: link d4/d27/f61 d4/d8d/f10d 0 2026-03-09T00:04:10.268 INFO:tasks.workunit.client.0.vm03.stdout:5/698: truncate d1c/d20/d55/d4f/d58/d5d/f8d 1446234 0 2026-03-09T00:04:10.268 INFO:tasks.workunit.client.0.vm03.stdout:5/699: mknod d1c/d20/d55/d4f/d58/d73/d76/d91/cea 0 2026-03-09T00:04:10.268 INFO:tasks.workunit.client.0.vm03.stdout:5/700: mknod d1c/d20/d55/d66/d6b/de3/ceb 0 2026-03-09T00:04:10.304 INFO:tasks.workunit.client.0.vm03.stdout:9/673: dwrite f11 [0,4194304] 0 2026-03-09T00:04:10.306 INFO:tasks.workunit.client.0.vm03.stdout:9/674: truncate d15/d1c/d28/f29 330752 0 2026-03-09T00:04:10.318 INFO:tasks.workunit.client.1.vm06.stdout:4/902: dwrite d17/d24/fce [0,4194304] 0 2026-03-09T00:04:10.320 INFO:tasks.workunit.client.1.vm06.stdout:4/903: truncate d17/d24/d49/d5f/db2/fb9 897379 0 2026-03-09T00:04:10.320 INFO:tasks.workunit.client.0.vm03.stdout:3/502: dwrite d2/db/d40/d51/f5c [0,4194304] 0 2026-03-09T00:04:10.321 INFO:tasks.workunit.client.0.vm03.stdout:1/777: dwrite d4/f42 [0,4194304] 0 2026-03-09T00:04:10.321 INFO:tasks.workunit.client.0.vm03.stdout:3/503: mkdir d2/db/d3b/d5d/d6d/d72/d96 0 2026-03-09T00:04:10.324 INFO:tasks.workunit.client.0.vm03.stdout:8/680: dwrite d7/df/d1a/d40/db3/dba/d3f/d95/fb4 [0,4194304] 0 2026-03-09T00:04:10.328 INFO:tasks.workunit.client.1.vm06.stdout:4/904: dread d17/d5b/fff [0,4194304] 0 2026-03-09T00:04:10.328 INFO:tasks.workunit.client.1.vm06.stdout:4/905: fdatasync d17/d24/f5c 0 2026-03-09T00:04:10.328 INFO:tasks.workunit.client.1.vm06.stdout:4/906: write d17/d21/d4c/faf [696019,44569] 0 2026-03-09T00:04:10.328 INFO:tasks.workunit.client.1.vm06.stdout:4/907: fdatasync d17/d21/d4c/d66/fcf 0 2026-03-09T00:04:10.336 INFO:tasks.workunit.client.1.vm06.stdout:4/908: dread d17/d24/d3b/d5e/f6f [0,4194304] 0 2026-03-09T00:04:10.336 INFO:tasks.workunit.client.1.vm06.stdout:4/909: write d17/d5b/ff9 [623013,74561] 0 2026-03-09T00:04:10.336 INFO:tasks.workunit.client.1.vm06.stdout:3/906: dwrite d11/d28/d2e/db2/f101 [0,4194304] 0 2026-03-09T00:04:10.345 INFO:tasks.workunit.client.1.vm06.stdout:4/910: getdents d17/d24/d49/de4 0 2026-03-09T00:04:10.348 INFO:tasks.workunit.client.1.vm06.stdout:3/907: mknod d11/d28/d2e/d2f/dc1/c138 0 2026-03-09T00:04:10.351 INFO:tasks.workunit.client.1.vm06.stdout:4/911: link d17/d5b/d8f/ld0 d17/d21/d4c/d50/l13a 0 2026-03-09T00:04:10.351 INFO:tasks.workunit.client.1.vm06.stdout:4/912: write d17/d21/d4c/dc2/fcd [914760,100135] 0 2026-03-09T00:04:10.356 INFO:tasks.workunit.client.0.vm03.stdout:8/681: write d7/df/d1a/d40/db3/f88 [3191292,82549] 0 2026-03-09T00:04:10.357 INFO:tasks.workunit.client.1.vm06.stdout:4/913: write d17/d24/f39 [2074006,104260] 0 2026-03-09T00:04:10.357 INFO:tasks.workunit.client.1.vm06.stdout:4/914: write d17/d21/d4c/d66/de3/f139 [948107,41686] 0 2026-03-09T00:04:10.357 INFO:tasks.workunit.client.1.vm06.stdout:4/915: creat d17/d21/d4c/d50/f13b x:0 0 0 2026-03-09T00:04:10.363 INFO:tasks.workunit.client.0.vm03.stdout:2/697: dwrite d8/d1b/d24/da5/fb5 [0,4194304] 0 2026-03-09T00:04:10.371 INFO:tasks.workunit.client.0.vm03.stdout:2/698: rename d8/d1b/d2a/d6b/d50/c70 to d8/d1b/d24/da5/da8/ce4 0 2026-03-09T00:04:10.414 INFO:tasks.workunit.client.1.vm06.stdout:9/802: dwrite d1/d4/d6e/d14/d25/d85/fb8 [0,4194304] 0 2026-03-09T00:04:10.419 INFO:tasks.workunit.client.1.vm06.stdout:7/898: dwrite d0/df/d1a/d27/d4c/f32 [4194304,4194304] 0 2026-03-09T00:04:10.419 INFO:tasks.workunit.client.1.vm06.stdout:7/899: dread - d0/df/d1a/d27/d4c/d40/d51/d90/dae/de0/fe7 zero size 2026-03-09T00:04:10.419 INFO:tasks.workunit.client.0.vm03.stdout:5/701: dread d1c/d20/d55/d4f/d58/d73/d76/fd5 [0,4194304] 0 2026-03-09T00:04:10.419 INFO:tasks.workunit.client.0.vm03.stdout:3/504: dwrite d2/db/d3b/d5d/d6d/d72/f7a [0,4194304] 0 2026-03-09T00:04:10.419 INFO:tasks.workunit.client.0.vm03.stdout:3/505: fdatasync d2/db/d2d/d55/f6f 0 2026-03-09T00:04:10.422 INFO:tasks.workunit.client.1.vm06.stdout:9/803: mkdir d1/d73/dcf/d109 0 2026-03-09T00:04:10.422 INFO:tasks.workunit.client.1.vm06.stdout:9/804: fsync d1/d3/d4f/d91/de8/ff4 0 2026-03-09T00:04:10.434 INFO:tasks.workunit.client.0.vm03.stdout:5/702: link d1c/d20/d55/d4f/l85 d1c/d20/d55/dac/lec 0 2026-03-09T00:04:10.441 INFO:tasks.workunit.client.0.vm03.stdout:5/703: stat d1c/d20/d55/d4f/d58/d73/d9e/da5 0 2026-03-09T00:04:10.441 INFO:tasks.workunit.client.0.vm03.stdout:5/704: fdatasync d1c/d20/d55/d66/d6b/d8f/f98 0 2026-03-09T00:04:10.441 INFO:tasks.workunit.client.0.vm03.stdout:5/705: chown d1c/d20/d56/ld6 2936 1 2026-03-09T00:04:10.442 INFO:tasks.workunit.client.1.vm06.stdout:7/900: creat d0/df/d1a/d22/de3/f10b x:0 0 0 2026-03-09T00:04:10.442 INFO:tasks.workunit.client.1.vm06.stdout:7/901: link d0/df/d1a/d3f/de8/lf1 d0/df/d1a/d27/d70/d9b/de2/d109/l10c 0 2026-03-09T00:04:10.442 INFO:tasks.workunit.client.0.vm03.stdout:5/706: mkdir d1c/d20/d97/ded 0 2026-03-09T00:04:10.442 INFO:tasks.workunit.client.0.vm03.stdout:5/707: truncate d1c/f29 4654338 0 2026-03-09T00:04:10.442 INFO:tasks.workunit.client.0.vm03.stdout:5/708: stat d1c/d20/d55/d4f/d58/d73/d9e/da5 0 2026-03-09T00:04:10.450 INFO:tasks.workunit.client.0.vm03.stdout:1/778: dwrite f2 [4194304,4194304] 0 2026-03-09T00:04:10.453 INFO:tasks.workunit.client.1.vm06.stdout:7/902: write d0/df/d1a/d3a/d4e/d5e/f73 [46629,56384] 0 2026-03-09T00:04:10.453 INFO:tasks.workunit.client.1.vm06.stdout:7/903: symlink d0/df/d1a/d27/d4c/d40/l10d 0 2026-03-09T00:04:10.453 INFO:tasks.workunit.client.1.vm06.stdout:7/904: stat d0/df/d1a/d22/de3/f10b 0 2026-03-09T00:04:10.456 INFO:tasks.workunit.client.1.vm06.stdout:7/905: rmdir d0/d55/d99/db2 39 2026-03-09T00:04:10.456 INFO:tasks.workunit.client.1.vm06.stdout:7/906: truncate d0/df/d1a/d3a/d4e/d5e/f6f 4502101 0 2026-03-09T00:04:10.459 INFO:tasks.workunit.client.0.vm03.stdout:1/779: write d4/d15/d5c/f74 [198437,65607] 0 2026-03-09T00:04:10.462 INFO:tasks.workunit.client.0.vm03.stdout:3/506: dread d2/db/d2d/f54 [0,4194304] 0 2026-03-09T00:04:10.462 INFO:tasks.workunit.client.0.vm03.stdout:3/507: dread - d2/db/d3b/f62 zero size 2026-03-09T00:04:10.462 INFO:tasks.workunit.client.0.vm03.stdout:3/508: chown d2/db/d6a/l71 29 1 2026-03-09T00:04:10.466 INFO:tasks.workunit.client.0.vm03.stdout:1/780: mknod d4/d3a/d3d/d98/dee/deb/c109 0 2026-03-09T00:04:10.467 INFO:tasks.workunit.client.0.vm03.stdout:3/509: mknod d2/db/d40/d58/c97 0 2026-03-09T00:04:10.467 INFO:tasks.workunit.client.0.vm03.stdout:3/510: fsync d2/db/d2d/f8b 0 2026-03-09T00:04:10.471 INFO:tasks.workunit.client.0.vm03.stdout:1/781: creat d4/d3a/d32/f10a x:0 0 0 2026-03-09T00:04:10.471 INFO:tasks.workunit.client.0.vm03.stdout:1/782: readlink d4/d3a/d3d/d98/dee/lcf 0 2026-03-09T00:04:10.477 INFO:tasks.workunit.client.1.vm06.stdout:6/875: write d4/d27/f84 [264883,85682] 0 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.1.vm06.stdout:6/876: rename d4/d16/d46/df8 to d4/d16/d53/df2/d10e 0 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.1.vm06.stdout:6/877: mknod d4/d16/d53/ddf/d7e/dac/dd3/c10f 0 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/606: sync 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:4/761: sync 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/607: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/db2 11515122 1 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/608: getdents d2/d1f/d3a/d24/da4/d46/d81/d96/d8e 0 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/609: unlink d2/d1f/d3a/d24/da4/d91/d67/f8a 0 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/610: write d2/d1f/d3a/d24/da4/d46/d81/d96/d37/fb0 [559184,128647] 0 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/611: rmdir d2/d1f/d3a/d24/da4/d91/d67 39 2026-03-09T00:04:10.491 INFO:tasks.workunit.client.0.vm03.stdout:7/612: write d2/d1f/d3a/d24/da4/d91/d67/f64 [744613,16005] 0 2026-03-09T00:04:10.494 INFO:tasks.workunit.client.0.vm03.stdout:6/683: dwrite d13/d1e/f34 [4194304,4194304] 0 2026-03-09T00:04:10.499 INFO:tasks.workunit.client.0.vm03.stdout:6/684: getdents d13/d35/d74 0 2026-03-09T00:04:10.506 INFO:tasks.workunit.client.0.vm03.stdout:6/685: dread d13/d35/d71/d97/da5/fad [0,4194304] 0 2026-03-09T00:04:10.506 INFO:tasks.workunit.client.0.vm03.stdout:6/686: creat d13/d35/db5/fe3 x:0 0 0 2026-03-09T00:04:10.507 INFO:tasks.workunit.client.0.vm03.stdout:6/687: getdents d13/d35/d74/d89/d9d 0 2026-03-09T00:04:10.528 INFO:tasks.workunit.client.1.vm06.stdout:9/805: dwrite d1/d4/d6e/d9/f82 [0,4194304] 0 2026-03-09T00:04:10.532 INFO:tasks.workunit.client.1.vm06.stdout:6/878: dread d4/f68 [0,4194304] 0 2026-03-09T00:04:10.532 INFO:tasks.workunit.client.1.vm06.stdout:6/879: fdatasync d4/d16/d46/d90/fd0 0 2026-03-09T00:04:10.533 INFO:tasks.workunit.client.1.vm06.stdout:6/880: getdents d4/d16/d53/ddf 0 2026-03-09T00:04:10.533 INFO:tasks.workunit.client.1.vm06.stdout:6/881: mkdir d4/d16/d53/ddf/d52/d110 0 2026-03-09T00:04:10.537 INFO:tasks.workunit.client.1.vm06.stdout:9/806: dread d1/d4/d6e/d14/d25/f6f [0,4194304] 0 2026-03-09T00:04:10.538 INFO:tasks.workunit.client.1.vm06.stdout:9/807: rmdir d1/da7/dfc 39 2026-03-09T00:04:10.565 INFO:tasks.workunit.client.0.vm03.stdout:0/696: sync 2026-03-09T00:04:10.565 INFO:tasks.workunit.client.0.vm03.stdout:0/697: creat d2/da/dd/d49/d6c/d4b/f100 x:0 0 0 2026-03-09T00:04:10.565 INFO:tasks.workunit.client.0.vm03.stdout:0/698: readlink d2/da/dd/d6e/l93 0 2026-03-09T00:04:10.567 INFO:tasks.workunit.client.0.vm03.stdout:8/682: dwrite d7/df/d1a/d40/db3/dba/dad/fc6 [0,4194304] 0 2026-03-09T00:04:10.586 INFO:tasks.workunit.client.1.vm06.stdout:7/907: dwrite d0/df/d1a/d27/d4c/d40/d51/d90/dae/de0/ff4 [0,4194304] 0 2026-03-09T00:04:10.586 INFO:tasks.workunit.client.1.vm06.stdout:7/908: write d0/f14 [6086200,80746] 0 2026-03-09T00:04:10.589 INFO:tasks.workunit.client.0.vm03.stdout:8/683: dread d7/df/d1a/d40/db3/dba/d3f/f7d [0,4194304] 0 2026-03-09T00:04:10.590 INFO:tasks.workunit.client.0.vm03.stdout:4/762: dwrite d7/d20/d6a/dea/d38/f8f [0,4194304] 0 2026-03-09T00:04:10.590 INFO:tasks.workunit.client.0.vm03.stdout:4/763: chown d7/d27/c8d 9175 1 2026-03-09T00:04:10.595 INFO:tasks.workunit.client.1.vm06.stdout:3/908: dwrite d11/d28/d2e/d2f/d36/f59 [0,4194304] 0 2026-03-09T00:04:10.598 INFO:tasks.workunit.client.0.vm03.stdout:4/764: write d7/d20/d6a/d77/f83 [1688743,22974] 0 2026-03-09T00:04:10.600 INFO:tasks.workunit.client.0.vm03.stdout:2/699: rename d8/d1b/d24/da5/da8 to d8/d26/d5e/d5f/d95/de5 0 2026-03-09T00:04:10.603 INFO:tasks.workunit.client.0.vm03.stdout:4/765: unlink d7/d20/d6a/dea/fa0 0 2026-03-09T00:04:10.607 INFO:tasks.workunit.client.1.vm06.stdout:3/909: truncate d11/f12 2059938 0 2026-03-09T00:04:10.607 INFO:tasks.workunit.client.0.vm03.stdout:2/700: mkdir d8/d1b/d8f/de6 0 2026-03-09T00:04:10.607 INFO:tasks.workunit.client.0.vm03.stdout:4/766: creat d7/d20/d6a/dea/d54/ff3 x:0 0 0 2026-03-09T00:04:10.607 INFO:tasks.workunit.client.0.vm03.stdout:2/701: getdents d8/d74 0 2026-03-09T00:04:10.608 INFO:tasks.workunit.client.0.vm03.stdout:6/688: dwrite d13/d35/d69/f84 [4194304,4194304] 0 2026-03-09T00:04:10.610 INFO:tasks.workunit.client.0.vm03.stdout:2/702: dread f2 [4194304,4194304] 0 2026-03-09T00:04:10.610 INFO:tasks.workunit.client.0.vm03.stdout:2/703: read d8/d26/d5e/d5f/d95/faf [698644,51464] 0 2026-03-09T00:04:10.611 INFO:tasks.workunit.client.1.vm06.stdout:3/910: unlink d11/d28/d2e/d2f/d5b/d5f/l80 0 2026-03-09T00:04:10.613 INFO:tasks.workunit.client.0.vm03.stdout:2/704: getdents d8/d1b/d24/da5/dda 0 2026-03-09T00:04:10.613 INFO:tasks.workunit.client.0.vm03.stdout:2/705: readlink d8/d1b/d2a/d6b/d50/d8a/lbd 0 2026-03-09T00:04:10.616 INFO:tasks.workunit.client.0.vm03.stdout:2/706: mknod d8/d26/ce7 0 2026-03-09T00:04:10.618 INFO:tasks.workunit.client.0.vm03.stdout:3/511: dwrite d2/db/d40/d88/f89 [0,4194304] 0 2026-03-09T00:04:10.623 INFO:tasks.workunit.client.0.vm03.stdout:3/512: symlink d2/db/d40/d44/l98 0 2026-03-09T00:04:10.634 INFO:tasks.workunit.client.0.vm03.stdout:4/767: dread d7/d20/d6a/dea/d4e/f9d [0,4194304] 0 2026-03-09T00:04:10.634 INFO:tasks.workunit.client.0.vm03.stdout:4/768: chown d7/d6f/da5/cad 92834140 1 2026-03-09T00:04:10.634 INFO:tasks.workunit.client.1.vm06.stdout:3/911: read d11/f1d [1137491,128184] 0 2026-03-09T00:04:10.641 INFO:tasks.workunit.client.0.vm03.stdout:5/709: dread d1c/d20/d55/d4f/d58/d73/d9e/fd1 [0,4194304] 0 2026-03-09T00:04:10.644 INFO:tasks.workunit.client.0.vm03.stdout:5/710: stat d1c/d20/l4b 0 2026-03-09T00:04:10.644 INFO:tasks.workunit.client.0.vm03.stdout:5/711: chown d1c/d20/l50 1170358119 1 2026-03-09T00:04:10.644 INFO:tasks.workunit.client.1.vm06.stdout:3/912: chown d11/l23 492463994 1 2026-03-09T00:04:10.644 INFO:tasks.workunit.client.0.vm03.stdout:4/769: mknod d7/de6/cf4 0 2026-03-09T00:04:10.645 INFO:tasks.workunit.client.0.vm03.stdout:4/770: symlink d7/d20/d6a/dea/d38/da9/lf5 0 2026-03-09T00:04:10.647 INFO:tasks.workunit.client.0.vm03.stdout:4/771: rmdir d7/d20/d6a/dea/d38/da9 39 2026-03-09T00:04:10.647 INFO:tasks.workunit.client.0.vm03.stdout:4/772: fdatasync d7/d20/d6a/dde/fec 0 2026-03-09T00:04:10.648 INFO:tasks.workunit.client.1.vm06.stdout:3/913: dread d11/d28/d4d/d89/d90/fba [0,4194304] 0 2026-03-09T00:04:10.648 INFO:tasks.workunit.client.1.vm06.stdout:3/914: truncate d11/d28/d57/f7b 2358385 0 2026-03-09T00:04:10.658 INFO:tasks.workunit.client.1.vm06.stdout:9/808: dwrite d1/d4/d6e/ffa [0,4194304] 0 2026-03-09T00:04:10.658 INFO:tasks.workunit.client.1.vm06.stdout:9/809: chown d1/d4/d6e/d14/d25/c80 746578 1 2026-03-09T00:04:10.658 INFO:tasks.workunit.client.1.vm06.stdout:9/810: write d1/da7/fea [840531,118060] 0 2026-03-09T00:04:10.659 INFO:tasks.workunit.client.0.vm03.stdout:5/712: truncate d1c/d51/f68 3855436 0 2026-03-09T00:04:10.665 INFO:tasks.workunit.client.1.vm06.stdout:9/811: unlink d1/da7/dfc/f102 0 2026-03-09T00:04:10.678 INFO:tasks.workunit.client.1.vm06.stdout:9/812: symlink d1/d3/d4f/d91/ddb/l10a 0 2026-03-09T00:04:10.679 INFO:tasks.workunit.client.0.vm03.stdout:5/713: getdents d1c/d67 0 2026-03-09T00:04:10.679 INFO:tasks.workunit.client.1.vm06.stdout:9/813: truncate d1/d4/f24 2336191 0 2026-03-09T00:04:10.682 INFO:tasks.workunit.client.0.vm03.stdout:5/714: dread d1c/d20/f39 [0,4194304] 0 2026-03-09T00:04:10.708 INFO:tasks.workunit.client.0.vm03.stdout:1/783: dwrite d4/d15/dae/d101/f108 [0,4194304] 0 2026-03-09T00:04:10.710 INFO:tasks.workunit.client.0.vm03.stdout:1/784: unlink d4/d15/d77/fe4 0 2026-03-09T00:04:10.711 INFO:tasks.workunit.client.0.vm03.stdout:1/785: creat d4/d3a/d61/da6/f10b x:0 0 0 2026-03-09T00:04:10.712 INFO:tasks.workunit.client.0.vm03.stdout:1/786: creat d4/d3a/d3d/d98/dee/deb/f10c x:0 0 0 2026-03-09T00:04:10.746 INFO:tasks.workunit.client.0.vm03.stdout:2/707: dwrite d8/d26/d5e/d5f/f48 [0,4194304] 0 2026-03-09T00:04:10.747 INFO:tasks.workunit.client.0.vm03.stdout:2/708: getdents d8/d1b/d2a/d6b 0 2026-03-09T00:04:10.747 INFO:tasks.workunit.client.0.vm03.stdout:5/715: dwrite d1c/d20/d55/d4f/d58/d5d/fd0 [0,4194304] 0 2026-03-09T00:04:10.751 INFO:tasks.workunit.client.1.vm06.stdout:6/882: dwrite d4/d27/f103 [0,4194304] 0 2026-03-09T00:04:10.751 INFO:tasks.workunit.client.0.vm03.stdout:7/613: rename d2/d1f/d3a/d24/da4/d91 to d2/d4/db7 0 2026-03-09T00:04:10.757 INFO:tasks.workunit.client.0.vm03.stdout:5/716: link d1c/d20/d55/d66/d6b/d8f/lb9 d1c/d20/d55/d43/lee 0 2026-03-09T00:04:10.759 INFO:tasks.workunit.client.0.vm03.stdout:7/614: link d2/d4/d8c/la9 d2/d1f/d3a/d24/da4/d46/lb8 0 2026-03-09T00:04:10.759 INFO:tasks.workunit.client.0.vm03.stdout:7/615: chown d2/l89 481856 1 2026-03-09T00:04:10.761 INFO:tasks.workunit.client.0.vm03.stdout:0/699: rename d2/da/d76/d8a/fa3 to d2/da/dd/d49/d6c/da6/dda/f101 0 2026-03-09T00:04:10.763 INFO:tasks.workunit.client.0.vm03.stdout:7/616: write d2/d1f/d3a/d24/da4/d46/d54/f77 [1379086,42285] 0 2026-03-09T00:04:10.763 INFO:tasks.workunit.client.0.vm03.stdout:7/617: dread - d2/d4/db7/daa/fb3 zero size 2026-03-09T00:04:10.764 INFO:tasks.workunit.client.0.vm03.stdout:5/717: link d1c/d20/d55/ca8 d1c/d20/d55/d4f/d58/d73/d9e/cef 0 2026-03-09T00:04:10.764 INFO:tasks.workunit.client.0.vm03.stdout:5/718: readlink d1c/d20/d55/dac/lec 0 2026-03-09T00:04:10.764 INFO:tasks.workunit.client.0.vm03.stdout:4/773: dwrite d7/d20/d6a/fe4 [0,4194304] 0 2026-03-09T00:04:10.764 INFO:tasks.workunit.client.0.vm03.stdout:4/774: write d7/d20/fce [151680,63948] 0 2026-03-09T00:04:10.769 INFO:tasks.workunit.client.1.vm06.stdout:3/915: dwrite d11/d28/d2e/d2f/d5b/ddb/df1/f134 [0,4194304] 0 2026-03-09T00:04:10.769 INFO:tasks.workunit.client.0.vm03.stdout:0/700: symlink d2/da/d36/ddf/df7/l102 0 2026-03-09T00:04:10.777 INFO:tasks.workunit.client.0.vm03.stdout:9/675: sync 2026-03-09T00:04:10.785 INFO:tasks.workunit.client.1.vm06.stdout:1/805: sync 2026-03-09T00:04:10.786 INFO:tasks.workunit.client.1.vm06.stdout:3/916: unlink d11/d28/d2e/d2f/d5b/d5f/db1/c113 0 2026-03-09T00:04:10.786 INFO:tasks.workunit.client.1.vm06.stdout:3/917: fdatasync d11/d28/d2e/d7e/d83/f9a 0 2026-03-09T00:04:10.791 INFO:tasks.workunit.client.0.vm03.stdout:5/719: truncate d1c/d20/d97/fb3 1138759 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:4/775: chown d7/d20/d6a/dea/c2d 4237877 1 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:4/776: chown d7/d20/d6a/dea/d4e/f9d 45 1 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:0/701: creat d2/da/dd/d49/d6c/da6/dda/db5/dba/f103 x:0 0 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:0/702: fdatasync d2/da/d1a/f56 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:7/618: rmdir d2/d4/d1e/d78 39 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:3/513: rename d2/db/d6a/d70 to d2/db/d40/d44/d68/d99 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:0/703: symlink d2/da/dd/l104 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:7/619: mkdir d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:4/777: rmdir d7/d20/d6a/d77/db7 39 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:1/787: rename d4/d15/d5c/d6c/f71 to d4/d3a/d3d/f10d 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:0/704: creat d2/da/d36/ddf/df7/f105 x:0 0 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:1/788: creat d4/d3a/f10e x:0 0 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:7/620: rename d2/d4/d1e/c7b to d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/cba 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:3/514: link d2/db/l33 d2/db/d3b/l9a 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:0/705: mknod d2/da/dd/d49/d6c/da6/dda/c106 0 2026-03-09T00:04:10.807 INFO:tasks.workunit.client.0.vm03.stdout:0/706: write d2/da/d1a/f91 [599331,82167] 0 2026-03-09T00:04:10.809 INFO:tasks.workunit.client.0.vm03.stdout:4/778: mkdir d7/d20/d6a/dea/df6 0 2026-03-09T00:04:10.810 INFO:tasks.workunit.client.0.vm03.stdout:4/779: write d7/f15 [2078703,25955] 0 2026-03-09T00:04:10.810 INFO:tasks.workunit.client.0.vm03.stdout:1/789: truncate d4/d6/f6e 2513232 0 2026-03-09T00:04:10.810 INFO:tasks.workunit.client.0.vm03.stdout:3/515: mknod d2/db/d3b/d5f/d65/c9b 0 2026-03-09T00:04:10.812 INFO:tasks.workunit.client.0.vm03.stdout:4/780: creat d7/d6f/dcf/ff7 x:0 0 0 2026-03-09T00:04:10.813 INFO:tasks.workunit.client.0.vm03.stdout:1/790: creat d4/d3a/d32/da3/f10f x:0 0 0 2026-03-09T00:04:10.820 INFO:tasks.workunit.client.0.vm03.stdout:4/781: mknod d7/de6/cf8 0 2026-03-09T00:04:10.821 INFO:tasks.workunit.client.0.vm03.stdout:0/707: dread d2/da/dd/d49/fcb [0,4194304] 0 2026-03-09T00:04:10.836 INFO:tasks.workunit.client.0.vm03.stdout:4/782: dread d7/d20/f21 [0,4194304] 0 2026-03-09T00:04:10.836 INFO:tasks.workunit.client.0.vm03.stdout:4/783: creat d7/d6f/dcf/ff9 x:0 0 0 2026-03-09T00:04:10.837 INFO:tasks.workunit.client.1.vm06.stdout:9/814: dwrite d1/da7/fb9 [4194304,4194304] 0 2026-03-09T00:04:10.839 INFO:tasks.workunit.client.0.vm03.stdout:4/784: dread d7/d20/d6a/d77/f83 [0,4194304] 0 2026-03-09T00:04:10.839 INFO:tasks.workunit.client.0.vm03.stdout:4/785: creat d7/d20/d6a/dde/ffa x:0 0 0 2026-03-09T00:04:10.839 INFO:tasks.workunit.client.0.vm03.stdout:4/786: read - d7/d20/d6a/dde/ffa zero size 2026-03-09T00:04:10.839 INFO:tasks.workunit.client.0.vm03.stdout:4/787: write d7/d20/d6a/dde/ffa [631470,15094] 0 2026-03-09T00:04:10.841 INFO:tasks.workunit.client.1.vm06.stdout:9/815: unlink d1/d4/d6e/d14/d25/d85/f107 0 2026-03-09T00:04:10.842 INFO:tasks.workunit.client.1.vm06.stdout:9/816: creat d1/d3/d4f/d91/f10b x:0 0 0 2026-03-09T00:04:10.843 INFO:tasks.workunit.client.0.vm03.stdout:4/788: unlink f4 0 2026-03-09T00:04:10.853 INFO:tasks.workunit.client.0.vm03.stdout:4/789: dread d7/d20/f3d [0,4194304] 0 2026-03-09T00:04:10.898 INFO:tasks.workunit.client.1.vm06.stdout:6/883: dwrite d4/d16/d53/ddf/d4b/fad [0,4194304] 0 2026-03-09T00:04:10.902 INFO:tasks.workunit.client.1.vm06.stdout:6/884: read d4/d16/d46/fc4 [1787857,28017] 0 2026-03-09T00:04:10.926 INFO:tasks.workunit.client.0.vm03.stdout:2/709: dwrite d8/f3e [0,4194304] 0 2026-03-09T00:04:10.927 INFO:tasks.workunit.client.0.vm03.stdout:8/684: dwrite d7/f11 [0,4194304] 0 2026-03-09T00:04:10.938 INFO:tasks.workunit.client.0.vm03.stdout:1/791: dread d4/d15/d77/f7a [0,4194304] 0 2026-03-09T00:04:10.952 INFO:tasks.workunit.client.1.vm06.stdout:1/806: dwrite d6/d21/d2d/fc5 [0,4194304] 0 2026-03-09T00:04:10.952 INFO:tasks.workunit.client.1.vm06.stdout:1/807: truncate d6/d4c/fc3 4659913 0 2026-03-09T00:04:10.952 INFO:tasks.workunit.client.1.vm06.stdout:1/808: write d6/fa [4485427,27375] 0 2026-03-09T00:04:10.954 INFO:tasks.workunit.client.1.vm06.stdout:1/809: truncate d6/d4c/d71/fea 2429618 0 2026-03-09T00:04:10.954 INFO:tasks.workunit.client.1.vm06.stdout:1/810: fdatasync d6/d63/f99 0 2026-03-09T00:04:10.954 INFO:tasks.workunit.client.1.vm06.stdout:1/811: chown d6/d4c/d79/l68 13830132 1 2026-03-09T00:04:10.954 INFO:tasks.workunit.client.1.vm06.stdout:1/812: readlink d6/d21/d2d/d3b/d87/laf 0 2026-03-09T00:04:10.955 INFO:tasks.workunit.client.1.vm06.stdout:1/813: mkdir d6/d8f/d10f 0 2026-03-09T00:04:10.958 INFO:tasks.workunit.client.1.vm06.stdout:8/929: sync 2026-03-09T00:04:10.959 INFO:tasks.workunit.client.1.vm06.stdout:8/930: link db/dd/d24/c32 db/d74/d78/d98/db6/dc7/d101/db7/c12f 0 2026-03-09T00:04:10.959 INFO:tasks.workunit.client.1.vm06.stdout:8/931: fsync db/d53/f76 0 2026-03-09T00:04:10.960 INFO:tasks.workunit.client.1.vm06.stdout:1/814: write d6/d21/d2d/d3b/d42/f9a [2225894,95238] 0 2026-03-09T00:04:10.961 INFO:tasks.workunit.client.1.vm06.stdout:1/815: getdents d6/d4c 0 2026-03-09T00:04:10.961 INFO:tasks.workunit.client.1.vm06.stdout:1/816: chown d6/d21/d2d/d3b/lb7 1 1 2026-03-09T00:04:10.964 INFO:tasks.workunit.client.1.vm06.stdout:8/932: dread db/d53/d70/f54 [0,4194304] 0 2026-03-09T00:04:10.964 INFO:tasks.workunit.client.1.vm06.stdout:8/933: creat db/d74/d87/f130 x:0 0 0 2026-03-09T00:04:10.964 INFO:tasks.workunit.client.1.vm06.stdout:1/817: rename d6/f81 to d6/d21/dfc/de8/f110 0 2026-03-09T00:04:10.964 INFO:tasks.workunit.client.1.vm06.stdout:8/934: mkdir db/dd/d24/dac/d131 0 2026-03-09T00:04:10.964 INFO:tasks.workunit.client.1.vm06.stdout:8/935: chown db/l12 5 1 2026-03-09T00:04:10.970 INFO:tasks.workunit.client.1.vm06.stdout:8/936: write db/d74/d78/d98/db6/fff [2935797,49564] 0 2026-03-09T00:04:10.970 INFO:tasks.workunit.client.1.vm06.stdout:9/817: dwrite d1/d4/d6e/d14/d25/d85/f72 [0,4194304] 0 2026-03-09T00:04:10.970 INFO:tasks.workunit.client.1.vm06.stdout:1/818: getdents d6/d63 0 2026-03-09T00:04:10.970 INFO:tasks.workunit.client.1.vm06.stdout:8/937: truncate db/d74/d78/fd2 3523235 0 2026-03-09T00:04:10.973 INFO:tasks.workunit.client.0.vm03.stdout:9/676: dwrite d15/d1c/d21/d75/fa6 [0,4194304] 0 2026-03-09T00:04:10.973 INFO:tasks.workunit.client.1.vm06.stdout:9/818: rename d1/d3/d4f/d52/fa5 to d1/d3/d2b/d58/f10c 0 2026-03-09T00:04:10.973 INFO:tasks.workunit.client.0.vm03.stdout:4/790: fsync d7/d20/d6a/d77/d25/f7f 0 2026-03-09T00:04:10.975 INFO:tasks.workunit.client.1.vm06.stdout:1/819: mknod d6/d21/d2d/d3b/d42/c111 0 2026-03-09T00:04:10.977 INFO:tasks.workunit.client.1.vm06.stdout:1/820: dread d6/d21/d2d/d37/f8b [0,4194304] 0 2026-03-09T00:04:10.983 INFO:tasks.workunit.client.1.vm06.stdout:1/821: creat d6/d4c/d79/d10c/f112 x:0 0 0 2026-03-09T00:04:10.991 INFO:tasks.workunit.client.1.vm06.stdout:1/822: stat d6/l2b 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:8/685: read d7/df/d1a/d40/db3/f74 [3251175,74120] 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:4/791: mkdir d7/d20/d6a/dea/d38/dfb 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:4/792: write d7/d20/d6a/d77/d25/fb8 [5234897,63672] 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:4/793: fsync d7/d20/d6a/dea/d54/f96 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:4/794: write d7/d6f/da5/fe3 [448185,45224] 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:8/686: unlink d7/f67 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:4/795: creat d7/d20/d6a/dea/d4e/dd0/ffc x:0 0 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:8/687: symlink d7/df/d1a/d40/db3/dba/d38/d4c/d98/ld1 0 2026-03-09T00:04:10.992 INFO:tasks.workunit.client.0.vm03.stdout:4/796: link d7/d20/f70 d7/d20/d6a/dea/d54/ffd 0 2026-03-09T00:04:10.997 INFO:tasks.workunit.client.1.vm06.stdout:1/823: dread d6/d4c/d79/f5c [4194304,4194304] 0 2026-03-09T00:04:10.998 INFO:tasks.workunit.client.1.vm06.stdout:1/824: stat d6/d21/d2d/d3b/d42/fb4 0 2026-03-09T00:04:11.007 INFO:tasks.workunit.client.1.vm06.stdout:6/885: dwrite d4/d16/d46/d90/fd0 [0,4194304] 0 2026-03-09T00:04:11.014 INFO:tasks.workunit.client.0.vm03.stdout:9/677: dread d15/d1c/d28/f5b [0,4194304] 0 2026-03-09T00:04:11.014 INFO:tasks.workunit.client.0.vm03.stdout:9/678: fsync d15/d1c/d21/fcd 0 2026-03-09T00:04:11.014 INFO:tasks.workunit.client.1.vm06.stdout:3/918: dwrite d11/d28/d4d/d9b/fe2 [0,4194304] 0 2026-03-09T00:04:11.014 INFO:tasks.workunit.client.1.vm06.stdout:6/886: mknod d4/d16/d53/df2/c111 0 2026-03-09T00:04:11.021 INFO:tasks.workunit.client.1.vm06.stdout:3/919: rename d11/d28/d2e/l10a to d11/d28/d4d/l139 0 2026-03-09T00:04:11.022 INFO:tasks.workunit.client.1.vm06.stdout:3/920: unlink d11/d28/d2e/d2f/d5b/d94/fa1 0 2026-03-09T00:04:11.028 INFO:tasks.workunit.client.1.vm06.stdout:6/887: write d4/d16/d53/f82 [5518960,48459] 0 2026-03-09T00:04:11.077 INFO:tasks.workunit.client.0.vm03.stdout:7/621: dwrite d2/d4/db7/daa/fb3 [0,4194304] 0 2026-03-09T00:04:11.094 INFO:tasks.workunit.client.1.vm06.stdout:8/938: dwrite db/d74/d87/d100/d10a/fcf [0,4194304] 0 2026-03-09T00:04:11.094 INFO:tasks.workunit.client.1.vm06.stdout:8/939: getdents db/d53/d70/d38/d4d/d79/dd5 0 2026-03-09T00:04:11.095 INFO:tasks.workunit.client.1.vm06.stdout:3/921: dread d11/d28/d2e/d2f/f79 [0,4194304] 0 2026-03-09T00:04:11.095 INFO:tasks.workunit.client.1.vm06.stdout:3/922: stat d11/d28/d2e/d2f/dc1/c138 0 2026-03-09T00:04:11.095 INFO:tasks.workunit.client.1.vm06.stdout:3/923: dread - d11/d28/d2e/d2f/d5b/db5/f130 zero size 2026-03-09T00:04:11.099 INFO:tasks.workunit.client.0.vm03.stdout:8/688: dwrite d7/df/d1a/d2b/f77 [0,4194304] 0 2026-03-09T00:04:11.102 INFO:tasks.workunit.client.0.vm03.stdout:2/710: dwrite d8/d1b/d2a/d6b/d50/f91 [0,4194304] 0 2026-03-09T00:04:11.104 INFO:tasks.workunit.client.0.vm03.stdout:8/689: truncate d7/df/d1a/d40/db3/f88 3784888 0 2026-03-09T00:04:11.104 INFO:tasks.workunit.client.0.vm03.stdout:9/679: dwrite d15/f44 [0,4194304] 0 2026-03-09T00:04:11.108 INFO:tasks.workunit.client.0.vm03.stdout:2/711: creat d8/d1b/d24/da5/dc9/fe8 x:0 0 0 2026-03-09T00:04:11.109 INFO:tasks.workunit.client.0.vm03.stdout:8/690: dread d7/df/d1a/d40/f69 [0,4194304] 0 2026-03-09T00:04:11.111 INFO:tasks.workunit.client.1.vm06.stdout:3/924: write d11/d28/d2e/d7e/fd3 [1072363,70681] 0 2026-03-09T00:04:11.118 INFO:tasks.workunit.client.0.vm03.stdout:9/680: rename d15/d1c/d36/d4d/dc4/f9d to d15/d7f/fdf 0 2026-03-09T00:04:11.119 INFO:tasks.workunit.client.0.vm03.stdout:2/712: truncate d8/d74/fc7 1703358 0 2026-03-09T00:04:11.119 INFO:tasks.workunit.client.0.vm03.stdout:8/691: mkdir d7/df/d1a/d40/d9d/da3/dd2 0 2026-03-09T00:04:11.119 INFO:tasks.workunit.client.0.vm03.stdout:8/692: getdents d7/df/d1a/d40/db3/dba/dad 0 2026-03-09T00:04:11.119 INFO:tasks.workunit.client.0.vm03.stdout:2/713: creat d8/d74/fe9 x:0 0 0 2026-03-09T00:04:11.121 INFO:tasks.workunit.client.0.vm03.stdout:8/693: truncate d7/df/d1a/d40/f69 467493 0 2026-03-09T00:04:11.122 INFO:tasks.workunit.client.1.vm06.stdout:3/925: read d11/d28/f5e [3418351,6788] 0 2026-03-09T00:04:11.125 INFO:tasks.workunit.client.1.vm06.stdout:1/825: dwrite d6/d8f/f103 [0,4194304] 0 2026-03-09T00:04:11.125 INFO:tasks.workunit.client.1.vm06.stdout:1/826: write d6/d21/d2d/f6c [1222128,121422] 0 2026-03-09T00:04:11.125 INFO:tasks.workunit.client.1.vm06.stdout:1/827: readlink d6/d21/l60 0 2026-03-09T00:04:11.125 INFO:tasks.workunit.client.0.vm03.stdout:2/714: creat d8/d1b/d24/da5/fea x:0 0 0 2026-03-09T00:04:11.125 INFO:tasks.workunit.client.0.vm03.stdout:2/715: dread - d8/d1b/d2a/d56/f8c zero size 2026-03-09T00:04:11.130 INFO:tasks.workunit.client.1.vm06.stdout:3/926: mkdir d11/d28/d4d/d89/d90/dd2/d13a 0 2026-03-09T00:04:11.130 INFO:tasks.workunit.client.0.vm03.stdout:9/681: rename d15/d1c/d9c to d15/d1c/d21/d75/de0 0 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:4/916: sync 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:4/917: chown d17/d24/d3b/d5e/d7a/f11a 1 1 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:0/963: sync 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:0/964: chown d3/d18/d2c/d2d/d74/dc7/d110/d12d 37385527 1 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:0/965: chown d3/d18/cc4 11259277 1 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:7/909: sync 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:7/910: read - d0/df/d1a/d27/d4c/d40/d51/d86/ff6 zero size 2026-03-09T00:04:11.131 INFO:tasks.workunit.client.1.vm06.stdout:7/911: read d0/f7 [3730541,104920] 0 2026-03-09T00:04:11.137 INFO:tasks.workunit.client.0.vm03.stdout:7/622: truncate d2/d1f/d3a/d24/da4/d46/d54/f77 3853102 0 2026-03-09T00:04:11.137 INFO:tasks.workunit.client.0.vm03.stdout:7/623: dread - d2/d4/db7/d67/fa0 zero size 2026-03-09T00:04:11.138 INFO:tasks.workunit.client.0.vm03.stdout:9/682: rename d15/db6 to d15/d1c/d28/de1 0 2026-03-09T00:04:11.142 INFO:tasks.workunit.client.0.vm03.stdout:7/624: rename d2/fc to d2/d1f/d3a/d24/da4/d46/d81/d96/fbb 0 2026-03-09T00:04:11.152 INFO:tasks.workunit.client.0.vm03.stdout:6/689: sync 2026-03-09T00:04:11.153 INFO:tasks.workunit.client.0.vm03.stdout:1/792: rmdir d4/d15 39 2026-03-09T00:04:11.153 INFO:tasks.workunit.client.0.vm03.stdout:1/793: fdatasync d4/d3a/d32/da3/f10f 0 2026-03-09T00:04:11.155 INFO:tasks.workunit.client.0.vm03.stdout:9/683: mkdir d15/d77/de2 0 2026-03-09T00:04:11.158 INFO:tasks.workunit.client.1.vm06.stdout:1/828: mkdir d6/d21/d2d/d113 0 2026-03-09T00:04:11.163 INFO:tasks.workunit.client.1.vm06.stdout:4/918: truncate d17/d21/d32/f85 122319 0 2026-03-09T00:04:11.170 INFO:tasks.workunit.client.1.vm06.stdout:0/966: mkdir d3/d18/d1f/d39/d69/d116/d146 0 2026-03-09T00:04:11.170 INFO:tasks.workunit.client.1.vm06.stdout:0/967: readlink d3/d18/d2c/d2d/d31/l65 0 2026-03-09T00:04:11.172 INFO:tasks.workunit.client.0.vm03.stdout:1/794: dread d4/d15/d5c/f74 [0,4194304] 0 2026-03-09T00:04:11.172 INFO:tasks.workunit.client.1.vm06.stdout:7/912: creat d0/d55/d99/f10e x:0 0 0 2026-03-09T00:04:11.180 INFO:tasks.workunit.client.1.vm06.stdout:3/927: symlink d11/l13b 0 2026-03-09T00:04:11.180 INFO:tasks.workunit.client.1.vm06.stdout:4/919: getdents d17/d21/d32 0 2026-03-09T00:04:11.180 INFO:tasks.workunit.client.1.vm06.stdout:4/920: stat d17/d21/d4c/d50/d12f 0 2026-03-09T00:04:11.184 INFO:tasks.workunit.client.1.vm06.stdout:3/928: mknod d11/d28/d4d/d89/d90/d112/c13c 0 2026-03-09T00:04:11.185 INFO:tasks.workunit.client.1.vm06.stdout:4/921: truncate d17/d21/d4c/d50/f136 1309868 0 2026-03-09T00:04:11.186 INFO:tasks.workunit.client.1.vm06.stdout:4/922: unlink d17/d21/d4c/d50/f60 0 2026-03-09T00:04:11.186 INFO:tasks.workunit.client.1.vm06.stdout:4/923: chown d17/d24/d3b/dbf/ddf/dfc 43 1 2026-03-09T00:04:11.186 INFO:tasks.workunit.client.1.vm06.stdout:4/924: write d17/d21/d4c/d50/f8c [2464725,27500] 0 2026-03-09T00:04:11.186 INFO:tasks.workunit.client.1.vm06.stdout:4/925: write d17/d21/d4c/dc2/f11f [1297979,66979] 0 2026-03-09T00:04:11.188 INFO:tasks.workunit.client.1.vm06.stdout:4/926: truncate d17/d21/d32/d92/fa4 2259335 0 2026-03-09T00:04:11.189 INFO:tasks.workunit.client.1.vm06.stdout:4/927: creat d17/d24/d3b/d97/db7/d12c/f13c x:0 0 0 2026-03-09T00:04:11.190 INFO:tasks.workunit.client.1.vm06.stdout:4/928: fsync d17/d21/fb8 0 2026-03-09T00:04:11.190 INFO:tasks.workunit.client.1.vm06.stdout:4/929: truncate d17/d21/d4c/d66/f9a 966184 0 2026-03-09T00:04:11.198 INFO:tasks.workunit.client.0.vm03.stdout:4/797: rmdir d7/d20/d6a/dea 39 2026-03-09T00:04:11.200 INFO:tasks.workunit.client.1.vm06.stdout:8/940: dwrite db/dd/f7a [0,4194304] 0 2026-03-09T00:04:11.200 INFO:tasks.workunit.client.1.vm06.stdout:8/941: chown db/dd/d84/fe4 1246392 1 2026-03-09T00:04:11.200 INFO:tasks.workunit.client.1.vm06.stdout:8/942: chown db/d74/d87 1072428 1 2026-03-09T00:04:11.201 INFO:tasks.workunit.client.0.vm03.stdout:8/694: dwrite d7/df/d1a/d40/fb5 [0,4194304] 0 2026-03-09T00:04:11.203 INFO:tasks.workunit.client.1.vm06.stdout:4/930: write d17/d21/d4c/d66/fa2 [246105,113207] 0 2026-03-09T00:04:11.204 INFO:tasks.workunit.client.1.vm06.stdout:4/931: readlink d17/d24/d49/de4/db0/l115 0 2026-03-09T00:04:11.214 INFO:tasks.workunit.client.0.vm03.stdout:4/798: dread d7/d6f/f9b [0,4194304] 0 2026-03-09T00:04:11.214 INFO:tasks.workunit.client.0.vm03.stdout:4/799: symlink d7/d20/d6a/dea/d38/da9/lfe 0 2026-03-09T00:04:11.220 INFO:tasks.workunit.client.1.vm06.stdout:8/943: write db/d74/d78/d98/db6/dc7/d101/ffc [4075866,36914] 0 2026-03-09T00:04:11.230 INFO:tasks.workunit.client.0.vm03.stdout:4/800: write d7/d20/d6a/d77/db7/f9a [587987,3732] 0 2026-03-09T00:04:11.230 INFO:tasks.workunit.client.0.vm03.stdout:4/801: dread - d7/d20/d6a/f76 zero size 2026-03-09T00:04:11.232 INFO:tasks.workunit.client.0.vm03.stdout:4/802: mknod d7/d6f/dcf/de8/cff 0 2026-03-09T00:04:11.232 INFO:tasks.workunit.client.0.vm03.stdout:4/803: getdents d7/d20 0 2026-03-09T00:04:11.248 INFO:tasks.workunit.client.0.vm03.stdout:5/720: dwrite d1c/d20/d55/d4f/d58/d5d/f8d [0,4194304] 0 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: pgmap v8: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 109 GiB / 120 GiB avail; 66 MiB/s rd, 80 MiB/s wr, 138 op/s 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:04:11.308 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:11 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:11.323 INFO:tasks.workunit.client.0.vm03.stdout:1/795: dread d4/d3a/d61/d78/f94 [0,4194304] 0 2026-03-09T00:04:11.323 INFO:tasks.workunit.client.0.vm03.stdout:1/796: mknod d4/d15/de5/c110 0 2026-03-09T00:04:11.324 INFO:tasks.workunit.client.0.vm03.stdout:1/797: mkdir d4/d3a/d32/d87/d111 0 2026-03-09T00:04:11.325 INFO:tasks.workunit.client.0.vm03.stdout:1/798: creat d4/d15/d5c/d103/f112 x:0 0 0 2026-03-09T00:04:11.325 INFO:tasks.workunit.client.0.vm03.stdout:1/799: write d4/d15/d5c/d6c/fc0 [2252143,65126] 0 2026-03-09T00:04:11.326 INFO:tasks.workunit.client.0.vm03.stdout:1/800: creat d4/d3a/d32/da1/f113 x:0 0 0 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:4/804: write d7/fa7 [425549,16239] 0 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:4/805: fsync d7/d27/f52 0 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:4/806: fdatasync d7/d20/d6a/dea/d54/d58/f6b 0 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:4/807: write d7/f62 [1876541,108432] 0 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:4/808: readlink d7/d20/d6a/dea/d54/l61 0 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:4/809: chown d7/d20/d35/d66/c94 73061210 1 2026-03-09T00:04:11.328 INFO:tasks.workunit.client.0.vm03.stdout:1/801: link d4/d3a/d32/l80 d4/d3a/d3d/l114 0 2026-03-09T00:04:11.329 INFO:tasks.workunit.client.0.vm03.stdout:4/810: chown d7/d20/d6a/dea/l4d 64744 1 2026-03-09T00:04:11.333 INFO:tasks.workunit.client.0.vm03.stdout:4/811: dread d7/f15 [0,4194304] 0 2026-03-09T00:04:11.333 INFO:tasks.workunit.client.0.vm03.stdout:4/812: creat d7/d20/d6a/d77/db7/f100 x:0 0 0 2026-03-09T00:04:11.336 INFO:tasks.workunit.client.0.vm03.stdout:4/813: mkdir d7/d20/d6a/d77/d25/de2/df1/d101 0 2026-03-09T00:04:11.336 INFO:tasks.workunit.client.0.vm03.stdout:4/814: truncate d7/d20/d6a/dde/fec 202170 0 2026-03-09T00:04:11.336 INFO:tasks.workunit.client.0.vm03.stdout:4/815: fsync d7/d20/f70 0 2026-03-09T00:04:11.337 INFO:tasks.workunit.client.0.vm03.stdout:4/816: link d7/d20/d6a/d77/d25/fa1 d7/d20/d6a/d77/d25/f102 0 2026-03-09T00:04:11.340 INFO:tasks.workunit.client.1.vm06.stdout:0/968: dwrite d3/d18/d1f/d39/d49/d60/fe8 [0,4194304] 0 2026-03-09T00:04:11.341 INFO:tasks.workunit.client.0.vm03.stdout:0/708: sync 2026-03-09T00:04:11.341 INFO:tasks.workunit.client.0.vm03.stdout:3/516: sync 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:3/517: getdents d2/db/d40/d44 0 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:3/518: chown d2/db/d40/d58/f7f 1143 1 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:3/519: chown d2/l39 1 1 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:3/520: fdatasync d2/db/d40/d51/f5a 0 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:3/521: write d2/db/f14 [4833343,93616] 0 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:0/709: dread d2/da/fca [0,4194304] 0 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:0/710: chown d2/da/dd/d49/d6c/f3b 31592516 1 2026-03-09T00:04:11.346 INFO:tasks.workunit.client.0.vm03.stdout:0/711: fdatasync d2/da/dd/f7b 0 2026-03-09T00:04:11.347 INFO:tasks.workunit.client.0.vm03.stdout:4/817: unlink d7/d20/d6a/d77/fc8 0 2026-03-09T00:04:11.347 INFO:tasks.workunit.client.0.vm03.stdout:4/818: dread - d7/d6f/dcf/ff7 zero size 2026-03-09T00:04:11.349 INFO:tasks.workunit.client.0.vm03.stdout:3/522: mknod d2/db/d3b/d5f/d65/c9c 0 2026-03-09T00:04:11.349 INFO:tasks.workunit.client.0.vm03.stdout:3/523: read - d2/db/d40/d51/f57 zero size 2026-03-09T00:04:11.349 INFO:tasks.workunit.client.0.vm03.stdout:3/524: fdatasync d2/db/f26 0 2026-03-09T00:04:11.350 INFO:tasks.workunit.client.1.vm06.stdout:0/969: rename d3/d18/d2c/d2d/d74/dc7/d110 to d3/d18/d2c/d2d/d74/da8/d109/d147 0 2026-03-09T00:04:11.351 INFO:tasks.workunit.client.1.vm06.stdout:9/819: dwrite d1/d4/d6e/d14/f1a [8388608,4194304] 0 2026-03-09T00:04:11.354 INFO:tasks.workunit.client.0.vm03.stdout:4/819: mknod d7/d20/d6a/dea/d38/c103 0 2026-03-09T00:04:11.360 INFO:tasks.workunit.client.1.vm06.stdout:9/820: rename d1/da7/fea to d1/d3/d4f/d91/d94/ddf/f10d 0 2026-03-09T00:04:11.360 INFO:tasks.workunit.client.1.vm06.stdout:9/821: fdatasync d1/d4/d6e/f2c 0 2026-03-09T00:04:11.363 INFO:tasks.workunit.client.0.vm03.stdout:3/525: mknod d2/db/d40/d88/c9d 0 2026-03-09T00:04:11.365 INFO:tasks.workunit.client.1.vm06.stdout:0/970: write d3/d18/d2c/d2d/d8c/fb5 [1954961,18774] 0 2026-03-09T00:04:11.365 INFO:tasks.workunit.client.1.vm06.stdout:0/971: readlink d3/d10f/lf6 0 2026-03-09T00:04:11.366 INFO:tasks.workunit.client.0.vm03.stdout:0/712: unlink d2/da/d36/da4/l26 0 2026-03-09T00:04:11.379 INFO:tasks.workunit.client.1.vm06.stdout:9/822: creat d1/d3/d4f/d52/de3/de5/f10e x:0 0 0 2026-03-09T00:04:11.379 INFO:tasks.workunit.client.1.vm06.stdout:9/823: fdatasync d1/d3/d4f/d52/de3/de5/f10e 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:6/690: dwrite d13/d35/d4c/d62/fa0 [0,4194304] 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:6/691: readlink d13/l8e 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:0/713: mkdir d2/da/d36/ddf/df7/d107 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:3/526: mkdir d2/db/d40/d51/d9e 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:0/714: chown d2/da/d1a/fc4 507656 1 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:2/716: dwrite d8/d1b/d2a/f33 [4194304,4194304] 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:2/717: chown d8/d1b/d2a/d56 3870575 1 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:2/718: getdents d8/d1b/d6c/dd7 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:9/684: dwrite d15/d1c/d36/f86 [0,4194304] 0 2026-03-09T00:04:11.380 INFO:tasks.workunit.client.0.vm03.stdout:3/527: creat d2/db/d40/d44/f9f x:0 0 0 2026-03-09T00:04:11.381 INFO:tasks.workunit.client.1.vm06.stdout:8/944: dwrite db/dd/de3/f122 [0,4194304] 0 2026-03-09T00:04:11.381 INFO:tasks.workunit.client.0.vm03.stdout:2/719: unlink d8/d1b/f22 0 2026-03-09T00:04:11.383 INFO:tasks.workunit.client.1.vm06.stdout:0/972: dread d3/d18/d1f/d44/f58 [0,4194304] 0 2026-03-09T00:04:11.383 INFO:tasks.workunit.client.0.vm03.stdout:5/721: dwrite d1c/d20/f4e [0,4194304] 0 2026-03-09T00:04:11.384 INFO:tasks.workunit.client.0.vm03.stdout:0/715: read d2/da/dd/d49/d6c/d4b/d55/d6f/fd4 [865348,22610] 0 2026-03-09T00:04:11.384 INFO:tasks.workunit.client.0.vm03.stdout:0/716: chown d2/da/dd/d49/d6c/da6/dda/lac 167825414 1 2026-03-09T00:04:11.386 INFO:tasks.workunit.client.0.vm03.stdout:9/685: creat d15/d1c/d21/d54/dab/fe3 x:0 0 0 2026-03-09T00:04:11.390 INFO:tasks.workunit.client.0.vm03.stdout:2/720: mknod d8/d1b/d24/da5/dda/de0/ceb 0 2026-03-09T00:04:11.395 INFO:tasks.workunit.client.0.vm03.stdout:2/721: write d8/d1b/d2a/d6b/d50/fc8 [275693,23337] 0 2026-03-09T00:04:11.396 INFO:tasks.workunit.client.1.vm06.stdout:8/945: creat db/d74/f132 x:0 0 0 2026-03-09T00:04:11.396 INFO:tasks.workunit.client.1.vm06.stdout:8/946: write db/dd/d24/da7/f11e [285230,125841] 0 2026-03-09T00:04:11.396 INFO:tasks.workunit.client.1.vm06.stdout:8/947: write db/dd/f67 [556373,55106] 0 2026-03-09T00:04:11.401 INFO:tasks.workunit.client.1.vm06.stdout:0/973: creat d3/d18/de9/f148 x:0 0 0 2026-03-09T00:04:11.401 INFO:tasks.workunit.client.1.vm06.stdout:0/974: fsync d3/d18/d1f/fe2 0 2026-03-09T00:04:11.401 INFO:tasks.workunit.client.1.vm06.stdout:0/975: creat d3/d18/d1f/d39/d3b/df9/df2/d73/f149 x:0 0 0 2026-03-09T00:04:11.401 INFO:tasks.workunit.client.1.vm06.stdout:0/976: dread - d3/d18/d1f/d39/f6e zero size 2026-03-09T00:04:11.401 INFO:tasks.workunit.client.1.vm06.stdout:0/977: write d3/d18/d1f/fe2 [487420,35940] 0 2026-03-09T00:04:11.412 INFO:tasks.workunit.client.1.vm06.stdout:8/948: rmdir db/d74/d78/d98/db6 39 2026-03-09T00:04:11.424 INFO:tasks.workunit.client.0.vm03.stdout:5/722: mkdir d1c/d51/d6a/d75/df0 0 2026-03-09T00:04:11.424 INFO:tasks.workunit.client.0.vm03.stdout:5/723: chown c1b 54584 1 2026-03-09T00:04:11.424 INFO:tasks.workunit.client.1.vm06.stdout:0/978: link d3/d10f/ff1 d3/d18/d2c/f14a 0 2026-03-09T00:04:11.424 INFO:tasks.workunit.client.1.vm06.stdout:0/979: creat d3/d18/d1f/d39/f14b x:0 0 0 2026-03-09T00:04:11.424 INFO:tasks.workunit.client.1.vm06.stdout:0/980: mknod d3/d18/de9/c14c 0 2026-03-09T00:04:11.424 INFO:tasks.workunit.client.1.vm06.stdout:0/981: creat d3/d18/d1f/d119/f14d x:0 0 0 2026-03-09T00:04:11.425 INFO:tasks.workunit.client.1.vm06.stdout:3/929: dwrite d11/d3f/d8d/f135 [0,4194304] 0 2026-03-09T00:04:11.431 INFO:tasks.workunit.client.0.vm03.stdout:5/724: dread d1c/fc4 [0,4194304] 0 2026-03-09T00:04:11.435 INFO:tasks.workunit.client.0.vm03.stdout:5/725: mkdir d1c/d20/d55/d66/dc6/df1 0 2026-03-09T00:04:11.444 INFO:tasks.workunit.client.0.vm03.stdout:5/726: dread d1c/f1f [0,4194304] 0 2026-03-09T00:04:11.445 INFO:tasks.workunit.client.0.vm03.stdout:5/727: mkdir d1c/d51/df2 0 2026-03-09T00:04:11.456 INFO:tasks.workunit.client.0.vm03.stdout:4/820: dwrite d7/d20/d6a/dea/d4e/dd0/ffc [0,4194304] 0 2026-03-09T00:04:11.511 INFO:tasks.workunit.client.0.vm03.stdout:6/692: dwrite d13/f1a [0,4194304] 0 2026-03-09T00:04:11.511 INFO:tasks.workunit.client.0.vm03.stdout:2/722: dwrite d8/d26/d5e/f64 [0,4194304] 0 2026-03-09T00:04:11.513 INFO:tasks.workunit.client.0.vm03.stdout:5/728: dread d1c/d20/d97/fb3 [0,4194304] 0 2026-03-09T00:04:11.514 INFO:tasks.workunit.client.0.vm03.stdout:6/693: mkdir d13/d1e/d44/d4a/d52/dbf/de4 0 2026-03-09T00:04:11.514 INFO:tasks.workunit.client.0.vm03.stdout:2/723: dread d8/d1b/d2a/d56/fa4 [0,4194304] 0 2026-03-09T00:04:11.515 INFO:tasks.workunit.client.0.vm03.stdout:6/694: truncate d13/d35/d71/f87 3567529 0 2026-03-09T00:04:11.515 INFO:tasks.workunit.client.0.vm03.stdout:6/695: creat d13/d1e/d44/d4a/d52/fe5 x:0 0 0 2026-03-09T00:04:11.516 INFO:tasks.workunit.client.0.vm03.stdout:6/696: unlink d13/d35/d72/f85 0 2026-03-09T00:04:11.516 INFO:tasks.workunit.client.0.vm03.stdout:6/697: fsync d13/f1a 0 2026-03-09T00:04:11.522 INFO:tasks.workunit.client.0.vm03.stdout:2/724: dread d8/d1b/d2a/d6b/dc6/fd2 [0,4194304] 0 2026-03-09T00:04:11.531 INFO:tasks.workunit.client.0.vm03.stdout:2/725: mknod d8/d1b/d2a/d6b/d50/d8a/cec 0 2026-03-09T00:04:11.531 INFO:tasks.workunit.client.0.vm03.stdout:2/726: write d8/d74/fe9 [337560,2100] 0 2026-03-09T00:04:11.531 INFO:tasks.workunit.client.0.vm03.stdout:2/727: mkdir d8/d26/d5e/d5f/ded 0 2026-03-09T00:04:11.538 INFO:tasks.workunit.client.0.vm03.stdout:3/528: dwrite d2/db/d3b/d5f/d65/f90 [0,4194304] 0 2026-03-09T00:04:11.557 INFO:tasks.workunit.client.0.vm03.stdout:1/802: dread d4/fa0 [0,4194304] 0 2026-03-09T00:04:11.557 INFO:tasks.workunit.client.0.vm03.stdout:1/803: chown d4/d15/de5/c110 6 1 2026-03-09T00:04:11.558 INFO:tasks.workunit.client.1.vm06.stdout:6/888: sync 2026-03-09T00:04:11.559 INFO:tasks.workunit.client.0.vm03.stdout:1/804: truncate d4/d3a/d3d/f58 2422087 0 2026-03-09T00:04:11.561 INFO:tasks.workunit.client.1.vm06.stdout:6/889: mkdir d4/d16/d53/df2/d10e/d112 0 2026-03-09T00:04:11.561 INFO:tasks.workunit.client.1.vm06.stdout:6/890: chown d4/d16/c17 169814 1 2026-03-09T00:04:11.570 INFO:tasks.workunit.client.0.vm03.stdout:4/821: dwrite d7/d20/d6a/d77/d25/fb8 [4194304,4194304] 0 2026-03-09T00:04:11.577 INFO:tasks.workunit.client.1.vm06.stdout:6/891: dread d4/d16/d53/ddf/d52/f6c [0,4194304] 0 2026-03-09T00:04:11.577 INFO:tasks.workunit.client.1.vm06.stdout:6/892: fsync d4/d16/d53/ddf/d52/f9e 0 2026-03-09T00:04:11.577 INFO:tasks.workunit.client.1.vm06.stdout:6/893: chown d4/d16/d53/ddf/da6/dbb/ffb 3 1 2026-03-09T00:04:11.577 INFO:tasks.workunit.client.1.vm06.stdout:6/894: creat d4/d16/d53/d67/f113 x:0 0 0 2026-03-09T00:04:11.577 INFO:tasks.workunit.client.1.vm06.stdout:3/930: dwrite d11/d28/d2e/d2f/f79 [0,4194304] 0 2026-03-09T00:04:11.577 INFO:tasks.workunit.client.1.vm06.stdout:3/931: dread - d11/d28/d4d/d89/f115 zero size 2026-03-09T00:04:11.578 INFO:tasks.workunit.client.1.vm06.stdout:6/895: creat d4/d27/d3e/d78/f114 x:0 0 0 2026-03-09T00:04:11.584 INFO:tasks.workunit.client.1.vm06.stdout:3/932: creat d11/d28/d2e/d7e/d83/d132/f13d x:0 0 0 2026-03-09T00:04:11.584 INFO:tasks.workunit.client.1.vm06.stdout:3/933: read d11/d28/d2e/d2f/d5b/fea [519561,51707] 0 2026-03-09T00:04:11.584 INFO:tasks.workunit.client.1.vm06.stdout:6/896: link d4/d16/d53/ddf/d7e/dac/dd3/d101/cee d4/d16/d53/ddf/d7e/dac/dd3/c115 0 2026-03-09T00:04:11.584 INFO:tasks.workunit.client.1.vm06.stdout:6/897: fsync d4/d16/d53/ddf/d4b/f83 0 2026-03-09T00:04:11.584 INFO:tasks.workunit.client.1.vm06.stdout:6/898: truncate d4/d27/fb6 639510 0 2026-03-09T00:04:11.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: pgmap v8: 65 pgs: 65 active+clean; 3.1 GiB data, 10 GiB used, 109 GiB / 120 GiB avail; 66 MiB/s rd, 80 MiB/s wr, 138 op/s 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:04:11.596 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:11 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:11.608 INFO:tasks.workunit.client.1.vm06.stdout:8/949: dwrite db/dd/f7a [0,4194304] 0 2026-03-09T00:04:11.608 INFO:tasks.workunit.client.1.vm06.stdout:8/950: write db/dd/f67 [623330,51775] 0 2026-03-09T00:04:11.609 INFO:tasks.workunit.client.0.vm03.stdout:0/717: dwrite d2/f59 [0,4194304] 0 2026-03-09T00:04:11.609 INFO:tasks.workunit.client.1.vm06.stdout:8/951: rmdir db/d74/d78 39 2026-03-09T00:04:11.609 INFO:tasks.workunit.client.1.vm06.stdout:8/952: chown db/dd/d24/c29 53465205 1 2026-03-09T00:04:11.614 INFO:tasks.workunit.client.0.vm03.stdout:1/805: dread d4/d6/d52/f9a [0,4194304] 0 2026-03-09T00:04:11.614 INFO:tasks.workunit.client.0.vm03.stdout:1/806: fdatasync d4/d3a/d32/da3/fcd 0 2026-03-09T00:04:11.615 INFO:tasks.workunit.client.1.vm06.stdout:8/953: write db/d74/d78/fd2 [98828,1462] 0 2026-03-09T00:04:11.618 INFO:tasks.workunit.client.0.vm03.stdout:0/718: write d2/da/dd/f7b [1415524,77036] 0 2026-03-09T00:04:11.618 INFO:tasks.workunit.client.0.vm03.stdout:0/719: truncate d2/da/d36/da4/f3f 4498794 0 2026-03-09T00:04:11.619 INFO:tasks.workunit.client.0.vm03.stdout:0/720: getdents d2/da/dd 0 2026-03-09T00:04:11.619 INFO:tasks.workunit.client.0.vm03.stdout:0/721: chown d2/da/d4e/faa 0 1 2026-03-09T00:04:11.619 INFO:tasks.workunit.client.0.vm03.stdout:0/722: stat d2/da/d4e/c92 0 2026-03-09T00:04:11.619 INFO:tasks.workunit.client.0.vm03.stdout:0/723: chown d2/da/dd/d49/d6c/d4b/d55/d6f/dad 60 1 2026-03-09T00:04:11.637 INFO:tasks.workunit.client.1.vm06.stdout:6/899: write d4/d27/d3e/f55 [324859,18068] 0 2026-03-09T00:04:11.637 INFO:tasks.workunit.client.1.vm06.stdout:6/900: mknod d4/d16/d53/ddf/c116 0 2026-03-09T00:04:11.638 INFO:tasks.workunit.client.1.vm06.stdout:6/901: truncate d4/d16/d46/fc4 29813 0 2026-03-09T00:04:11.639 INFO:tasks.workunit.client.1.vm06.stdout:6/902: creat d4/d16/d53/ddf/d4b/ddb/f117 x:0 0 0 2026-03-09T00:04:11.639 INFO:tasks.workunit.client.1.vm06.stdout:6/903: readlink d4/d16/d53/ddf/da6/dbb/lbd 0 2026-03-09T00:04:11.640 INFO:tasks.workunit.client.1.vm06.stdout:6/904: read d4/d16/d53/ddf/d4b/f50 [336762,48403] 0 2026-03-09T00:04:11.642 INFO:tasks.workunit.client.1.vm06.stdout:6/905: write d4/d27/d3e/d78/f92 [655781,28871] 0 2026-03-09T00:04:11.651 INFO:tasks.workunit.client.1.vm06.stdout:6/906: chown d4/d16/d53/cdc 1333 1 2026-03-09T00:04:11.652 INFO:tasks.workunit.client.1.vm06.stdout:6/907: rename d4/f22 to d4/d16/d53/ddf/d4b/ddb/f118 0 2026-03-09T00:04:11.652 INFO:tasks.workunit.client.1.vm06.stdout:6/908: link d4/d16/d53/d67/ld1 d4/d16/d46/l119 0 2026-03-09T00:04:11.661 INFO:tasks.workunit.client.0.vm03.stdout:0/724: write f0 [3265900,83349] 0 2026-03-09T00:04:11.693 INFO:tasks.workunit.client.1.vm06.stdout:3/934: dwrite d11/d28/f3a [0,4194304] 0 2026-03-09T00:04:11.694 INFO:tasks.workunit.client.1.vm06.stdout:3/935: rename d11/d28/f4f to d11/d28/d4d/d89/d90/dd2/d13a/f13e 0 2026-03-09T00:04:11.694 INFO:tasks.workunit.client.1.vm06.stdout:3/936: dread - d11/d28/d2e/dff/f126 zero size 2026-03-09T00:04:11.719 INFO:tasks.workunit.client.1.vm06.stdout:6/909: dwrite d4/d16/d53/ddf/d52/d7d/faf [0,4194304] 0 2026-03-09T00:04:11.720 INFO:tasks.workunit.client.1.vm06.stdout:8/954: dwrite db/d74/d87/d100/f95 [0,4194304] 0 2026-03-09T00:04:11.721 INFO:tasks.workunit.client.0.vm03.stdout:2/728: dwrite d8/d1b/d2a/f33 [0,4194304] 0 2026-03-09T00:04:11.721 INFO:tasks.workunit.client.0.vm03.stdout:2/729: chown d8/d1b/d2a/d6b/d50/fc8 35083 1 2026-03-09T00:04:11.721 INFO:tasks.workunit.client.0.vm03.stdout:2/730: chown d8/d1b/d8f/la0 245599003 1 2026-03-09T00:04:11.721 INFO:tasks.workunit.client.0.vm03.stdout:2/731: chown d8/d26/d5e/d6f/c8e 4393816 1 2026-03-09T00:04:11.721 INFO:tasks.workunit.client.0.vm03.stdout:2/732: fdatasync d8/fb 0 2026-03-09T00:04:11.721 INFO:tasks.workunit.client.1.vm06.stdout:6/910: creat d4/d16/d53/ddf/d52/f11a x:0 0 0 2026-03-09T00:04:11.722 INFO:tasks.workunit.client.0.vm03.stdout:2/733: read d8/d26/d5e/f7c [764687,52617] 0 2026-03-09T00:04:11.722 INFO:tasks.workunit.client.0.vm03.stdout:2/734: fdatasync d8/d1b/d24/f86 0 2026-03-09T00:04:11.723 INFO:tasks.workunit.client.1.vm06.stdout:6/911: truncate d4/d16/d53/ddf/d7e/dac/dd3/d101/f5c 5005290 0 2026-03-09T00:04:11.723 INFO:tasks.workunit.client.1.vm06.stdout:6/912: write d4/d8d/f106 [179035,38841] 0 2026-03-09T00:04:11.723 INFO:tasks.workunit.client.1.vm06.stdout:6/913: fsync d4/d16/d53/ddf/d4b/fba 0 2026-03-09T00:04:11.729 INFO:tasks.workunit.client.1.vm06.stdout:6/914: unlink d4/d16/d53/ddf/da6/ld9 0 2026-03-09T00:04:11.732 INFO:tasks.workunit.client.1.vm06.stdout:6/915: write d4/fb [2811224,70309] 0 2026-03-09T00:04:11.732 INFO:tasks.workunit.client.0.vm03.stdout:9/686: dwrite fb [0,4194304] 0 2026-03-09T00:04:11.736 INFO:tasks.workunit.client.0.vm03.stdout:2/735: truncate d8/d1b/d24/f38 235036 0 2026-03-09T00:04:11.810 INFO:tasks.workunit.client.0.vm03.stdout:2/736: dwrite d8/d26/d5e/d6f/d97/f1a [0,4194304] 0 2026-03-09T00:04:11.811 INFO:tasks.workunit.client.0.vm03.stdout:2/737: symlink d8/d26/d5e/d6f/lee 0 2026-03-09T00:04:11.811 INFO:tasks.workunit.client.0.vm03.stdout:2/738: mkdir d8/d1b/d24/da5/dda/def 0 2026-03-09T00:04:11.812 INFO:tasks.workunit.client.0.vm03.stdout:2/739: rename d8/d1b/d2a/d2e/fad to d8/d1b/d2a/d2e/d9a/ff0 0 2026-03-09T00:04:11.812 INFO:tasks.workunit.client.0.vm03.stdout:2/740: write d8/d1b/d24/da5/fea [261861,17152] 0 2026-03-09T00:04:11.813 INFO:tasks.workunit.client.0.vm03.stdout:2/741: creat d8/d1b/d24/da5/dda/de0/ff1 x:0 0 0 2026-03-09T00:04:11.813 INFO:tasks.workunit.client.0.vm03.stdout:2/742: mknod d8/d1b/d2a/d6b/d50/d8a/cf2 0 2026-03-09T00:04:11.813 INFO:tasks.workunit.client.0.vm03.stdout:2/743: fdatasync d8/d1b/d2a/d6b/d50/f91 0 2026-03-09T00:04:11.814 INFO:tasks.workunit.client.0.vm03.stdout:2/744: symlink d8/d26/d5e/d6f/lf3 0 2026-03-09T00:04:11.814 INFO:tasks.workunit.client.0.vm03.stdout:2/745: readlink d8/d1b/d2a/d2e/d9a/ldd 0 2026-03-09T00:04:11.814 INFO:tasks.workunit.client.0.vm03.stdout:2/746: chown d8/d1b/d24/da5/fea 39 1 2026-03-09T00:04:11.814 INFO:tasks.workunit.client.0.vm03.stdout:2/747: stat d8/l14 0 2026-03-09T00:04:11.814 INFO:tasks.workunit.client.0.vm03.stdout:2/748: creat d8/d1b/d2a/d6b/dc6/ff4 x:0 0 0 2026-03-09T00:04:11.814 INFO:tasks.workunit.client.0.vm03.stdout:2/749: dread - d8/d1b/d2a/d56/fb6 zero size 2026-03-09T00:04:11.815 INFO:tasks.workunit.client.0.vm03.stdout:2/750: rmdir d8/d1b/d2a/d56 39 2026-03-09T00:04:11.818 INFO:tasks.workunit.client.0.vm03.stdout:9/687: dwrite d15/d1c/d28/fa7 [0,4194304] 0 2026-03-09T00:04:11.819 INFO:tasks.workunit.client.0.vm03.stdout:3/529: dwrite d2/db/d3b/d5d/f8d [0,4194304] 0 2026-03-09T00:04:11.820 INFO:tasks.workunit.client.0.vm03.stdout:1/807: dwrite d4/d3a/d32/da3/f10f [0,4194304] 0 2026-03-09T00:04:11.826 INFO:tasks.workunit.client.0.vm03.stdout:5/729: dwrite d1c/d20/d56/d74/f84 [0,4194304] 0 2026-03-09T00:04:11.826 INFO:tasks.workunit.client.0.vm03.stdout:3/530: mknod d2/db/d3b/d5f/d65/ca0 0 2026-03-09T00:04:11.826 INFO:tasks.workunit.client.0.vm03.stdout:3/531: stat d2/db/d40 0 2026-03-09T00:04:11.826 INFO:tasks.workunit.client.0.vm03.stdout:3/532: creat d2/db/d6a/fa1 x:0 0 0 2026-03-09T00:04:11.832 INFO:tasks.workunit.client.0.vm03.stdout:5/730: write d1c/d20/d55/d4f/d58/db5/f45 [1666139,15336] 0 2026-03-09T00:04:11.832 INFO:tasks.workunit.client.0.vm03.stdout:1/808: creat d4/d15/de5/f115 x:0 0 0 2026-03-09T00:04:11.832 INFO:tasks.workunit.client.0.vm03.stdout:1/809: dread - d4/d15/d5c/d6c/fb7 zero size 2026-03-09T00:04:11.837 INFO:tasks.workunit.client.0.vm03.stdout:5/731: mkdir d1c/d20/d56/db4/df3 0 2026-03-09T00:04:11.838 INFO:tasks.workunit.client.0.vm03.stdout:1/810: mkdir d4/d3a/d32/d87/d116 0 2026-03-09T00:04:11.838 INFO:tasks.workunit.client.0.vm03.stdout:5/732: fsync ff 0 2026-03-09T00:04:11.841 INFO:tasks.workunit.client.1.vm06.stdout:8/955: dwrite db/dd/d24/dac/d126/f12c [0,4194304] 0 2026-03-09T00:04:11.876 INFO:tasks.workunit.client.1.vm06.stdout:9/824: getdents d1/d3/d4f/d91/d94/ddf 0 2026-03-09T00:04:11.876 INFO:tasks.workunit.client.1.vm06.stdout:9/825: read - d1/fa3 zero size 2026-03-09T00:04:11.877 INFO:tasks.workunit.client.1.vm06.stdout:9/826: mkdir d1/d3/d2b/d58/d10f 0 2026-03-09T00:04:11.879 INFO:tasks.workunit.client.1.vm06.stdout:9/827: mknod d1/d3/d4f/d91/de8/c110 0 2026-03-09T00:04:11.886 INFO:tasks.workunit.client.1.vm06.stdout:9/828: readlink d1/d4/d6e/d14/d25/d85/l30 0 2026-03-09T00:04:11.886 INFO:tasks.workunit.client.1.vm06.stdout:9/829: readlink d1/d4/d6e/d14/d25/d85/d49/l88 0 2026-03-09T00:04:11.886 INFO:tasks.workunit.client.1.vm06.stdout:9/830: dread - d1/d3/d4f/fbd zero size 2026-03-09T00:04:11.886 INFO:tasks.workunit.client.0.vm03.stdout:7/625: sync 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.0.vm03.stdout:7/626: readlink d2/d1f/d3a/d24/l76 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.0.vm03.stdout:7/627: chown d2/d1f/c10 59383 1 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.0.vm03.stdout:7/628: fdatasync d2/d4/d1e/fae 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.0.vm03.stdout:8/695: sync 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/831: getdents d1/d4/d6e 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/832: chown d1/d3/d2b/d58/c96 7688 1 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/833: symlink d1/d3/d50/l111 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/834: rename d1/d73/l7e to d1/d3/d4f/d52/l112 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/835: write d1/d3/d4f/d52/fb3 [385980,27969] 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/836: dread - d1/d3/f106 zero size 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:0/982: rmdir d3/d18/d1f/d39/d3b/df9/df2 39 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/837: unlink d1/d4/d6e/d9/l98 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/838: fdatasync d1/d4/d6e/d14/fb2 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:0/983: rename d3/f29 to d3/d18/d1f/d119/f14e 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:9/839: mkdir d1/d3/d4f/d91/dae/de9/d113 0 2026-03-09T00:04:11.887 INFO:tasks.workunit.client.1.vm06.stdout:0/984: mkdir d3/d18/d2c/d2d/d74/dc7/d14f 0 2026-03-09T00:04:11.888 INFO:tasks.workunit.client.0.vm03.stdout:4/822: dwrite d7/d20/d6a/d77/db7/f9a [0,4194304] 0 2026-03-09T00:04:11.888 INFO:tasks.workunit.client.0.vm03.stdout:4/823: chown d7/d6f/da5 57860983 1 2026-03-09T00:04:11.888 INFO:tasks.workunit.client.0.vm03.stdout:4/824: chown d7/d20/d6a/dea/d4e/f9e 773068 1 2026-03-09T00:04:11.888 INFO:tasks.workunit.client.0.vm03.stdout:4/825: readlink d7/d20/l5b 0 2026-03-09T00:04:11.888 INFO:tasks.workunit.client.0.vm03.stdout:4/826: creat d7/d20/d6a/dde/f104 x:0 0 0 2026-03-09T00:04:11.889 INFO:tasks.workunit.client.0.vm03.stdout:5/733: write d1c/d20/f65 [815179,126353] 0 2026-03-09T00:04:11.890 INFO:tasks.workunit.client.1.vm06.stdout:9/840: rename d1/d3/d2b/c34 to d1/d3/d4f/d91/d94/c114 0 2026-03-09T00:04:11.891 INFO:tasks.workunit.client.1.vm06.stdout:9/841: getdents d1/d4/d6e 0 2026-03-09T00:04:11.891 INFO:tasks.workunit.client.1.vm06.stdout:9/842: stat d1/d3/d4f/d91/de8/c110 0 2026-03-09T00:04:11.891 INFO:tasks.workunit.client.1.vm06.stdout:9/843: truncate d1/d3/d50/fba 183160 0 2026-03-09T00:04:11.896 INFO:tasks.workunit.client.0.vm03.stdout:7/629: dread d2/f73 [0,4194304] 0 2026-03-09T00:04:11.896 INFO:tasks.workunit.client.0.vm03.stdout:7/630: chown d2/d4/db7/d67/f95 1 1 2026-03-09T00:04:11.896 INFO:tasks.workunit.client.0.vm03.stdout:7/631: stat d2/d1f/c18 0 2026-03-09T00:04:11.896 INFO:tasks.workunit.client.0.vm03.stdout:7/632: stat d2/d1f/d3a/d24/da4/d46/d81/d96/d8e 0 2026-03-09T00:04:11.896 INFO:tasks.workunit.client.0.vm03.stdout:7/633: getdents d2/d1f 0 2026-03-09T00:04:11.901 INFO:tasks.workunit.client.0.vm03.stdout:5/734: write d1c/d20/d55/d4f/d58/db5/f6f [785420,21367] 0 2026-03-09T00:04:11.909 INFO:tasks.workunit.client.0.vm03.stdout:5/735: chown d1c/d20/d55/d4f/d58/d5d/faa 28353 1 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.0.vm03.stdout:5/736: truncate d1c/d20/d55/d66/d70/f80 1686884 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.0.vm03.stdout:5/737: truncate d1c/d20/d55/d66/d70/f71 878881 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:9/844: mknod d1/d3/c115 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:0/985: rmdir d3/d18/d1f/d39/d49/d60 39 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:9/845: rename d1/d3/d4f/l8c to d1/d3/d4f/d52/de3/l116 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:9/846: truncate d1/f16 3793243 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:9/847: mknod d1/d4/d6e/d9/c117 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:9/848: link d1/d4/d6e/d14/d25/d85/f28 d1/d3/d4f/d91/dae/f118 0 2026-03-09T00:04:11.910 INFO:tasks.workunit.client.1.vm06.stdout:9/849: chown d1/d3/d4f/d91/d94/la1 3 1 2026-03-09T00:04:11.942 INFO:tasks.workunit.client.1.vm06.stdout:6/916: dwrite d4/d16/d53/ddf/fb8 [0,4194304] 0 2026-03-09T00:04:11.943 INFO:tasks.workunit.client.1.vm06.stdout:6/917: link d4/d16/d53/ddf/fb8 d4/d27/d3e/f11b 0 2026-03-09T00:04:11.943 INFO:tasks.workunit.client.1.vm06.stdout:6/918: mkdir d4/d16/d53/df2/d11c 0 2026-03-09T00:04:11.990 INFO:tasks.workunit.client.0.vm03.stdout:2/751: dwrite d8/d1b/d24/f86 [0,4194304] 0 2026-03-09T00:04:11.991 INFO:tasks.workunit.client.1.vm06.stdout:9/850: write d1/da7/fb9 [5162519,2496] 0 2026-03-09T00:04:11.991 INFO:tasks.workunit.client.0.vm03.stdout:2/752: unlink d8/d1b/laa 0 2026-03-09T00:04:11.991 INFO:tasks.workunit.client.0.vm03.stdout:2/753: chown d8/f5d 204763625 1 2026-03-09T00:04:12.002 INFO:tasks.workunit.client.1.vm06.stdout:9/851: write d1/d4/d6e/d14/d25/d85/d49/f69 [3459518,41771] 0 2026-03-09T00:04:12.002 INFO:tasks.workunit.client.0.vm03.stdout:3/533: dwrite d2/db/d3b/f3e [0,4194304] 0 2026-03-09T00:04:12.003 INFO:tasks.workunit.client.0.vm03.stdout:3/534: rmdir d2/db/d40/d51/d9e 0 2026-03-09T00:04:12.004 INFO:tasks.workunit.client.0.vm03.stdout:3/535: mkdir d2/db/d40/d51/da2 0 2026-03-09T00:04:12.004 INFO:tasks.workunit.client.0.vm03.stdout:3/536: fdatasync d2/db/d2d/f52 0 2026-03-09T00:04:12.020 INFO:tasks.workunit.client.1.vm06.stdout:8/956: dwrite db/f17 [0,4194304] 0 2026-03-09T00:04:12.022 INFO:tasks.workunit.client.1.vm06.stdout:8/957: readlink db/d74/d87/d100/l12d 0 2026-03-09T00:04:12.024 INFO:tasks.workunit.client.0.vm03.stdout:2/754: read d8/d26/f85 [95985,81999] 0 2026-03-09T00:04:12.025 INFO:tasks.workunit.client.0.vm03.stdout:1/811: dwrite d4/d3a/d32/da1/f113 [0,4194304] 0 2026-03-09T00:04:12.025 INFO:tasks.workunit.client.0.vm03.stdout:1/812: dread - d4/f39 zero size 2026-03-09T00:04:12.025 INFO:tasks.workunit.client.0.vm03.stdout:2/755: rmdir d8/d26/d5e/d5f/d95 39 2026-03-09T00:04:12.028 INFO:tasks.workunit.client.0.vm03.stdout:4/827: dwrite d7/d27/fc5 [0,4194304] 0 2026-03-09T00:04:12.028 INFO:tasks.workunit.client.1.vm06.stdout:8/958: dread db/d1e/f34 [0,4194304] 0 2026-03-09T00:04:12.028 INFO:tasks.workunit.client.1.vm06.stdout:8/959: creat db/dd/d24/dac/f133 x:0 0 0 2026-03-09T00:04:12.028 INFO:tasks.workunit.client.1.vm06.stdout:8/960: chown db/dd/d24/cef 6890814 1 2026-03-09T00:04:12.028 INFO:tasks.workunit.client.0.vm03.stdout:2/756: mkdir d8/d1b/d2a/d2e/df5 0 2026-03-09T00:04:12.033 INFO:tasks.workunit.client.0.vm03.stdout:1/813: read f0 [1798876,120305] 0 2026-03-09T00:04:12.033 INFO:tasks.workunit.client.0.vm03.stdout:1/814: chown d4/d3a/d32/d87/l96 357094 1 2026-03-09T00:04:12.041 INFO:tasks.workunit.client.0.vm03.stdout:4/828: mkdir d7/d6f/dcf/d105 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:4/829: stat d7/d20/c59 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.1.vm06.stdout:4/932: sync 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.1.vm06.stdout:7/913: sync 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.1.vm06.stdout:1/829: sync 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.1.vm06.stdout:4/933: read d17/d21/d4c/dc2/f11f [761512,90212] 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:2/757: unlink d8/d26/f5a 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:3/537: write d2/f9 [3251676,104586] 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:4/830: rename d7/d20/d6a/dea/d38/da9/ddc/l8b to d7/l106 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:4/831: dread - d7/d20/d6a/dde/f104 zero size 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:4/832: write d7/d20/d6a/dea/d38/fca [324013,67642] 0 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:4/833: chown d7/d27/f89 256945 1 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.0.vm03.stdout:4/834: chown d7/d20/d6a/dea/d54/c5a 526942 1 2026-03-09T00:04:12.045 INFO:tasks.workunit.client.1.vm06.stdout:1/830: creat d6/d21/d2d/d3b/d42/f114 x:0 0 0 2026-03-09T00:04:12.046 INFO:tasks.workunit.client.0.vm03.stdout:2/758: link d8/d1b/d2a/d6b/le3 d8/d1b/d6c/dd7/lf6 0 2026-03-09T00:04:12.048 INFO:tasks.workunit.client.1.vm06.stdout:7/914: write d0/df/d1a/d3a/f83 [3523374,9173] 0 2026-03-09T00:04:12.048 INFO:tasks.workunit.client.1.vm06.stdout:7/915: read d0/df/d1a/d22/f28 [122995,33007] 0 2026-03-09T00:04:12.048 INFO:tasks.workunit.client.1.vm06.stdout:7/916: write d0/df/d1a/d22/f28 [423351,35359] 0 2026-03-09T00:04:12.048 INFO:tasks.workunit.client.1.vm06.stdout:7/917: chown d0/df/d1a/d27/d4c/c58 15189383 1 2026-03-09T00:04:12.049 INFO:tasks.workunit.client.1.vm06.stdout:7/918: dread d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc9 [0,4194304] 0 2026-03-09T00:04:12.049 INFO:tasks.workunit.client.1.vm06.stdout:7/919: creat d0/df/d7b/dd2/f10f x:0 0 0 2026-03-09T00:04:12.050 INFO:tasks.workunit.client.0.vm03.stdout:4/835: link d7/d20/d6a/dea/d54/cc0 d7/d20/d6a/dea/d4e/c107 0 2026-03-09T00:04:12.050 INFO:tasks.workunit.client.0.vm03.stdout:2/759: mknod d8/d26/d5e/d5f/d95/cf7 0 2026-03-09T00:04:12.050 INFO:tasks.workunit.client.1.vm06.stdout:4/934: unlink d17/d24/f3a 0 2026-03-09T00:04:12.050 INFO:tasks.workunit.client.1.vm06.stdout:4/935: truncate d17/d21/d4c/f87 1343500 0 2026-03-09T00:04:12.050 INFO:tasks.workunit.client.1.vm06.stdout:4/936: creat d17/d24/d3b/dbf/dea/f13d x:0 0 0 2026-03-09T00:04:12.050 INFO:tasks.workunit.client.1.vm06.stdout:1/831: symlink d6/d4c/d79/l115 0 2026-03-09T00:04:12.053 INFO:tasks.workunit.client.0.vm03.stdout:3/538: link d2/db/d3b/c50 d2/ca3 0 2026-03-09T00:04:12.054 INFO:tasks.workunit.client.0.vm03.stdout:8/696: dwrite d7/df/d1a/d40/d58/fb6 [0,4194304] 0 2026-03-09T00:04:12.056 INFO:tasks.workunit.client.0.vm03.stdout:4/836: getdents d7/de6 0 2026-03-09T00:04:12.056 INFO:tasks.workunit.client.0.vm03.stdout:4/837: stat d7/f5d 0 2026-03-09T00:04:12.056 INFO:tasks.workunit.client.0.vm03.stdout:4/838: fdatasync d7/d20/d6a/d77/f82 0 2026-03-09T00:04:12.057 INFO:tasks.workunit.client.1.vm06.stdout:7/920: creat d0/df/d1a/d27/d70/f110 x:0 0 0 2026-03-09T00:04:12.058 INFO:tasks.workunit.client.0.vm03.stdout:2/760: symlink d8/d26/lf8 0 2026-03-09T00:04:12.069 INFO:tasks.workunit.client.1.vm06.stdout:4/937: creat d17/d24/d3b/d75/d125/f13e x:0 0 0 2026-03-09T00:04:12.071 INFO:tasks.workunit.client.1.vm06.stdout:7/921: symlink d0/d39/l111 0 2026-03-09T00:04:12.071 INFO:tasks.workunit.client.1.vm06.stdout:7/922: stat d0/df/d1a/d27/d4c/l46 0 2026-03-09T00:04:12.071 INFO:tasks.workunit.client.1.vm06.stdout:4/938: creat d17/d24/d3b/dbf/ddf/df5/d133/f13f x:0 0 0 2026-03-09T00:04:12.071 INFO:tasks.workunit.client.1.vm06.stdout:4/939: dread d17/d21/d4c/d50/ff4 [0,4194304] 0 2026-03-09T00:04:12.071 INFO:tasks.workunit.client.0.vm03.stdout:3/539: link d2/f5 d2/db/d56/fa4 0 2026-03-09T00:04:12.071 INFO:tasks.workunit.client.0.vm03.stdout:3/540: dread - d2/db/d3b/f6c zero size 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:8/697: creat d7/df/d1a/d40/db3/dba/dc3/fd3 x:0 0 0 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:3/541: getdents d2/db/d3b 0 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:3/542: chown d2/db/d3b/c79 4986 1 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:3/543: chown d2/f5 4 1 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:3/544: chown d2/db/f13 3 1 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:3/545: unlink d2/db/d40/d44/l98 0 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:3/546: dread - d2/db/d3b/d5d/d6d/d72/f86 zero size 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:4/839: dread d7/d20/d6a/dea/d54/fc6 [0,4194304] 0 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.0.vm03.stdout:4/840: chown d7/l1a 4992975 1 2026-03-09T00:04:12.072 INFO:tasks.workunit.client.1.vm06.stdout:0/986: dwrite d3/f19 [0,4194304] 0 2026-03-09T00:04:12.075 INFO:tasks.workunit.client.0.vm03.stdout:3/547: unlink d2/db/d3b/f62 0 2026-03-09T00:04:12.076 INFO:tasks.workunit.client.0.vm03.stdout:3/548: write d2/db/d40/f78 [182417,129550] 0 2026-03-09T00:04:12.076 INFO:tasks.workunit.client.1.vm06.stdout:4/940: creat d17/d24/d49/d5f/f140 x:0 0 0 2026-03-09T00:04:12.077 INFO:tasks.workunit.client.0.vm03.stdout:4/841: mkdir d7/d6f/d108 0 2026-03-09T00:04:12.078 INFO:tasks.workunit.client.1.vm06.stdout:0/987: getdents d3/d18/d2c/d2d/d74 0 2026-03-09T00:04:12.084 INFO:tasks.workunit.client.0.vm03.stdout:4/842: rmdir d7/d6f/dcf/d105 0 2026-03-09T00:04:12.091 INFO:tasks.workunit.client.0.vm03.stdout:4/843: creat d7/d20/d6a/dde/f109 x:0 0 0 2026-03-09T00:04:12.091 INFO:tasks.workunit.client.0.vm03.stdout:4/844: chown d7/d6f/da5/cad 421 1 2026-03-09T00:04:12.092 INFO:tasks.workunit.client.0.vm03.stdout:4/845: unlink d7/d20/d6a/dea/d54/d58/c80 0 2026-03-09T00:04:12.092 INFO:tasks.workunit.client.1.vm06.stdout:4/941: read d17/d24/d49/de4/db0/fe0 [569866,13736] 0 2026-03-09T00:04:12.092 INFO:tasks.workunit.client.1.vm06.stdout:4/942: mknod d17/d24/d49/d5f/c141 0 2026-03-09T00:04:12.092 INFO:tasks.workunit.client.1.vm06.stdout:4/943: read - d17/d24/d3b/d97/db7/f106 zero size 2026-03-09T00:04:12.092 INFO:tasks.workunit.client.1.vm06.stdout:4/944: truncate d17/d24/d3b/d5e/d7a/fde 827853 0 2026-03-09T00:04:12.096 INFO:tasks.workunit.client.1.vm06.stdout:4/945: rename d17/d21/d4c/dc2/ce8 to d17/d24/d49/c142 0 2026-03-09T00:04:12.096 INFO:tasks.workunit.client.1.vm06.stdout:4/946: chown d17/d21/caa 31314 1 2026-03-09T00:04:12.096 INFO:tasks.workunit.client.1.vm06.stdout:4/947: write d17/f20 [2398386,35254] 0 2026-03-09T00:04:12.097 INFO:tasks.workunit.client.0.vm03.stdout:7/634: dwrite d2/d4/f13 [0,4194304] 0 2026-03-09T00:04:12.097 INFO:tasks.workunit.client.1.vm06.stdout:4/948: rename d17/d5b/d8f to d17/d134/d143 0 2026-03-09T00:04:12.097 INFO:tasks.workunit.client.1.vm06.stdout:4/949: fdatasync f14 0 2026-03-09T00:04:12.098 INFO:tasks.workunit.client.0.vm03.stdout:7/635: rename d2/d1f/d3a/c2d to d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/cbc 0 2026-03-09T00:04:12.103 INFO:tasks.workunit.client.0.vm03.stdout:7/636: chown d2/d4 14080 1 2026-03-09T00:04:12.103 INFO:tasks.workunit.client.0.vm03.stdout:7/637: rename d2/d1f/d35/c6f to d2/d1f/d35/cbd 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.0.vm03.stdout:7/638: creat d2/d4/db7/d67/d6b/fbe x:0 0 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.0.vm03.stdout:7/639: symlink d2/d4/d1e/d5e/lbf 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/950: mkdir d17/d21/d4c/d66/d68/dbe/d144 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/951: rmdir d17/d134 39 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/952: write d17/d24/d3b/d5e/d7a/fde [893073,127650] 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/953: fdatasync d17/d21/d4c/d66/d68/dbe/f101 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/954: truncate d17/d24/d49/d5f/f140 804705 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/955: write d17/d24/d3b/dbf/ddf/df5/d133/f13f [614968,52875] 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/956: mkdir d17/d134/d145 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/957: truncate d17/d24/d3b/d75/f9e 899415 0 2026-03-09T00:04:12.104 INFO:tasks.workunit.client.1.vm06.stdout:4/958: truncate d17/d24/d3b/d97/db7/df1/f10d 226953 0 2026-03-09T00:04:12.116 INFO:tasks.workunit.client.0.vm03.stdout:8/698: dread d7/f48 [0,4194304] 0 2026-03-09T00:04:12.119 INFO:tasks.workunit.client.0.vm03.stdout:0/725: write d2/da/dd/d49/d6c/d4b/d55/f78 [754918,60089] 0 2026-03-09T00:04:12.120 INFO:tasks.workunit.client.0.vm03.stdout:0/726: read d2/da/d1a/f91 [626287,14039] 0 2026-03-09T00:04:12.122 INFO:tasks.workunit.client.0.vm03.stdout:8/699: mkdir d7/df/d1a/d40/db3/dd4 0 2026-03-09T00:04:12.123 INFO:tasks.workunit.client.0.vm03.stdout:0/727: creat d2/da/d76/d8a/f108 x:0 0 0 2026-03-09T00:04:12.126 INFO:tasks.workunit.client.0.vm03.stdout:8/700: dread d7/df/d1a/f2e [0,4194304] 0 2026-03-09T00:04:12.126 INFO:tasks.workunit.client.0.vm03.stdout:8/701: write d7/df/d1a/d40/db3/dba/d38/d60/f71 [3330832,114999] 0 2026-03-09T00:04:12.129 INFO:tasks.workunit.client.0.vm03.stdout:8/702: rename d7/df/d1a/d40/db3/dba/dad/fc6 to d7/df/d1a/d2b/d62/fd5 0 2026-03-09T00:04:12.136 INFO:tasks.workunit.client.0.vm03.stdout:5/738: dwrite d1c/d20/d55/f5a [0,4194304] 0 2026-03-09T00:04:12.138 INFO:tasks.workunit.client.0.vm03.stdout:8/703: write d7/df/d1a/fc4 [1130334,113862] 0 2026-03-09T00:04:12.144 INFO:tasks.workunit.client.0.vm03.stdout:5/739: rename d1c/d20/f65 to d1c/d20/d55/d66/d6b/de3/ff4 0 2026-03-09T00:04:12.162 INFO:tasks.workunit.client.0.vm03.stdout:8/704: rmdir d7/df/d1a/d40/db3/dba/d3f 39 2026-03-09T00:04:12.162 INFO:tasks.workunit.client.0.vm03.stdout:5/740: unlink d1c/f29 0 2026-03-09T00:04:12.162 INFO:tasks.workunit.client.0.vm03.stdout:8/705: truncate d7/df/f53 1974725 0 2026-03-09T00:04:12.162 INFO:tasks.workunit.client.0.vm03.stdout:5/741: symlink d1c/d20/d55/d4f/d58/d73/d76/d91/lf5 0 2026-03-09T00:04:12.162 INFO:tasks.workunit.client.0.vm03.stdout:5/742: creat d1c/d20/d55/ff6 x:0 0 0 2026-03-09T00:04:12.171 INFO:tasks.workunit.client.1.vm06.stdout:8/961: dwrite db/d53/d70/d38/fa8 [0,4194304] 0 2026-03-09T00:04:12.177 INFO:tasks.workunit.client.1.vm06.stdout:8/962: write db/f114 [1638438,16756] 0 2026-03-09T00:04:12.177 INFO:tasks.workunit.client.1.vm06.stdout:8/963: read - db/d53/d70/d38/f12b zero size 2026-03-09T00:04:12.177 INFO:tasks.workunit.client.1.vm06.stdout:8/964: creat db/dd/d24/d63/f134 x:0 0 0 2026-03-09T00:04:12.178 INFO:tasks.workunit.client.1.vm06.stdout:8/965: mkdir db/d53/d70/d38/d135 0 2026-03-09T00:04:12.200 INFO:tasks.workunit.client.1.vm06.stdout:8/966: chown db/d74/d78/d98/db6/ff0 348033028 1 2026-03-09T00:04:12.200 INFO:tasks.workunit.client.1.vm06.stdout:8/967: mknod db/d74/d78/d98/db6/dc7/d10f/c136 0 2026-03-09T00:04:12.200 INFO:tasks.workunit.client.1.vm06.stdout:8/968: unlink db/d74/d87/d100/d10a/c129 0 2026-03-09T00:04:12.208 INFO:tasks.workunit.client.1.vm06.stdout:4/959: dread d17/d24/d3b/d5e/d7a/fde [0,4194304] 0 2026-03-09T00:04:12.208 INFO:tasks.workunit.client.1.vm06.stdout:4/960: truncate d17/d24/d49/d5f/f76 1397032 0 2026-03-09T00:04:12.209 INFO:tasks.workunit.client.1.vm06.stdout:4/961: creat d17/d21/d4c/d50/d12f/f146 x:0 0 0 2026-03-09T00:04:12.209 INFO:tasks.workunit.client.1.vm06.stdout:4/962: write d17/d24/d49/d5f/db2/ff2 [883172,39166] 0 2026-03-09T00:04:12.209 INFO:tasks.workunit.client.1.vm06.stdout:4/963: creat d17/d24/d49/d5f/db2/f147 x:0 0 0 2026-03-09T00:04:12.211 INFO:tasks.workunit.client.1.vm06.stdout:4/964: mkdir d17/d21/d4c/dc2/d148 0 2026-03-09T00:04:12.227 INFO:tasks.workunit.client.1.vm06.stdout:4/965: creat d17/d24/d3b/dbf/dea/f149 x:0 0 0 2026-03-09T00:04:12.227 INFO:tasks.workunit.client.1.vm06.stdout:4/966: getdents d17/d24/d3b/d5e/d127 0 2026-03-09T00:04:12.227 INFO:tasks.workunit.client.1.vm06.stdout:4/967: write d17/d21/d4c/d66/ff8 [1736429,113842] 0 2026-03-09T00:04:12.228 INFO:tasks.workunit.client.0.vm03.stdout:3/549: dwrite d2/db/f1a [0,4194304] 0 2026-03-09T00:04:12.229 INFO:tasks.workunit.client.0.vm03.stdout:3/550: chown d2/db/d40/f78 3 1 2026-03-09T00:04:12.229 INFO:tasks.workunit.client.0.vm03.stdout:3/551: stat d2/db/d40/d58/f7f 0 2026-03-09T00:04:12.230 INFO:tasks.workunit.client.0.vm03.stdout:3/552: rename d2/db/d3b/d5d/d6d to d2/db/d3b/d5f/da5 0 2026-03-09T00:04:12.231 INFO:tasks.workunit.client.0.vm03.stdout:3/553: unlink d2/db/d40/d44/f9f 0 2026-03-09T00:04:12.231 INFO:tasks.workunit.client.0.vm03.stdout:3/554: truncate d2/db/d2d/d55/f6f 150224 0 2026-03-09T00:04:12.249 INFO:tasks.workunit.client.1.vm06.stdout:1/832: dwrite d6/d21/d2d/d37/d6d/dd7/ff6 [0,4194304] 0 2026-03-09T00:04:12.249 INFO:tasks.workunit.client.0.vm03.stdout:2/761: dwrite d8/f9b [0,4194304] 0 2026-03-09T00:04:12.250 INFO:tasks.workunit.client.0.vm03.stdout:1/815: dwrite d4/d15/d77/dce/df6/fec [0,4194304] 0 2026-03-09T00:04:12.250 INFO:tasks.workunit.client.0.vm03.stdout:2/762: dread d8/d26/d5e/f7c [0,4194304] 0 2026-03-09T00:04:12.250 INFO:tasks.workunit.client.0.vm03.stdout:2/763: dread - d8/d1b/d2a/d6b/dc6/ff4 zero size 2026-03-09T00:04:12.259 INFO:tasks.workunit.client.0.vm03.stdout:1/816: mkdir d4/d3a/d8f/d104/d117 0 2026-03-09T00:04:12.260 INFO:tasks.workunit.client.0.vm03.stdout:1/817: truncate d4/d3a/d8f/fc4 278124 0 2026-03-09T00:04:12.261 INFO:tasks.workunit.client.0.vm03.stdout:2/764: rename d8/d1b/d2a/d6b/d50/d8a/fa3 to d8/d1b/d2a/d2e/df5/ff9 0 2026-03-09T00:04:12.261 INFO:tasks.workunit.client.0.vm03.stdout:2/765: stat d8/d1b/d24/da5/dda/def 0 2026-03-09T00:04:12.261 INFO:tasks.workunit.client.0.vm03.stdout:2/766: read d8/d26/d5e/d6f/d97/f1c [9775,6639] 0 2026-03-09T00:04:12.264 INFO:tasks.workunit.client.0.vm03.stdout:8/706: dwrite d7/df/d1a/d40/db3/dba/d38/d4c/d98/fa7 [0,4194304] 0 2026-03-09T00:04:12.265 INFO:tasks.workunit.client.0.vm03.stdout:7/640: dwrite d2/d1f/f62 [0,4194304] 0 2026-03-09T00:04:12.265 INFO:tasks.workunit.client.0.vm03.stdout:7/641: chown d2/d1f/d3a/d24/da4/f47 228292 1 2026-03-09T00:04:12.269 INFO:tasks.workunit.client.0.vm03.stdout:7/642: dread d2/d4/f2e [0,4194304] 0 2026-03-09T00:04:12.269 INFO:tasks.workunit.client.0.vm03.stdout:7/643: truncate d2/d1f/d3a/d24/da4/f47 5071989 0 2026-03-09T00:04:12.275 INFO:tasks.workunit.client.1.vm06.stdout:0/988: dwrite d3/d18/d1f/d39/d3b/df9/df2/d73/f149 [0,4194304] 0 2026-03-09T00:04:12.278 INFO:tasks.workunit.client.0.vm03.stdout:4/846: dwrite d7/d20/d6a/dea/d38/da9/ddc/f65 [0,4194304] 0 2026-03-09T00:04:12.278 INFO:tasks.workunit.client.0.vm03.stdout:4/847: getdents d7/de6 0 2026-03-09T00:04:12.280 INFO:tasks.workunit.client.0.vm03.stdout:1/818: mknod d4/d15/dae/d101/c118 0 2026-03-09T00:04:12.283 INFO:tasks.workunit.client.0.vm03.stdout:4/848: dread d7/d20/d6a/dea/f2a [0,4194304] 0 2026-03-09T00:04:12.296 INFO:tasks.workunit.client.0.vm03.stdout:2/767: mkdir d8/d1b/d8f/dfa 0 2026-03-09T00:04:12.306 INFO:tasks.workunit.client.1.vm06.stdout:0/989: link d3/d18/d2c/d2d/d74/da8/d109/l111 d3/d18/d2c/d2d/d74/dc7/d14f/l150 0 2026-03-09T00:04:12.308 INFO:tasks.workunit.client.0.vm03.stdout:8/707: rename l5 to d7/df/d1a/d40/db3/dba/d3f/ld6 0 2026-03-09T00:04:12.309 INFO:tasks.workunit.client.0.vm03.stdout:8/708: write d7/df/d1a/d2b/d62/fd5 [4924893,39819] 0 2026-03-09T00:04:12.309 INFO:tasks.workunit.client.0.vm03.stdout:4/849: mknod d7/d20/d6a/dea/d38/dfb/c10a 0 2026-03-09T00:04:12.314 INFO:tasks.workunit.client.0.vm03.stdout:8/709: stat d7/c8 0 2026-03-09T00:04:12.314 INFO:tasks.workunit.client.1.vm06.stdout:0/990: mkdir d3/d18/d1f/d39/d49/d60/d151 0 2026-03-09T00:04:12.315 INFO:tasks.workunit.client.1.vm06.stdout:0/991: creat d3/d18/d2c/d2d/d74/d90/f152 x:0 0 0 2026-03-09T00:04:12.336 INFO:tasks.workunit.client.0.vm03.stdout:0/728: dwrite d2/da/dd/d49/d6c/d4b/d55/ffa [0,4194304] 0 2026-03-09T00:04:12.337 INFO:tasks.workunit.client.0.vm03.stdout:0/729: unlink d2/da/d1a/fb7 0 2026-03-09T00:04:12.338 INFO:tasks.workunit.client.0.vm03.stdout:0/730: mkdir d2/da/dd/d49/d6c/d4b/d55/d6f/dad/de8/d109 0 2026-03-09T00:04:12.338 INFO:tasks.workunit.client.0.vm03.stdout:0/731: fsync d2/f7f 0 2026-03-09T00:04:12.338 INFO:tasks.workunit.client.0.vm03.stdout:0/732: chown d2/da/d1a/fd5 41055 1 2026-03-09T00:04:12.338 INFO:tasks.workunit.client.0.vm03.stdout:0/733: truncate d2/da/d76/fd9 1044137 0 2026-03-09T00:04:12.338 INFO:tasks.workunit.client.0.vm03.stdout:0/734: chown d2/da/dd/d49/d6c/da6/dda/db5/fb6 124423105 1 2026-03-09T00:04:12.339 INFO:tasks.workunit.client.0.vm03.stdout:0/735: unlink d2/da/d76/d8a/d8f/db8/lc7 0 2026-03-09T00:04:12.339 INFO:tasks.workunit.client.0.vm03.stdout:0/736: dread - d2/da/dd/d49/d6c/da6/dda/db5/ff1 zero size 2026-03-09T00:04:12.339 INFO:tasks.workunit.client.0.vm03.stdout:0/737: chown d2/f59 1393052 1 2026-03-09T00:04:12.340 INFO:tasks.workunit.client.0.vm03.stdout:0/738: creat d2/da/d36/ddf/f10a x:0 0 0 2026-03-09T00:04:12.342 INFO:tasks.workunit.client.0.vm03.stdout:0/739: link d2/da/dd/d49/d6c/d4b/d55/d6f/ff8 d2/da/dd/d6e/f10b 0 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/710: dread d7/df/d1a/d40/db3/dba/d3f/f8f [0,4194304] 0 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/711: truncate d7/df/d1a/d40/db3/dba/f24 854933 0 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/712: creat d7/df/d1a/d40/db3/dba/d38/d91/fd7 x:0 0 0 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/713: write d7/df/d1a/d40/db3/dba/d38/d4c/fb2 [695044,82367] 0 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/714: stat d7/df/d1a/d40/db3/dba/d3f/f7d 0 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/715: chown d7/df/d1a/d40/db3/dba/d3f/f59 657421211 1 2026-03-09T00:04:12.354 INFO:tasks.workunit.client.0.vm03.stdout:8/716: fsync d7/df/d1a/d40/db3/dba/d38/f3e 0 2026-03-09T00:04:12.368 INFO:tasks.workunit.client.0.vm03.stdout:6/698: sync 2026-03-09T00:04:12.374 INFO:tasks.workunit.client.0.vm03.stdout:6/699: symlink d13/d35/d72/le6 0 2026-03-09T00:04:12.374 INFO:tasks.workunit.client.0.vm03.stdout:6/700: mknod d13/d1e/d44/d4a/ce7 0 2026-03-09T00:04:12.376 INFO:tasks.workunit.client.0.vm03.stdout:6/701: rename d13/d1e/d44/d4a/d52/l7b to d13/d35/le8 0 2026-03-09T00:04:12.376 INFO:tasks.workunit.client.0.vm03.stdout:6/702: write d13/d35/f68 [1220108,101432] 0 2026-03-09T00:04:12.387 INFO:tasks.workunit.client.0.vm03.stdout:9/688: sync 2026-03-09T00:04:12.388 INFO:tasks.workunit.client.0.vm03.stdout:9/689: creat d15/d1c/d36/fe4 x:0 0 0 2026-03-09T00:04:12.388 INFO:tasks.workunit.client.0.vm03.stdout:9/690: mknod d15/d1c/d28/de1/ce5 0 2026-03-09T00:04:12.389 INFO:tasks.workunit.client.0.vm03.stdout:9/691: truncate d15/d1c/d36/d4d/fad 1019358 0 2026-03-09T00:04:12.389 INFO:tasks.workunit.client.0.vm03.stdout:9/692: creat d15/d1c/d21/d54/d87/d93/dcf/fe6 x:0 0 0 2026-03-09T00:04:12.389 INFO:tasks.workunit.client.0.vm03.stdout:9/693: stat d15 0 2026-03-09T00:04:12.389 INFO:tasks.workunit.client.0.vm03.stdout:9/694: read - d15/d1c/d21/d54/d87/fd6 zero size 2026-03-09T00:04:12.408 INFO:tasks.workunit.client.0.vm03.stdout:7/644: dwrite d2/d1f/d3a/d24/da4/d46/d54/f77 [0,4194304] 0 2026-03-09T00:04:12.410 INFO:tasks.workunit.client.0.vm03.stdout:7/645: mknod d2/d1f/cc0 0 2026-03-09T00:04:12.410 INFO:tasks.workunit.client.0.vm03.stdout:7/646: write d2/d1f/d3a/d24/da4/d46/d81/d96/f3f [3543326,11214] 0 2026-03-09T00:04:12.411 INFO:tasks.workunit.client.0.vm03.stdout:7/647: mknod d2/d1f/d3a/d24/da4/d46/d81/d96/d80/cc1 0 2026-03-09T00:04:12.411 INFO:tasks.workunit.client.0.vm03.stdout:7/648: read d2/d4/fb [7916474,74947] 0 2026-03-09T00:04:12.411 INFO:tasks.workunit.client.0.vm03.stdout:7/649: symlink d2/d1f/d3a/d24/da4/d46/d81/lc2 0 2026-03-09T00:04:12.412 INFO:tasks.workunit.client.0.vm03.stdout:5/743: getdents d1c/d20/d55/d66/d6b/de3 0 2026-03-09T00:04:12.413 INFO:tasks.workunit.client.0.vm03.stdout:7/650: rename d2/d1f/c52 to d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/cc3 0 2026-03-09T00:04:12.413 INFO:tasks.workunit.client.0.vm03.stdout:7/651: write d2/d4/d1e/f97 [412253,61699] 0 2026-03-09T00:04:12.413 INFO:tasks.workunit.client.0.vm03.stdout:7/652: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d37/fb0 7 1 2026-03-09T00:04:12.417 INFO:tasks.workunit.client.0.vm03.stdout:5/744: rmdir d1c/d20/d55/d43/da7 0 2026-03-09T00:04:12.431 INFO:tasks.workunit.client.0.vm03.stdout:1/819: dwrite d4/d5e/f82 [0,4194304] 0 2026-03-09T00:04:12.432 INFO:tasks.workunit.client.0.vm03.stdout:1/820: creat d4/d3a/d32/d87/d111/f119 x:0 0 0 2026-03-09T00:04:12.432 INFO:tasks.workunit.client.0.vm03.stdout:1/821: truncate d4/d3a/f2c 1709391 0 2026-03-09T00:04:12.441 INFO:tasks.workunit.client.1.vm06.stdout:9/852: dwrite d1/d4/d6e/d9/f3d [0,4194304] 0 2026-03-09T00:04:12.443 INFO:tasks.workunit.client.1.vm06.stdout:9/853: rename d1/d4/d6e/d14/d25/c6a to d1/d3/d4f/d91/d94/ddf/dfe/c119 0 2026-03-09T00:04:12.444 INFO:tasks.workunit.client.1.vm06.stdout:9/854: mkdir d1/d3/d4f/d52/de3/d11a 0 2026-03-09T00:04:12.445 INFO:tasks.workunit.client.1.vm06.stdout:9/855: rmdir d1/d3/d4f/d52/de3/de5 39 2026-03-09T00:04:12.446 INFO:tasks.workunit.client.1.vm06.stdout:9/856: link d1/d4/d6e/d14/d25/d85/f90 d1/d3/d4f/d52/f11b 0 2026-03-09T00:04:12.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:12 vm06.local ceph-mon[58395]: Upgrade: Updating mgr.vm06.rzcvhn 2026-03-09T00:04:12.453 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:12 vm06.local ceph-mon[58395]: Deploying daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:04:12.462 INFO:tasks.workunit.client.1.vm06.stdout:9/857: write d1/da7/fc0 [1113817,105589] 0 2026-03-09T00:04:12.463 INFO:tasks.workunit.client.1.vm06.stdout:9/858: symlink d1/l11c 0 2026-03-09T00:04:12.467 INFO:tasks.workunit.client.1.vm06.stdout:9/859: dread d1/d3/d4f/d91/d94/fd3 [0,4194304] 0 2026-03-09T00:04:12.479 INFO:tasks.workunit.client.1.vm06.stdout:9/860: creat d1/d3/f11d x:0 0 0 2026-03-09T00:04:12.482 INFO:tasks.workunit.client.0.vm03.stdout:2/768: dwrite d8/d26/d5e/d6f/d97/f1d [0,4194304] 0 2026-03-09T00:04:12.483 INFO:tasks.workunit.client.0.vm03.stdout:2/769: symlink d8/d1b/d8f/lfb 0 2026-03-09T00:04:12.485 INFO:tasks.workunit.client.0.vm03.stdout:2/770: mkdir d8/d26/dfc 0 2026-03-09T00:04:12.485 INFO:tasks.workunit.client.0.vm03.stdout:2/771: chown d8/f15 810 1 2026-03-09T00:04:12.511 INFO:tasks.workunit.client.1.vm06.stdout:8/969: fsync db/dd/d24/d63/f134 0 2026-03-09T00:04:12.513 INFO:tasks.workunit.client.1.vm06.stdout:4/968: write d17/d24/d49/d5f/fad [1097230,36931] 0 2026-03-09T00:04:12.513 INFO:tasks.workunit.client.0.vm03.stdout:3/555: truncate d2/db/f1a 2518453 0 2026-03-09T00:04:12.515 INFO:tasks.workunit.client.1.vm06.stdout:8/970: rename db/dd/d24/f33 to db/dd/d48/f137 0 2026-03-09T00:04:12.517 INFO:tasks.workunit.client.0.vm03.stdout:7/653: dwrite d2/d1f/d3a/d24/da4/d46/f5b [0,4194304] 0 2026-03-09T00:04:12.518 INFO:tasks.workunit.client.1.vm06.stdout:4/969: link d17/d24/l44 d17/d24/d49/de4/db0/ddd/l14a 0 2026-03-09T00:04:12.518 INFO:tasks.workunit.client.0.vm03.stdout:6/703: dwrite d13/f1a [4194304,4194304] 0 2026-03-09T00:04:12.519 INFO:tasks.workunit.client.0.vm03.stdout:0/740: dwrite d2/da/d76/d8a/d8f/db8/fe6 [0,4194304] 0 2026-03-09T00:04:12.520 INFO:tasks.workunit.client.0.vm03.stdout:3/556: symlink d2/db/d2d/la6 0 2026-03-09T00:04:12.520 INFO:tasks.workunit.client.0.vm03.stdout:2/772: dread d8/d1b/d24/f2f [4194304,4194304] 0 2026-03-09T00:04:12.521 INFO:tasks.workunit.client.0.vm03.stdout:7/654: creat d2/d4/db7/daa/fc4 x:0 0 0 2026-03-09T00:04:12.521 INFO:tasks.workunit.client.0.vm03.stdout:7/655: write d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/f4a [5154779,14326] 0 2026-03-09T00:04:12.527 INFO:tasks.workunit.client.0.vm03.stdout:7/656: dread d2/d4/db7/d67/f70 [0,4194304] 0 2026-03-09T00:04:12.527 INFO:tasks.workunit.client.0.vm03.stdout:7/657: dread - d2/d4/d1e/d78/fa5 zero size 2026-03-09T00:04:12.527 INFO:tasks.workunit.client.0.vm03.stdout:7/658: readlink d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/l69 0 2026-03-09T00:04:12.527 INFO:tasks.workunit.client.0.vm03.stdout:8/717: dwrite d7/df/d1a/d2b/f5c [0,4194304] 0 2026-03-09T00:04:12.527 INFO:tasks.workunit.client.1.vm06.stdout:4/970: mkdir d17/d14b 0 2026-03-09T00:04:12.527 INFO:tasks.workunit.client.1.vm06.stdout:4/971: read d17/d5b/dac/f112 [358034,10428] 0 2026-03-09T00:04:12.530 INFO:tasks.workunit.client.0.vm03.stdout:6/704: link d13/d1e/d44/d59/d77/f96 d13/d1e/d44/d59/fe9 0 2026-03-09T00:04:12.531 INFO:tasks.workunit.client.1.vm06.stdout:4/972: rmdir d17/d134 39 2026-03-09T00:04:12.531 INFO:tasks.workunit.client.1.vm06.stdout:4/973: creat d17/d21/d4c/d66/de3/d10b/f14c x:0 0 0 2026-03-09T00:04:12.531 INFO:tasks.workunit.client.1.vm06.stdout:4/974: creat d17/d5b/dac/f14d x:0 0 0 2026-03-09T00:04:12.537 INFO:tasks.workunit.client.0.vm03.stdout:0/741: symlink d2/da/dd/d49/d6c/d4b/d55/d6f/dad/l10c 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:4/975: mknod d17/d24/d3b/d97/db7/d12c/c14e 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:4/976: chown d17/d24/d49/c23 1002 1 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:4/977: link d17/d21/f11b d17/d24/d3b/d97/db7/df1/f14f 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:4/978: fdatasync d17/d21/d32/fd6 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:4/979: creat d17/d24/d3b/dbf/ddf/df5/f150 x:0 0 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.0.vm03.stdout:3/557: symlink d2/db/d3b/d5f/da5/d72/d96/la7 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.0.vm03.stdout:2/773: symlink d8/d1b/d2a/d2e/df5/lfd 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.0.vm03.stdout:7/659: link d2/d1f/d3a/d24/l76 d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/lc5 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.0.vm03.stdout:8/718: mknod d7/df/d1a/d40/d9d/da3/dd2/cd8 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.0.vm03.stdout:8/719: fdatasync d7/df/d1a/d40/db3/dba/d3f/d95/fb4 0 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:3/937: sync 2026-03-09T00:04:12.547 INFO:tasks.workunit.client.1.vm06.stdout:7/923: sync 2026-03-09T00:04:12.548 INFO:tasks.workunit.client.1.vm06.stdout:7/924: dread - d0/df/d1a/d27/d70/d9b/f9f zero size 2026-03-09T00:04:12.548 INFO:tasks.workunit.client.1.vm06.stdout:6/919: sync 2026-03-09T00:04:12.548 INFO:tasks.workunit.client.1.vm06.stdout:6/920: stat d4/c15 0 2026-03-09T00:04:12.548 INFO:tasks.workunit.client.1.vm06.stdout:6/921: chown d4/d27/d3e/d45/dea 190 1 2026-03-09T00:04:12.550 INFO:tasks.workunit.client.1.vm06.stdout:4/980: mknod d17/d21/d4c/dc2/d148/c151 0 2026-03-09T00:04:12.553 INFO:tasks.workunit.client.1.vm06.stdout:3/938: mkdir d11/d28/d2e/d2f/d36/d8f/d12e/d13f 0 2026-03-09T00:04:12.553 INFO:tasks.workunit.client.1.vm06.stdout:3/939: getdents d11/d28/d2e/d2f/dfe 0 2026-03-09T00:04:12.553 INFO:tasks.workunit.client.1.vm06.stdout:3/940: dread - d11/d28/d4d/d89/d90/fa7 zero size 2026-03-09T00:04:12.557 INFO:tasks.workunit.client.1.vm06.stdout:3/941: dread f7 [0,4194304] 0 2026-03-09T00:04:12.558 INFO:tasks.workunit.client.0.vm03.stdout:1/822: dwrite d4/d15/f7f [0,4194304] 0 2026-03-09T00:04:12.566 INFO:tasks.workunit.client.1.vm06.stdout:4/981: getdents d17/d24/d3b/dbf/ddf/dfc 0 2026-03-09T00:04:12.566 INFO:tasks.workunit.client.1.vm06.stdout:3/942: creat d11/d117/f140 x:0 0 0 2026-03-09T00:04:12.566 INFO:tasks.workunit.client.1.vm06.stdout:4/982: stat d17/d24/d3b/dbf/ddf/df5/l10c 0 2026-03-09T00:04:12.566 INFO:tasks.workunit.client.1.vm06.stdout:3/943: rename d11/d28/d2e/d2f/d5b/d94/f102 to d11/d28/f141 0 2026-03-09T00:04:12.566 INFO:tasks.workunit.client.1.vm06.stdout:4/983: mkdir d17/d24/d49/d5f/db2/d152 0 2026-03-09T00:04:12.569 INFO:tasks.workunit.client.0.vm03.stdout:6/705: rename d13/d8f to d13/dc4/dea 0 2026-03-09T00:04:12.571 INFO:tasks.workunit.client.0.vm03.stdout:2/774: mkdir d8/d1b/d24/da5/dfe 0 2026-03-09T00:04:12.571 INFO:tasks.workunit.client.0.vm03.stdout:2/775: chown d8/d1b/d8f/ccd 96 1 2026-03-09T00:04:12.571 INFO:tasks.workunit.client.0.vm03.stdout:7/660: mkdir d2/d1f/dc6 0 2026-03-09T00:04:12.571 INFO:tasks.workunit.client.1.vm06.stdout:6/922: dread d4/d27/d3e/d78/f91 [0,4194304] 0 2026-03-09T00:04:12.572 INFO:tasks.workunit.client.0.vm03.stdout:8/720: getdents d7/df/d1a/d40/db3/dba/d38/d60 0 2026-03-09T00:04:12.575 INFO:tasks.workunit.client.0.vm03.stdout:1/823: write d4/d5e/f82 [5912708,113775] 0 2026-03-09T00:04:12.575 INFO:tasks.workunit.client.0.vm03.stdout:6/706: rename d13/d1e/f8c to d13/d35/d71/d97/da5/db1/feb 0 2026-03-09T00:04:12.575 INFO:tasks.workunit.client.0.vm03.stdout:6/707: chown d13/d1e/d44/d59/fe0 1787723 1 2026-03-09T00:04:12.576 INFO:tasks.workunit.client.0.vm03.stdout:3/558: getdents d2/db/d40/d58 0 2026-03-09T00:04:12.577 INFO:tasks.workunit.client.0.vm03.stdout:7/661: mknod d2/d1f/cc7 0 2026-03-09T00:04:12.577 INFO:tasks.workunit.client.0.vm03.stdout:8/721: stat d7/f9 0 2026-03-09T00:04:12.577 INFO:tasks.workunit.client.0.vm03.stdout:8/722: chown d7/df/d1a/d40/db3/dba/d38/d4c/d98/fa7 708 1 2026-03-09T00:04:12.577 INFO:tasks.workunit.client.0.vm03.stdout:8/723: dread - d7/df/d1a/d2b/f9f zero size 2026-03-09T00:04:12.577 INFO:tasks.workunit.client.0.vm03.stdout:8/724: readlink d7/df/d1a/d40/db3/l89 0 2026-03-09T00:04:12.578 INFO:tasks.workunit.client.0.vm03.stdout:3/559: stat d2/db/l33 0 2026-03-09T00:04:12.578 INFO:tasks.workunit.client.0.vm03.stdout:2/776: getdents d8/d26/d5e/db1 0 2026-03-09T00:04:12.578 INFO:tasks.workunit.client.0.vm03.stdout:2/777: readlink d8/d1b/d8f/la0 0 2026-03-09T00:04:12.584 INFO:tasks.workunit.client.0.vm03.stdout:0/742: dread d2/da/dd/d49/d6c/d4b/d55/f83 [0,4194304] 0 2026-03-09T00:04:12.586 INFO:tasks.workunit.client.0.vm03.stdout:7/662: creat d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/fc8 x:0 0 0 2026-03-09T00:04:12.586 INFO:tasks.workunit.client.0.vm03.stdout:8/725: mkdir d7/df/d1a/d40/dd9 0 2026-03-09T00:04:12.587 INFO:tasks.workunit.client.0.vm03.stdout:8/726: creat d7/df/d1a/d2b/fda x:0 0 0 2026-03-09T00:04:12.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:12 vm03.local ceph-mon[52346]: Upgrade: Updating mgr.vm06.rzcvhn 2026-03-09T00:04:12.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:12 vm03.local ceph-mon[52346]: Deploying daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:04:12.588 INFO:tasks.workunit.client.0.vm03.stdout:8/727: write d7/df/d1a/d40/db3/dba/d3f/f47 [675034,123952] 0 2026-03-09T00:04:12.588 INFO:tasks.workunit.client.0.vm03.stdout:8/728: mknod d7/df/d1a/d2b/d62/cdb 0 2026-03-09T00:04:12.591 INFO:tasks.workunit.client.0.vm03.stdout:7/663: dread d2/d4/f2e [0,4194304] 0 2026-03-09T00:04:12.591 INFO:tasks.workunit.client.0.vm03.stdout:7/664: chown d2/d1f/d3a/d24/da4/d46/d81/d96/f44 1332 1 2026-03-09T00:04:12.594 INFO:tasks.workunit.client.0.vm03.stdout:5/745: dwrite d1c/d20/d55/d4f/d58/d73/d9e/fe4 [0,4194304] 0 2026-03-09T00:04:12.600 INFO:tasks.workunit.client.1.vm06.stdout:6/923: write d4/f5 [4987166,102097] 0 2026-03-09T00:04:12.600 INFO:tasks.workunit.client.0.vm03.stdout:8/729: symlink d7/df/d1a/d40/db3/dba/d38/ldc 0 2026-03-09T00:04:12.600 INFO:tasks.workunit.client.0.vm03.stdout:8/730: write d7/df/d1a/f1c [3960961,80925] 0 2026-03-09T00:04:12.601 INFO:tasks.workunit.client.1.vm06.stdout:6/924: creat d4/d27/f11d x:0 0 0 2026-03-09T00:04:12.601 INFO:tasks.workunit.client.1.vm06.stdout:6/925: chown d4/d16/d53/df2/d11c 51766561 1 2026-03-09T00:04:12.602 INFO:tasks.workunit.client.1.vm06.stdout:6/926: write d4/fc [1407433,98106] 0 2026-03-09T00:04:12.602 INFO:tasks.workunit.client.1.vm06.stdout:6/927: fsync d4/d27/f31 0 2026-03-09T00:04:12.603 INFO:tasks.workunit.client.0.vm03.stdout:8/731: unlink d7/df/l17 0 2026-03-09T00:04:12.604 INFO:tasks.workunit.client.0.vm03.stdout:7/665: dread d2/d4/fb [0,4194304] 0 2026-03-09T00:04:12.606 INFO:tasks.workunit.client.1.vm06.stdout:6/928: getdents d4/d16/d46/d90 0 2026-03-09T00:04:12.606 INFO:tasks.workunit.client.1.vm06.stdout:6/929: read d4/ff [3638595,114568] 0 2026-03-09T00:04:12.606 INFO:tasks.workunit.client.0.vm03.stdout:7/666: mknod d2/d1f/d3a/d24/da4/d46/d81/d96/d37/cc9 0 2026-03-09T00:04:12.608 INFO:tasks.workunit.client.0.vm03.stdout:7/667: rename d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/c51 to d2/d4/d1e/d5e/daf/cca 0 2026-03-09T00:04:12.609 INFO:tasks.workunit.client.0.vm03.stdout:7/668: rmdir d2/d1f/d3a/d24/da4/d46/d54/d8d/dad 39 2026-03-09T00:04:12.610 INFO:tasks.workunit.client.0.vm03.stdout:7/669: creat d2/d4/db7/fcb x:0 0 0 2026-03-09T00:04:12.610 INFO:tasks.workunit.client.0.vm03.stdout:7/670: truncate d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fa1 151950 0 2026-03-09T00:04:12.610 INFO:tasks.workunit.client.0.vm03.stdout:7/671: symlink d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/db2/lcc 0 2026-03-09T00:04:12.611 INFO:tasks.workunit.client.0.vm03.stdout:7/672: link d2/d4/db7/daa/fb3 d2/fcd 0 2026-03-09T00:04:12.611 INFO:tasks.workunit.client.0.vm03.stdout:7/673: write d2/d4/db7/d67/f64 [2119871,83632] 0 2026-03-09T00:04:12.665 INFO:tasks.workunit.client.1.vm06.stdout:1/833: dwrite d6/d21/fc1 [0,4194304] 0 2026-03-09T00:04:12.665 INFO:tasks.workunit.client.1.vm06.stdout:1/834: readlink l5 0 2026-03-09T00:04:12.668 INFO:tasks.workunit.client.0.vm03.stdout:0/743: dread d2/da/dd/d49/fcb [0,4194304] 0 2026-03-09T00:04:12.668 INFO:tasks.workunit.client.0.vm03.stdout:0/744: readlink d2/da/d76/d8a/d8f/l97 0 2026-03-09T00:04:12.668 INFO:tasks.workunit.client.0.vm03.stdout:0/745: truncate d2/da/d76/fa1 1021156 0 2026-03-09T00:04:12.673 INFO:tasks.workunit.client.1.vm06.stdout:1/835: stat d6/d21/c49 0 2026-03-09T00:04:12.679 INFO:tasks.workunit.client.1.vm06.stdout:1/836: readlink d6/d21/d2d/d37/d6d/laa 0 2026-03-09T00:04:12.680 INFO:tasks.workunit.client.0.vm03.stdout:7/674: dread d2/d4/db7/d67/f64 [0,4194304] 0 2026-03-09T00:04:12.680 INFO:tasks.workunit.client.0.vm03.stdout:7/675: chown d2/d4/d1e/d85/la7 59364 1 2026-03-09T00:04:12.680 INFO:tasks.workunit.client.1.vm06.stdout:1/837: creat d6/d4c/d79/f116 x:0 0 0 2026-03-09T00:04:12.687 INFO:tasks.workunit.client.1.vm06.stdout:7/925: dwrite d0/df/d1a/d27/d4c/d40/d5b/f78 [0,4194304] 0 2026-03-09T00:04:12.695 INFO:tasks.workunit.client.0.vm03.stdout:7/676: truncate d2/d1f/d3a/f1a 4171582 0 2026-03-09T00:04:12.695 INFO:tasks.workunit.client.0.vm03.stdout:7/677: dread - d2/d4/d1e/d78/fa5 zero size 2026-03-09T00:04:12.695 INFO:tasks.workunit.client.0.vm03.stdout:7/678: fdatasync d2/d1f/f11 0 2026-03-09T00:04:12.695 INFO:tasks.workunit.client.0.vm03.stdout:7/679: fsync d2/d1f/f62 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/926: mkdir d0/df/d17/dba/d112 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/927: fdatasync d0/df/d1a/d27/d4c/d40/d51/d86/fc3 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/928: creat d0/df/d1a/d3f/f113 x:0 0 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/929: readlink d0/d55/d99/d102/l104 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/930: rmdir d0/df/d1a/d27/d4c 39 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/931: fdatasync d0/df/d7b/dd2/f10f 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/932: stat d0/df/d1a/d27/d4c/d40/d5b 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/933: mkdir d0/df/d1a/d22/d114 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/934: mknod d0/df/d1a/d27/d4c/d40/d51/c115 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/935: fdatasync d0/df/f8a 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/936: symlink d0/df/d1a/d22/l116 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/937: mkdir d0/df/d1a/d27/d117 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/938: creat d0/df/d1a/d3f/d53/f118 x:0 0 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/939: rmdir d0/df/d7b 39 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/940: truncate d0/df/d7b/dd2/f101 283777 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/941: fsync d0/df/d1a/d27/d4c/d40/d5b/fd7 0 2026-03-09T00:04:12.696 INFO:tasks.workunit.client.1.vm06.stdout:7/942: write d0/df/d1a/d3f/f113 [771110,122837] 0 2026-03-09T00:04:12.723 INFO:tasks.workunit.client.0.vm03.stdout:1/824: dwrite d4/d3a/f26 [0,4194304] 0 2026-03-09T00:04:12.724 INFO:tasks.workunit.client.0.vm03.stdout:1/825: mknod d4/d6/d52/c11a 0 2026-03-09T00:04:12.725 INFO:tasks.workunit.client.0.vm03.stdout:2/778: dwrite d8/d1b/d2a/fbb [0,4194304] 0 2026-03-09T00:04:12.727 INFO:tasks.workunit.client.0.vm03.stdout:1/826: read d4/d6/f6e [1767610,93644] 0 2026-03-09T00:04:12.747 INFO:tasks.workunit.client.0.vm03.stdout:8/732: dwrite d7/df/d1a/d40/db3/dba/d3f/f8f [0,4194304] 0 2026-03-09T00:04:12.747 INFO:tasks.workunit.client.0.vm03.stdout:8/733: fsync d7/f48 0 2026-03-09T00:04:12.761 INFO:tasks.workunit.client.0.vm03.stdout:8/734: link f6 d7/df/d1a/d40/db3/dba/d38/d60/dcd/fdd 0 2026-03-09T00:04:12.761 INFO:tasks.workunit.client.0.vm03.stdout:8/735: rename d7/f48 to d7/df/d1a/d40/db3/dba/d38/d4c/fde 0 2026-03-09T00:04:12.761 INFO:tasks.workunit.client.0.vm03.stdout:8/736: stat d7/df/l6f 0 2026-03-09T00:04:12.770 INFO:tasks.workunit.client.0.vm03.stdout:8/737: dread d7/df/d1a/f4f [0,4194304] 0 2026-03-09T00:04:12.770 INFO:tasks.workunit.client.0.vm03.stdout:5/746: dwrite d1c/d51/d6a/d75/fd2 [0,4194304] 0 2026-03-09T00:04:12.775 INFO:tasks.workunit.client.0.vm03.stdout:8/738: mkdir d7/df/d1a/d40/db3/dba/d38/d91/ddf 0 2026-03-09T00:04:12.775 INFO:tasks.workunit.client.0.vm03.stdout:0/746: dwrite d2/da/d1a/fc4 [0,4194304] 0 2026-03-09T00:04:12.776 INFO:tasks.workunit.client.0.vm03.stdout:0/747: dread d2/d5a/fbd [0,4194304] 0 2026-03-09T00:04:12.776 INFO:tasks.workunit.client.0.vm03.stdout:0/748: chown d2/da/dd/d49/d6c/da6/lc3 731 1 2026-03-09T00:04:12.776 INFO:tasks.workunit.client.0.vm03.stdout:0/749: readlink d2/da/d1a/l46 0 2026-03-09T00:04:12.780 INFO:tasks.workunit.client.0.vm03.stdout:8/739: dread d7/df/d1a/d40/f69 [0,4194304] 0 2026-03-09T00:04:12.780 INFO:tasks.workunit.client.0.vm03.stdout:8/740: write d7/df/d1a/d40/db3/dba/d38/d91/fbf [737703,48561] 0 2026-03-09T00:04:12.785 INFO:tasks.workunit.client.0.vm03.stdout:1/827: dwrite d4/d15/d5c/f9c [0,4194304] 0 2026-03-09T00:04:12.788 INFO:tasks.workunit.client.0.vm03.stdout:0/750: getdents d2/da 0 2026-03-09T00:04:12.793 INFO:tasks.workunit.client.0.vm03.stdout:1/828: mkdir d4/d3a/d3d/d46/d11b 0 2026-03-09T00:04:12.806 INFO:tasks.workunit.client.0.vm03.stdout:1/829: unlink d4/lc 0 2026-03-09T00:04:12.806 INFO:tasks.workunit.client.0.vm03.stdout:0/751: write d2/da/dd/d49/d6c/d4b/f88 [3062170,51013] 0 2026-03-09T00:04:12.806 INFO:tasks.workunit.client.0.vm03.stdout:0/752: fsync d2/da/d36/ddf/df7/f105 0 2026-03-09T00:04:12.806 INFO:tasks.workunit.client.0.vm03.stdout:0/753: dread - d2/da/dd/d49/d6c/d4b/f100 zero size 2026-03-09T00:04:12.808 INFO:tasks.workunit.client.0.vm03.stdout:1/830: dread d4/d3a/d3d/d46/f4c [0,4194304] 0 2026-03-09T00:04:12.808 INFO:tasks.workunit.client.0.vm03.stdout:0/754: creat d2/da/dd/d49/d6c/d4b/daf/f10d x:0 0 0 2026-03-09T00:04:12.808 INFO:tasks.workunit.client.0.vm03.stdout:0/755: rename d2/da/dd/d49 to d2/da/dd/d49/d6c/d4b/d55/d6f/d10e 22 2026-03-09T00:04:12.808 INFO:tasks.workunit.client.0.vm03.stdout:0/756: truncate d2/da/dd/d6e/f10b 488966 0 2026-03-09T00:04:12.811 INFO:tasks.workunit.client.0.vm03.stdout:0/757: link d2/da/dd/d49/d6c/l64 d2/da/d76/d8a/d8f/db8/l10f 0 2026-03-09T00:04:12.823 INFO:tasks.workunit.client.0.vm03.stdout:1/831: write d4/d15/dae/d101/f108 [1952034,128390] 0 2026-03-09T00:04:12.824 INFO:tasks.workunit.client.0.vm03.stdout:5/747: dwrite fe [0,4194304] 0 2026-03-09T00:04:12.824 INFO:tasks.workunit.client.0.vm03.stdout:5/748: truncate d1c/d20/d55/d4f/d58/d5d/fb6 849206 0 2026-03-09T00:04:12.824 INFO:tasks.workunit.client.0.vm03.stdout:5/749: write d1c/d20/d55/d4f/fc1 [4163637,26768] 0 2026-03-09T00:04:12.824 INFO:tasks.workunit.client.0.vm03.stdout:5/750: write f12 [4658370,21492] 0 2026-03-09T00:04:12.824 INFO:tasks.workunit.client.0.vm03.stdout:5/751: write d1c/d20/d55/d66/d6b/de3/ff4 [316394,47244] 0 2026-03-09T00:04:12.828 INFO:tasks.workunit.client.0.vm03.stdout:5/752: rename d1c/d20/d55/d4f/d58/d73/d9e/da5 to d1c/d20/d55/d4f/d58/db5/df7 0 2026-03-09T00:04:12.833 INFO:tasks.workunit.client.0.vm03.stdout:5/753: rename d1c/d20/d56/d74/f84 to d1c/d20/d55/d4f/d58/db5/df7/ff8 0 2026-03-09T00:04:12.834 INFO:tasks.workunit.client.0.vm03.stdout:5/754: creat d1c/d20/d55/d66/d70/ff9 x:0 0 0 2026-03-09T00:04:12.839 INFO:tasks.workunit.client.0.vm03.stdout:8/741: dwrite d7/f34 [0,4194304] 0 2026-03-09T00:04:12.839 INFO:tasks.workunit.client.0.vm03.stdout:8/742: chown d7/df/l70 4602962 1 2026-03-09T00:04:12.842 INFO:tasks.workunit.client.0.vm03.stdout:8/743: mkdir d7/df/d1a/d40/de0 0 2026-03-09T00:04:12.850 INFO:tasks.workunit.client.0.vm03.stdout:8/744: read d7/df/d1a/f93 [123799,23483] 0 2026-03-09T00:04:12.879 INFO:tasks.workunit.client.1.vm06.stdout:3/944: getdents d11/d28/d2e/d2f/d36/d8f/d12e 0 2026-03-09T00:04:12.941 INFO:tasks.workunit.client.0.vm03.stdout:5/755: dwrite d1c/d20/d55/d4f/d58/db5/f6f [0,4194304] 0 2026-03-09T00:04:12.941 INFO:tasks.workunit.client.0.vm03.stdout:5/756: fdatasync d1c/d20/d55/d4f/d58/db5/f6f 0 2026-03-09T00:04:12.941 INFO:tasks.workunit.client.0.vm03.stdout:5/757: write d1c/d67/fda [512973,48970] 0 2026-03-09T00:04:12.941 INFO:tasks.workunit.client.0.vm03.stdout:5/758: chown d1c/d20/d55/d66/c89 6 1 2026-03-09T00:04:12.943 INFO:tasks.workunit.client.0.vm03.stdout:5/759: mknod d1c/d20/dc0/cfa 0 2026-03-09T00:04:12.947 INFO:tasks.workunit.client.0.vm03.stdout:5/760: write d1c/d20/f33 [1633920,20342] 0 2026-03-09T00:04:12.948 INFO:tasks.workunit.client.0.vm03.stdout:5/761: stat d1c/d20/f25 0 2026-03-09T00:04:12.950 INFO:tasks.workunit.client.0.vm03.stdout:5/762: creat d1c/d20/d97/ded/ffb x:0 0 0 2026-03-09T00:04:12.955 INFO:tasks.workunit.client.0.vm03.stdout:0/758: dwrite d2/da/dd/d49/d6c/d4b/f100 [0,4194304] 0 2026-03-09T00:04:12.955 INFO:tasks.workunit.client.0.vm03.stdout:0/759: fsync d2/da/d36/da4/f3f 0 2026-03-09T00:04:12.959 INFO:tasks.workunit.client.0.vm03.stdout:8/745: dwrite d7/df/d1a/d2b/d62/fce [0,4194304] 0 2026-03-09T00:04:12.960 INFO:tasks.workunit.client.0.vm03.stdout:1/832: dwrite d4/d15/d77/f7a [0,4194304] 0 2026-03-09T00:04:12.975 INFO:tasks.workunit.client.0.vm03.stdout:0/760: symlink d2/da/l110 0 2026-03-09T00:04:12.980 INFO:tasks.workunit.client.0.vm03.stdout:8/746: rename d7/df/d1a/d2b/f5c to d7/df/d1a/d40/d58/fe1 0 2026-03-09T00:04:12.989 INFO:tasks.workunit.client.0.vm03.stdout:0/761: rename d2/d5a to d2/d111 0 2026-03-09T00:04:12.989 INFO:tasks.workunit.client.0.vm03.stdout:0/762: dread d2/da/dd/d49/fcb [0,4194304] 0 2026-03-09T00:04:12.991 INFO:tasks.workunit.client.0.vm03.stdout:8/747: mkdir d7/df/d1a/d40/db3/dba/d38/d4c/d98/de2 0 2026-03-09T00:04:12.995 INFO:tasks.workunit.client.0.vm03.stdout:8/748: creat d7/df/d1a/d40/db3/dba/dc3/fe3 x:0 0 0 2026-03-09T00:04:12.995 INFO:tasks.workunit.client.0.vm03.stdout:0/763: link d2/da/dd/f7b d2/da/d76/d8a/d8f/db8/f112 0 2026-03-09T00:04:12.995 INFO:tasks.workunit.client.0.vm03.stdout:0/764: creat d2/da/dd/d49/d6c/d4b/d55/d6f/dad/de8/f113 x:0 0 0 2026-03-09T00:04:12.996 INFO:tasks.workunit.client.0.vm03.stdout:0/765: mknod d2/da/dd/c114 0 2026-03-09T00:04:12.997 INFO:tasks.workunit.client.0.vm03.stdout:0/766: creat d2/da/d36/f115 x:0 0 0 2026-03-09T00:04:13.000 INFO:tasks.workunit.client.0.vm03.stdout:8/749: dread d7/df/d1a/f33 [0,4194304] 0 2026-03-09T00:04:13.001 INFO:tasks.workunit.client.0.vm03.stdout:8/750: symlink d7/df/d1a/d40/db3/dba/dc3/le4 0 2026-03-09T00:04:13.001 INFO:tasks.workunit.client.0.vm03.stdout:8/751: truncate d7/df/d1a/d40/d58/f8c 224878 0 2026-03-09T00:04:13.002 INFO:tasks.workunit.client.0.vm03.stdout:8/752: mkdir d7/df/d1a/d40/d58/de5 0 2026-03-09T00:04:13.006 INFO:tasks.workunit.client.1.vm06.stdout:4/984: dwrite d17/d24/d3b/d75/f9e [0,4194304] 0 2026-03-09T00:04:13.006 INFO:tasks.workunit.client.1.vm06.stdout:4/985: fsync d17/d21/d4c/d66/fa2 0 2026-03-09T00:04:13.008 INFO:tasks.workunit.client.0.vm03.stdout:8/753: mknod d7/df/d1a/d40/db3/dba/d38/d91/ce6 0 2026-03-09T00:04:13.008 INFO:tasks.workunit.client.0.vm03.stdout:1/833: dwrite d4/d15/f17 [0,4194304] 0 2026-03-09T00:04:13.016 INFO:tasks.workunit.client.1.vm06.stdout:8/971: sync 2026-03-09T00:04:13.016 INFO:tasks.workunit.client.1.vm06.stdout:9/861: sync 2026-03-09T00:04:13.016 INFO:tasks.workunit.client.1.vm06.stdout:0/992: sync 2026-03-09T00:04:13.016 INFO:tasks.workunit.client.1.vm06.stdout:0/993: creat d3/d18/d2c/d2d/d31/f153 x:0 0 0 2026-03-09T00:04:13.017 INFO:tasks.workunit.client.1.vm06.stdout:8/972: mkdir db/d1e/d138 0 2026-03-09T00:04:13.028 INFO:tasks.workunit.client.1.vm06.stdout:9/862: creat d1/d3/d4f/d91/d94/f11e x:0 0 0 2026-03-09T00:04:13.028 INFO:tasks.workunit.client.1.vm06.stdout:9/863: chown d1/d4/d6e/d14/d25/d85/l30 131 1 2026-03-09T00:04:13.033 INFO:tasks.workunit.client.1.vm06.stdout:7/943: write d0/df/d1a/d3a/d4e/d5e/f6f [2833496,72350] 0 2026-03-09T00:04:13.034 INFO:tasks.workunit.client.1.vm06.stdout:7/944: creat d0/d55/d99/f119 x:0 0 0 2026-03-09T00:04:13.038 INFO:tasks.workunit.client.0.vm03.stdout:8/754: dread d7/df/f3d [0,4194304] 0 2026-03-09T00:04:13.050 INFO:tasks.workunit.client.1.vm06.stdout:9/864: rename d1/d4/d6e/d14 to d1/d4/d11f 0 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.0.vm03.stdout:3/560: truncate d2/db/f1a 61039 0 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.0.vm03.stdout:3/561: symlink d2/db/d3b/la8 0 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.0.vm03.stdout:3/562: dread - d2/db/d6a/fa1 zero size 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.1.vm06.stdout:9/865: creat d1/d4/d11f/d25/d85/f120 x:0 0 0 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.1.vm06.stdout:9/866: unlink d1/d4/d11f/d25/f70 0 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.1.vm06.stdout:9/867: chown d1/d3/fa6 4516270 1 2026-03-09T00:04:13.054 INFO:tasks.workunit.client.1.vm06.stdout:9/868: dread - d1/d3/d4f/d91/d94/fcd zero size 2026-03-09T00:04:13.058 INFO:tasks.workunit.client.0.vm03.stdout:8/755: dread d7/df/d1a/d40/db3/dba/d3f/d95/fb4 [0,4194304] 0 2026-03-09T00:04:13.061 INFO:tasks.workunit.client.0.vm03.stdout:8/756: rmdir d7/df/d1a/d40/db3/dba/d38/d91/ddf 0 2026-03-09T00:04:13.065 INFO:tasks.workunit.client.1.vm06.stdout:9/869: write d1/d4/d6e/ffa [408992,72728] 0 2026-03-09T00:04:13.077 INFO:tasks.workunit.client.0.vm03.stdout:8/757: write d7/df/d1a/d40/db3/dba/d38/f3e [1578856,61765] 0 2026-03-09T00:04:13.077 INFO:tasks.workunit.client.0.vm03.stdout:8/758: creat d7/df/d1a/fe7 x:0 0 0 2026-03-09T00:04:13.080 INFO:tasks.workunit.client.1.vm06.stdout:9/870: dread d1/d4/d6e/d9/f82 [0,4194304] 0 2026-03-09T00:04:13.081 INFO:tasks.workunit.client.1.vm06.stdout:9/871: creat d1/d3/d4f/d91/de8/f121 x:0 0 0 2026-03-09T00:04:13.082 INFO:tasks.workunit.client.1.vm06.stdout:9/872: symlink d1/d3/d4f/d91/d94/ddf/l122 0 2026-03-09T00:04:13.087 INFO:tasks.workunit.client.1.vm06.stdout:9/873: mknod d1/d3/d4f/d91/de8/c123 0 2026-03-09T00:04:13.087 INFO:tasks.workunit.client.1.vm06.stdout:9/874: creat d1/d4/d11f/d25/f124 x:0 0 0 2026-03-09T00:04:13.087 INFO:tasks.workunit.client.1.vm06.stdout:9/875: creat d1/d3/d4f/d91/dae/f125 x:0 0 0 2026-03-09T00:04:13.087 INFO:tasks.workunit.client.1.vm06.stdout:9/876: chown d1/d3/f11 839781126 1 2026-03-09T00:04:13.088 INFO:tasks.workunit.client.1.vm06.stdout:1/838: sync 2026-03-09T00:04:13.088 INFO:tasks.workunit.client.1.vm06.stdout:1/839: chown d6/l3f 122645519 1 2026-03-09T00:04:13.088 INFO:tasks.workunit.client.1.vm06.stdout:6/930: sync 2026-03-09T00:04:13.088 INFO:tasks.workunit.client.1.vm06.stdout:6/931: read d4/d16/d53/ddf/d7e/dac/fe1 [237415,94172] 0 2026-03-09T00:04:13.088 INFO:tasks.workunit.client.1.vm06.stdout:1/840: creat d6/d8f/f117 x:0 0 0 2026-03-09T00:04:13.089 INFO:tasks.workunit.client.0.vm03.stdout:5/763: dwrite d1c/d20/d55/d43/f53 [0,4194304] 0 2026-03-09T00:04:13.093 INFO:tasks.workunit.client.0.vm03.stdout:5/764: symlink d1c/d20/d56/lfc 0 2026-03-09T00:04:13.093 INFO:tasks.workunit.client.0.vm03.stdout:5/765: write d1c/d20/d55/ff6 [1028283,75886] 0 2026-03-09T00:04:13.094 INFO:tasks.workunit.client.1.vm06.stdout:6/932: mknod d4/d16/c11e 0 2026-03-09T00:04:13.094 INFO:tasks.workunit.client.1.vm06.stdout:6/933: getdents d4/d16/d53/ddf/d52/d110 0 2026-03-09T00:04:13.096 INFO:tasks.workunit.client.1.vm06.stdout:1/841: rmdir d6/d4c/d79/d10c 39 2026-03-09T00:04:13.097 INFO:tasks.workunit.client.0.vm03.stdout:5/766: truncate d1c/d20/d55/d4f/d58/d73/d9e/fe4 2333748 0 2026-03-09T00:04:13.098 INFO:tasks.workunit.client.1.vm06.stdout:6/934: mknod d4/d16/d53/ddf/d7e/dac/dcd/c11f 0 2026-03-09T00:04:13.098 INFO:tasks.workunit.client.1.vm06.stdout:1/842: mkdir d6/d21/def/d118 0 2026-03-09T00:04:13.100 INFO:tasks.workunit.client.0.vm03.stdout:5/767: mknod d1c/d67/cfd 0 2026-03-09T00:04:13.101 INFO:tasks.workunit.client.1.vm06.stdout:1/843: creat d6/d8f/d10f/f119 x:0 0 0 2026-03-09T00:04:13.101 INFO:tasks.workunit.client.1.vm06.stdout:6/935: truncate d4/d16/d53/ddf/d4b/fba 3863793 0 2026-03-09T00:04:13.101 INFO:tasks.workunit.client.1.vm06.stdout:1/844: unlink d6/d63/f82 0 2026-03-09T00:04:13.102 INFO:tasks.workunit.client.1.vm06.stdout:6/936: creat d4/d27/d3e/d45/f120 x:0 0 0 2026-03-09T00:04:13.102 INFO:tasks.workunit.client.1.vm06.stdout:6/937: fsync d4/d16/d53/ddf/da6/dbb/fed 0 2026-03-09T00:04:13.103 INFO:tasks.workunit.client.1.vm06.stdout:1/845: symlink d6/d21/d2d/d3b/d42/l11a 0 2026-03-09T00:04:13.108 INFO:tasks.workunit.client.0.vm03.stdout:5/768: write d1c/d20/f39 [2433711,90041] 0 2026-03-09T00:04:13.109 INFO:tasks.workunit.client.1.vm06.stdout:1/846: rename d6/d4c/d71/d83/f9b to d6/d21/f11b 0 2026-03-09T00:04:13.109 INFO:tasks.workunit.client.1.vm06.stdout:1/847: fdatasync d6/d21/f11b 0 2026-03-09T00:04:13.109 INFO:tasks.workunit.client.1.vm06.stdout:6/938: rmdir d4/d16/d53/ddf/d52 39 2026-03-09T00:04:13.109 INFO:tasks.workunit.client.1.vm06.stdout:6/939: fsync d4/d27/d3e/f55 0 2026-03-09T00:04:13.109 INFO:tasks.workunit.client.1.vm06.stdout:6/940: readlink d4/le 0 2026-03-09T00:04:13.111 INFO:tasks.workunit.client.0.vm03.stdout:5/769: symlink d1c/d67/lfe 0 2026-03-09T00:04:13.113 INFO:tasks.workunit.client.1.vm06.stdout:1/848: link d6/d8f/f103 d6/d21/def/f11c 0 2026-03-09T00:04:13.113 INFO:tasks.workunit.client.1.vm06.stdout:1/849: truncate d6/d4c/d79/d10c/f112 786158 0 2026-03-09T00:04:13.113 INFO:tasks.workunit.client.0.vm03.stdout:5/770: dread d1c/f96 [0,4194304] 0 2026-03-09T00:04:13.113 INFO:tasks.workunit.client.0.vm03.stdout:5/771: truncate d1c/d20/d55/f3d 4539062 0 2026-03-09T00:04:13.113 INFO:tasks.workunit.client.1.vm06.stdout:6/941: rename d4/d16/d53/df2/d11c to d4/d16/d121 0 2026-03-09T00:04:13.113 INFO:tasks.workunit.client.1.vm06.stdout:6/942: readlink d4/d16/d53/ddf/d7e/dac/dd3/d101/lfc 0 2026-03-09T00:04:13.126 INFO:tasks.workunit.client.1.vm06.stdout:1/850: mkdir d6/d4c/d79/d10c/d11d 0 2026-03-09T00:04:13.126 INFO:tasks.workunit.client.0.vm03.stdout:5/772: chown f18 1195076452 1 2026-03-09T00:04:13.132 INFO:tasks.workunit.client.0.vm03.stdout:1/834: dwrite d4/d15/d1a/f1b [0,4194304] 0 2026-03-09T00:04:13.132 INFO:tasks.workunit.client.0.vm03.stdout:1/835: stat d4/d3a/d32/dc2 0 2026-03-09T00:04:13.136 INFO:tasks.workunit.client.0.vm03.stdout:5/773: write d1c/d20/d55/f9b [1364276,27051] 0 2026-03-09T00:04:13.137 INFO:tasks.workunit.client.0.vm03.stdout:5/774: write d1c/d20/d55/d4f/d58/db5/f3c [479579,35336] 0 2026-03-09T00:04:13.139 INFO:tasks.workunit.client.0.vm03.stdout:5/775: write d1c/d20/d55/d4f/d58/d5d/fb6 [163248,25093] 0 2026-03-09T00:04:13.139 INFO:tasks.workunit.client.0.vm03.stdout:5/776: fsync d1c/f1f 0 2026-03-09T00:04:13.158 INFO:tasks.workunit.client.1.vm06.stdout:1/851: dread d6/d21/d2d/fe9 [0,4194304] 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/852: creat d6/d21/dfc/de8/f11e x:0 0 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/853: fdatasync d6/d4c/f8e 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/854: stat d6/d21/d2d/d37/d6d/dd7/ff6 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/855: creat d6/d21/def/d118/f11f x:0 0 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/856: write d6/d21/d2d/fe9 [1469055,39609] 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/857: symlink d6/d21/def/l120 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/858: truncate d6/d21/d2d/d3b/d87/f8d 4033731 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/859: mkdir d6/d8f/d10f/d121 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/860: creat d6/d21/d2d/d37/d6d/dd7/f122 x:0 0 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/861: stat d6/f34 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/862: readlink d6/d21/d2d/d3b/lb7 0 2026-03-09T00:04:13.166 INFO:tasks.workunit.client.1.vm06.stdout:1/863: link d6/d21/d2d/d3b/d87/d9d/dd8/fe6 d6/d21/d2d/d3b/dc9/f123 0 2026-03-09T00:04:13.167 INFO:tasks.workunit.client.1.vm06.stdout:1/864: mknod d6/d21/d2d/d3b/d42/df0/c124 0 2026-03-09T00:04:13.169 INFO:tasks.workunit.client.1.vm06.stdout:1/865: creat d6/d21/d2d/d37/d6d/f125 x:0 0 0 2026-03-09T00:04:13.170 INFO:tasks.workunit.client.1.vm06.stdout:1/866: mknod d6/dc4/c126 0 2026-03-09T00:04:13.170 INFO:tasks.workunit.client.1.vm06.stdout:1/867: chown d6/d8f/f103 0 1 2026-03-09T00:04:13.170 INFO:tasks.workunit.client.1.vm06.stdout:1/868: readlink d6/d4c/d79/l115 0 2026-03-09T00:04:13.171 INFO:tasks.workunit.client.1.vm06.stdout:1/869: write d6/d4c/fe1 [552333,118833] 0 2026-03-09T00:04:13.171 INFO:tasks.workunit.client.1.vm06.stdout:1/870: stat l5 0 2026-03-09T00:04:13.171 INFO:tasks.workunit.client.1.vm06.stdout:1/871: write d6/d21/d2d/d37/f8b [2718614,91793] 0 2026-03-09T00:04:13.173 INFO:tasks.workunit.client.1.vm06.stdout:1/872: unlink d6/d21/dfc/de8/f110 0 2026-03-09T00:04:13.214 INFO:tasks.workunit.client.1.vm06.stdout:9/877: dwrite d1/d4/d11f/d25/d85/d49/fec [0,4194304] 0 2026-03-09T00:04:13.214 INFO:tasks.workunit.client.1.vm06.stdout:9/878: fsync d1/d4/f39 0 2026-03-09T00:04:13.214 INFO:tasks.workunit.client.1.vm06.stdout:4/986: dwrite d17/d21/d4c/d50/f8c [0,4194304] 0 2026-03-09T00:04:13.215 INFO:tasks.workunit.client.1.vm06.stdout:9/879: mkdir d1/d3/d4f/d91/dae/de9/d126 0 2026-03-09T00:04:13.215 INFO:tasks.workunit.client.1.vm06.stdout:9/880: dread - d1/d3/d4f/d91/ff7 zero size 2026-03-09T00:04:13.226 INFO:tasks.workunit.client.0.vm03.stdout:2/779: dwrite d8/d1b/d2a/fbb [4194304,4194304] 0 2026-03-09T00:04:13.226 INFO:tasks.workunit.client.0.vm03.stdout:2/780: fdatasync d8/d26/d5e/d6f/d97/f75 0 2026-03-09T00:04:13.228 INFO:tasks.workunit.client.0.vm03.stdout:2/781: creat d8/d26/dfc/fff x:0 0 0 2026-03-09T00:04:13.231 INFO:tasks.workunit.client.0.vm03.stdout:2/782: getdents d8/d26/d5e/d5f/d95 0 2026-03-09T00:04:13.235 INFO:tasks.workunit.client.0.vm03.stdout:2/783: mknod d8/d26/d5e/d5f/ded/c100 0 2026-03-09T00:04:13.235 INFO:tasks.workunit.client.0.vm03.stdout:2/784: symlink d8/d1b/d2a/d6b/dc6/l101 0 2026-03-09T00:04:13.235 INFO:tasks.workunit.client.0.vm03.stdout:0/767: write d2/da/d36/ff6 [792125,69206] 0 2026-03-09T00:04:13.240 INFO:tasks.workunit.client.0.vm03.stdout:2/785: dread d8/d1b/d24/f2f [0,4194304] 0 2026-03-09T00:04:13.241 INFO:tasks.workunit.client.1.vm06.stdout:1/873: dread d6/d21/d2d/d37/d6d/dd7/ff6 [0,4194304] 0 2026-03-09T00:04:13.241 INFO:tasks.workunit.client.1.vm06.stdout:1/874: write d6/d4c/feb [950706,110624] 0 2026-03-09T00:04:13.241 INFO:tasks.workunit.client.1.vm06.stdout:1/875: write d6/d8f/f101 [403397,90652] 0 2026-03-09T00:04:13.242 INFO:tasks.workunit.client.0.vm03.stdout:0/768: mkdir d2/da/d76/d8a/d116 0 2026-03-09T00:04:13.242 INFO:tasks.workunit.client.0.vm03.stdout:0/769: fdatasync d2/da/dd/f11 0 2026-03-09T00:04:13.246 INFO:tasks.workunit.client.1.vm06.stdout:8/973: write db/d53/d5c/f6f [1256833,18569] 0 2026-03-09T00:04:13.246 INFO:tasks.workunit.client.0.vm03.stdout:2/786: mknod d8/d1b/d6c/c102 0 2026-03-09T00:04:13.247 INFO:tasks.workunit.client.0.vm03.stdout:1/836: dread d4/d15/dae/d101/f108 [0,4194304] 0 2026-03-09T00:04:13.247 INFO:tasks.workunit.client.0.vm03.stdout:0/770: rename d2/da/dd/d49/d6c/d4b/d55/d6f/lec to d2/da/dd/d49/l117 0 2026-03-09T00:04:13.247 INFO:tasks.workunit.client.0.vm03.stdout:0/771: write d2/da/dd/d49/d6c/d4b/d55/d6f/dad/de8/f113 [384197,14077] 0 2026-03-09T00:04:13.248 INFO:tasks.workunit.client.0.vm03.stdout:2/787: mkdir d8/d1b/d2a/d6b/d50/d103 0 2026-03-09T00:04:13.248 INFO:tasks.workunit.client.0.vm03.stdout:2/788: readlink d8/d1b/d2a/d6b/le3 0 2026-03-09T00:04:13.249 INFO:tasks.workunit.client.0.vm03.stdout:0/772: symlink d2/da/dd/d49/d6c/da6/l118 0 2026-03-09T00:04:13.253 INFO:tasks.workunit.client.0.vm03.stdout:2/789: rmdir d8/d1b/d8f/dfa 0 2026-03-09T00:04:13.253 INFO:tasks.workunit.client.0.vm03.stdout:2/790: mkdir d8/d26/d5e/dc5/d104 0 2026-03-09T00:04:13.253 INFO:tasks.workunit.client.1.vm06.stdout:3/945: dwrite d11/d28/d4d/d89/fbe [0,4194304] 0 2026-03-09T00:04:13.253 INFO:tasks.workunit.client.1.vm06.stdout:3/946: chown d11/d28/f141 27 1 2026-03-09T00:04:13.253 INFO:tasks.workunit.client.1.vm06.stdout:3/947: chown c0 735 1 2026-03-09T00:04:13.253 INFO:tasks.workunit.client.1.vm06.stdout:3/948: dread - d11/d28/d2e/d2f/d5b/db5/f130 zero size 2026-03-09T00:04:13.255 INFO:tasks.workunit.client.1.vm06.stdout:9/881: dread d1/d3/d50/fba [0,4194304] 0 2026-03-09T00:04:13.255 INFO:tasks.workunit.client.0.vm03.stdout:3/563: dwrite d2/db/d3b/d5f/d65/f90 [0,4194304] 0 2026-03-09T00:04:13.255 INFO:tasks.workunit.client.0.vm03.stdout:3/564: fdatasync d2/db/d2d/f45 0 2026-03-09T00:04:13.258 INFO:tasks.workunit.client.1.vm06.stdout:6/943: dwrite d4/ff [0,4194304] 0 2026-03-09T00:04:13.258 INFO:tasks.workunit.client.1.vm06.stdout:6/944: creat d4/d16/d53/ddf/d4b/ddb/f122 x:0 0 0 2026-03-09T00:04:13.258 INFO:tasks.workunit.client.1.vm06.stdout:6/945: fdatasync d4/d16/f33 0 2026-03-09T00:04:13.258 INFO:tasks.workunit.client.1.vm06.stdout:6/946: readlink d4/d16/d53/l5b 0 2026-03-09T00:04:13.258 INFO:tasks.workunit.client.1.vm06.stdout:6/947: chown d4/f26 225 1 2026-03-09T00:04:13.260 INFO:tasks.workunit.client.0.vm03.stdout:0/773: dread d2/da/d76/d8a/d8f/db8/f112 [0,4194304] 0 2026-03-09T00:04:13.262 INFO:tasks.workunit.client.1.vm06.stdout:3/949: mknod d11/d3f/d129/c142 0 2026-03-09T00:04:13.265 INFO:tasks.workunit.client.1.vm06.stdout:9/882: mkdir d1/d3/d4f/d91/dae/de9/d113/d127 0 2026-03-09T00:04:13.265 INFO:tasks.workunit.client.1.vm06.stdout:9/883: stat d1/d3/d4f/d52/de3 0 2026-03-09T00:04:13.265 INFO:tasks.workunit.client.1.vm06.stdout:9/884: chown d1/d3/f9b 0 1 2026-03-09T00:04:13.265 INFO:tasks.workunit.client.1.vm06.stdout:9/885: write d1/d3/d4f/d91/fc4 [269297,33359] 0 2026-03-09T00:04:13.265 INFO:tasks.workunit.client.0.vm03.stdout:3/565: mkdir d2/db/d2d/d55/da9 0 2026-03-09T00:04:13.265 INFO:tasks.workunit.client.0.vm03.stdout:0/774: read d2/da/d1a/f56 [1314307,123263] 0 2026-03-09T00:04:13.267 INFO:tasks.workunit.client.1.vm06.stdout:0/994: dwrite d3/d18/d1f/d44/f58 [4194304,4194304] 0 2026-03-09T00:04:13.267 INFO:tasks.workunit.client.1.vm06.stdout:0/995: creat d3/d10f/f154 x:0 0 0 2026-03-09T00:04:13.268 INFO:tasks.workunit.client.0.vm03.stdout:2/791: read d8/d1b/d2a/d2e/df5/ff9 [83969,71735] 0 2026-03-09T00:04:13.270 INFO:tasks.workunit.client.0.vm03.stdout:3/566: dread d2/db/d40/d51/f5a [0,4194304] 0 2026-03-09T00:04:13.272 INFO:tasks.workunit.client.0.vm03.stdout:0/775: creat d2/da/dd/d49/d6c/da6/f119 x:0 0 0 2026-03-09T00:04:13.273 INFO:tasks.workunit.client.1.vm06.stdout:3/950: symlink d11/d3f/d129/l143 0 2026-03-09T00:04:13.273 INFO:tasks.workunit.client.0.vm03.stdout:2/792: mkdir d8/d1b/d24/da5/dfe/d105 0 2026-03-09T00:04:13.274 INFO:tasks.workunit.client.1.vm06.stdout:9/886: symlink d1/d3/d4f/l128 0 2026-03-09T00:04:13.274 INFO:tasks.workunit.client.1.vm06.stdout:9/887: stat d1/d3/c115 0 2026-03-09T00:04:13.279 INFO:tasks.workunit.client.1.vm06.stdout:3/951: dread d11/d28/d2e/d2f/f3e [0,4194304] 0 2026-03-09T00:04:13.283 INFO:tasks.workunit.client.0.vm03.stdout:0/776: dread d2/da/d1a/fc4 [0,4194304] 0 2026-03-09T00:04:13.283 INFO:tasks.workunit.client.0.vm03.stdout:0/777: fdatasync d2/da/dd/d49/fa9 0 2026-03-09T00:04:13.286 INFO:tasks.workunit.client.0.vm03.stdout:3/567: link d2/f8a d2/db/d40/d88/faa 0 2026-03-09T00:04:13.289 INFO:tasks.workunit.client.0.vm03.stdout:0/778: getdents d2/da 0 2026-03-09T00:04:13.290 INFO:tasks.workunit.client.1.vm06.stdout:6/948: rmdir d4/d27/d3e/d78 39 2026-03-09T00:04:13.290 INFO:tasks.workunit.client.1.vm06.stdout:6/949: chown d4/d8d 1771 1 2026-03-09T00:04:13.290 INFO:tasks.workunit.client.0.vm03.stdout:3/568: mknod d2/db/d40/d44/d68/d99/cab 0 2026-03-09T00:04:13.291 INFO:tasks.workunit.client.1.vm06.stdout:9/888: creat d1/d3/d4f/d52/f129 x:0 0 0 2026-03-09T00:04:13.292 INFO:tasks.workunit.client.1.vm06.stdout:3/952: rename d11/d28/d2e/d2f/d5b/d5f/c98 to d11/d28/d2e/d2f/d5b/d94/ddd/c144 0 2026-03-09T00:04:13.292 INFO:tasks.workunit.client.0.vm03.stdout:3/569: symlink d2/db/d40/d44/d68/lac 0 2026-03-09T00:04:13.295 INFO:tasks.workunit.client.0.vm03.stdout:0/779: symlink d2/da/d36/da4/l11a 0 2026-03-09T00:04:13.295 INFO:tasks.workunit.client.1.vm06.stdout:9/889: truncate d1/d3/ddc/fde 3846949 0 2026-03-09T00:04:13.300 INFO:tasks.workunit.client.0.vm03.stdout:3/570: readlink d2/db/l1e 0 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.0.vm03.stdout:3/571: readlink d2/db/d3b/l91 0 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.0.vm03.stdout:3/572: truncate d2/db/d6a/fa1 1043639 0 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.0.vm03.stdout:3/573: readlink d2/db/d2d/l2e 0 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.0.vm03.stdout:3/574: chown d2/db/d2d/c3c 3 1 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.1.vm06.stdout:3/953: read d11/d28/d2e/db2/d100/f103 [3510246,39373] 0 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.1.vm06.stdout:3/954: getdents d11/d28/d2e/d2f/d36/d8f/d12e 0 2026-03-09T00:04:13.302 INFO:tasks.workunit.client.1.vm06.stdout:3/955: chown d11/f12 442937909 1 2026-03-09T00:04:13.303 INFO:tasks.workunit.client.1.vm06.stdout:3/956: mkdir d11/d28/d2e/d2f/d5b/d94/ddd/d11b/d145 0 2026-03-09T00:04:13.306 INFO:tasks.workunit.client.0.vm03.stdout:3/575: dread d2/db/d40/d51/f5c [0,4194304] 0 2026-03-09T00:04:13.308 INFO:tasks.workunit.client.1.vm06.stdout:8/974: dwrite db/d74/d87/fca [0,4194304] 0 2026-03-09T00:04:13.308 INFO:tasks.workunit.client.0.vm03.stdout:0/780: mknod d2/da/d36/da4/c11b 0 2026-03-09T00:04:13.310 INFO:tasks.workunit.client.1.vm06.stdout:6/950: rename d4/d16/d46/lc0 to d4/d16/d121/l123 0 2026-03-09T00:04:13.314 INFO:tasks.workunit.client.1.vm06.stdout:9/890: dread d1/d73/f8f [0,4194304] 0 2026-03-09T00:04:13.314 INFO:tasks.workunit.client.1.vm06.stdout:9/891: chown d1/d3/c115 113 1 2026-03-09T00:04:13.314 INFO:tasks.workunit.client.1.vm06.stdout:9/892: chown d1/d4/d2f/cf8 558 1 2026-03-09T00:04:13.314 INFO:tasks.workunit.client.1.vm06.stdout:9/893: stat d1/d4/d2f/cb7 0 2026-03-09T00:04:13.315 INFO:tasks.workunit.client.0.vm03.stdout:0/781: unlink d2/da/dd/d49/c8c 0 2026-03-09T00:04:13.316 INFO:tasks.workunit.client.1.vm06.stdout:8/975: creat db/dd/d24/dac/d131/f139 x:0 0 0 2026-03-09T00:04:13.317 INFO:tasks.workunit.client.0.vm03.stdout:3/576: link d2/db/c29 d2/db/d40/d88/cad 0 2026-03-09T00:04:13.317 INFO:tasks.workunit.client.0.vm03.stdout:3/577: fdatasync d2/db/f3a 0 2026-03-09T00:04:13.319 INFO:tasks.workunit.client.0.vm03.stdout:0/782: link d2/da/d36/da4/ce3 d2/da/dd/d49/d6c/da6/dda/db5/dba/dff/c11c 0 2026-03-09T00:04:13.319 INFO:tasks.workunit.client.0.vm03.stdout:0/783: fdatasync d2/da/dd/d49/d6c/da6/f119 0 2026-03-09T00:04:13.319 INFO:tasks.workunit.client.0.vm03.stdout:0/784: chown d2/da/d4e/f9c 70245675 1 2026-03-09T00:04:13.322 INFO:tasks.workunit.client.0.vm03.stdout:3/578: creat d2/db/d2d/fae x:0 0 0 2026-03-09T00:04:13.322 INFO:tasks.workunit.client.0.vm03.stdout:3/579: truncate d2/db/d40/d88/faa 903 0 2026-03-09T00:04:13.344 INFO:tasks.workunit.client.0.vm03.stdout:8/759: dwrite d7/df/d1a/d40/f69 [0,4194304] 0 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:8/760: dread d7/df/d1a/d40/db3/dba/d38/d91/fbf [0,4194304] 0 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:7/680: sync 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:6/708: sync 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:4/850: sync 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:4/851: chown d7/d20/d6a/dea/d54/d58/f6b 952318 1 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:9/695: sync 2026-03-09T00:04:13.348 INFO:tasks.workunit.client.0.vm03.stdout:9/696: fdatasync d15/d1c/d21/d54/f80 0 2026-03-09T00:04:13.354 INFO:tasks.workunit.client.0.vm03.stdout:8/761: rename d7/df/d1a/d2b/d62/faa to d7/df/d1a/d40/d58/fe8 0 2026-03-09T00:04:13.356 INFO:tasks.workunit.client.0.vm03.stdout:7/681: truncate d2/d1f/d3a/d24/da4/d46/d81/f8f 1577432 0 2026-03-09T00:04:13.356 INFO:tasks.workunit.client.0.vm03.stdout:7/682: fdatasync d2/d1f/d3a/f1a 0 2026-03-09T00:04:13.361 INFO:tasks.workunit.client.0.vm03.stdout:4/852: unlink d7/de6/cf4 0 2026-03-09T00:04:13.379 INFO:tasks.workunit.client.0.vm03.stdout:9/697: creat d15/d1c/d21/d75/de0/fe7 x:0 0 0 2026-03-09T00:04:13.379 INFO:tasks.workunit.client.0.vm03.stdout:8/762: symlink d7/df/d1a/d40/db3/dba/dad/le9 0 2026-03-09T00:04:13.382 INFO:tasks.workunit.client.0.vm03.stdout:7/683: dread d2/d1f/d35/f3e [0,4194304] 0 2026-03-09T00:04:13.383 INFO:tasks.workunit.client.0.vm03.stdout:7/684: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d8e 12 1 2026-03-09T00:04:13.384 INFO:tasks.workunit.client.1.vm06.stdout:4/987: dwrite d17/d21/f38 [0,4194304] 0 2026-03-09T00:04:13.385 INFO:tasks.workunit.client.0.vm03.stdout:3/580: write d2/db/d2d/f45 [2387935,62689] 0 2026-03-09T00:04:13.388 INFO:tasks.workunit.client.1.vm06.stdout:8/976: dread db/d74/d78/f93 [0,4194304] 0 2026-03-09T00:04:13.388 INFO:tasks.workunit.client.1.vm06.stdout:8/977: dread - db/d53/d70/d38/d4d/f11f zero size 2026-03-09T00:04:13.391 INFO:tasks.workunit.client.0.vm03.stdout:9/698: dread d15/d1c/d36/d4d/f5d [0,4194304] 0 2026-03-09T00:04:13.398 INFO:tasks.workunit.client.1.vm06.stdout:4/988: symlink d17/d24/d49/de4/l153 0 2026-03-09T00:04:13.404 INFO:tasks.workunit.client.0.vm03.stdout:7/685: creat d2/d4/d8c/fce x:0 0 0 2026-03-09T00:04:13.409 INFO:tasks.workunit.client.0.vm03.stdout:7/686: fdatasync d2/d1f/d3a/f19 0 2026-03-09T00:04:13.412 INFO:tasks.workunit.client.0.vm03.stdout:7/687: dread d2/d4/d1e/f97 [0,4194304] 0 2026-03-09T00:04:13.412 INFO:tasks.workunit.client.0.vm03.stdout:7/688: fdatasync d2/d1f/d3a/d24/da4/d46/d81/d96/f44 0 2026-03-09T00:04:13.412 INFO:tasks.workunit.client.0.vm03.stdout:6/709: rename d13/d35/d4c to d13/d1e/d44/d59/dec 0 2026-03-09T00:04:13.412 INFO:tasks.workunit.client.1.vm06.stdout:0/996: dwrite d3/f1e [0,4194304] 0 2026-03-09T00:04:13.413 INFO:tasks.workunit.client.1.vm06.stdout:0/997: fsync d3/d18/d1f/d39/d3b/df9/df2/fec 0 2026-03-09T00:04:13.413 INFO:tasks.workunit.client.1.vm06.stdout:0/998: truncate d3/d10f/f154 805210 0 2026-03-09T00:04:13.413 INFO:tasks.workunit.client.1.vm06.stdout:0/999: dread - d3/d18/de9/f10c zero size 2026-03-09T00:04:13.413 INFO:tasks.workunit.client.1.vm06.stdout:8/978: symlink db/d74/d78/d98/db6/dc7/d101/db7/l13a 0 2026-03-09T00:04:13.413 INFO:tasks.workunit.client.1.vm06.stdout:8/979: write db/d1e/f34 [259110,13164] 0 2026-03-09T00:04:13.413 INFO:tasks.workunit.client.1.vm06.stdout:8/980: creat db/dd/d24/dac/d131/f13b x:0 0 0 2026-03-09T00:04:13.416 INFO:tasks.workunit.client.0.vm03.stdout:3/581: getdents d2/db/d40/d44 0 2026-03-09T00:04:13.420 INFO:tasks.workunit.client.0.vm03.stdout:3/582: write d2/db/d3b/f95 [353129,116857] 0 2026-03-09T00:04:13.424 INFO:tasks.workunit.client.0.vm03.stdout:2/793: dwrite d8/d1b/d6c/f7b [0,4194304] 0 2026-03-09T00:04:13.425 INFO:tasks.workunit.client.0.vm03.stdout:3/583: dread d2/f9 [0,4194304] 0 2026-03-09T00:04:13.428 INFO:tasks.workunit.client.0.vm03.stdout:7/689: rmdir d2/d1f/d3a 39 2026-03-09T00:04:13.437 INFO:tasks.workunit.client.0.vm03.stdout:7/690: chown d2/d1f/d3a/d24/da4/d46/d54/c55 37171433 1 2026-03-09T00:04:13.437 INFO:tasks.workunit.client.1.vm06.stdout:8/981: creat db/d74/d87/d100/f13c x:0 0 0 2026-03-09T00:04:13.437 INFO:tasks.workunit.client.0.vm03.stdout:7/691: write d2/d1f/d3a/f19 [3770574,20891] 0 2026-03-09T00:04:13.442 INFO:tasks.workunit.client.0.vm03.stdout:0/785: dwrite d2/f59 [4194304,4194304] 0 2026-03-09T00:04:13.444 INFO:tasks.workunit.client.0.vm03.stdout:2/794: rename d8/d1b/d2a/d6b/d50/f63 to d8/d74/f106 0 2026-03-09T00:04:13.444 INFO:tasks.workunit.client.0.vm03.stdout:2/795: creat d8/d26/dfc/f107 x:0 0 0 2026-03-09T00:04:13.444 INFO:tasks.workunit.client.0.vm03.stdout:2/796: write d8/d26/d5e/d6f/d97/f1c [67713,19193] 0 2026-03-09T00:04:13.445 INFO:tasks.workunit.client.0.vm03.stdout:3/584: mkdir d2/db/d3b/d3f/daf 0 2026-03-09T00:04:13.452 INFO:tasks.workunit.client.1.vm06.stdout:6/951: dwrite d4/d16/d53/ddf/fb8 [0,4194304] 0 2026-03-09T00:04:13.452 INFO:tasks.workunit.client.1.vm06.stdout:6/952: read d4/f6e [704187,111614] 0 2026-03-09T00:04:13.452 INFO:tasks.workunit.client.1.vm06.stdout:4/989: rmdir d17/d21/d4c/d50 39 2026-03-09T00:04:13.454 INFO:tasks.workunit.client.1.vm06.stdout:3/957: dwrite d11/d28/d2e/d2f/d5b/d5f/f60 [0,4194304] 0 2026-03-09T00:04:13.455 INFO:tasks.workunit.client.1.vm06.stdout:8/982: creat db/dd/f13d x:0 0 0 2026-03-09T00:04:13.455 INFO:tasks.workunit.client.1.vm06.stdout:8/983: fdatasync db/d1e/f50 0 2026-03-09T00:04:13.455 INFO:tasks.workunit.client.1.vm06.stdout:8/984: readlink db/d74/d78/d98/db6/dc7/d101/lf3 0 2026-03-09T00:04:13.455 INFO:tasks.workunit.client.1.vm06.stdout:8/985: read db/d53/d70/d38/f5b [3330704,27980] 0 2026-03-09T00:04:13.459 INFO:tasks.workunit.client.1.vm06.stdout:4/990: mknod d17/d134/d143/c154 0 2026-03-09T00:04:13.464 INFO:tasks.workunit.client.1.vm06.stdout:3/958: mknod d11/d28/d2e/c146 0 2026-03-09T00:04:13.475 INFO:tasks.workunit.client.1.vm06.stdout:3/959: write d11/d28/d2e/db2/d100/f103 [1656508,125638] 0 2026-03-09T00:04:13.476 INFO:tasks.workunit.client.0.vm03.stdout:0/786: rename d2/da/d36/da4/ce3 to d2/da/dd/d49/c11d 0 2026-03-09T00:04:13.476 INFO:tasks.workunit.client.1.vm06.stdout:8/986: symlink db/dd/d84/l13e 0 2026-03-09T00:04:13.476 INFO:tasks.workunit.client.1.vm06.stdout:4/991: link d17/d24/d49/f5a d17/f155 0 2026-03-09T00:04:13.477 INFO:tasks.workunit.client.0.vm03.stdout:3/585: creat d2/db/d3b/d5f/da5/d72/fb0 x:0 0 0 2026-03-09T00:04:13.478 INFO:tasks.workunit.client.0.vm03.stdout:3/586: chown d2/db/f28 97 1 2026-03-09T00:04:13.478 INFO:tasks.workunit.client.0.vm03.stdout:3/587: truncate d2/db/d3b/f95 1124657 0 2026-03-09T00:04:13.481 INFO:tasks.workunit.client.1.vm06.stdout:4/992: dread d17/d24/d49/f62 [0,4194304] 0 2026-03-09T00:04:13.481 INFO:tasks.workunit.client.0.vm03.stdout:7/692: dread d2/f3 [0,4194304] 0 2026-03-09T00:04:13.484 INFO:tasks.workunit.client.0.vm03.stdout:8/763: dwrite d7/df/d1a/d40/db3/dba/d3f/f90 [0,4194304] 0 2026-03-09T00:04:13.491 INFO:tasks.workunit.client.0.vm03.stdout:3/588: mkdir d2/db/d40/d51/da2/db1 0 2026-03-09T00:04:13.503 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:13 vm03.local ceph-mon[52346]: pgmap v9: 65 pgs: 65 active+clean; 3.5 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 115 MiB/s rd, 142 MiB/s wr, 222 op/s 2026-03-09T00:04:13.503 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:13 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:13.503 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:13 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:13.503 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:13 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:13.503 INFO:tasks.workunit.client.0.vm03.stdout:7/693: mknod d2/d4/ccf 0 2026-03-09T00:04:13.508 INFO:tasks.workunit.client.1.vm06.stdout:8/987: unlink db/d1e/f51 0 2026-03-09T00:04:13.516 INFO:tasks.workunit.client.0.vm03.stdout:6/710: dwrite d13/d1e/d44/d59/fe9 [0,4194304] 0 2026-03-09T00:04:13.520 INFO:tasks.workunit.client.1.vm06.stdout:8/988: mknod db/d74/d87/c13f 0 2026-03-09T00:04:13.522 INFO:tasks.workunit.client.1.vm06.stdout:8/989: dread - db/dd/d24/da7/d125/fd6 zero size 2026-03-09T00:04:13.525 INFO:tasks.workunit.client.0.vm03.stdout:4/853: dwrite d7/d20/d35/fb5 [0,4194304] 0 2026-03-09T00:04:13.525 INFO:tasks.workunit.client.0.vm03.stdout:4/854: chown d7/d20/d6a/dea/d54/d58/l95 942013 1 2026-03-09T00:04:13.528 INFO:tasks.workunit.client.0.vm03.stdout:6/711: dread d13/d1e/f9f [0,4194304] 0 2026-03-09T00:04:13.528 INFO:tasks.workunit.client.0.vm03.stdout:4/855: mknod d7/d6f/dcf/de8/dee/c10b 0 2026-03-09T00:04:13.529 INFO:tasks.workunit.client.0.vm03.stdout:6/712: mkdir d13/d35/d71/d97/ded 0 2026-03-09T00:04:13.529 INFO:tasks.workunit.client.0.vm03.stdout:6/713: chown d13/d1e/fc3 0 1 2026-03-09T00:04:13.531 INFO:tasks.workunit.client.0.vm03.stdout:4/856: rmdir d7/d6f/dcf/de8 39 2026-03-09T00:04:13.532 INFO:tasks.workunit.client.0.vm03.stdout:4/857: rename d7/d20/db3 to d7/d20/d6a/dea/d38/dfb/d10c 0 2026-03-09T00:04:13.533 INFO:tasks.workunit.client.0.vm03.stdout:4/858: read d7/fa7 [898583,5916] 0 2026-03-09T00:04:13.533 INFO:tasks.workunit.client.1.vm06.stdout:3/960: dread d11/d28/d2e/d2f/d5b/d5f/d91/f107 [4194304,4194304] 0 2026-03-09T00:04:13.534 INFO:tasks.workunit.client.0.vm03.stdout:4/859: creat d7/d20/d6a/d77/d25/de2/df1/f10d x:0 0 0 2026-03-09T00:04:13.534 INFO:tasks.workunit.client.0.vm03.stdout:4/860: mknod d7/d20/d6a/dea/d38/da9/ddc/df2/c10e 0 2026-03-09T00:04:13.539 INFO:tasks.workunit.client.1.vm06.stdout:8/990: write db/d74/d78/d98/db6/fff [5899808,16303] 0 2026-03-09T00:04:13.539 INFO:tasks.workunit.client.0.vm03.stdout:4/861: truncate d7/d27/dc9/fd2 6441765 0 2026-03-09T00:04:13.553 INFO:tasks.workunit.client.1.vm06.stdout:8/991: getdents db/d53/d6d/d7b 0 2026-03-09T00:04:13.554 INFO:tasks.workunit.client.1.vm06.stdout:3/961: dread d11/d28/d2e/db2/f116 [0,4194304] 0 2026-03-09T00:04:13.558 INFO:tasks.workunit.client.1.vm06.stdout:3/962: rename d11/d3f/d8d to d11/d28/d4d/d89/d90/d147 0 2026-03-09T00:04:13.560 INFO:tasks.workunit.client.1.vm06.stdout:3/963: fsync d11/d28/d2e/d7e/d83/d87/f10c 0 2026-03-09T00:04:13.563 INFO:tasks.workunit.client.0.vm03.stdout:3/589: dread d2/f5 [0,4194304] 0 2026-03-09T00:04:13.563 INFO:tasks.workunit.client.0.vm03.stdout:2/797: dwrite d8/d1b/d2a/d56/f57 [0,4194304] 0 2026-03-09T00:04:13.563 INFO:tasks.workunit.client.1.vm06.stdout:3/964: write d11/d28/d2e/d2f/d36/f59 [2869964,88439] 0 2026-03-09T00:04:13.564 INFO:tasks.workunit.client.1.vm06.stdout:6/953: dwrite d4/d27/d3e/d78/f91 [0,4194304] 0 2026-03-09T00:04:13.564 INFO:tasks.workunit.client.1.vm06.stdout:6/954: write d4/d27/d3e/d78/f114 [767285,99574] 0 2026-03-09T00:04:13.564 INFO:tasks.workunit.client.1.vm06.stdout:6/955: truncate d4/d27/f11d 379067 0 2026-03-09T00:04:13.564 INFO:tasks.workunit.client.1.vm06.stdout:6/956: readlink d4/d27/l3a 0 2026-03-09T00:04:13.565 INFO:tasks.workunit.client.1.vm06.stdout:6/957: dread d4/d16/d53/d67/f8f [0,4194304] 0 2026-03-09T00:04:13.565 INFO:tasks.workunit.client.1.vm06.stdout:6/958: read d4/d27/d3e/f44 [1726197,74349] 0 2026-03-09T00:04:13.565 INFO:tasks.workunit.client.1.vm06.stdout:3/965: mkdir d11/d28/d4d/d89/d90/dd2/d148 0 2026-03-09T00:04:13.569 INFO:tasks.workunit.client.1.vm06.stdout:3/966: fdatasync d11/d3f/f54 0 2026-03-09T00:04:13.574 INFO:tasks.workunit.client.0.vm03.stdout:9/699: dwrite fc [0,4194304] 0 2026-03-09T00:04:13.574 INFO:tasks.workunit.client.0.vm03.stdout:9/700: stat d15/d1c/d28/d6e 0 2026-03-09T00:04:13.578 INFO:tasks.workunit.client.1.vm06.stdout:1/876: getdents d6/d21/d2d/d3b/d42/df0 0 2026-03-09T00:04:13.579 INFO:tasks.workunit.client.1.vm06.stdout:1/877: creat d6/d21/d2d/d37/d6d/f127 x:0 0 0 2026-03-09T00:04:13.581 INFO:tasks.workunit.client.1.vm06.stdout:6/959: dread d4/d27/d3e/f55 [0,4194304] 0 2026-03-09T00:04:13.581 INFO:tasks.workunit.client.0.vm03.stdout:9/701: symlink d15/d1c/d28/le8 0 2026-03-09T00:04:13.590 INFO:tasks.workunit.client.0.vm03.stdout:9/702: rename d15/d1c/d21/d75/de0/fe7 to d15/fe9 0 2026-03-09T00:04:13.592 INFO:tasks.workunit.client.0.vm03.stdout:3/590: dread d2/db/d3b/f95 [0,4194304] 0 2026-03-09T00:04:13.594 INFO:tasks.workunit.client.1.vm06.stdout:6/960: link d4/d27/d3e/d45/c10b d4/d27/d3e/d78/c124 0 2026-03-09T00:04:13.597 INFO:tasks.workunit.client.0.vm03.stdout:4/862: dwrite d7/d27/f52 [0,4194304] 0 2026-03-09T00:04:13.598 INFO:tasks.workunit.client.0.vm03.stdout:9/703: unlink d15/d7f/c85 0 2026-03-09T00:04:13.603 INFO:tasks.workunit.client.0.vm03.stdout:4/863: mknod d7/d27/c10f 0 2026-03-09T00:04:13.603 INFO:tasks.workunit.client.0.vm03.stdout:4/864: mkdir d7/d20/d6a/dea/d4e/d110 0 2026-03-09T00:04:13.620 INFO:tasks.workunit.client.0.vm03.stdout:3/591: dread d2/db/d3b/d3f/f46 [0,4194304] 0 2026-03-09T00:04:13.630 INFO:tasks.workunit.client.0.vm03.stdout:3/592: unlink d2/db/d2d/d55/f6f 0 2026-03-09T00:04:13.630 INFO:tasks.workunit.client.0.vm03.stdout:3/593: fsync d2/db/d40/f78 0 2026-03-09T00:04:13.630 INFO:tasks.workunit.client.0.vm03.stdout:3/594: write d2/db/d3b/d3f/f7c [886065,13002] 0 2026-03-09T00:04:13.635 INFO:tasks.workunit.client.0.vm03.stdout:3/595: truncate f1 209677 0 2026-03-09T00:04:13.642 INFO:tasks.workunit.client.0.vm03.stdout:3/596: mknod d2/db/d40/d51/da2/db1/cb2 0 2026-03-09T00:04:13.647 INFO:tasks.workunit.client.0.vm03.stdout:3/597: creat d2/db/d3b/d5f/fb3 x:0 0 0 2026-03-09T00:04:13.648 INFO:tasks.workunit.client.0.vm03.stdout:3/598: write d2/db/d3b/d5f/da5/d72/f86 [431850,37143] 0 2026-03-09T00:04:13.648 INFO:tasks.workunit.client.0.vm03.stdout:3/599: rename d2/db/d3b/d5f/da5/f85 to d2/db/d56/fb4 0 2026-03-09T00:04:13.648 INFO:tasks.workunit.client.0.vm03.stdout:9/704: dread d15/d1c/d21/d75/fbc [0,4194304] 0 2026-03-09T00:04:13.654 INFO:tasks.workunit.client.0.vm03.stdout:9/705: truncate d15/d1c/d21/d75/fa4 3184284 0 2026-03-09T00:04:13.674 INFO:tasks.workunit.client.0.vm03.stdout:9/706: creat d15/d1c/d21/fea x:0 0 0 2026-03-09T00:04:13.675 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:13 vm06.local ceph-mon[58395]: pgmap v9: 65 pgs: 65 active+clean; 3.5 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 115 MiB/s rd, 142 MiB/s wr, 222 op/s 2026-03-09T00:04:13.675 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:13 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:13.675 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:13 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:13.675 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:13 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:13.684 INFO:tasks.workunit.client.1.vm06.stdout:9/894: write d1/f45 [516199,22153] 0 2026-03-09T00:04:13.685 INFO:tasks.workunit.client.1.vm06.stdout:9/895: mkdir d1/d4/df5/d12a 0 2026-03-09T00:04:13.698 INFO:tasks.workunit.client.0.vm03.stdout:6/714: dwrite d13/d35/fda [0,4194304] 0 2026-03-09T00:04:13.698 INFO:tasks.workunit.client.0.vm03.stdout:6/715: read - d13/d1e/d44/d59/fe0 zero size 2026-03-09T00:04:13.699 INFO:tasks.workunit.client.1.vm06.stdout:6/961: dwrite d4/d16/d53/d67/f8f [0,4194304] 0 2026-03-09T00:04:13.703 INFO:tasks.workunit.client.1.vm06.stdout:7/945: sync 2026-03-09T00:04:13.703 INFO:tasks.workunit.client.1.vm06.stdout:8/992: rmdir db/d74/d78/d98/db6/dc7/d101/db7 39 2026-03-09T00:04:13.703 INFO:tasks.workunit.client.1.vm06.stdout:7/946: chown d0/df/fb8 497254231 1 2026-03-09T00:04:13.703 INFO:tasks.workunit.client.1.vm06.stdout:7/947: dread - d0/df/d1a/d27/d70/fd6 zero size 2026-03-09T00:04:13.705 INFO:tasks.workunit.client.0.vm03.stdout:6/716: mkdir d13/d35/d69/dee 0 2026-03-09T00:04:13.707 INFO:tasks.workunit.client.0.vm03.stdout:6/717: unlink d13/d35/d72/cb9 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.0.vm03.stdout:7/694: write d2/d1f/d3a/d24/da4/f47 [4672261,1574] 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.0.vm03.stdout:7/695: chown d2/d1f/d3a/d24/da4/d46/d81/d96 139 1 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.0.vm03.stdout:7/696: fsync d2/d1f/d35/f3e 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.0.vm03.stdout:7/697: mkdir d2/d4/d1e/d5e/d7e/dd0 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:8/993: dread db/d53/d70/d38/fa8 [0,4194304] 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:6/962: creat d4/d16/d53/df2/f125 x:0 0 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:8/994: creat db/d53/d70/f140 x:0 0 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:7/948: mkdir d0/d55/d99/db2/d11a 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:7/949: dread d0/d39/f3e [0,4194304] 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:7/950: dread d0/df/d1a/d27/d4c/d40/d51/d90/dae/fc9 [0,4194304] 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:8/995: dread db/dd/de3/fe7 [0,4194304] 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:8/996: creat db/dd/d24/da7/dab/f141 x:0 0 0 2026-03-09T00:04:13.739 INFO:tasks.workunit.client.1.vm06.stdout:8/997: read - db/d74/d87/f130 zero size 2026-03-09T00:04:13.740 INFO:tasks.workunit.client.1.vm06.stdout:8/998: chown db/d53/d70/d38/d4d/d79/dd5/cde 663074 1 2026-03-09T00:04:13.740 INFO:tasks.workunit.client.1.vm06.stdout:8/999: write db/f114 [2949618,97191] 0 2026-03-09T00:04:13.741 INFO:tasks.workunit.client.1.vm06.stdout:6/963: dread d4/d16/d46/fc4 [0,4194304] 0 2026-03-09T00:04:13.742 INFO:tasks.workunit.client.0.vm03.stdout:6/718: dread d13/d1e/f2d [0,4194304] 0 2026-03-09T00:04:13.742 INFO:tasks.workunit.client.0.vm03.stdout:6/719: chown d13/d35/d74/fc5 12812018 1 2026-03-09T00:04:13.745 INFO:tasks.workunit.client.1.vm06.stdout:6/964: dread d4/d16/d53/ddf/d52/fe3 [0,4194304] 0 2026-03-09T00:04:13.746 INFO:tasks.workunit.client.0.vm03.stdout:6/720: link d13/d1e/d44/l46 d13/d35/d71/d97/lef 0 2026-03-09T00:04:13.746 INFO:tasks.workunit.client.0.vm03.stdout:6/721: chown d13/d1e/d44/d4a/d52/fd5 109784784 1 2026-03-09T00:04:13.747 INFO:tasks.workunit.client.1.vm06.stdout:6/965: symlink d4/d16/d53/ddf/d52/d7d/l126 0 2026-03-09T00:04:13.747 INFO:tasks.workunit.client.1.vm06.stdout:6/966: chown d4/d27/d3e 185 1 2026-03-09T00:04:13.747 INFO:tasks.workunit.client.1.vm06.stdout:6/967: write d4/d16/d53/ddf/d7e/dac/dd3/d101/f79 [3894727,49365] 0 2026-03-09T00:04:13.747 INFO:tasks.workunit.client.0.vm03.stdout:6/722: rmdir d13/d35/d74/d89/d9d 39 2026-03-09T00:04:13.747 INFO:tasks.workunit.client.0.vm03.stdout:6/723: fdatasync d13/d1e/f3e 0 2026-03-09T00:04:13.747 INFO:tasks.workunit.client.0.vm03.stdout:6/724: mknod d13/d35/d71/d97/da5/db1/cf0 0 2026-03-09T00:04:13.757 INFO:tasks.workunit.client.0.vm03.stdout:5/777: sync 2026-03-09T00:04:13.767 INFO:tasks.workunit.client.1.vm06.stdout:3/967: dwrite d11/d28/d2e/d2f/d36/d8f/fca [0,4194304] 0 2026-03-09T00:04:13.769 INFO:tasks.workunit.client.1.vm06.stdout:3/968: unlink d11/d28/d2e/d2f/d5b/ddb/df1/f134 0 2026-03-09T00:04:13.771 INFO:tasks.workunit.client.1.vm06.stdout:3/969: mknod d11/c149 0 2026-03-09T00:04:13.783 INFO:tasks.workunit.client.1.vm06.stdout:3/970: dread d11/d28/d2e/d2f/d5b/db5/f127 [0,4194304] 0 2026-03-09T00:04:13.783 INFO:tasks.workunit.client.1.vm06.stdout:3/971: write d11/d28/d2e/d2f/f92 [4604861,17065] 0 2026-03-09T00:04:13.788 INFO:tasks.workunit.client.1.vm06.stdout:3/972: rename d11/d28/d4d/d89/d90/d147/l95 to d11/d28/d4d/d89/d90/dd2/l14a 0 2026-03-09T00:04:13.804 INFO:tasks.workunit.client.1.vm06.stdout:6/968: dread d4/f26 [0,4194304] 0 2026-03-09T00:04:13.804 INFO:tasks.workunit.client.1.vm06.stdout:6/969: dread - d4/d16/d53/df2/f125 zero size 2026-03-09T00:04:13.806 INFO:tasks.workunit.client.0.vm03.stdout:1/837: dwrite d4/d15/d5c/fb1 [0,4194304] 0 2026-03-09T00:04:13.817 INFO:tasks.workunit.client.1.vm06.stdout:7/951: write d0/df/d1a/d3a/f3c [1056554,125045] 0 2026-03-09T00:04:13.821 INFO:tasks.workunit.client.0.vm03.stdout:1/838: dread d4/f7d [0,4194304] 0 2026-03-09T00:04:13.822 INFO:tasks.workunit.client.0.vm03.stdout:1/839: mkdir d4/d15/d5c/d103/d11c 0 2026-03-09T00:04:13.842 INFO:tasks.workunit.client.0.vm03.stdout:4/865: dwrite d7/d20/d6a/d77/d25/f102 [0,4194304] 0 2026-03-09T00:04:13.842 INFO:tasks.workunit.client.0.vm03.stdout:4/866: chown d7/d20/d6a/d77/db7/f9f 231826379 1 2026-03-09T00:04:13.842 INFO:tasks.workunit.client.0.vm03.stdout:1/840: rmdir d4/d3a 39 2026-03-09T00:04:13.842 INFO:tasks.workunit.client.0.vm03.stdout:1/841: stat d4/d3a/d32/dc2 0 2026-03-09T00:04:13.842 INFO:tasks.workunit.client.0.vm03.stdout:1/842: dread - d4/d15/ffd zero size 2026-03-09T00:04:13.842 INFO:tasks.workunit.client.0.vm03.stdout:1/843: getdents d4/de2 0 2026-03-09T00:04:13.846 INFO:tasks.workunit.client.0.vm03.stdout:1/844: unlink d4/d15/d77/dce/df6/cd0 0 2026-03-09T00:04:13.846 INFO:tasks.workunit.client.0.vm03.stdout:1/845: truncate d4/d3a/d61/da6/fa7 590712 0 2026-03-09T00:04:13.850 INFO:tasks.workunit.client.0.vm03.stdout:4/867: link d7/d20/d6a/d77/l7a d7/d20/d6a/dea/d54/d58/l111 0 2026-03-09T00:04:13.862 INFO:tasks.workunit.client.0.vm03.stdout:1/846: mkdir d4/d3a/d3d/d98/d11d 0 2026-03-09T00:04:13.865 INFO:tasks.workunit.client.0.vm03.stdout:4/868: dread d7/d20/d6a/d77/d25/f7f [0,4194304] 0 2026-03-09T00:04:13.865 INFO:tasks.workunit.client.0.vm03.stdout:4/869: fsync d7/d20/d6a/dea/d38/da9/ddc/f7c 0 2026-03-09T00:04:13.865 INFO:tasks.workunit.client.0.vm03.stdout:1/847: dread d4/d3a/d3d/d46/f5d [0,4194304] 0 2026-03-09T00:04:13.865 INFO:tasks.workunit.client.0.vm03.stdout:1/848: fdatasync d4/f39 0 2026-03-09T00:04:13.865 INFO:tasks.workunit.client.0.vm03.stdout:1/849: fsync d4/d3a/d3d/f10d 0 2026-03-09T00:04:13.865 INFO:tasks.workunit.client.0.vm03.stdout:1/850: chown d4/d3a/d32/dc2/cdf 4 1 2026-03-09T00:04:13.870 INFO:tasks.workunit.client.0.vm03.stdout:1/851: truncate d4/d15/dae/d101/f108 2604691 0 2026-03-09T00:04:13.874 INFO:tasks.workunit.client.0.vm03.stdout:1/852: getdents d4/d15/d77/dce/dd9/df3 0 2026-03-09T00:04:13.874 INFO:tasks.workunit.client.0.vm03.stdout:1/853: mknod d4/d3a/d3d/d46/df7/c11e 0 2026-03-09T00:04:13.874 INFO:tasks.workunit.client.0.vm03.stdout:1/854: chown d4/d15/dae/d101 35429212 1 2026-03-09T00:04:13.881 INFO:tasks.workunit.client.1.vm06.stdout:3/973: dwrite d11/d28/d2e/d2f/f74 [0,4194304] 0 2026-03-09T00:04:13.885 INFO:tasks.workunit.client.0.vm03.stdout:3/600: dwrite d2/db/f67 [0,4194304] 0 2026-03-09T00:04:13.888 INFO:tasks.workunit.client.0.vm03.stdout:5/778: dwrite d1c/d20/d55/d4f/d58/d73/d76/fd5 [4194304,4194304] 0 2026-03-09T00:04:13.888 INFO:tasks.workunit.client.0.vm03.stdout:9/707: dwrite d15/d1c/d21/fea [0,4194304] 0 2026-03-09T00:04:13.890 INFO:tasks.workunit.client.1.vm06.stdout:4/993: dwrite d17/d21/f38 [4194304,4194304] 0 2026-03-09T00:04:13.890 INFO:tasks.workunit.client.1.vm06.stdout:4/994: chown d17/d24/d49 1306091 1 2026-03-09T00:04:13.890 INFO:tasks.workunit.client.1.vm06.stdout:4/995: fdatasync d17/f155 0 2026-03-09T00:04:13.892 INFO:tasks.workunit.client.0.vm03.stdout:5/779: dread d1c/d20/f39 [0,4194304] 0 2026-03-09T00:04:13.892 INFO:tasks.workunit.client.0.vm03.stdout:5/780: chown d1c/d20/d55/ff6 38 1 2026-03-09T00:04:13.892 INFO:tasks.workunit.client.0.vm03.stdout:5/781: fsync d1c/d20/d55/f9b 0 2026-03-09T00:04:13.892 INFO:tasks.workunit.client.0.vm03.stdout:5/782: fsync d1c/d20/d55/d4f/d58/d73/d76/d91/fa2 0 2026-03-09T00:04:13.893 INFO:tasks.workunit.client.0.vm03.stdout:3/601: mkdir d2/db/d40/d44/db5 0 2026-03-09T00:04:13.895 INFO:tasks.workunit.client.0.vm03.stdout:3/602: read d2/db/d40/f78 [54429,17134] 0 2026-03-09T00:04:13.896 INFO:tasks.workunit.client.0.vm03.stdout:6/725: getdents d13/d35/d69 0 2026-03-09T00:04:13.901 INFO:tasks.workunit.client.1.vm06.stdout:4/996: unlink d17/l59 0 2026-03-09T00:04:13.902 INFO:tasks.workunit.client.0.vm03.stdout:5/783: getdents d1c/d20/d55/d66/d6b/d8f/dca 0 2026-03-09T00:04:13.903 INFO:tasks.workunit.client.1.vm06.stdout:9/896: dwrite d1/d3/d4f/d52/f75 [4194304,4194304] 0 2026-03-09T00:04:13.906 INFO:tasks.workunit.client.1.vm06.stdout:4/997: read d17/d21/d4c/d50/ff4 [148405,7618] 0 2026-03-09T00:04:13.907 INFO:tasks.workunit.client.0.vm03.stdout:6/726: mkdir d13/d35/d74/df1 0 2026-03-09T00:04:13.909 INFO:tasks.workunit.client.0.vm03.stdout:9/708: rmdir d15/d1c/d21/d75/de0 39 2026-03-09T00:04:13.910 INFO:tasks.workunit.client.0.vm03.stdout:9/709: dread - d15/d1c/d21/d54/d87/fd6 zero size 2026-03-09T00:04:13.910 INFO:tasks.workunit.client.0.vm03.stdout:9/710: fdatasync d15/d1c/d28/d6e/da2/fc0 0 2026-03-09T00:04:13.914 INFO:tasks.workunit.client.0.vm03.stdout:5/784: truncate d1c/d20/d55/d4f/f69 2502880 0 2026-03-09T00:04:13.915 INFO:tasks.workunit.client.0.vm03.stdout:5/785: read d1c/d20/d55/fdb [235630,84278] 0 2026-03-09T00:04:13.919 INFO:tasks.workunit.client.0.vm03.stdout:6/727: symlink d13/dc4/dea/dd7/lf2 0 2026-03-09T00:04:13.919 INFO:tasks.workunit.client.1.vm06.stdout:6/970: dwrite d4/f5 [0,4194304] 0 2026-03-09T00:04:13.919 INFO:tasks.workunit.client.1.vm06.stdout:6/971: readlink d4/d27/d3e/d45/l7b 0 2026-03-09T00:04:13.930 INFO:tasks.workunit.client.0.vm03.stdout:9/711: mknod d15/d1c/d21/ceb 0 2026-03-09T00:04:13.930 INFO:tasks.workunit.client.0.vm03.stdout:9/712: stat d15/d1c/d21/d54/d87/l95 0 2026-03-09T00:04:13.931 INFO:tasks.workunit.client.1.vm06.stdout:4/998: unlink d17/d5b/ff9 0 2026-03-09T00:04:13.931 INFO:tasks.workunit.client.1.vm06.stdout:9/897: mkdir d1/d3/d4f/d91/d94/ddf/dfe/d12b 0 2026-03-09T00:04:13.931 INFO:tasks.workunit.client.1.vm06.stdout:9/898: stat d1/d3/f11 0 2026-03-09T00:04:13.931 INFO:tasks.workunit.client.1.vm06.stdout:9/899: write d1/d4/d2f/f7f [5064180,12988] 0 2026-03-09T00:04:13.935 INFO:tasks.workunit.client.1.vm06.stdout:3/974: dwrite d11/d28/d2e/d2f/f3e [0,4194304] 0 2026-03-09T00:04:13.937 INFO:tasks.workunit.client.0.vm03.stdout:9/713: mkdir d15/d1c/d36/d4d/dc4/dec 0 2026-03-09T00:04:13.941 INFO:tasks.workunit.client.0.vm03.stdout:5/786: rename d1c/d20/d55/d4f/d58/l7e to d1c/lff 0 2026-03-09T00:04:13.941 INFO:tasks.workunit.client.0.vm03.stdout:5/787: chown d1c/d20/d55/d66/c78 32121 1 2026-03-09T00:04:13.941 INFO:tasks.workunit.client.0.vm03.stdout:5/788: chown d1c/d20/d56/db4 1462134 1 2026-03-09T00:04:13.941 INFO:tasks.workunit.client.0.vm03.stdout:5/789: chown d1c/d20/d55/d4f/c6d 469262 1 2026-03-09T00:04:13.951 INFO:tasks.workunit.client.1.vm06.stdout:6/972: mkdir d4/d16/d46/d127 0 2026-03-09T00:04:13.958 INFO:tasks.workunit.client.0.vm03.stdout:9/714: mkdir d15/d1c/d28/de1/ded 0 2026-03-09T00:04:13.958 INFO:tasks.workunit.client.0.vm03.stdout:9/715: chown d15/d1c/d21/d54/d87/lcb 25538 1 2026-03-09T00:04:13.960 INFO:tasks.workunit.client.1.vm06.stdout:3/975: rmdir d11/d28/d4d/d89/d90/dd2/d148 0 2026-03-09T00:04:13.965 INFO:tasks.workunit.client.0.vm03.stdout:2/798: sync 2026-03-09T00:04:13.970 INFO:tasks.workunit.client.0.vm03.stdout:0/787: sync 2026-03-09T00:04:13.971 INFO:tasks.workunit.client.0.vm03.stdout:8/764: sync 2026-03-09T00:04:13.971 INFO:tasks.workunit.client.1.vm06.stdout:6/973: mkdir d4/d16/d121/d128 0 2026-03-09T00:04:13.971 INFO:tasks.workunit.client.1.vm06.stdout:3/976: rmdir d11/d28/d4d/d9b 39 2026-03-09T00:04:13.971 INFO:tasks.workunit.client.0.vm03.stdout:4/870: rmdir d7/d20/d6a/d77 39 2026-03-09T00:04:13.971 INFO:tasks.workunit.client.0.vm03.stdout:4/871: fdatasync d7/d20/d6a/dde/ffa 0 2026-03-09T00:04:13.971 INFO:tasks.workunit.client.0.vm03.stdout:2/799: getdents d8/d1b/d24/da5/dfe 0 2026-03-09T00:04:13.972 INFO:tasks.workunit.client.0.vm03.stdout:8/765: link d7/df/d1a/l1b d7/df/d1a/d40/db3/dba/lea 0 2026-03-09T00:04:13.975 INFO:tasks.workunit.client.0.vm03.stdout:2/800: mkdir d8/d1b/d24/da5/dfe/d105/d108 0 2026-03-09T00:04:13.975 INFO:tasks.workunit.client.1.vm06.stdout:4/999: rename d17/d21/d4c/dc2 to d17/d24/d49/de4/db0/d156 0 2026-03-09T00:04:13.975 INFO:tasks.workunit.client.1.vm06.stdout:9/900: rename d1/d3 to d1/d3/d4f/d12c 22 2026-03-09T00:04:13.979 INFO:tasks.workunit.client.0.vm03.stdout:0/788: rmdir d2/da/d76/d8a/d8f/db8 39 2026-03-09T00:04:13.979 INFO:tasks.workunit.client.1.vm06.stdout:6/974: symlink d4/d16/d53/ddf/d52/d110/l129 0 2026-03-09T00:04:13.980 INFO:tasks.workunit.client.1.vm06.stdout:7/952: dwrite d0/d55/d99/f10e [0,4194304] 0 2026-03-09T00:04:13.980 INFO:tasks.workunit.client.1.vm06.stdout:7/953: write d0/df/d1a/d3f/d53/fef [1030349,60783] 0 2026-03-09T00:04:13.981 INFO:tasks.workunit.client.0.vm03.stdout:0/789: mknod d2/da/c11e 0 2026-03-09T00:04:13.981 INFO:tasks.workunit.client.0.vm03.stdout:0/790: rmdir d2/da/dd/d49/d6c/d4b/d55/d6f/dad 39 2026-03-09T00:04:13.981 INFO:tasks.workunit.client.1.vm06.stdout:9/901: getdents d1/d3/d2b/d58 0 2026-03-09T00:04:13.981 INFO:tasks.workunit.client.1.vm06.stdout:9/902: dread - d1/d3/d4f/d91/d94/fcd zero size 2026-03-09T00:04:13.985 INFO:tasks.workunit.client.1.vm06.stdout:3/977: getdents d11/d28/d2e/d2f/d5b/ddb/df1 0 2026-03-09T00:04:13.989 INFO:tasks.workunit.client.0.vm03.stdout:0/791: symlink d2/da/d36/ddf/l11f 0 2026-03-09T00:04:13.999 INFO:tasks.workunit.client.1.vm06.stdout:6/975: dread d4/d16/d53/df2/ffa [0,4194304] 0 2026-03-09T00:04:14.004 INFO:tasks.workunit.client.1.vm06.stdout:7/954: creat d0/df/d1a/d27/f11b x:0 0 0 2026-03-09T00:04:14.004 INFO:tasks.workunit.client.1.vm06.stdout:7/955: creat d0/df/d1a/d3f/d53/f11c x:0 0 0 2026-03-09T00:04:14.004 INFO:tasks.workunit.client.1.vm06.stdout:3/978: mkdir d11/d28/d2e/d2f/d5b/d14b 0 2026-03-09T00:04:14.009 INFO:tasks.workunit.client.0.vm03.stdout:0/792: dread d2/da/d1a/f25 [0,4194304] 0 2026-03-09T00:04:14.009 INFO:tasks.workunit.client.0.vm03.stdout:0/793: read - d2/fd3 zero size 2026-03-09T00:04:14.009 INFO:tasks.workunit.client.0.vm03.stdout:0/794: write d2/da/dd/d49/d6c/d4b/d55/d6f/dad/de8/f113 [1153504,84973] 0 2026-03-09T00:04:14.009 INFO:tasks.workunit.client.0.vm03.stdout:0/795: chown d2/da/c11e 10 1 2026-03-09T00:04:14.019 INFO:tasks.workunit.client.1.vm06.stdout:6/976: symlink d4/l12a 0 2026-03-09T00:04:14.023 INFO:tasks.workunit.client.0.vm03.stdout:2/801: dread d8/fd [0,4194304] 0 2026-03-09T00:04:14.028 INFO:tasks.workunit.client.1.vm06.stdout:7/956: mkdir d0/df/d1a/d3f/de8/d11d 0 2026-03-09T00:04:14.031 INFO:tasks.workunit.client.1.vm06.stdout:7/957: rename d0/df/d1a/d3a/d4e/d5e/c96 to d0/d55/d99/db2/c11e 0 2026-03-09T00:04:14.038 INFO:tasks.workunit.client.1.vm06.stdout:7/958: creat d0/df/d1a/d27/d4c/d40/d51/d90/dae/de0/f11f x:0 0 0 2026-03-09T00:04:14.038 INFO:tasks.workunit.client.0.vm03.stdout:3/603: dwrite d2/db/f3a [0,4194304] 0 2026-03-09T00:04:14.038 INFO:tasks.workunit.client.1.vm06.stdout:6/977: write d4/d16/d53/ddf/d7e/dac/dd3/d101/f65 [1553176,42480] 0 2026-03-09T00:04:14.043 INFO:tasks.workunit.client.1.vm06.stdout:7/959: link d0/df/d1a/d3a/d4e/d5e/ddc/f81 d0/df/d1a/d27/d70/f120 0 2026-03-09T00:04:14.046 INFO:tasks.workunit.client.1.vm06.stdout:7/960: truncate d0/df/d1a/d27/d4c/d40/d51/d86/fc3 1631300 0 2026-03-09T00:04:14.057 INFO:tasks.workunit.client.0.vm03.stdout:3/604: creat d2/db/d56/fb6 x:0 0 0 2026-03-09T00:04:14.057 INFO:tasks.workunit.client.0.vm03.stdout:3/605: link d2/db/d2d/fae d2/db/d2d/fb7 0 2026-03-09T00:04:14.057 INFO:tasks.workunit.client.0.vm03.stdout:3/606: dread - d2/db/d56/fb6 zero size 2026-03-09T00:04:14.057 INFO:tasks.workunit.client.0.vm03.stdout:3/607: dread - d2/db/f80 zero size 2026-03-09T00:04:14.057 INFO:tasks.workunit.client.0.vm03.stdout:3/608: truncate d2/db/d56/fb4 454289 0 2026-03-09T00:04:14.061 INFO:tasks.workunit.client.0.vm03.stdout:2/802: dread d8/d74/fc7 [0,4194304] 0 2026-03-09T00:04:14.062 INFO:tasks.workunit.client.0.vm03.stdout:6/728: dwrite d13/d1e/f21 [0,4194304] 0 2026-03-09T00:04:14.062 INFO:tasks.workunit.client.0.vm03.stdout:6/729: stat d13/d1e/d44/d59/dec 0 2026-03-09T00:04:14.063 INFO:tasks.workunit.client.0.vm03.stdout:6/730: creat d13/d35/ff3 x:0 0 0 2026-03-09T00:04:14.066 INFO:tasks.workunit.client.0.vm03.stdout:2/803: dread d8/d26/d5e/d6f/d97/f1a [0,4194304] 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:3/609: write d2/db/d2d/f8b [2816336,108733] 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/731: write d13/f70 [345105,1871] 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/732: creat d13/d1e/d44/d59/d77/ff4 x:0 0 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:3/610: mkdir d2/db/d3b/d3f/db8 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/733: mkdir d13/d1e/d44/d59/dec/d62/df5 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/734: write d13/fd3 [429238,71439] 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/735: creat d13/d35/d74/d89/db3/ff6 x:0 0 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/736: chown d13/d1e/d44/c5a 725470 1 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/737: chown d13/d1e/d44/d4a/ce7 135163 1 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/738: stat d13/dc4/dea/dd7 0 2026-03-09T00:04:14.077 INFO:tasks.workunit.client.0.vm03.stdout:6/739: unlink d13/d1e/d44/d4a/d52/c5f 0 2026-03-09T00:04:14.079 INFO:tasks.workunit.client.0.vm03.stdout:6/740: creat d13/d35/d69/ff7 x:0 0 0 2026-03-09T00:04:14.080 INFO:tasks.workunit.client.0.vm03.stdout:6/741: chown d13/d1e/d44/d59/dec/caf 218691 1 2026-03-09T00:04:14.080 INFO:tasks.workunit.client.0.vm03.stdout:3/611: dread d2/db/d3b/d5f/da5/d72/f7a [0,4194304] 0 2026-03-09T00:04:14.080 INFO:tasks.workunit.client.0.vm03.stdout:3/612: write d2/db/d40/d51/f5c [4607716,3652] 0 2026-03-09T00:04:14.094 INFO:tasks.workunit.client.0.vm03.stdout:3/613: dread d2/db/f13 [0,4194304] 0 2026-03-09T00:04:14.094 INFO:tasks.workunit.client.0.vm03.stdout:3/614: creat d2/db/d2d/fb9 x:0 0 0 2026-03-09T00:04:14.096 INFO:tasks.workunit.client.0.vm03.stdout:2/804: dread d8/d1b/d6c/f7b [0,4194304] 0 2026-03-09T00:04:14.097 INFO:tasks.workunit.client.0.vm03.stdout:3/615: creat d2/db/d3b/d5d/fba x:0 0 0 2026-03-09T00:04:14.097 INFO:tasks.workunit.client.0.vm03.stdout:3/616: fsync d2/db/d40/d51/f57 0 2026-03-09T00:04:14.099 INFO:tasks.workunit.client.0.vm03.stdout:3/617: creat d2/db/d3b/d5f/da5/d72/d96/fbb x:0 0 0 2026-03-09T00:04:14.119 INFO:tasks.workunit.client.0.vm03.stdout:1/855: dwrite d4/d3a/d61/d78/f8e [0,4194304] 0 2026-03-09T00:04:14.125 INFO:tasks.workunit.client.0.vm03.stdout:2/805: dread d8/d1b/d2a/f4c [0,4194304] 0 2026-03-09T00:04:14.128 INFO:tasks.workunit.client.1.vm06.stdout:3/979: dwrite d11/d28/d2e/d2f/f74 [0,4194304] 0 2026-03-09T00:04:14.137 INFO:tasks.workunit.client.0.vm03.stdout:5/790: dwrite d1c/d20/d55/f9b [0,4194304] 0 2026-03-09T00:04:14.145 INFO:tasks.workunit.client.0.vm03.stdout:5/791: dread d1c/d20/d55/f5a [0,4194304] 0 2026-03-09T00:04:14.145 INFO:tasks.workunit.client.1.vm06.stdout:9/903: dwrite d1/d4/d6e/d9/f87 [0,4194304] 0 2026-03-09T00:04:14.148 INFO:tasks.workunit.client.1.vm06.stdout:6/978: dwrite d4/d27/d3e/f11b [0,4194304] 0 2026-03-09T00:04:14.148 INFO:tasks.workunit.client.0.vm03.stdout:8/766: dwrite d7/df/d1a/d2b/f9f [0,4194304] 0 2026-03-09T00:04:14.148 INFO:tasks.workunit.client.0.vm03.stdout:0/796: dwrite d2/da/d1a/fc4 [0,4194304] 0 2026-03-09T00:04:14.150 INFO:tasks.workunit.client.0.vm03.stdout:9/716: dwrite d15/d1c/d21/d54/dab/fe3 [0,4194304] 0 2026-03-09T00:04:14.156 INFO:tasks.workunit.client.1.vm06.stdout:6/979: dread d4/fc [0,4194304] 0 2026-03-09T00:04:14.164 INFO:tasks.workunit.client.0.vm03.stdout:1/856: creat d4/d15/d77/d8c/f11f x:0 0 0 2026-03-09T00:04:14.164 INFO:tasks.workunit.client.0.vm03.stdout:1/857: write d4/d3a/d3d/d98/dee/d93/fbe [610015,120380] 0 2026-03-09T00:04:14.178 INFO:tasks.workunit.client.1.vm06.stdout:3/980: truncate d11/d28/d2e/d2f/fec 714024 0 2026-03-09T00:04:14.185 INFO:tasks.workunit.client.0.vm03.stdout:2/806: rename d8/d26/d5e/d6f/d97/f75 to d8/d26/f109 0 2026-03-09T00:04:14.192 INFO:tasks.workunit.client.1.vm06.stdout:1/878: sync 2026-03-09T00:04:14.197 INFO:tasks.workunit.client.0.vm03.stdout:6/742: dwrite d13/d35/d74/fc5 [0,4194304] 0 2026-03-09T00:04:14.206 INFO:tasks.workunit.client.0.vm03.stdout:1/858: dwrite d4/d3a/d3d/d46/df7/ffa [0,4194304] 0 2026-03-09T00:04:14.224 INFO:tasks.workunit.client.0.vm03.stdout:8/767: dwrite d7/df/d1a/d40/f5e [0,4194304] 0 2026-03-09T00:04:14.237 INFO:tasks.workunit.client.0.vm03.stdout:8/768: dread d7/df/d1a/d40/db3/dba/d38/d91/fa5 [0,4194304] 0 2026-03-09T00:04:14.251 INFO:tasks.workunit.client.1.vm06.stdout:7/961: getdents d0/df/d1a/d27/d70 0 2026-03-09T00:04:14.255 INFO:tasks.workunit.client.0.vm03.stdout:0/797: link d2/fe d2/da/d4e/f120 0 2026-03-09T00:04:14.256 INFO:tasks.workunit.client.0.vm03.stdout:0/798: chown d2/da/dd/d49/d6c/da6/dda/db5 13013 1 2026-03-09T00:04:14.256 INFO:tasks.workunit.client.0.vm03.stdout:0/799: truncate d2/da/dd/d49/d6c/f41 4235376 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.0.vm03.stdout:1/859: dwrite d4/d3a/d3d/d98/dee/d93/ffc [0,4194304] 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.0.vm03.stdout:9/717: symlink d15/d1c/d21/d54/d87/d93/lee 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.1.vm06.stdout:6/980: mkdir d4/d16/d53/ddf/d7e/dac/dd3/d12b 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.1.vm06.stdout:6/981: read d4/f36 [7778663,50611] 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.1.vm06.stdout:6/982: write d4/d16/d53/ddf/d52/f6c [4187856,2441] 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.1.vm06.stdout:6/983: write d4/d16/d53/ddf/d4b/fa9 [1813780,114232] 0 2026-03-09T00:04:14.270 INFO:tasks.workunit.client.1.vm06.stdout:6/984: chown d4/d16 4432 1 2026-03-09T00:04:14.276 INFO:tasks.workunit.client.1.vm06.stdout:3/981: mkdir d11/d28/d4d/d14c 0 2026-03-09T00:04:14.280 INFO:tasks.workunit.client.0.vm03.stdout:2/807: symlink d8/d1b/d2a/d56/l10a 0 2026-03-09T00:04:14.280 INFO:tasks.workunit.client.0.vm03.stdout:2/808: chown d8/d1b/d2a/d6b/d50/d8a 7 1 2026-03-09T00:04:14.280 INFO:tasks.workunit.client.0.vm03.stdout:2/809: truncate d8/d1b/f30 5073120 0 2026-03-09T00:04:14.281 INFO:tasks.workunit.client.1.vm06.stdout:7/962: dread d0/df/d1a/d27/d4c/d40/d51/d86/fc3 [0,4194304] 0 2026-03-09T00:04:14.281 INFO:tasks.workunit.client.1.vm06.stdout:7/963: dread - d0/df/d1a/d27/d4c/f6d zero size 2026-03-09T00:04:14.287 INFO:tasks.workunit.client.1.vm06.stdout:1/879: mkdir d6/dc4/d128 0 2026-03-09T00:04:14.296 INFO:tasks.workunit.client.1.vm06.stdout:7/964: dread d0/df/d1a/d35/f61 [0,4194304] 0 2026-03-09T00:04:14.296 INFO:tasks.workunit.client.1.vm06.stdout:7/965: readlink d0/df/d1a/d27/d70/d9b/de2/d109/l10c 0 2026-03-09T00:04:14.312 INFO:tasks.workunit.client.1.vm06.stdout:6/985: creat d4/d16/d53/ddf/da6/f12c x:0 0 0 2026-03-09T00:04:14.312 INFO:tasks.workunit.client.1.vm06.stdout:6/986: stat d4/d16/d53/d67 0 2026-03-09T00:04:14.312 INFO:tasks.workunit.client.1.vm06.stdout:6/987: read - d4/d16/d53/d67/f107 zero size 2026-03-09T00:04:14.314 INFO:tasks.workunit.client.0.vm03.stdout:6/743: creat d13/d35/d74/d89/ff8 x:0 0 0 2026-03-09T00:04:14.317 INFO:tasks.workunit.client.1.vm06.stdout:1/880: dwrite d6/d4c/d79/fc2 [0,4194304] 0 2026-03-09T00:04:14.317 INFO:tasks.workunit.client.1.vm06.stdout:1/881: fdatasync d6/f28 0 2026-03-09T00:04:14.317 INFO:tasks.workunit.client.1.vm06.stdout:1/882: dread - d6/d8f/f117 zero size 2026-03-09T00:04:14.317 INFO:tasks.workunit.client.1.vm06.stdout:3/982: mknod d11/d28/d2e/d2f/d36/d8f/c14d 0 2026-03-09T00:04:14.322 INFO:tasks.workunit.client.1.vm06.stdout:7/966: link d0/df/d1a/d27/d4c/d40/l10d d0/df/d7b/l121 0 2026-03-09T00:04:14.327 INFO:tasks.workunit.client.1.vm06.stdout:6/988: mkdir d4/d8d/d12d 0 2026-03-09T00:04:14.335 INFO:tasks.workunit.client.0.vm03.stdout:8/769: creat d7/df/d1a/d40/db3/dba/d38/feb x:0 0 0 2026-03-09T00:04:14.349 INFO:tasks.workunit.client.1.vm06.stdout:3/983: readlink d11/d3f/l45 0 2026-03-09T00:04:14.351 INFO:tasks.workunit.client.0.vm03.stdout:0/800: mknod d2/da/dd/d49/d6c/da6/dda/c121 0 2026-03-09T00:04:14.355 INFO:tasks.workunit.client.1.vm06.stdout:6/989: creat d4/d16/d53/ddf/d7e/dac/dd3/f12e x:0 0 0 2026-03-09T00:04:14.357 INFO:tasks.workunit.client.1.vm06.stdout:3/984: mknod d11/d28/d2e/c14e 0 2026-03-09T00:04:14.357 INFO:tasks.workunit.client.0.vm03.stdout:1/860: symlink d4/d3a/d61/d78/dd8/l120 0 2026-03-09T00:04:14.371 INFO:tasks.workunit.client.1.vm06.stdout:6/990: dread d4/d27/d3e/d78/f92 [0,4194304] 0 2026-03-09T00:04:14.375 INFO:tasks.workunit.client.0.vm03.stdout:8/770: dwrite d7/df/d1a/d40/db3/dba/d38/f3e [0,4194304] 0 2026-03-09T00:04:14.384 INFO:tasks.workunit.client.0.vm03.stdout:9/718: creat d15/d1c/d28/de1/ded/fef x:0 0 0 2026-03-09T00:04:14.388 INFO:tasks.workunit.client.0.vm03.stdout:8/771: creat d7/df/d1a/d40/d58/fec x:0 0 0 2026-03-09T00:04:14.390 INFO:tasks.workunit.client.0.vm03.stdout:9/719: mknod d15/d1c/d36/d4d/dc4/dec/cf0 0 2026-03-09T00:04:14.390 INFO:tasks.workunit.client.0.vm03.stdout:9/720: chown d15/d1c/d36/d4d/cd8 88 1 2026-03-09T00:04:14.392 INFO:tasks.workunit.client.0.vm03.stdout:2/810: truncate d8/d1b/d24/f38 1048277 0 2026-03-09T00:04:14.397 INFO:tasks.workunit.client.0.vm03.stdout:2/811: write d8/d1b/d2a/d6b/d50/d8a/fc3 [848259,57823] 0 2026-03-09T00:04:14.397 INFO:tasks.workunit.client.0.vm03.stdout:9/721: truncate d15/d1c/d21/f61 1025703 0 2026-03-09T00:04:14.397 INFO:tasks.workunit.client.0.vm03.stdout:9/722: read - d15/d1c/d28/d6e/fd4 zero size 2026-03-09T00:04:14.410 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:14 vm03.local ceph-mon[52346]: pgmap v10: 65 pgs: 65 active+clean; 3.5 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 103 MiB/s rd, 127 MiB/s wr, 199 op/s 2026-03-09T00:04:14.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:14 vm06.local ceph-mon[58395]: pgmap v10: 65 pgs: 65 active+clean; 3.5 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 103 MiB/s rd, 127 MiB/s wr, 199 op/s 2026-03-09T00:04:14.443 INFO:tasks.workunit.client.1.vm06.stdout:6/991: dwrite d4/d16/d53/ddf/d4b/ddb/f117 [0,4194304] 0 2026-03-09T00:04:14.454 INFO:tasks.workunit.client.0.vm03.stdout:7/698: sync 2026-03-09T00:04:14.454 INFO:tasks.workunit.client.0.vm03.stdout:7/699: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/db2/lcc 6 1 2026-03-09T00:04:14.456 INFO:tasks.workunit.client.0.vm03.stdout:8/772: dwrite d7/df/d1a/d40/db3/dba/d3f/f47 [0,4194304] 0 2026-03-09T00:04:14.456 INFO:tasks.workunit.client.0.vm03.stdout:9/723: dwrite d15/d1c/d21/d54/f80 [0,4194304] 0 2026-03-09T00:04:14.456 INFO:tasks.workunit.client.0.vm03.stdout:9/724: dread - d15/d1c/d36/fe4 zero size 2026-03-09T00:04:14.464 INFO:tasks.workunit.client.1.vm06.stdout:9/904: link d1/d3/d50/ld4 d1/d73/l12d 0 2026-03-09T00:04:14.464 INFO:tasks.workunit.client.1.vm06.stdout:9/905: stat d1/d3/d50/fba 0 2026-03-09T00:04:14.465 INFO:tasks.workunit.client.1.vm06.stdout:9/906: mkdir d1/d3/d4f/d91/de8/d12e 0 2026-03-09T00:04:14.465 INFO:tasks.workunit.client.1.vm06.stdout:9/907: dread - d1/d3/d4f/fbd zero size 2026-03-09T00:04:14.465 INFO:tasks.workunit.client.1.vm06.stdout:9/908: chown d1/d4/d11f/c105 54 1 2026-03-09T00:04:14.471 INFO:tasks.workunit.client.1.vm06.stdout:9/909: dread d1/d3/d4f/d91/dae/f118 [0,4194304] 0 2026-03-09T00:04:14.513 INFO:tasks.workunit.client.1.vm06.stdout:9/910: dwrite d1/d4/f9c [0,4194304] 0 2026-03-09T00:04:14.515 INFO:tasks.workunit.client.1.vm06.stdout:9/911: mknod d1/d3/d4f/d91/dae/de9/d126/c12f 0 2026-03-09T00:04:14.519 INFO:tasks.workunit.client.1.vm06.stdout:9/912: write d1/d73/fc2 [3583830,114013] 0 2026-03-09T00:04:14.519 INFO:tasks.workunit.client.1.vm06.stdout:9/913: fdatasync d1/d3/f5c 0 2026-03-09T00:04:14.519 INFO:tasks.workunit.client.1.vm06.stdout:9/914: fdatasync d1/d4/d6e/d9/f8a 0 2026-03-09T00:04:14.554 INFO:tasks.workunit.client.1.vm06.stdout:9/915: dwrite d1/d3/d4f/d91/d94/f95 [0,4194304] 0 2026-03-09T00:04:14.557 INFO:tasks.workunit.client.1.vm06.stdout:9/916: mkdir d1/d3/d4f/d91/d94/ddf/d130 0 2026-03-09T00:04:14.561 INFO:tasks.workunit.client.1.vm06.stdout:9/917: dread d1/d4/d6e/d9/f82 [0,4194304] 0 2026-03-09T00:04:14.565 INFO:tasks.workunit.client.1.vm06.stdout:9/918: dread d1/d4/d11f/d25/d85/f28 [0,4194304] 0 2026-03-09T00:04:14.604 INFO:tasks.workunit.client.1.vm06.stdout:9/919: dwrite d1/d3/d4f/d91/d94/ff6 [0,4194304] 0 2026-03-09T00:04:14.604 INFO:tasks.workunit.client.1.vm06.stdout:9/920: creat d1/d3/d4f/d91/dae/de9/d126/f131 x:0 0 0 2026-03-09T00:04:14.606 INFO:tasks.workunit.client.0.vm03.stdout:4/872: sync 2026-03-09T00:04:14.607 INFO:tasks.workunit.client.0.vm03.stdout:4/873: creat d7/d20/d6a/dea/d54/d58/d85/f112 x:0 0 0 2026-03-09T00:04:14.607 INFO:tasks.workunit.client.0.vm03.stdout:4/874: truncate d7/d20/d6a/dde/f104 905820 0 2026-03-09T00:04:14.640 INFO:tasks.workunit.client.0.vm03.stdout:4/875: dwrite d7/d27/f89 [0,4194304] 0 2026-03-09T00:04:14.640 INFO:tasks.workunit.client.0.vm03.stdout:4/876: readlink d7/d20/d6a/dea/d38/dfb/d10c/lb9 0 2026-03-09T00:04:14.641 INFO:tasks.workunit.client.0.vm03.stdout:4/877: creat d7/d20/d6a/dea/d38/dfb/d10c/f113 x:0 0 0 2026-03-09T00:04:14.641 INFO:tasks.workunit.client.0.vm03.stdout:4/878: creat d7/d20/d6a/dea/d78/f114 x:0 0 0 2026-03-09T00:04:14.642 INFO:tasks.workunit.client.0.vm03.stdout:4/879: creat d7/d20/d6a/dea/d38/da9/ddc/f115 x:0 0 0 2026-03-09T00:04:14.642 INFO:tasks.workunit.client.0.vm03.stdout:4/880: readlink d7/d20/d6a/dea/l4d 0 2026-03-09T00:04:14.642 INFO:tasks.workunit.client.0.vm03.stdout:4/881: creat d7/d20/d6a/dea/d38/f116 x:0 0 0 2026-03-09T00:04:14.643 INFO:tasks.workunit.client.0.vm03.stdout:4/882: mkdir d7/de6/d117 0 2026-03-09T00:04:14.643 INFO:tasks.workunit.client.0.vm03.stdout:4/883: mknod d7/d20/d6a/dea/d38/c118 0 2026-03-09T00:04:14.693 INFO:tasks.workunit.client.0.vm03.stdout:5/792: sync 2026-03-09T00:04:14.693 INFO:tasks.workunit.client.0.vm03.stdout:3/618: sync 2026-03-09T00:04:14.693 INFO:tasks.workunit.client.0.vm03.stdout:3/619: fdatasync d2/db/f1a 0 2026-03-09T00:04:14.701 INFO:tasks.workunit.client.0.vm03.stdout:3/620: mknod d2/db/d3b/d5f/da5/cbc 0 2026-03-09T00:04:14.701 INFO:tasks.workunit.client.0.vm03.stdout:3/621: chown d2/db/d3b/d5f/da5/d72/d96/fbb 406457410 1 2026-03-09T00:04:14.705 INFO:tasks.workunit.client.0.vm03.stdout:3/622: mkdir d2/db/d3b/d5f/da5/d72/dbd 0 2026-03-09T00:04:14.706 INFO:tasks.workunit.client.0.vm03.stdout:3/623: mknod d2/db/d3b/d3f/daf/cbe 0 2026-03-09T00:04:14.715 INFO:tasks.workunit.client.0.vm03.stdout:5/793: dread d1c/d20/d55/d4f/d58/db5/f6f [4194304,4194304] 0 2026-03-09T00:04:14.720 INFO:tasks.workunit.client.1.vm06.stdout:1/883: write d6/d21/d2d/d37/fb5 [3029070,45293] 0 2026-03-09T00:04:14.878 INFO:tasks.workunit.client.0.vm03.stdout:0/801: creat d2/da/d76/d8a/f122 x:0 0 0 2026-03-09T00:04:14.912 INFO:tasks.workunit.client.0.vm03.stdout:0/802: dwrite d2/da/dd/d49/d6c/d4b/f100 [0,4194304] 0 2026-03-09T00:04:14.922 INFO:tasks.workunit.client.0.vm03.stdout:0/803: symlink d2/da/d76/d8a/d8f/l123 0 2026-03-09T00:04:14.922 INFO:tasks.workunit.client.0.vm03.stdout:0/804: symlink d2/da/d76/l124 0 2026-03-09T00:04:14.946 INFO:tasks.workunit.client.0.vm03.stdout:8/773: rmdir d7/df/d1a/d40/de0 0 2026-03-09T00:04:14.949 INFO:tasks.workunit.client.0.vm03.stdout:8/774: truncate d7/f25 2614064 0 2026-03-09T00:04:14.949 INFO:tasks.workunit.client.0.vm03.stdout:8/775: truncate d7/df/d1a/d40/db3/dba/d38/d60/dcd/fdd 2161894 0 2026-03-09T00:04:14.949 INFO:tasks.workunit.client.0.vm03.stdout:8/776: write d7/df/d1a/d2b/f77 [4525972,33483] 0 2026-03-09T00:04:14.949 INFO:tasks.workunit.client.0.vm03.stdout:8/777: truncate d7/df/d1a/d40/db3/dba/dc3/fd3 788922 0 2026-03-09T00:04:14.953 INFO:tasks.workunit.client.1.vm06.stdout:6/992: rename d4/d16/d53/ddf/d4b/fba to d4/f12f 0 2026-03-09T00:04:14.953 INFO:tasks.workunit.client.1.vm06.stdout:6/993: creat d4/d27/f130 x:0 0 0 2026-03-09T00:04:14.954 INFO:tasks.workunit.client.0.vm03.stdout:8/778: rmdir d7/df/d1a/d40/db3/dba/d38/d60/dcd 39 2026-03-09T00:04:14.954 INFO:tasks.workunit.client.0.vm03.stdout:8/779: readlink d7/df/d1a/d40/db3/dba/d38/d60/l7b 0 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:6/994: symlink d4/d16/d53/ddf/d7e/dac/l131 0 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:9/921: rename d1/d3/f11 to d1/da7/dfc/f132 0 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:9/922: chown d1/d4/d2f/cf8 0 1 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:9/923: chown d1/d4/fe 4 1 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:9/924: fsync d1/d3/d4f/d91/f10b 0 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:6/995: symlink d4/d16/d53/ddf/d7e/daa/l132 0 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:6/996: write d4/d27/d3e/f41 [5580841,107465] 0 2026-03-09T00:04:14.955 INFO:tasks.workunit.client.1.vm06.stdout:6/997: creat d4/d27/f133 x:0 0 0 2026-03-09T00:04:14.956 INFO:tasks.workunit.client.1.vm06.stdout:9/925: creat d1/d4/d11f/d25/f133 x:0 0 0 2026-03-09T00:04:14.959 INFO:tasks.workunit.client.1.vm06.stdout:6/998: symlink d4/d16/d53/ddf/d52/l134 0 2026-03-09T00:04:14.960 INFO:tasks.workunit.client.1.vm06.stdout:6/999: symlink d4/d16/d53/ddf/d7e/dac/dd3/d101/l135 0 2026-03-09T00:04:14.960 INFO:tasks.workunit.client.1.vm06.stdout:1/884: rename d6/d4c/d51 to d6/d21/d2d/d3b/d42/d129 0 2026-03-09T00:04:14.961 INFO:tasks.workunit.client.1.vm06.stdout:1/885: unlink d6/d21/d2d/d3b/c46 0 2026-03-09T00:04:14.963 INFO:tasks.workunit.client.1.vm06.stdout:1/886: rename d6/d4c/c6b to d6/d21/c12a 0 2026-03-09T00:04:14.964 INFO:tasks.workunit.client.1.vm06.stdout:1/887: truncate d6/d21/d2d/d3b/d42/f9a 1079028 0 2026-03-09T00:04:14.965 INFO:tasks.workunit.client.1.vm06.stdout:1/888: fdatasync d6/d21/d2d/d3b/d42/fb4 0 2026-03-09T00:04:14.965 INFO:tasks.workunit.client.1.vm06.stdout:1/889: chown d6/d4c/f90 1401178 1 2026-03-09T00:04:14.965 INFO:tasks.workunit.client.1.vm06.stdout:1/890: chown d6/d21/d2d/d3b/d87/d9d/dd8/fe6 2980 1 2026-03-09T00:04:14.967 INFO:tasks.workunit.client.0.vm03.stdout:8/780: read d7/df/d1a/d40/db3/dba/d38/f3e [1386887,23426] 0 2026-03-09T00:04:14.967 INFO:tasks.workunit.client.0.vm03.stdout:8/781: fdatasync f3 0 2026-03-09T00:04:14.970 INFO:tasks.workunit.client.1.vm06.stdout:9/926: dread d1/d4/f39 [0,4194304] 0 2026-03-09T00:04:14.971 INFO:tasks.workunit.client.0.vm03.stdout:8/782: creat d7/df/d1a/d40/d9d/da3/dd2/fed x:0 0 0 2026-03-09T00:04:14.973 INFO:tasks.workunit.client.0.vm03.stdout:8/783: dread d7/df/d1a/d40/d58/f8c [0,4194304] 0 2026-03-09T00:04:14.981 INFO:tasks.workunit.client.0.vm03.stdout:8/784: creat d7/df/fee x:0 0 0 2026-03-09T00:04:14.982 INFO:tasks.workunit.client.0.vm03.stdout:8/785: symlink d7/df/d1a/d40/db3/dba/dad/lef 0 2026-03-09T00:04:14.982 INFO:tasks.workunit.client.0.vm03.stdout:8/786: mkdir d7/df/d1a/d40/d9d/da3/df0 0 2026-03-09T00:04:14.982 INFO:tasks.workunit.client.0.vm03.stdout:8/787: chown d7/df/d1a/d40/db3/dba/dc3/fe3 498213598 1 2026-03-09T00:04:14.985 INFO:tasks.workunit.client.0.vm03.stdout:8/788: mkdir d7/df/d1a/d40/db3/dba/d3f/df1 0 2026-03-09T00:04:15.018 INFO:tasks.workunit.client.1.vm06.stdout:1/891: dwrite d6/d4c/feb [0,4194304] 0 2026-03-09T00:04:15.019 INFO:tasks.workunit.client.1.vm06.stdout:1/892: mkdir d6/d21/d2d/d3b/d87/d12b 0 2026-03-09T00:04:15.022 INFO:tasks.workunit.client.1.vm06.stdout:1/893: creat d6/d21/d2d/d3b/d42/d129/f12c x:0 0 0 2026-03-09T00:04:15.033 INFO:tasks.workunit.client.0.vm03.stdout:8/789: dwrite d7/df/d1a/d40/db3/dba/d38/d4c/f97 [0,4194304] 0 2026-03-09T00:04:15.034 INFO:tasks.workunit.client.0.vm03.stdout:6/744: rename d13/f6f to d13/d35/d74/d89/db3/ff9 0 2026-03-09T00:04:15.034 INFO:tasks.workunit.client.0.vm03.stdout:6/745: fdatasync d13/f17 0 2026-03-09T00:04:15.039 INFO:tasks.workunit.client.0.vm03.stdout:2/812: rename d8/d74/ldc to d8/d1b/d24/da5/dfe/l10b 0 2026-03-09T00:04:15.039 INFO:tasks.workunit.client.0.vm03.stdout:2/813: truncate d8/d1b/d2a/d6b/d50/f54 2630518 0 2026-03-09T00:04:15.042 INFO:tasks.workunit.client.0.vm03.stdout:7/700: rename d2/d4/db7/d67/c68 to d2/d4/d1e/d78/cd1 0 2026-03-09T00:04:15.048 INFO:tasks.workunit.client.0.vm03.stdout:8/790: write d7/df/d1a/d40/d58/fe1 [3373445,92853] 0 2026-03-09T00:04:15.048 INFO:tasks.workunit.client.0.vm03.stdout:8/791: stat d7/df/f55 0 2026-03-09T00:04:15.058 INFO:tasks.workunit.client.1.vm06.stdout:7/967: unlink d0/l7c 0 2026-03-09T00:04:15.058 INFO:tasks.workunit.client.0.vm03.stdout:7/701: rename d2/d4/d1e/d85 to d2/d1f/d3a/d24/da4/d46/d54/d8d/dd2 0 2026-03-09T00:04:15.059 INFO:tasks.workunit.client.0.vm03.stdout:7/702: fdatasync d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/f49 0 2026-03-09T00:04:15.059 INFO:tasks.workunit.client.0.vm03.stdout:7/703: creat d2/d4/d8c/fd3 x:0 0 0 2026-03-09T00:04:15.059 INFO:tasks.workunit.client.0.vm03.stdout:7/704: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d37/fb0 677 1 2026-03-09T00:04:15.060 INFO:tasks.workunit.client.1.vm06.stdout:7/968: symlink d0/df/d17/l122 0 2026-03-09T00:04:15.062 INFO:tasks.workunit.client.0.vm03.stdout:7/705: symlink d2/d1f/d3a/d24/da4/d46/d81/d96/d80/ld4 0 2026-03-09T00:04:15.063 INFO:tasks.workunit.client.1.vm06.stdout:7/969: dread d0/df/d1a/d27/f66 [0,4194304] 0 2026-03-09T00:04:15.063 INFO:tasks.workunit.client.0.vm03.stdout:9/725: rename d15/d1c/d21/f46 to d15/d1c/d21/d54/d87/d93/ff1 0 2026-03-09T00:04:15.065 INFO:tasks.workunit.client.0.vm03.stdout:7/706: link d2/d4/d1e/fa8 d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/fd5 0 2026-03-09T00:04:15.072 INFO:tasks.workunit.client.1.vm06.stdout:7/970: mkdir d0/d123 0 2026-03-09T00:04:15.072 INFO:tasks.workunit.client.1.vm06.stdout:7/971: chown d0/df/d1a 577610 1 2026-03-09T00:04:15.072 INFO:tasks.workunit.client.1.vm06.stdout:7/972: chown d0/df/d7b/dd2 1805192 1 2026-03-09T00:04:15.072 INFO:tasks.workunit.client.1.vm06.stdout:7/973: write d0/df/d1a/d3f/d53/fc6 [529691,87756] 0 2026-03-09T00:04:15.076 INFO:tasks.workunit.client.0.vm03.stdout:7/707: read d2/d1f/d3a/f1a [1995193,14781] 0 2026-03-09T00:04:15.076 INFO:tasks.workunit.client.0.vm03.stdout:7/708: write d2/d1f/d3a/f1a [1597899,104292] 0 2026-03-09T00:04:15.076 INFO:tasks.workunit.client.0.vm03.stdout:9/726: link f11 d15/d1c/d28/d6e/da2/ff2 0 2026-03-09T00:04:15.100 INFO:tasks.workunit.client.0.vm03.stdout:3/624: rename d2/db/d40/d44/d68/d99 to d2/dbf 0 2026-03-09T00:04:15.100 INFO:tasks.workunit.client.0.vm03.stdout:3/625: creat d2/db/d3b/d5d/fc0 x:0 0 0 2026-03-09T00:04:15.100 INFO:tasks.workunit.client.0.vm03.stdout:3/626: chown d2/db/d3b/d3f/db8 12200835 1 2026-03-09T00:04:15.100 INFO:tasks.workunit.client.0.vm03.stdout:3/627: truncate d2/db/d3b/f6c 512776 0 2026-03-09T00:04:15.100 INFO:tasks.workunit.client.0.vm03.stdout:9/727: getdents d15/d1c/d21/d54 0 2026-03-09T00:04:15.100 INFO:tasks.workunit.client.0.vm03.stdout:7/709: symlink d2/d1f/d3a/d24/da4/d46/d81/d96/d80/ld6 0 2026-03-09T00:04:15.104 INFO:tasks.workunit.client.0.vm03.stdout:3/628: dread d2/db/d3b/d3f/f7c [0,4194304] 0 2026-03-09T00:04:15.110 INFO:tasks.workunit.client.0.vm03.stdout:2/814: dwrite d8/d1b/d2a/d6b/d50/d8a/fc0 [0,4194304] 0 2026-03-09T00:04:15.113 INFO:tasks.workunit.client.1.vm06.stdout:7/974: dwrite d0/fe [0,4194304] 0 2026-03-09T00:04:15.116 INFO:tasks.workunit.client.0.vm03.stdout:1/861: creat d4/d3a/d3d/d98/f121 x:0 0 0 2026-03-09T00:04:15.125 INFO:tasks.workunit.client.0.vm03.stdout:6/746: rename d13/d1e/f30 to d13/d1e/d44/ffa 0 2026-03-09T00:04:15.129 INFO:tasks.workunit.client.0.vm03.stdout:9/728: mkdir d15/d1c/d21/df3 0 2026-03-09T00:04:15.129 INFO:tasks.workunit.client.0.vm03.stdout:9/729: readlink d15/d1c/d28/l42 0 2026-03-09T00:04:15.132 INFO:tasks.workunit.client.0.vm03.stdout:7/710: creat d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/fd7 x:0 0 0 2026-03-09T00:04:15.132 INFO:tasks.workunit.client.1.vm06.stdout:7/975: creat d0/df/d1a/d27/d70/d9b/de2/f124 x:0 0 0 2026-03-09T00:04:15.132 INFO:tasks.workunit.client.1.vm06.stdout:7/976: creat d0/df/d1a/d3f/d53/f125 x:0 0 0 2026-03-09T00:04:15.132 INFO:tasks.workunit.client.1.vm06.stdout:7/977: stat d0/df/d17/f1f 0 2026-03-09T00:04:15.139 INFO:tasks.workunit.client.0.vm03.stdout:5/794: dwrite d1c/d20/f25 [0,4194304] 0 2026-03-09T00:04:15.139 INFO:tasks.workunit.client.0.vm03.stdout:5/795: fsync f15 0 2026-03-09T00:04:15.141 INFO:tasks.workunit.client.0.vm03.stdout:3/629: symlink d2/db/d40/d51/lc1 0 2026-03-09T00:04:15.141 INFO:tasks.workunit.client.0.vm03.stdout:3/630: chown d2/db/d3b/d5f/da5/f6e 126 1 2026-03-09T00:04:15.151 INFO:tasks.workunit.client.0.vm03.stdout:6/747: mkdir d13/d1e/d44/d59/dec/d62/dfb 0 2026-03-09T00:04:15.151 INFO:tasks.workunit.client.0.vm03.stdout:6/748: fdatasync d13/f17 0 2026-03-09T00:04:15.153 INFO:tasks.workunit.client.0.vm03.stdout:9/730: unlink d15/d7f/c83 0 2026-03-09T00:04:15.153 INFO:tasks.workunit.client.0.vm03.stdout:9/731: chown d15/d1c/d21/d54/d87/d93/lc8 0 1 2026-03-09T00:04:15.153 INFO:tasks.workunit.client.0.vm03.stdout:9/732: fdatasync d15/d1c/d21/fdc 0 2026-03-09T00:04:15.154 INFO:tasks.workunit.client.1.vm06.stdout:1/894: truncate d6/d4c/feb 2268355 0 2026-03-09T00:04:15.158 INFO:tasks.workunit.client.1.vm06.stdout:7/978: symlink d0/df/d1a/d3a/l126 0 2026-03-09T00:04:15.158 INFO:tasks.workunit.client.1.vm06.stdout:7/979: read d0/df/d1a/d27/d70/d9b/f107 [37684,109653] 0 2026-03-09T00:04:15.162 INFO:tasks.workunit.client.0.vm03.stdout:5/796: creat d1c/d51/d6a/d75/df0/f100 x:0 0 0 2026-03-09T00:04:15.162 INFO:tasks.workunit.client.0.vm03.stdout:5/797: dread - d1c/d20/d55/d66/d70/fde zero size 2026-03-09T00:04:15.167 INFO:tasks.workunit.client.1.vm06.stdout:1/895: creat d6/d21/d2d/d37/dbc/f12d x:0 0 0 2026-03-09T00:04:15.169 INFO:tasks.workunit.client.0.vm03.stdout:5/798: dread d1c/d20/d55/f46 [4194304,4194304] 0 2026-03-09T00:04:15.169 INFO:tasks.workunit.client.0.vm03.stdout:5/799: fsync d1c/d20/d55/f7d 0 2026-03-09T00:04:15.171 INFO:tasks.workunit.client.1.vm06.stdout:1/896: creat d6/d8f/d10f/f12e x:0 0 0 2026-03-09T00:04:15.171 INFO:tasks.workunit.client.1.vm06.stdout:1/897: write d6/d21/fe7 [642299,20643] 0 2026-03-09T00:04:15.173 INFO:tasks.workunit.client.0.vm03.stdout:5/800: write d1c/d20/d55/d4f/d58/db5/df7/ff8 [2028460,38028] 0 2026-03-09T00:04:15.177 INFO:tasks.workunit.client.0.vm03.stdout:6/749: symlink d13/dc4/dea/lfc 0 2026-03-09T00:04:15.187 INFO:tasks.workunit.client.1.vm06.stdout:1/898: mknod d6/d21/d2d/d3b/d42/df0/d106/c12f 0 2026-03-09T00:04:15.190 INFO:tasks.workunit.client.1.vm06.stdout:1/899: symlink d6/d4c/d79/d10c/d11d/l130 0 2026-03-09T00:04:15.191 INFO:tasks.workunit.client.0.vm03.stdout:8/792: rename d7/df/d1a/d40/db3/dba to d7/df/d1a/d40/d9d/df2 0 2026-03-09T00:04:15.191 INFO:tasks.workunit.client.0.vm03.stdout:8/793: write d7/df/d1a/d40/d9d/df2/d3f/d95/fcb [912227,87023] 0 2026-03-09T00:04:15.191 INFO:tasks.workunit.client.0.vm03.stdout:6/750: mknod d13/d35/d72/cfd 0 2026-03-09T00:04:15.191 INFO:tasks.workunit.client.0.vm03.stdout:6/751: write f2 [4324999,124303] 0 2026-03-09T00:04:15.192 INFO:tasks.workunit.client.0.vm03.stdout:8/794: link d7/df/d1a/d2b/d62/fd5 d7/df/ff3 0 2026-03-09T00:04:15.195 INFO:tasks.workunit.client.0.vm03.stdout:2/815: rename d8/d1b/d2a/d56/f57 to d8/d26/d5e/d5f/d95/f10c 0 2026-03-09T00:04:15.195 INFO:tasks.workunit.client.0.vm03.stdout:2/816: readlink d8/d1b/d2a/d56/la2 0 2026-03-09T00:04:15.195 INFO:tasks.workunit.client.0.vm03.stdout:6/752: rmdir d13/dc4/dc9 0 2026-03-09T00:04:15.195 INFO:tasks.workunit.client.0.vm03.stdout:7/711: dwrite d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/fd5 [0,4194304] 0 2026-03-09T00:04:15.195 INFO:tasks.workunit.client.0.vm03.stdout:7/712: write d2/f73 [2304103,10135] 0 2026-03-09T00:04:15.195 INFO:tasks.workunit.client.0.vm03.stdout:7/713: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d80 197 1 2026-03-09T00:04:15.196 INFO:tasks.workunit.client.0.vm03.stdout:8/795: rmdir d7/df/d1a/d40/d9d/df2/d3f 39 2026-03-09T00:04:15.196 INFO:tasks.workunit.client.0.vm03.stdout:8/796: creat d7/df/d1a/d2b/ff4 x:0 0 0 2026-03-09T00:04:15.196 INFO:tasks.workunit.client.0.vm03.stdout:2/817: mknod d8/d1b/d2a/c10d 0 2026-03-09T00:04:15.202 INFO:tasks.workunit.client.0.vm03.stdout:8/797: truncate d7/df/d1a/d40/f78 1937079 0 2026-03-09T00:04:15.202 INFO:tasks.workunit.client.0.vm03.stdout:8/798: mknod d7/df/d1a/d40/dd9/cf5 0 2026-03-09T00:04:15.202 INFO:tasks.workunit.client.0.vm03.stdout:8/799: write d7/df/d1a/f93 [177822,11880] 0 2026-03-09T00:04:15.202 INFO:tasks.workunit.client.0.vm03.stdout:8/800: write d7/df/d1a/d2b/f44 [5096712,118126] 0 2026-03-09T00:04:15.202 INFO:tasks.workunit.client.0.vm03.stdout:6/753: creat d13/d35/d69/dee/ffe x:0 0 0 2026-03-09T00:04:15.202 INFO:tasks.workunit.client.0.vm03.stdout:5/801: rename d1c/d20/d56/d74 to d1c/d20/d55/db0/dc7/d101 0 2026-03-09T00:04:15.218 INFO:tasks.workunit.client.0.vm03.stdout:7/714: dread d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f56 [0,4194304] 0 2026-03-09T00:04:15.218 INFO:tasks.workunit.client.1.vm06.stdout:3/985: sync 2026-03-09T00:04:15.232 INFO:tasks.workunit.client.0.vm03.stdout:5/802: mknod d1c/d20/d55/dac/c102 0 2026-03-09T00:04:15.246 INFO:tasks.workunit.client.0.vm03.stdout:5/803: dread d1c/d20/d55/f9b [0,4194304] 0 2026-03-09T00:04:15.250 INFO:tasks.workunit.client.0.vm03.stdout:3/631: dwrite d2/db/d2d/f52 [0,4194304] 0 2026-03-09T00:04:15.256 INFO:tasks.workunit.client.0.vm03.stdout:4/884: sync 2026-03-09T00:04:15.256 INFO:tasks.workunit.client.0.vm03.stdout:0/805: sync 2026-03-09T00:04:15.256 INFO:tasks.workunit.client.0.vm03.stdout:0/806: fdatasync d2/da/dd/d49/d6c/f52 0 2026-03-09T00:04:15.257 INFO:tasks.workunit.client.1.vm06.stdout:3/986: dread d11/d28/d2e/db2/f116 [0,4194304] 0 2026-03-09T00:04:15.261 INFO:tasks.workunit.client.1.vm06.stdout:9/927: sync 2026-03-09T00:04:15.266 INFO:tasks.workunit.client.1.vm06.stdout:9/928: dread - d1/d3/d2b/fbb zero size 2026-03-09T00:04:15.268 INFO:tasks.workunit.client.0.vm03.stdout:0/807: read d2/da/dd/f38 [6904428,10708] 0 2026-03-09T00:04:15.268 INFO:tasks.workunit.client.0.vm03.stdout:0/808: fdatasync d2/f32 0 2026-03-09T00:04:15.268 INFO:tasks.workunit.client.0.vm03.stdout:5/804: rename f15 to d1c/d20/d55/d4f/d58/d73/f103 0 2026-03-09T00:04:15.271 INFO:tasks.workunit.client.1.vm06.stdout:9/929: symlink d1/d3/d4f/d91/d94/ddf/l134 0 2026-03-09T00:04:15.271 INFO:tasks.workunit.client.1.vm06.stdout:9/930: fdatasync d1/d4/d6e/d9/f4c 0 2026-03-09T00:04:15.279 INFO:tasks.workunit.client.0.vm03.stdout:2/818: dwrite d8/d1b/d2a/d2e/f9e [0,4194304] 0 2026-03-09T00:04:15.279 INFO:tasks.workunit.client.0.vm03.stdout:0/809: write d2/da/dd/d49/d6c/d4b/f88 [6918810,97034] 0 2026-03-09T00:04:15.281 INFO:tasks.workunit.client.0.vm03.stdout:8/801: dwrite d7/df/f55 [4194304,4194304] 0 2026-03-09T00:04:15.287 INFO:tasks.workunit.client.0.vm03.stdout:0/810: write d2/ff [4157434,21130] 0 2026-03-09T00:04:15.291 INFO:tasks.workunit.client.0.vm03.stdout:3/632: truncate d2/db/d3b/d5f/da5/f6e 380746 0 2026-03-09T00:04:15.291 INFO:tasks.workunit.client.0.vm03.stdout:3/633: chown d2/db/d2d/f45 0 1 2026-03-09T00:04:15.297 INFO:tasks.workunit.client.0.vm03.stdout:4/885: mknod d7/d20/d6a/dea/d38/c119 0 2026-03-09T00:04:15.299 INFO:tasks.workunit.client.0.vm03.stdout:4/886: getdents d7/d20/d6a/dea/df6 0 2026-03-09T00:04:15.299 INFO:tasks.workunit.client.0.vm03.stdout:4/887: readlink d7/d20/d6a/dea/d4e/l5c 0 2026-03-09T00:04:15.299 INFO:tasks.workunit.client.0.vm03.stdout:4/888: write d7/d20/d6a/dea/fe5 [420157,97011] 0 2026-03-09T00:04:15.299 INFO:tasks.workunit.client.0.vm03.stdout:5/805: truncate d1c/d51/d6a/d75/f77 707443 0 2026-03-09T00:04:15.299 INFO:tasks.workunit.client.0.vm03.stdout:5/806: readlink d1c/d20/l50 0 2026-03-09T00:04:15.301 INFO:tasks.workunit.client.0.vm03.stdout:2/819: symlink d8/d1b/d2a/d2e/l10e 0 2026-03-09T00:04:15.301 INFO:tasks.workunit.client.0.vm03.stdout:2/820: chown d8/d1b/d2a/d2e/cce 14661 1 2026-03-09T00:04:15.301 INFO:tasks.workunit.client.0.vm03.stdout:2/821: dread - d8/d26/d5e/db1/fd0 zero size 2026-03-09T00:04:15.302 INFO:tasks.workunit.client.0.vm03.stdout:8/802: creat d7/df/d1a/d40/d9d/da3/ff6 x:0 0 0 2026-03-09T00:04:15.305 INFO:tasks.workunit.client.0.vm03.stdout:0/811: symlink d2/da/d36/ddf/l125 0 2026-03-09T00:04:15.305 INFO:tasks.workunit.client.0.vm03.stdout:0/812: truncate d2/da/d76/d8a/f122 188083 0 2026-03-09T00:04:15.305 INFO:tasks.workunit.client.0.vm03.stdout:0/813: truncate d2/da/d1a/fd5 285953 0 2026-03-09T00:04:15.308 INFO:tasks.workunit.client.0.vm03.stdout:5/807: mknod d1c/d20/d55/d66/dc6/df1/c104 0 2026-03-09T00:04:15.308 INFO:tasks.workunit.client.0.vm03.stdout:5/808: truncate d1c/d20/d97/ded/ffb 820963 0 2026-03-09T00:04:15.308 INFO:tasks.workunit.client.0.vm03.stdout:5/809: fdatasync d1c/d20/d55/d66/d6b/de3/ff4 0 2026-03-09T00:04:15.324 INFO:tasks.workunit.client.1.vm06.stdout:7/980: sync 2026-03-09T00:04:15.326 INFO:tasks.workunit.client.1.vm06.stdout:7/981: mknod d0/df/d1a/d22/c127 0 2026-03-09T00:04:15.326 INFO:tasks.workunit.client.0.vm03.stdout:8/803: dread d7/df/d1a/d40/d9d/df2/f24 [0,4194304] 0 2026-03-09T00:04:15.329 INFO:tasks.workunit.client.1.vm06.stdout:9/931: dwrite d1/da7/fad [0,4194304] 0 2026-03-09T00:04:15.329 INFO:tasks.workunit.client.0.vm03.stdout:6/754: rmdir d13/d35/d69/dee 39 2026-03-09T00:04:15.329 INFO:tasks.workunit.client.0.vm03.stdout:6/755: read d13/f70 [1349400,101224] 0 2026-03-09T00:04:15.331 INFO:tasks.workunit.client.0.vm03.stdout:6/756: mkdir d13/d35/dff 0 2026-03-09T00:04:15.331 INFO:tasks.workunit.client.0.vm03.stdout:7/715: dwrite d2/d1f/d3a/f1a [0,4194304] 0 2026-03-09T00:04:15.335 INFO:tasks.workunit.client.0.vm03.stdout:8/804: rename d7/df/d1a/d2b/c82 to d7/cf7 0 2026-03-09T00:04:15.342 INFO:tasks.workunit.client.1.vm06.stdout:9/932: getdents d1/d73 0 2026-03-09T00:04:15.342 INFO:tasks.workunit.client.0.vm03.stdout:6/757: symlink d13/d35/d74/d89/l100 0 2026-03-09T00:04:15.346 INFO:tasks.workunit.client.0.vm03.stdout:8/805: dread d7/df/d1a/d40/d58/f57 [0,4194304] 0 2026-03-09T00:04:15.346 INFO:tasks.workunit.client.0.vm03.stdout:8/806: dread - d7/df/d1a/d40/d9d/df2/d38/f61 zero size 2026-03-09T00:04:15.348 INFO:tasks.workunit.client.1.vm06.stdout:9/933: write d1/d3/f23 [26329,57791] 0 2026-03-09T00:04:15.353 INFO:tasks.workunit.client.1.vm06.stdout:7/982: dread d0/df/d1a/d27/f43 [0,4194304] 0 2026-03-09T00:04:15.353 INFO:tasks.workunit.client.1.vm06.stdout:7/983: truncate d0/df/fb8 966502 0 2026-03-09T00:04:15.357 INFO:tasks.workunit.client.0.vm03.stdout:7/716: dread d2/d1f/d3a/f1a [0,4194304] 0 2026-03-09T00:04:15.362 INFO:tasks.workunit.client.1.vm06.stdout:9/934: rmdir d1/d3/d4f/d91/d94/ddf 39 2026-03-09T00:04:15.362 INFO:tasks.workunit.client.1.vm06.stdout:3/987: write d11/d28/d2e/dff/f123 [1433511,122293] 0 2026-03-09T00:04:15.367 INFO:tasks.workunit.client.0.vm03.stdout:7/717: dread d2/f73 [0,4194304] 0 2026-03-09T00:04:15.370 INFO:tasks.workunit.client.0.vm03.stdout:1/862: sync 2026-03-09T00:04:15.374 INFO:tasks.workunit.client.1.vm06.stdout:1/900: sync 2026-03-09T00:04:15.381 INFO:tasks.workunit.client.0.vm03.stdout:1/863: dread d4/d15/d77/f7c [4194304,4194304] 0 2026-03-09T00:04:15.384 INFO:tasks.workunit.client.0.vm03.stdout:1/864: truncate d4/f39 528042 0 2026-03-09T00:04:15.385 INFO:tasks.workunit.client.0.vm03.stdout:7/718: dread d2/d4/d1e/fae [0,4194304] 0 2026-03-09T00:04:15.385 INFO:tasks.workunit.client.0.vm03.stdout:7/719: truncate d2/d1f/d3a/d24/da4/d46/d81/d96/fbb 1823384 0 2026-03-09T00:04:15.387 INFO:tasks.workunit.client.0.vm03.stdout:5/810: dwrite d1c/d20/d55/d43/f53 [0,4194304] 0 2026-03-09T00:04:15.387 INFO:tasks.workunit.client.0.vm03.stdout:5/811: write ff [1859048,14977] 0 2026-03-09T00:04:15.399 INFO:tasks.workunit.client.0.vm03.stdout:0/814: dwrite d2/da/d1a/fd5 [0,4194304] 0 2026-03-09T00:04:15.402 INFO:tasks.workunit.client.0.vm03.stdout:0/815: stat d2/da/dd/d49/c72 0 2026-03-09T00:04:15.405 INFO:tasks.workunit.client.1.vm06.stdout:7/984: creat d0/df/d1a/d27/d4c/f128 x:0 0 0 2026-03-09T00:04:15.410 INFO:tasks.workunit.client.1.vm06.stdout:9/935: stat d1/f16 0 2026-03-09T00:04:15.412 INFO:tasks.workunit.client.1.vm06.stdout:7/985: mkdir d0/df/d1a/dec/d129 0 2026-03-09T00:04:15.412 INFO:tasks.workunit.client.1.vm06.stdout:7/986: fdatasync d0/df/d1a/d35/f106 0 2026-03-09T00:04:15.421 INFO:tasks.workunit.client.0.vm03.stdout:8/807: mkdir d7/df/d1a/d40/d9d/df2/d38/d91/df8 0 2026-03-09T00:04:15.423 INFO:tasks.workunit.client.0.vm03.stdout:6/758: creat d13/d1e/d44/d59/dec/d62/dfb/f101 x:0 0 0 2026-03-09T00:04:15.423 INFO:tasks.workunit.client.0.vm03.stdout:7/720: truncate d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/f74 2262621 0 2026-03-09T00:04:15.434 INFO:tasks.workunit.client.0.vm03.stdout:8/808: dread d7/f25 [0,4194304] 0 2026-03-09T00:04:15.434 INFO:tasks.workunit.client.0.vm03.stdout:8/809: read d7/df/d1a/f2e [90340,130679] 0 2026-03-09T00:04:15.434 INFO:tasks.workunit.client.0.vm03.stdout:8/810: fsync d7/df/d1a/d40/d58/fec 0 2026-03-09T00:04:15.436 INFO:tasks.workunit.client.0.vm03.stdout:8/811: creat d7/df/d1a/d40/d58/ff9 x:0 0 0 2026-03-09T00:04:15.436 INFO:tasks.workunit.client.0.vm03.stdout:8/812: fsync d7/df/ff3 0 2026-03-09T00:04:15.436 INFO:tasks.workunit.client.0.vm03.stdout:8/813: dread - d7/df/d1a/d40/d9d/da3/dd2/fed zero size 2026-03-09T00:04:15.436 INFO:tasks.workunit.client.0.vm03.stdout:8/814: dread - d7/df/d1a/d40/d58/fec zero size 2026-03-09T00:04:15.449 INFO:tasks.workunit.client.0.vm03.stdout:7/721: rename d2/f4d to d2/d4/db7/fd8 0 2026-03-09T00:04:15.449 INFO:tasks.workunit.client.0.vm03.stdout:7/722: chown d2/d4/f13 1020922 1 2026-03-09T00:04:15.486 INFO:tasks.workunit.client.0.vm03.stdout:4/889: dwrite d7/d20/d6a/dea/fe5 [0,4194304] 0 2026-03-09T00:04:15.495 INFO:tasks.workunit.client.0.vm03.stdout:4/890: truncate d7/d20/d35/f68 5963048 0 2026-03-09T00:04:15.504 INFO:tasks.workunit.client.0.vm03.stdout:4/891: dread - d7/d20/d6a/dea/ff0 zero size 2026-03-09T00:04:15.504 INFO:tasks.workunit.client.0.vm03.stdout:4/892: write d7/f71 [776141,35336] 0 2026-03-09T00:04:15.507 INFO:tasks.workunit.client.0.vm03.stdout:4/893: dread d7/f15 [0,4194304] 0 2026-03-09T00:04:15.507 INFO:tasks.workunit.client.0.vm03.stdout:4/894: chown d7/d20/d6a/dea/d78/ldd 4485 1 2026-03-09T00:04:15.508 INFO:tasks.workunit.client.0.vm03.stdout:4/895: creat d7/d20/d6a/dea/df6/f11a x:0 0 0 2026-03-09T00:04:15.508 INFO:tasks.workunit.client.0.vm03.stdout:4/896: chown d7/d20/d6a/d77/d25/de2/df1/f10d 1978 1 2026-03-09T00:04:15.508 INFO:tasks.workunit.client.0.vm03.stdout:4/897: chown d7/c24 607533 1 2026-03-09T00:04:15.513 INFO:tasks.workunit.client.0.vm03.stdout:4/898: write d7/d20/d6a/dea/d38/fd1 [329254,23424] 0 2026-03-09T00:04:15.518 INFO:tasks.workunit.client.0.vm03.stdout:4/899: creat d7/d20/d6a/d77/d25/de2/f11b x:0 0 0 2026-03-09T00:04:15.553 INFO:tasks.workunit.client.1.vm06.stdout:7/987: dwrite d0/df/d1a/d35/f61 [8388608,4194304] 0 2026-03-09T00:04:15.553 INFO:tasks.workunit.client.1.vm06.stdout:7/988: readlink d0/df/d1a/d27/d4c/l46 0 2026-03-09T00:04:15.553 INFO:tasks.workunit.client.1.vm06.stdout:7/989: chown d0/df/d1a/d3f/d53/f11c 925 1 2026-03-09T00:04:15.555 INFO:tasks.workunit.client.1.vm06.stdout:7/990: unlink d0/df/d1a/d22/l2e 0 2026-03-09T00:04:15.555 INFO:tasks.workunit.client.1.vm06.stdout:7/991: fdatasync d0/df/d1a/d27/d4c/d40/d51/d86/fbd 0 2026-03-09T00:04:15.555 INFO:tasks.workunit.client.1.vm06.stdout:7/992: readlink d0/df/d7b/l8d 0 2026-03-09T00:04:15.557 INFO:tasks.workunit.client.0.vm03.stdout:5/812: dwrite d1c/f1f [4194304,4194304] 0 2026-03-09T00:04:15.558 INFO:tasks.workunit.client.1.vm06.stdout:7/993: rename d0/df/d1a/d3a/l126 to d0/df/d1a/d3a/d4e/d5e/ddc/l12a 0 2026-03-09T00:04:15.559 INFO:tasks.workunit.client.0.vm03.stdout:5/813: dread d1c/d20/d55/d66/d6b/de3/ff4 [0,4194304] 0 2026-03-09T00:04:15.559 INFO:tasks.workunit.client.0.vm03.stdout:5/814: truncate d1c/d20/d55/db0/fcd 57770 0 2026-03-09T00:04:15.559 INFO:tasks.workunit.client.0.vm03.stdout:5/815: stat d1c/d20/d55/d66/dc6/df1/c104 0 2026-03-09T00:04:15.559 INFO:tasks.workunit.client.1.vm06.stdout:1/901: dwrite d6/d4c/feb [0,4194304] 0 2026-03-09T00:04:15.562 INFO:tasks.workunit.client.1.vm06.stdout:1/902: fsync d6/db0/fdf 0 2026-03-09T00:04:15.563 INFO:tasks.workunit.client.0.vm03.stdout:5/816: symlink d1c/d51/d6a/d75/l105 0 2026-03-09T00:04:15.564 INFO:tasks.workunit.client.1.vm06.stdout:7/994: dread d0/df/d1a/d3a/d4e/d5e/ddc/f81 [0,4194304] 0 2026-03-09T00:04:15.564 INFO:tasks.workunit.client.0.vm03.stdout:5/817: creat d1c/d20/d56/da1/f106 x:0 0 0 2026-03-09T00:04:15.565 INFO:tasks.workunit.client.1.vm06.stdout:7/995: truncate d0/df/d1a/d35/ff0 112513 0 2026-03-09T00:04:15.568 INFO:tasks.workunit.client.1.vm06.stdout:1/903: dread d6/d4c/d71/fbf [0,4194304] 0 2026-03-09T00:04:15.568 INFO:tasks.workunit.client.1.vm06.stdout:9/936: dwrite d1/d4/d11f/d25/d85/f28 [0,4194304] 0 2026-03-09T00:04:15.568 INFO:tasks.workunit.client.1.vm06.stdout:3/988: dwrite d11/d28/d4d/f9c [0,4194304] 0 2026-03-09T00:04:15.568 INFO:tasks.workunit.client.1.vm06.stdout:3/989: chown d11/d28/d2e/d2f/d36/d8f/cbc 419 1 2026-03-09T00:04:15.570 INFO:tasks.workunit.client.0.vm03.stdout:9/733: sync 2026-03-09T00:04:15.579 INFO:tasks.workunit.client.0.vm03.stdout:2/822: sync 2026-03-09T00:04:15.579 INFO:tasks.workunit.client.0.vm03.stdout:2/823: fsync d8/d1b/d2a/d6b/f87 0 2026-03-09T00:04:15.579 INFO:tasks.workunit.client.0.vm03.stdout:2/824: stat d8/d1b/d2a/d6b/dc6/ff4 0 2026-03-09T00:04:15.582 INFO:tasks.workunit.client.1.vm06.stdout:7/996: dread d0/df/d1a/d35/f94 [0,4194304] 0 2026-03-09T00:04:15.584 INFO:tasks.workunit.client.1.vm06.stdout:1/904: link d6/d4c/d79/l53 d6/d21/d2d/l131 0 2026-03-09T00:04:15.586 INFO:tasks.workunit.client.0.vm03.stdout:1/865: dwrite f0 [0,4194304] 0 2026-03-09T00:04:15.590 INFO:tasks.workunit.client.0.vm03.stdout:9/734: symlink d15/d1c/d21/d54/d87/lf4 0 2026-03-09T00:04:15.599 INFO:tasks.workunit.client.0.vm03.stdout:2/825: mkdir d8/d26/dfc/d10f 0 2026-03-09T00:04:15.602 INFO:tasks.workunit.client.0.vm03.stdout:1/866: symlink d4/d3a/de6/l122 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.0.vm03.stdout:1/867: mknod d4/d15/de5/c123 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:7/997: creat d0/d55/d99/f12b x:0 0 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:7/998: chown d0/l1c 272393 1 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:7/999: mknod d0/df/d1a/d35/c12c 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:3/990: unlink d11/d28/d2e/d2f/d5b/d5f/c72 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:9/937: rmdir d1/d3/d4f/d91/dae/de9/d126 39 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:9/938: write d1/d3/d4f/d52/de3/ff0 [55058,67690] 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:9/939: read d1/da7/fb9 [7032764,51191] 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:3/991: chown d11/d28/d2e/d2f/d5b/d5f/d91/ce1 454 1 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:3/992: truncate d11/d28/f12f 211006 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:9/940: rename d1/fb5 to d1/d3/d2b/f135 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:3/993: fdatasync d11/d3f/f4c 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:9/941: creat d1/d3/d4f/f136 x:0 0 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:3/994: getdents d11/d28/d2e/d2f/d5b/db5 0 2026-03-09T00:04:15.609 INFO:tasks.workunit.client.1.vm06.stdout:3/995: stat d11/d28/d2e/d2f 0 2026-03-09T00:04:15.633 INFO:tasks.workunit.client.0.vm03.stdout:0/816: dwrite d2/da/dd/d49/d6c/f52 [4194304,4194304] 0 2026-03-09T00:04:15.633 INFO:tasks.workunit.client.0.vm03.stdout:0/817: creat d2/da/dd/d49/d6c/d4b/daf/f126 x:0 0 0 2026-03-09T00:04:15.633 INFO:tasks.workunit.client.0.vm03.stdout:0/818: chown d2/da/dd/f11 508 1 2026-03-09T00:04:15.634 INFO:tasks.workunit.client.0.vm03.stdout:0/819: mknod d2/da/dd/d49/d6c/d4b/d55/c127 0 2026-03-09T00:04:15.664 INFO:tasks.workunit.client.0.vm03.stdout:3/634: dwrite d2/db/d56/fb4 [0,4194304] 0 2026-03-09T00:04:15.669 INFO:tasks.workunit.client.1.vm06.stdout:3/996: write d11/d28/d2e/dff/f123 [858255,127553] 0 2026-03-09T00:04:15.679 INFO:tasks.workunit.client.0.vm03.stdout:3/635: write d2/f5 [4073548,78531] 0 2026-03-09T00:04:15.679 INFO:tasks.workunit.client.0.vm03.stdout:3/636: mkdir d2/db/d3b/dc2 0 2026-03-09T00:04:15.679 INFO:tasks.workunit.client.0.vm03.stdout:3/637: creat d2/dbf/fc3 x:0 0 0 2026-03-09T00:04:15.679 INFO:tasks.workunit.client.0.vm03.stdout:3/638: dread - d2/db/d3b/d5d/fba zero size 2026-03-09T00:04:15.679 INFO:tasks.workunit.client.0.vm03.stdout:3/639: write d2/db/d3b/d3f/f69 [1205986,33367] 0 2026-03-09T00:04:15.679 INFO:tasks.workunit.client.0.vm03.stdout:3/640: rmdir d2/db/d40/d88 39 2026-03-09T00:04:15.683 INFO:tasks.workunit.client.0.vm03.stdout:1/868: write d4/d3a/d3d/d46/f5d [67086,103266] 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/869: write d4/d15/dae/d101/f108 [256516,102069] 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/870: mknod d4/d3a/d32/d87/d116/c124 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/871: link d4/d3a/d32/d87/fd5 d4/d15/dae/f125 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/872: chown d4/d3a/d32/fb9 108 1 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/873: getdents d4/d3a/d8f/d104 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/874: chown d4/d15/d77/dce/dd9 168 1 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/875: getdents d4/d15/d1a/dfb 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/876: readlink d4/d3a/d3d/d98/dee/lcf 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/877: mknod d4/d3a/d3d/d98/d11d/c126 0 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/878: dread - d4/d3a/d8f/ff1 zero size 2026-03-09T00:04:15.686 INFO:tasks.workunit.client.0.vm03.stdout:1/879: chown d4/d15/d77/d8c 3133242 1 2026-03-09T00:04:15.692 INFO:tasks.workunit.client.1.vm06.stdout:1/905: dwrite d6/d4c/d79/fc2 [0,4194304] 0 2026-03-09T00:04:15.693 INFO:tasks.workunit.client.1.vm06.stdout:1/906: symlink d6/d21/d2d/d3b/d42/l132 0 2026-03-09T00:04:15.693 INFO:tasks.workunit.client.1.vm06.stdout:1/907: readlink d6/d21/l60 0 2026-03-09T00:04:15.693 INFO:tasks.workunit.client.0.vm03.stdout:3/641: dread d2/db/f1a [0,4194304] 0 2026-03-09T00:04:15.695 INFO:tasks.workunit.client.0.vm03.stdout:8/815: dwrite d7/df/d1a/d40/d58/f8c [0,4194304] 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:3/642: mknod d2/db/d3b/d5f/da5/d72/dbd/cc4 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:3/643: truncate d2/db/d2d/f54 109576 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:3/644: fdatasync d2/db/d3b/d5f/da5/d72/f86 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:8/816: symlink d7/lfa 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:8/817: fsync d7/df/d1a/d40/d9d/df2/f3a 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:3/645: unlink d2/db/d56/fb6 0 2026-03-09T00:04:15.701 INFO:tasks.workunit.client.0.vm03.stdout:3/646: rmdir d2/db/d3b/d5f/da5 39 2026-03-09T00:04:15.702 INFO:tasks.workunit.client.0.vm03.stdout:3/647: creat d2/fc5 x:0 0 0 2026-03-09T00:04:15.702 INFO:tasks.workunit.client.0.vm03.stdout:3/648: truncate d2/db/d3b/d5d/fba 254075 0 2026-03-09T00:04:15.702 INFO:tasks.workunit.client.0.vm03.stdout:3/649: chown d2/db/d40/d44/l53 0 1 2026-03-09T00:04:15.708 INFO:tasks.workunit.client.0.vm03.stdout:1/880: dread d4/d3a/d3d/d98/dee/d93/ffc [0,4194304] 0 2026-03-09T00:04:15.709 INFO:tasks.workunit.client.0.vm03.stdout:1/881: write d4/d15/d1a/f1d [6278923,65006] 0 2026-03-09T00:04:15.709 INFO:tasks.workunit.client.0.vm03.stdout:1/882: stat d4/d15/dae/dcb 0 2026-03-09T00:04:15.714 INFO:tasks.workunit.client.0.vm03.stdout:1/883: link d4/d3a/d3d/d98/fdb d4/d6/f127 0 2026-03-09T00:04:15.719 INFO:tasks.workunit.client.0.vm03.stdout:1/884: dread d4/d3a/f26 [0,4194304] 0 2026-03-09T00:04:15.719 INFO:tasks.workunit.client.0.vm03.stdout:1/885: fdatasync d4/d15/d5c/f74 0 2026-03-09T00:04:15.721 INFO:tasks.workunit.client.0.vm03.stdout:1/886: write d4/d3a/f2c [942151,56230] 0 2026-03-09T00:04:15.723 INFO:tasks.workunit.client.0.vm03.stdout:1/887: creat d4/d3a/d32/d87/db3/f128 x:0 0 0 2026-03-09T00:04:15.744 INFO:tasks.workunit.client.0.vm03.stdout:0/820: dwrite d2/da/dd/d49/d6c/d4b/daf/f10d [0,4194304] 0 2026-03-09T00:04:15.745 INFO:tasks.workunit.client.0.vm03.stdout:0/821: rename d2/da/dd/d49/d6c/da6/f119 to d2/da/dd/d49/d6c/da6/f128 0 2026-03-09T00:04:15.745 INFO:tasks.workunit.client.0.vm03.stdout:0/822: mknod d2/da/dd/d49/c129 0 2026-03-09T00:04:15.746 INFO:tasks.workunit.client.0.vm03.stdout:0/823: truncate d2/da/d4e/f9c 4248560 0 2026-03-09T00:04:15.747 INFO:tasks.workunit.client.0.vm03.stdout:0/824: mkdir d2/da/d36/ddf/df7/d12a 0 2026-03-09T00:04:15.747 INFO:tasks.workunit.client.0.vm03.stdout:0/825: chown d2/da/d36/f115 5379301 1 2026-03-09T00:04:15.747 INFO:tasks.workunit.client.0.vm03.stdout:0/826: mknod d2/c12b 0 2026-03-09T00:04:15.752 INFO:tasks.workunit.client.1.vm06.stdout:9/942: write d1/d3/d4f/d91/de8/ff4 [1833371,5762] 0 2026-03-09T00:04:15.755 INFO:tasks.workunit.client.1.vm06.stdout:9/943: creat d1/d3/d4f/d91/d94/ddf/dfe/f137 x:0 0 0 2026-03-09T00:04:15.755 INFO:tasks.workunit.client.1.vm06.stdout:9/944: rmdir d1/d3/d4f/d52/de3/de5 39 2026-03-09T00:04:15.756 INFO:tasks.workunit.client.1.vm06.stdout:9/945: creat d1/f138 x:0 0 0 2026-03-09T00:04:15.758 INFO:tasks.workunit.client.0.vm03.stdout:5/818: dwrite d1c/d20/d55/f9b [4194304,4194304] 0 2026-03-09T00:04:15.762 INFO:tasks.workunit.client.1.vm06.stdout:9/946: link d1/d3/d4f/d91/dae/le0 d1/d4/df5/l139 0 2026-03-09T00:04:15.764 INFO:tasks.workunit.client.0.vm03.stdout:5/819: mkdir d1c/d107 0 2026-03-09T00:04:15.765 INFO:tasks.workunit.client.0.vm03.stdout:5/820: mkdir d1c/d20/d55/d66/dc6/d108 0 2026-03-09T00:04:15.765 INFO:tasks.workunit.client.0.vm03.stdout:5/821: read - d1c/d20/d55/d4f/d58/d73/d76/d91/fa2 zero size 2026-03-09T00:04:15.767 INFO:tasks.workunit.client.0.vm03.stdout:5/822: mkdir d1c/d20/d55/d66/d6b/de3/d109 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/908: read d6/d21/def/f11c [2755707,34151] 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:9/947: creat d1/d4/df5/f13a x:0 0 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/909: mkdir d6/d4c/d79/d10c/d11d/d133 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/910: rmdir d6/d21/d2d/d3b/dc9 39 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/911: chown d6/d21/fa1 0 1 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/912: rename d6/d21/d2d/d3b/d87/d9d/fa3 to d6/d21/d2d/d37/d6d/dd7/f134 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/913: getdents d6/dc4 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/914: stat d6/db0/fdf 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/915: write d6/d21/d2d/d37/dbc/f12d [1029909,9965] 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/916: unlink d6/d21/c3a 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/917: chown d6/l38 35815 1 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/918: creat d6/d21/def/d118/f135 x:0 0 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:7/723: dwrite d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fa1 [0,4194304] 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/735: dwrite d15/d1c/d21/d75/fa6 [0,4194304] 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/736: symlink d15/d1c/d36/d4d/dc4/dec/lf5 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/737: mkdir d15/d1c/d21/d54/dab/df6 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/738: stat d15/d1c/d21/f61 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/739: mknod d15/d1c/d36/d4d/dc4/dec/cf7 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/740: creat d15/d1c/d21/d75/ff8 x:0 0 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/741: truncate d15/d1c/d36/f9e 1124827 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.0.vm03.stdout:9/742: creat d15/d1c/d21/d54/ff9 x:0 0 0 2026-03-09T00:04:15.777 INFO:tasks.workunit.client.1.vm06.stdout:1/919: read d6/d8f/f101 [176252,16016] 0 2026-03-09T00:04:15.778 INFO:tasks.workunit.client.1.vm06.stdout:9/948: dread d1/d3/d4f/d52/fb3 [0,4194304] 0 2026-03-09T00:04:15.779 INFO:tasks.workunit.client.1.vm06.stdout:1/920: mknod d6/d21/d2d/d3b/d42/df0/c136 0 2026-03-09T00:04:15.780 INFO:tasks.workunit.client.0.vm03.stdout:5/823: write d1c/d20/d55/d4f/d58/db5/f6f [4500706,75170] 0 2026-03-09T00:04:15.780 INFO:tasks.workunit.client.0.vm03.stdout:9/743: link d15/d1c/d21/ceb d15/d1c/d36/d4d/dc4/dec/cfa 0 2026-03-09T00:04:15.783 INFO:tasks.workunit.client.1.vm06.stdout:1/921: readlink d6/d63/lda 0 2026-03-09T00:04:15.784 INFO:tasks.workunit.client.0.vm03.stdout:5/824: mkdir d1c/d51/d6a/d75/df0/d10a 0 2026-03-09T00:04:15.785 INFO:tasks.workunit.client.0.vm03.stdout:9/744: rename d15/d1c/d21/df3 to d15/d77/dfb 0 2026-03-09T00:04:15.785 INFO:tasks.workunit.client.1.vm06.stdout:1/922: stat d6/f8c 0 2026-03-09T00:04:15.788 INFO:tasks.workunit.client.0.vm03.stdout:7/724: read d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/fd5 [4161739,124762] 0 2026-03-09T00:04:15.789 INFO:tasks.workunit.client.0.vm03.stdout:5/825: dread d1c/d20/f39 [0,4194304] 0 2026-03-09T00:04:15.795 INFO:tasks.workunit.client.1.vm06.stdout:1/923: creat d6/dc4/d128/f137 x:0 0 0 2026-03-09T00:04:15.799 INFO:tasks.workunit.client.0.vm03.stdout:2/826: getdents d8/d26/dfc 0 2026-03-09T00:04:15.800 INFO:tasks.workunit.client.0.vm03.stdout:6/759: dwrite d13/f5b [0,4194304] 0 2026-03-09T00:04:15.801 INFO:tasks.workunit.client.0.vm03.stdout:3/650: dwrite d2/db/d2d/f52 [0,4194304] 0 2026-03-09T00:04:15.801 INFO:tasks.workunit.client.0.vm03.stdout:3/651: chown d2/db/d3b/d5d/c5e 10 1 2026-03-09T00:04:15.801 INFO:tasks.workunit.client.0.vm03.stdout:8/818: dwrite d7/f34 [0,4194304] 0 2026-03-09T00:04:15.801 INFO:tasks.workunit.client.0.vm03.stdout:8/819: chown d7/c8 4 1 2026-03-09T00:04:15.801 INFO:tasks.workunit.client.0.vm03.stdout:8/820: dread - d7/df/d1a/d40/d9d/df2/d38/d4c/fc1 zero size 2026-03-09T00:04:15.801 INFO:tasks.workunit.client.0.vm03.stdout:8/821: write d7/df/d1a/d40/db3/f74 [3642028,88261] 0 2026-03-09T00:04:15.803 INFO:tasks.workunit.client.0.vm03.stdout:9/745: creat d15/d1c/d28/d6e/da2/ffc x:0 0 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.1.vm06.stdout:1/924: mknod d6/d21/d2d/d3b/d42/c138 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.1.vm06.stdout:1/925: creat d6/d21/d2d/d37/dbc/f139 x:0 0 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:5/826: mkdir d1c/d20/d55/db0/dc7/d101/d10b 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:2/827: mkdir d8/d1b/d2a/d56/d110 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:2/828: readlink d8/l14 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:2/829: chown d8/d1b/d2a/d6b/cb7 1443 1 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:4/900: sync 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:3/652: mkdir d2/db/d6a/dc6 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:3/653: write d2/db/d3b/d3f/f7c [887419,42037] 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:8/822: mknod d7/df/d1a/d40/d9d/df2/d3f/df1/cfb 0 2026-03-09T00:04:15.815 INFO:tasks.workunit.client.0.vm03.stdout:9/746: mknod d15/d1c/d21/d54/dab/cfd 0 2026-03-09T00:04:15.820 INFO:tasks.workunit.client.0.vm03.stdout:8/823: dread d7/df/d1a/d40/d58/f8c [0,4194304] 0 2026-03-09T00:04:15.820 INFO:tasks.workunit.client.0.vm03.stdout:8/824: fsync d7/df/d1a/fc4 0 2026-03-09T00:04:15.822 INFO:tasks.workunit.client.1.vm06.stdout:1/926: truncate d6/d21/d2d/d37/f77 3703454 0 2026-03-09T00:04:15.823 INFO:tasks.workunit.client.0.vm03.stdout:5/827: mknod d1c/d20/d55/d66/d70/c10c 0 2026-03-09T00:04:15.825 INFO:tasks.workunit.client.0.vm03.stdout:4/901: mkdir d7/d6f/dcf/de8/d11c 0 2026-03-09T00:04:15.825 INFO:tasks.workunit.client.0.vm03.stdout:5/828: write d1c/d20/d55/ff6 [267353,49384] 0 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.0.vm03.stdout:3/654: mkdir d2/db/d2d/dc7 0 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.0.vm03.stdout:3/655: chown d2/db/d2d/f45 108327 1 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.0.vm03.stdout:0/827: dread d2/da/dd/d49/d6c/d4b/f88 [0,4194304] 0 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.0.vm03.stdout:0/828: chown d2/da/d1a/fc4 12 1 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.0.vm03.stdout:0/829: dread - d2/da/dd/d49/d6c/da6/dcf/ff9 zero size 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.1.vm06.stdout:1/927: mknod d6/d21/da6/c13a 0 2026-03-09T00:04:15.826 INFO:tasks.workunit.client.1.vm06.stdout:1/928: dread - d6/d21/d2d/d3b/d87/d9d/fdd zero size 2026-03-09T00:04:15.827 INFO:tasks.workunit.client.1.vm06.stdout:1/929: fsync d6/d21/d2d/d37/fd3 0 2026-03-09T00:04:15.827 INFO:tasks.workunit.client.0.vm03.stdout:9/747: symlink d15/d1c/d36/d4d/dc4/lfe 0 2026-03-09T00:04:15.827 INFO:tasks.workunit.client.0.vm03.stdout:9/748: fdatasync d15/d1c/d28/d6e/da2/fca 0 2026-03-09T00:04:15.830 INFO:tasks.workunit.client.0.vm03.stdout:8/825: mknod d7/df/d1a/d40/d9d/df2/dc3/cfc 0 2026-03-09T00:04:15.830 INFO:tasks.workunit.client.0.vm03.stdout:8/826: dread - d7/df/d1a/d40/d58/ff9 zero size 2026-03-09T00:04:15.830 INFO:tasks.workunit.client.0.vm03.stdout:5/829: creat d1c/d20/d56/f10d x:0 0 0 2026-03-09T00:04:15.831 INFO:tasks.workunit.client.0.vm03.stdout:3/656: truncate d2/db/d3b/d5d/f8d 3828884 0 2026-03-09T00:04:15.832 INFO:tasks.workunit.client.0.vm03.stdout:0/830: creat d2/da/d36/ddf/df7/d12a/f12c x:0 0 0 2026-03-09T00:04:15.834 INFO:tasks.workunit.client.0.vm03.stdout:3/657: write d2/db/d3b/d5d/f60 [497363,27714] 0 2026-03-09T00:04:15.840 INFO:tasks.workunit.client.1.vm06.stdout:1/930: creat d6/d21/da6/f13b x:0 0 0 2026-03-09T00:04:15.841 INFO:tasks.workunit.client.0.vm03.stdout:3/658: creat d2/db/d3b/d3f/db8/fc8 x:0 0 0 2026-03-09T00:04:15.841 INFO:tasks.workunit.client.0.vm03.stdout:3/659: creat d2/db/d3b/d3f/db8/fc9 x:0 0 0 2026-03-09T00:04:15.841 INFO:tasks.workunit.client.0.vm03.stdout:3/660: chown d2/db/f15 51 1 2026-03-09T00:04:15.842 INFO:tasks.workunit.client.0.vm03.stdout:5/830: dread d1c/d20/d55/d66/d6b/d8f/f98 [0,4194304] 0 2026-03-09T00:04:15.843 INFO:tasks.workunit.client.0.vm03.stdout:5/831: unlink d1c/d20/d55/d4f/d58/d73/d76/fd5 0 2026-03-09T00:04:15.844 INFO:tasks.workunit.client.0.vm03.stdout:5/832: symlink d1c/d51/d6a/d75/df0/l10e 0 2026-03-09T00:04:15.844 INFO:tasks.workunit.client.0.vm03.stdout:5/833: readlink d1c/d20/d56/lfc 0 2026-03-09T00:04:15.844 INFO:tasks.workunit.client.0.vm03.stdout:5/834: write d1c/d20/d55/d4f/d58/fa6 [932582,120899] 0 2026-03-09T00:04:15.844 INFO:tasks.workunit.client.0.vm03.stdout:5/835: stat d1c/d20/l4b 0 2026-03-09T00:04:15.844 INFO:tasks.workunit.client.0.vm03.stdout:5/836: fsync d1c/d20/d55/d66/d70/f8c 0 2026-03-09T00:04:15.852 INFO:tasks.workunit.client.0.vm03.stdout:0/831: dread d2/da/dd/f7b [0,4194304] 0 2026-03-09T00:04:15.852 INFO:tasks.workunit.client.0.vm03.stdout:0/832: dread - d2/da/dd/d49/d6c/da6/f128 zero size 2026-03-09T00:04:15.852 INFO:tasks.workunit.client.0.vm03.stdout:0/833: readlink d2/da/dd/d49/d6c/d4b/d55/l96 0 2026-03-09T00:04:15.852 INFO:tasks.workunit.client.1.vm06.stdout:1/931: dread d6/d21/d2d/fe9 [0,4194304] 0 2026-03-09T00:04:15.853 INFO:tasks.workunit.client.1.vm06.stdout:1/932: write d6/d21/d2d/d37/fd3 [895199,10684] 0 2026-03-09T00:04:15.853 INFO:tasks.workunit.client.1.vm06.stdout:1/933: read d6/d21/d2d/f6c [72207,69622] 0 2026-03-09T00:04:15.854 INFO:tasks.workunit.client.0.vm03.stdout:0/834: mknod d2/da/dd/d49/d6c/d4b/d55/c12d 0 2026-03-09T00:04:15.855 INFO:tasks.workunit.client.1.vm06.stdout:1/934: mknod d6/d63/c13c 0 2026-03-09T00:04:15.855 INFO:tasks.workunit.client.1.vm06.stdout:1/935: read - d6/d21/d2d/d37/d6d/f127 zero size 2026-03-09T00:04:15.856 INFO:tasks.workunit.client.1.vm06.stdout:1/936: mkdir d6/d8f/d10f/d13d 0 2026-03-09T00:04:15.866 INFO:tasks.workunit.client.0.vm03.stdout:0/835: dread d2/da/d76/d8a/f122 [0,4194304] 0 2026-03-09T00:04:15.872 INFO:tasks.workunit.client.0.vm03.stdout:4/902: write d7/d27/f89 [3379172,106990] 0 2026-03-09T00:04:15.874 INFO:tasks.workunit.client.0.vm03.stdout:4/903: dread d7/d20/d6a/dde/ffa [0,4194304] 0 2026-03-09T00:04:15.875 INFO:tasks.workunit.client.0.vm03.stdout:4/904: mkdir d7/d20/d6a/d77/d25/de2/df1/d11d 0 2026-03-09T00:04:15.876 INFO:tasks.workunit.client.0.vm03.stdout:4/905: rename d7/d27/f2c to d7/d6f/d108/f11e 0 2026-03-09T00:04:15.878 INFO:tasks.workunit.client.0.vm03.stdout:4/906: rename d7/d20/d6a/dea/d38/da9/ddc/df2 to d7/d20/d6a/dea/d54/d58/d11f 0 2026-03-09T00:04:15.878 INFO:tasks.workunit.client.0.vm03.stdout:4/907: dread - d7/d6f/dcf/ff9 zero size 2026-03-09T00:04:15.879 INFO:tasks.workunit.client.0.vm03.stdout:4/908: rename d7/d20/d35/c63 to d7/d20/d6a/d77/d25/de2/df1/c120 0 2026-03-09T00:04:15.879 INFO:tasks.workunit.client.0.vm03.stdout:4/909: chown d7/d27/f31 1468316 1 2026-03-09T00:04:15.879 INFO:tasks.workunit.client.0.vm03.stdout:4/910: readlink d7/d20/d35/d66/le7 0 2026-03-09T00:04:15.893 INFO:tasks.workunit.client.0.vm03.stdout:7/725: dwrite d2/d1f/d3a/d24/da4/d46/d54/f9b [4194304,4194304] 0 2026-03-09T00:04:15.893 INFO:tasks.workunit.client.0.vm03.stdout:7/726: fsync d2/d4/d1e/f97 0 2026-03-09T00:04:15.893 INFO:tasks.workunit.client.0.vm03.stdout:7/727: write d2/d1f/d35/f3e [4461625,10952] 0 2026-03-09T00:04:15.893 INFO:tasks.workunit.client.0.vm03.stdout:7/728: creat d2/d4/db7/daa/fd9 x:0 0 0 2026-03-09T00:04:15.893 INFO:tasks.workunit.client.0.vm03.stdout:7/729: chown d2/d1f/l20 1 1 2026-03-09T00:04:15.895 INFO:tasks.workunit.client.0.vm03.stdout:7/730: rename d2/d1f/d35/f3e to d2/d4/db7/d67/d6b/fda 0 2026-03-09T00:04:15.896 INFO:tasks.workunit.client.0.vm03.stdout:0/836: dread d2/da/dd/d49/d6c/d4b/fa0 [0,4194304] 0 2026-03-09T00:04:15.899 INFO:tasks.workunit.client.0.vm03.stdout:2/830: dread d8/f11 [0,4194304] 0 2026-03-09T00:04:15.899 INFO:tasks.workunit.client.0.vm03.stdout:1/888: dwrite d4/d3a/f41 [0,4194304] 0 2026-03-09T00:04:15.909 INFO:tasks.workunit.client.0.vm03.stdout:2/831: creat d8/d1b/d2a/d6b/d50/f111 x:0 0 0 2026-03-09T00:04:15.910 INFO:tasks.workunit.client.0.vm03.stdout:2/832: mknod d8/d26/d5e/d5f/ded/c112 0 2026-03-09T00:04:15.911 INFO:tasks.workunit.client.0.vm03.stdout:2/833: creat d8/d26/d5e/d5f/ded/f113 x:0 0 0 2026-03-09T00:04:15.912 INFO:tasks.workunit.client.0.vm03.stdout:2/834: mknod d8/d1b/d2a/c114 0 2026-03-09T00:04:15.912 INFO:tasks.workunit.client.0.vm03.stdout:2/835: read - d8/d74/f106 zero size 2026-03-09T00:04:15.912 INFO:tasks.workunit.client.0.vm03.stdout:2/836: rename d8/f9 to d8/d1b/d6c/f115 0 2026-03-09T00:04:15.912 INFO:tasks.workunit.client.0.vm03.stdout:2/837: fdatasync d8/d1b/d2a/d6b/fe2 0 2026-03-09T00:04:15.912 INFO:tasks.workunit.client.0.vm03.stdout:2/838: write d8/d26/d5e/d6f/d97/fae [844765,105586] 0 2026-03-09T00:04:15.912 INFO:tasks.workunit.client.0.vm03.stdout:2/839: write d8/d26/dfc/f107 [539787,107283] 0 2026-03-09T00:04:15.914 INFO:tasks.workunit.client.0.vm03.stdout:2/840: rmdir d8/d1b/d24/da5/dfe 39 2026-03-09T00:04:15.923 INFO:tasks.workunit.client.0.vm03.stdout:2/841: chown d8/d26/d5e/d5f/d95/de5 925281 1 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:3/661: dread d2/db/d2d/f45 [0,4194304] 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:2/842: mkdir d8/d26/d5e/d5f/d116 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:2/843: truncate d8/d1b/d2a/d6b/d50/fc8 897609 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:0/837: read d2/da/dd/d49/d6c/d4b/f100 [2392412,76408] 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:6/760: write d13/f1d [293400,33000] 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:6/761: fsync d13/d1e/f9f 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:3/662: rename d2/dbf/f93 to d2/db/d2d/dc7/fca 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:3/663: truncate d2/db/f7e 3762 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:2/844: getdents d8/d1b/d24/da5/dfe 0 2026-03-09T00:04:15.924 INFO:tasks.workunit.client.0.vm03.stdout:0/838: getdents d2/da/d76/d8a/d8f 0 2026-03-09T00:04:15.932 INFO:tasks.workunit.client.0.vm03.stdout:2/845: dread d8/d1b/d2a/d56/fa4 [0,4194304] 0 2026-03-09T00:04:15.933 INFO:tasks.workunit.client.0.vm03.stdout:2/846: rename d8/d74/fe9 to d8/d1b/d24/da5/dda/de0/f117 0 2026-03-09T00:04:15.933 INFO:tasks.workunit.client.0.vm03.stdout:2/847: write d8/d74/f106 [303089,96954] 0 2026-03-09T00:04:15.950 INFO:tasks.workunit.client.0.vm03.stdout:8/827: dwrite d7/df/d1a/d40/d9d/df2/d38/fc9 [0,4194304] 0 2026-03-09T00:04:15.950 INFO:tasks.workunit.client.0.vm03.stdout:8/828: write d7/df/d1a/d2b/fda [558558,4542] 0 2026-03-09T00:04:15.952 INFO:tasks.workunit.client.0.vm03.stdout:8/829: rmdir d7/df/d1a/d40/db3/dd4 0 2026-03-09T00:04:15.952 INFO:tasks.workunit.client.0.vm03.stdout:8/830: truncate d7/df/d1a/d40/d9d/df2/d38/d4c/fde 1134610 0 2026-03-09T00:04:15.952 INFO:tasks.workunit.client.0.vm03.stdout:8/831: chown d7/df/d1a/d40/d9d/df2/d38/fc9 23949 1 2026-03-09T00:04:15.952 INFO:tasks.workunit.client.0.vm03.stdout:8/832: truncate d7/df/d1a/d40/d9d/df2/dc3/fe3 1040077 0 2026-03-09T00:04:15.952 INFO:tasks.workunit.client.0.vm03.stdout:8/833: stat d7/df/d1a/d40/d9d/df2/dad/lef 0 2026-03-09T00:04:15.957 INFO:tasks.workunit.client.0.vm03.stdout:8/834: dread d7/df/d1a/d40/d9d/df2/d38/d91/fbf [0,4194304] 0 2026-03-09T00:04:15.957 INFO:tasks.workunit.client.0.vm03.stdout:8/835: fdatasync d7/df/d1a/d40/db3/f75 0 2026-03-09T00:04:15.957 INFO:tasks.workunit.client.0.vm03.stdout:8/836: write d7/df/d1a/d40/d9d/df2/dc3/fe3 [1395765,117353] 0 2026-03-09T00:04:15.957 INFO:tasks.workunit.client.1.vm06.stdout:1/937: dwrite d6/db0/fdf [0,4194304] 0 2026-03-09T00:04:15.961 INFO:tasks.workunit.client.1.vm06.stdout:1/938: dread d6/d21/d2d/fc5 [0,4194304] 0 2026-03-09T00:04:15.961 INFO:tasks.workunit.client.1.vm06.stdout:1/939: creat d6/d21/def/d118/f13e x:0 0 0 2026-03-09T00:04:15.963 INFO:tasks.workunit.client.1.vm06.stdout:1/940: mknod d6/d21/d2d/d3b/d87/d9d/c13f 0 2026-03-09T00:04:15.969 INFO:tasks.workunit.client.1.vm06.stdout:1/941: truncate d6/d21/f2e 2452676 0 2026-03-09T00:04:15.969 INFO:tasks.workunit.client.0.vm03.stdout:7/731: write d2/d4/fb [4143411,27295] 0 2026-03-09T00:04:15.969 INFO:tasks.workunit.client.0.vm03.stdout:7/732: symlink d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/ldb 0 2026-03-09T00:04:15.969 INFO:tasks.workunit.client.0.vm03.stdout:7/733: read - d2/d4/db7/d67/d6b/fbe zero size 2026-03-09T00:04:15.969 INFO:tasks.workunit.client.0.vm03.stdout:7/734: chown d2 0 1 2026-03-09T00:04:15.974 INFO:tasks.workunit.client.0.vm03.stdout:2/848: dread d8/d26/f109 [0,4194304] 0 2026-03-09T00:04:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/849: dread d8/d1b/d2a/d6b/d50/fc8 [0,4194304] 0 2026-03-09T00:04:15.975 INFO:tasks.workunit.client.0.vm03.stdout:2/850: readlink d8/d1b/d2a/d6b/d50/d8a/lbd 0 2026-03-09T00:04:15.975 INFO:tasks.workunit.client.0.vm03.stdout:9/749: dwrite d15/f26 [0,4194304] 0 2026-03-09T00:04:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/889: dwrite d4/d3a/d43/fba [0,4194304] 0 2026-03-09T00:04:15.975 INFO:tasks.workunit.client.0.vm03.stdout:1/890: dread - d4/d3a/d32/f10a zero size 2026-03-09T00:04:15.983 INFO:tasks.workunit.client.0.vm03.stdout:9/750: write d15/d1c/d21/d54/d87/d93/fbf [1335077,71844] 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/851: link d8/d26/d5e/d6f/d97/lbf d8/d1b/d2a/d6b/dc6/l118 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/852: truncate d8/d1b/d2a/d6b/fcf 4460107 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/853: getdents d8/d1b/d24/da5/dda/def 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/854: creat d8/d26/d5e/db1/f119 x:0 0 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/855: fdatasync d8/fb 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/856: creat d8/d26/d5e/d6f/f11a x:0 0 0 2026-03-09T00:04:15.992 INFO:tasks.workunit.client.0.vm03.stdout:2/857: write d8/d1b/d2a/f2d [1537536,101639] 0 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:15.995 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:15.997 INFO:tasks.workunit.client.0.vm03.stdout:2/858: rename d8/d1b/d2a/d2e/d9a/ccc to d8/d1b/d2a/d56/c11b 0 2026-03-09T00:04:16.064 INFO:tasks.workunit.client.0.vm03.stdout:9/751: dwrite d15/d1c/d36/d4d/f5d [0,4194304] 0 2026-03-09T00:04:16.067 INFO:tasks.workunit.client.0.vm03.stdout:9/752: creat d15/d1c/d36/d4d/dc4/fff x:0 0 0 2026-03-09T00:04:16.067 INFO:tasks.workunit.client.0.vm03.stdout:9/753: creat d15/d7f/f100 x:0 0 0 2026-03-09T00:04:16.069 INFO:tasks.workunit.client.0.vm03.stdout:9/754: creat d15/d1c/d21/f101 x:0 0 0 2026-03-09T00:04:16.076 INFO:tasks.workunit.client.0.vm03.stdout:3/664: dwrite d2/fc5 [0,4194304] 0 2026-03-09T00:04:16.076 INFO:tasks.workunit.client.0.vm03.stdout:3/665: creat d2/db/d3b/d3f/fcb x:0 0 0 2026-03-09T00:04:16.076 INFO:tasks.workunit.client.0.vm03.stdout:3/666: chown d2/db/d40/d88/cad 29 1 2026-03-09T00:04:16.086 INFO:tasks.workunit.client.1.vm06.stdout:3/997: sync 2026-03-09T00:04:16.095 INFO:tasks.workunit.client.1.vm06.stdout:3/998: truncate d11/d28/d2e/d7e/d83/fe8 1856618 0 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:16.096 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:16.096 INFO:tasks.workunit.client.1.vm06.stdout:3/999: creat d11/d28/d2e/d7e/d83/d87/f14f x:0 0 0 2026-03-09T00:04:16.096 INFO:tasks.workunit.client.0.vm03.stdout:3/667: symlink d2/db/d6a/dc6/lcc 0 2026-03-09T00:04:16.117 INFO:tasks.workunit.client.0.vm03.stdout:1/891: dwrite d4/d3a/d32/f4f [0,4194304] 0 2026-03-09T00:04:16.118 INFO:tasks.workunit.client.0.vm03.stdout:1/892: fsync d4/d15/ffd 0 2026-03-09T00:04:16.118 INFO:tasks.workunit.client.0.vm03.stdout:7/735: dwrite d2/d1f/d3a/f1a [0,4194304] 0 2026-03-09T00:04:16.118 INFO:tasks.workunit.client.0.vm03.stdout:9/755: dwrite d15/d1c/d28/d6e/da2/ffc [0,4194304] 0 2026-03-09T00:04:16.118 INFO:tasks.workunit.client.0.vm03.stdout:9/756: chown d15/d1c/d36/l63 201727 1 2026-03-09T00:04:16.119 INFO:tasks.workunit.client.0.vm03.stdout:6/762: dwrite d13/d1e/d44/d4a/d52/f7a [0,4194304] 0 2026-03-09T00:04:16.120 INFO:tasks.workunit.client.0.vm03.stdout:1/893: rename d4/d3a/d3d/d46/df7/ffa to d4/d6/f129 0 2026-03-09T00:04:16.121 INFO:tasks.workunit.client.0.vm03.stdout:7/736: creat d2/d1f/d3a/d24/da4/d46/d81/d96/da2/fdc x:0 0 0 2026-03-09T00:04:16.121 INFO:tasks.workunit.client.0.vm03.stdout:9/757: creat d15/d1c/f102 x:0 0 0 2026-03-09T00:04:16.123 INFO:tasks.workunit.client.0.vm03.stdout:1/894: symlink d4/d15/dae/d101/l12a 0 2026-03-09T00:04:16.123 INFO:tasks.workunit.client.0.vm03.stdout:1/895: fsync d4/d15/d77/f7c 0 2026-03-09T00:04:16.124 INFO:tasks.workunit.client.0.vm03.stdout:7/737: rename d2/d4/d8c/la9 to d2/d4/d1e/d78/ldd 0 2026-03-09T00:04:16.124 INFO:tasks.workunit.client.0.vm03.stdout:7/738: fsync d2/d1f/d3a/d24/da4/d46/d54/f77 0 2026-03-09T00:04:16.124 INFO:tasks.workunit.client.0.vm03.stdout:7/739: rename d2/d1f/d3a/d24/da4/d46 to d2/d1f/d3a/d24/da4/d46/d81/d96/d88/dde 22 2026-03-09T00:04:16.124 INFO:tasks.workunit.client.0.vm03.stdout:7/740: write d2/d4/d1e/d78/fa5 [360591,30096] 0 2026-03-09T00:04:16.125 INFO:tasks.workunit.client.0.vm03.stdout:7/741: rename d2/d1f/d3a/f5d to d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/fdf 0 2026-03-09T00:04:16.126 INFO:tasks.workunit.client.0.vm03.stdout:7/742: rmdir d2/d4/db7/daa 39 2026-03-09T00:04:16.131 INFO:tasks.workunit.client.0.vm03.stdout:7/743: dread d2/d1f/f11 [0,4194304] 0 2026-03-09T00:04:16.132 INFO:tasks.workunit.client.0.vm03.stdout:7/744: truncate d2/d1f/d3a/f19 3514513 0 2026-03-09T00:04:16.132 INFO:tasks.workunit.client.0.vm03.stdout:7/745: write d2/d4/db7/d67/f64 [2834622,63678] 0 2026-03-09T00:04:16.133 INFO:tasks.workunit.client.0.vm03.stdout:7/746: symlink d2/d4/d8c/le0 0 2026-03-09T00:04:16.145 INFO:tasks.workunit.client.0.vm03.stdout:6/763: dread d13/d1e/d44/d4a/f58 [0,4194304] 0 2026-03-09T00:04:16.150 INFO:tasks.workunit.client.1.vm06.stdout:9/949: sync 2026-03-09T00:04:16.153 INFO:tasks.workunit.client.0.vm03.stdout:6/764: dread d13/f31 [0,4194304] 0 2026-03-09T00:04:16.153 INFO:tasks.workunit.client.0.vm03.stdout:6/765: write d13/dc4/fd9 [769269,61038] 0 2026-03-09T00:04:16.155 INFO:tasks.workunit.client.1.vm06.stdout:9/950: dread d1/d4/fa2 [0,4194304] 0 2026-03-09T00:04:16.155 INFO:tasks.workunit.client.1.vm06.stdout:9/951: creat d1/d3/ddc/f13b x:0 0 0 2026-03-09T00:04:16.162 INFO:tasks.workunit.client.0.vm03.stdout:1/896: dwrite d4/d15/d5c/d6c/fb7 [0,4194304] 0 2026-03-09T00:04:16.162 INFO:tasks.workunit.client.0.vm03.stdout:1/897: dread - d4/d3a/d8f/ff1 zero size 2026-03-09T00:04:16.162 INFO:tasks.workunit.client.0.vm03.stdout:1/898: write d4/d15/f6d [953415,96390] 0 2026-03-09T00:04:16.163 INFO:tasks.workunit.client.0.vm03.stdout:1/899: getdents d4/d15/d1a 0 2026-03-09T00:04:16.165 INFO:tasks.workunit.client.0.vm03.stdout:1/900: truncate d4/d3a/f26 1875837 0 2026-03-09T00:04:16.224 INFO:tasks.workunit.client.0.vm03.stdout:8/837: dwrite d7/df/d1a/d40/d9d/df2/dc3/fe3 [0,4194304] 0 2026-03-09T00:04:16.224 INFO:tasks.workunit.client.0.vm03.stdout:8/838: readlink d7/df/d1a/d40/d9d/df2/l22 0 2026-03-09T00:04:16.225 INFO:tasks.workunit.client.0.vm03.stdout:8/839: unlink d7/df/d1a/lb1 0 2026-03-09T00:04:16.226 INFO:tasks.workunit.client.0.vm03.stdout:2/859: dwrite f2 [0,4194304] 0 2026-03-09T00:04:16.230 INFO:tasks.workunit.client.0.vm03.stdout:2/860: creat d8/d26/d5e/d5f/d116/f11c x:0 0 0 2026-03-09T00:04:16.230 INFO:tasks.workunit.client.0.vm03.stdout:2/861: mknod d8/d1b/d6c/dd7/c11d 0 2026-03-09T00:04:16.231 INFO:tasks.workunit.client.0.vm03.stdout:2/862: mknod d8/d26/d5e/d5f/d95/c11e 0 2026-03-09T00:04:16.231 INFO:tasks.workunit.client.0.vm03.stdout:8/840: write d7/df/d1a/d40/d9d/df2/d3f/f7d [262911,124992] 0 2026-03-09T00:04:16.231 INFO:tasks.workunit.client.0.vm03.stdout:2/863: creat d8/d26/dfc/d10f/f11f x:0 0 0 2026-03-09T00:04:16.232 INFO:tasks.workunit.client.0.vm03.stdout:8/841: mkdir d7/df/d1a/d40/d9d/df2/d3f/df1/dfd 0 2026-03-09T00:04:16.233 INFO:tasks.workunit.client.0.vm03.stdout:2/864: mkdir d8/d1b/d2a/d6b/d50/d103/d120 0 2026-03-09T00:04:16.233 INFO:tasks.workunit.client.0.vm03.stdout:8/842: mknod d7/df/d1a/d40/d58/cfe 0 2026-03-09T00:04:16.239 INFO:tasks.workunit.client.0.vm03.stdout:3/668: dwrite d2/fc5 [0,4194304] 0 2026-03-09T00:04:16.241 INFO:tasks.workunit.client.0.vm03.stdout:3/669: symlink d2/lcd 0 2026-03-09T00:04:16.242 INFO:tasks.workunit.client.0.vm03.stdout:3/670: mknod d2/db/d40/d44/d68/cce 0 2026-03-09T00:04:16.242 INFO:tasks.workunit.client.0.vm03.stdout:3/671: creat d2/db/d3b/d5f/da5/d72/dbd/fcf x:0 0 0 2026-03-09T00:04:16.242 INFO:tasks.workunit.client.0.vm03.stdout:3/672: fsync d2/db/d2d/dc7/fca 0 2026-03-09T00:04:16.242 INFO:tasks.workunit.client.0.vm03.stdout:3/673: write d2/db/d2d/dc7/fca [929322,79593] 0 2026-03-09T00:04:16.243 INFO:tasks.workunit.client.0.vm03.stdout:3/674: truncate d2/db/f28 10970338 0 2026-03-09T00:04:16.244 INFO:tasks.workunit.client.0.vm03.stdout:3/675: rename d2/db/d2d/fb7 to d2/db/d40/d58/fd0 0 2026-03-09T00:04:16.254 INFO:tasks.workunit.client.0.vm03.stdout:7/747: dwrite d2/d4/f13 [0,4194304] 0 2026-03-09T00:04:16.254 INFO:tasks.workunit.client.0.vm03.stdout:6/766: dwrite d13/d35/d71/f87 [0,4194304] 0 2026-03-09T00:04:16.257 INFO:tasks.workunit.client.0.vm03.stdout:7/748: dread d2/d1f/d3a/d24/da4/d46/d81/f8f [0,4194304] 0 2026-03-09T00:04:16.257 INFO:tasks.workunit.client.0.vm03.stdout:7/749: truncate d2/d1f/d3a/d24/da4/d46/d81/f8f 1875292 0 2026-03-09T00:04:16.261 INFO:tasks.workunit.client.0.vm03.stdout:6/767: rename d13/d35/d72/dca to d13/dc4/dea/d102 0 2026-03-09T00:04:16.262 INFO:tasks.workunit.client.0.vm03.stdout:7/750: creat d2/d4/d1e/fe1 x:0 0 0 2026-03-09T00:04:16.263 INFO:tasks.workunit.client.0.vm03.stdout:6/768: creat d13/d1e/d44/d59/dec/d62/f103 x:0 0 0 2026-03-09T00:04:16.265 INFO:tasks.workunit.client.0.vm03.stdout:6/769: dread d13/f70 [0,4194304] 0 2026-03-09T00:04:16.265 INFO:tasks.workunit.client.0.vm03.stdout:6/770: write d13/d35/d69/f84 [8780438,98677] 0 2026-03-09T00:04:16.265 INFO:tasks.workunit.client.0.vm03.stdout:7/751: rename d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/lc5 to d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/le2 0 2026-03-09T00:04:16.266 INFO:tasks.workunit.client.0.vm03.stdout:6/771: rename d13/d35/d71/f87 to d13/d35/dff/f104 0 2026-03-09T00:04:16.266 INFO:tasks.workunit.client.0.vm03.stdout:6/772: stat d13/dc4 0 2026-03-09T00:04:16.278 INFO:tasks.workunit.client.0.vm03.stdout:7/752: mknod d2/d1f/d3a/d24/da4/d46/d81/d96/d37/ce3 0 2026-03-09T00:04:16.283 INFO:tasks.workunit.client.1.vm06.stdout:1/942: rename d6/d21/f2e to d6/d21/d2d/d3b/d87/d9d/dd8/f140 0 2026-03-09T00:04:16.285 INFO:tasks.workunit.client.1.vm06.stdout:1/943: symlink d6/d21/d2d/d113/l141 0 2026-03-09T00:04:16.286 INFO:tasks.workunit.client.1.vm06.stdout:9/952: rename d1/d3/d2b/d58/c7b to d1/d4/d11f/d25/d85/c13c 0 2026-03-09T00:04:16.286 INFO:tasks.workunit.client.1.vm06.stdout:9/953: fdatasync d1/d3/d2b/f33 0 2026-03-09T00:04:16.286 INFO:tasks.workunit.client.1.vm06.stdout:9/954: dread - d1/d3/d4f/d91/dae/f125 zero size 2026-03-09T00:04:16.287 INFO:tasks.workunit.client.1.vm06.stdout:9/955: symlink d1/d3/d4f/d91/dae/l13d 0 2026-03-09T00:04:16.287 INFO:tasks.workunit.client.1.vm06.stdout:9/956: mknod d1/d73/dcf/c13e 0 2026-03-09T00:04:16.288 INFO:tasks.workunit.client.1.vm06.stdout:9/957: getdents d1/d4/d11f 0 2026-03-09T00:04:16.292 INFO:tasks.workunit.client.1.vm06.stdout:9/958: dread d1/d4/d2f/fa0 [0,4194304] 0 2026-03-09T00:04:16.293 INFO:tasks.workunit.client.1.vm06.stdout:9/959: mkdir d1/d3/d4f/d91/d94/ddf/d130/d13f 0 2026-03-09T00:04:16.295 INFO:tasks.workunit.client.1.vm06.stdout:1/944: write d6/d21/d2d/f5b [2704547,106053] 0 2026-03-09T00:04:16.295 INFO:tasks.workunit.client.1.vm06.stdout:9/960: creat d1/d3/d4f/d91/dae/de9/d126/f140 x:0 0 0 2026-03-09T00:04:16.297 INFO:tasks.workunit.client.1.vm06.stdout:9/961: read d1/d4/d11f/d25/f4a [4383353,39647] 0 2026-03-09T00:04:16.311 INFO:tasks.workunit.client.0.vm03.stdout:7/753: dread d2/d1f/d3a/d24/da4/f47 [0,4194304] 0 2026-03-09T00:04:16.312 INFO:tasks.workunit.client.0.vm03.stdout:2/865: rmdir d8/d26/d5e/db1 39 2026-03-09T00:04:16.312 INFO:tasks.workunit.client.0.vm03.stdout:2/866: chown d8/d26/dfc/fff 0 1 2026-03-09T00:04:16.313 INFO:tasks.workunit.client.0.vm03.stdout:7/754: unlink d2/d1f/d3a/d24/c38 0 2026-03-09T00:04:16.314 INFO:tasks.workunit.client.0.vm03.stdout:1/901: dwrite d4/d5e/f88 [0,4194304] 0 2026-03-09T00:04:16.318 INFO:tasks.workunit.client.0.vm03.stdout:1/902: chown d4/d3a/d32/dc2 10432967 1 2026-03-09T00:04:16.318 INFO:tasks.workunit.client.0.vm03.stdout:1/903: truncate d4/d3a/d32/d87/fd5 487382 0 2026-03-09T00:04:16.319 INFO:tasks.workunit.client.1.vm06.stdout:1/945: write d6/d21/fc1 [3189844,59576] 0 2026-03-09T00:04:16.319 INFO:tasks.workunit.client.0.vm03.stdout:7/755: write d2/d4/f2e [684807,59363] 0 2026-03-09T00:04:16.319 INFO:tasks.workunit.client.0.vm03.stdout:2/867: getdents d8/d26/d5e/d6f/d97 0 2026-03-09T00:04:16.319 INFO:tasks.workunit.client.0.vm03.stdout:3/676: dwrite d2/db/d56/fa4 [4194304,4194304] 0 2026-03-09T00:04:16.321 INFO:tasks.workunit.client.0.vm03.stdout:1/904: truncate d4/d3a/d61/d78/f94 1964760 0 2026-03-09T00:04:16.321 INFO:tasks.workunit.client.0.vm03.stdout:1/905: write d4/d15/d5c/f89 [454706,32496] 0 2026-03-09T00:04:16.321 INFO:tasks.workunit.client.0.vm03.stdout:1/906: truncate d4/d3a/d3d/d46/fd7 999080 0 2026-03-09T00:04:16.322 INFO:tasks.workunit.client.1.vm06.stdout:1/946: symlink d6/db0/l142 0 2026-03-09T00:04:16.322 INFO:tasks.workunit.client.1.vm06.stdout:1/947: truncate d6/d21/d2d/d37/d6d/f127 799679 0 2026-03-09T00:04:16.322 INFO:tasks.workunit.client.0.vm03.stdout:9/758: write d15/d1c/d28/f29 [1332136,77882] 0 2026-03-09T00:04:16.323 INFO:tasks.workunit.client.0.vm03.stdout:2/868: truncate d8/d26/d5e/d6f/d97/f68 3891930 0 2026-03-09T00:04:16.327 INFO:tasks.workunit.client.0.vm03.stdout:2/869: dread d8/d1b/f47 [0,4194304] 0 2026-03-09T00:04:16.328 INFO:tasks.workunit.client.0.vm03.stdout:3/677: getdents d2/db/d40/d51/da2/db1 0 2026-03-09T00:04:16.329 INFO:tasks.workunit.client.1.vm06.stdout:1/948: mkdir d6/dc4/d143 0 2026-03-09T00:04:16.337 INFO:tasks.workunit.client.1.vm06.stdout:1/949: mknod d6/d8f/d10f/c144 0 2026-03-09T00:04:16.337 INFO:tasks.workunit.client.1.vm06.stdout:1/950: chown d6/d8f/d10f/c144 25132211 1 2026-03-09T00:04:16.337 INFO:tasks.workunit.client.0.vm03.stdout:2/870: creat d8/d1b/f121 x:0 0 0 2026-03-09T00:04:16.339 INFO:tasks.workunit.client.0.vm03.stdout:9/759: getdents d15/d77 0 2026-03-09T00:04:16.339 INFO:tasks.workunit.client.0.vm03.stdout:9/760: dread - d15/d1c/d28/d6e/da2/fc0 zero size 2026-03-09T00:04:16.340 INFO:tasks.workunit.client.0.vm03.stdout:9/761: write d15/d1c/d28/faa [3342968,12452] 0 2026-03-09T00:04:16.341 INFO:tasks.workunit.client.1.vm06.stdout:1/951: unlink d6/f25 0 2026-03-09T00:04:16.341 INFO:tasks.workunit.client.0.vm03.stdout:2/871: mkdir d8/d1b/d2a/d6b/dc6/d122 0 2026-03-09T00:04:16.341 INFO:tasks.workunit.client.0.vm03.stdout:2/872: fdatasync d8/d1b/d24/da5/dc9/fe8 0 2026-03-09T00:04:16.344 INFO:tasks.workunit.client.0.vm03.stdout:1/907: rename d4/d15/d77/dce/df6 to d4/d15/dae/d12b 0 2026-03-09T00:04:16.345 INFO:tasks.workunit.client.1.vm06.stdout:1/952: read d6/d21/dfc/f10d [514183,40632] 0 2026-03-09T00:04:16.345 INFO:tasks.workunit.client.1.vm06.stdout:1/953: dread - d6/d21/d2d/d37/dbc/f139 zero size 2026-03-09T00:04:16.345 INFO:tasks.workunit.client.0.vm03.stdout:4/911: sync 2026-03-09T00:04:16.345 INFO:tasks.workunit.client.0.vm03.stdout:5/837: sync 2026-03-09T00:04:16.345 INFO:tasks.workunit.client.0.vm03.stdout:0/839: sync 2026-03-09T00:04:16.345 INFO:tasks.workunit.client.0.vm03.stdout:0/840: read - d2/da/d76/d8a/d8f/ffb zero size 2026-03-09T00:04:16.348 INFO:tasks.workunit.client.0.vm03.stdout:9/762: rename d15/d1c/d21/d54/d87/d93/cd1 to d15/d1c/d28/de1/ded/c103 0 2026-03-09T00:04:16.348 INFO:tasks.workunit.client.0.vm03.stdout:9/763: read - d15/d1c/d21/d54/d87/fd6 zero size 2026-03-09T00:04:16.349 INFO:tasks.workunit.client.1.vm06.stdout:1/954: rename d6/d21/d2d/d37/fb5 to d6/d8f/f145 0 2026-03-09T00:04:16.349 INFO:tasks.workunit.client.1.vm06.stdout:1/955: creat d6/d21/dfc/de8/f146 x:0 0 0 2026-03-09T00:04:16.349 INFO:tasks.workunit.client.1.vm06.stdout:1/956: truncate d6/d4c/fe4 611308 0 2026-03-09T00:04:16.349 INFO:tasks.workunit.client.1.vm06.stdout:1/957: fdatasync d6/d21/d2d/d3b/d42/f80 0 2026-03-09T00:04:16.349 INFO:tasks.workunit.client.1.vm06.stdout:1/958: chown d6/d21/d2d/d3b/d42/d129/fba 13095 1 2026-03-09T00:04:16.349 INFO:tasks.workunit.client.0.vm03.stdout:1/908: getdents d4/d3a/d3d/d98/dee 0 2026-03-09T00:04:16.350 INFO:tasks.workunit.client.0.vm03.stdout:4/912: mkdir d7/d20/d6a/dea/df6/d121 0 2026-03-09T00:04:16.359 INFO:tasks.workunit.client.0.vm03.stdout:0/841: symlink d2/da/d36/l12e 0 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.1.vm06.stdout:1/959: creat d6/d4c/f147 x:0 0 0 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.0.vm03.stdout:9/764: creat d15/d1c/d36/f104 x:0 0 0 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.0.vm03.stdout:9/765: fdatasync d15/d1c/d36/d4d/fd9 0 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.0.vm03.stdout:9/766: dread - d15/d1c/d21/d54/ff9 zero size 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.0.vm03.stdout:9/767: write d15/d1c/d36/d4d/fd9 [790720,130200] 0 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.0.vm03.stdout:9/768: creat d15/d1c/d36/f105 x:0 0 0 2026-03-09T00:04:16.360 INFO:tasks.workunit.client.0.vm03.stdout:9/769: write d15/d1c/d28/d6e/f7c [2331044,10035] 0 2026-03-09T00:04:16.361 INFO:tasks.workunit.client.0.vm03.stdout:0/842: mkdir d2/da/d36/ddf/d12f 0 2026-03-09T00:04:16.361 INFO:tasks.workunit.client.0.vm03.stdout:0/843: chown d2/da/dd/d49/d6c/d4b/daf/fc6 497 1 2026-03-09T00:04:16.368 INFO:tasks.workunit.client.0.vm03.stdout:1/909: rmdir d4/d15/d5c/d103 39 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:9/770: rmdir d15/d1c/d36/d4d 39 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:9/771: fdatasync d15/d1c/d28/d6e/da2/fca 0 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:1/910: mkdir d4/d3a/d32/d87/db3/d12c 0 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:9/772: link d15/d1c/d21/d64/l32 d15/d1c/d36/d4d/dc4/l106 0 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:1/911: rename d4/d15/f8a to d4/de2/f12d 0 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:1/912: truncate d4/d15/d5c/f74 5040375 0 2026-03-09T00:04:16.377 INFO:tasks.workunit.client.0.vm03.stdout:1/913: chown d4/d3a/d43/daf/fbf 29313480 1 2026-03-09T00:04:16.380 INFO:tasks.workunit.client.0.vm03.stdout:0/844: write d2/da/dd/f38 [4664563,72175] 0 2026-03-09T00:04:16.380 INFO:tasks.workunit.client.0.vm03.stdout:0/845: chown d2/da/dd/d49/d6c/da6/dda/db5/ff1 33222 1 2026-03-09T00:04:16.381 INFO:tasks.workunit.client.0.vm03.stdout:0/846: truncate d2/da/dd/d49/d6c/f57 1094966 0 2026-03-09T00:04:16.381 INFO:tasks.workunit.client.0.vm03.stdout:0/847: getdents d2/da/d76/d8a/d8f 0 2026-03-09T00:04:16.382 INFO:tasks.workunit.client.0.vm03.stdout:0/848: mkdir d2/da/d36/ddf/d12f/d130 0 2026-03-09T00:04:16.403 INFO:tasks.workunit.client.0.vm03.stdout:5/838: dread d1c/d20/d55/d4f/d58/db5/df7/ff8 [0,4194304] 0 2026-03-09T00:04:16.404 INFO:tasks.workunit.client.0.vm03.stdout:5/839: rename d1c/d20/d55/d66/dc6/df1/c104 to d1c/d51/df2/c10f 0 2026-03-09T00:04:16.404 INFO:tasks.workunit.client.0.vm03.stdout:5/840: readlink d1c/d67/lfe 0 2026-03-09T00:04:16.404 INFO:tasks.workunit.client.0.vm03.stdout:5/841: fsync d1c/d20/d55/f34 0 2026-03-09T00:04:16.404 INFO:tasks.workunit.client.0.vm03.stdout:5/842: dread - d1c/d20/d55/d66/d70/fde zero size 2026-03-09T00:04:16.405 INFO:tasks.workunit.client.0.vm03.stdout:5/843: creat d1c/d20/d97/f110 x:0 0 0 2026-03-09T00:04:16.426 INFO:tasks.workunit.client.1.vm06.stdout:9/962: dwrite d1/d4/d11f/d25/d85/f28 [0,4194304] 0 2026-03-09T00:04:16.427 INFO:tasks.workunit.client.1.vm06.stdout:9/963: mknod d1/d4/d6e/d9/c141 0 2026-03-09T00:04:16.427 INFO:tasks.workunit.client.1.vm06.stdout:9/964: stat d1/d73/c7d 0 2026-03-09T00:04:16.427 INFO:tasks.workunit.client.1.vm06.stdout:9/965: creat d1/d3/d4f/d91/ddb/f142 x:0 0 0 2026-03-09T00:04:16.495 INFO:tasks.workunit.client.0.vm03.stdout:9/773: dread d15/d1c/d28/f55 [0,4194304] 0 2026-03-09T00:04:16.496 INFO:tasks.workunit.client.0.vm03.stdout:9/774: truncate d15/d1c/d21/d64/fc2 278227 0 2026-03-09T00:04:16.496 INFO:tasks.workunit.client.0.vm03.stdout:9/775: write d15/d1c/d36/d4d/dc4/fff [738366,35537] 0 2026-03-09T00:04:16.519 INFO:tasks.workunit.client.0.vm03.stdout:1/914: dwrite d4/d3a/f26 [0,4194304] 0 2026-03-09T00:04:16.519 INFO:tasks.workunit.client.0.vm03.stdout:1/915: write d4/d3a/d43/f5a [796352,91967] 0 2026-03-09T00:04:16.519 INFO:tasks.workunit.client.0.vm03.stdout:7/756: dwrite d2/d1f/d3a/d24/da4/f47 [0,4194304] 0 2026-03-09T00:04:16.519 INFO:tasks.workunit.client.0.vm03.stdout:1/916: rename d4/d3a/d3d/d98/dee/d93 to d4/d3a/d3d/d98/dee/d9e/d12e 0 2026-03-09T00:04:16.519 INFO:tasks.workunit.client.0.vm03.stdout:1/917: chown d4/d3a/d3d/d98/dee/d9e/d12e 172772 1 2026-03-09T00:04:16.520 INFO:tasks.workunit.client.0.vm03.stdout:7/757: symlink d2/d4/db7/le4 0 2026-03-09T00:04:16.521 INFO:tasks.workunit.client.0.vm03.stdout:1/918: symlink d4/d3a/d61/d78/l12f 0 2026-03-09T00:04:16.521 INFO:tasks.workunit.client.0.vm03.stdout:7/758: mknod d2/d4/d1e/ce5 0 2026-03-09T00:04:16.522 INFO:tasks.workunit.client.0.vm03.stdout:1/919: creat d4/d6/d52/db5/f130 x:0 0 0 2026-03-09T00:04:16.523 INFO:tasks.workunit.client.0.vm03.stdout:1/920: symlink d4/d15/d77/dce/l131 0 2026-03-09T00:04:16.527 INFO:tasks.workunit.client.0.vm03.stdout:4/913: dwrite d7/d20/d6a/dea/d54/ff3 [0,4194304] 0 2026-03-09T00:04:16.527 INFO:tasks.workunit.client.0.vm03.stdout:4/914: read - d7/d20/d6a/dea/d38/f116 zero size 2026-03-09T00:04:16.541 INFO:tasks.workunit.client.1.vm06.stdout:1/960: dwrite d6/f1d [0,4194304] 0 2026-03-09T00:04:16.547 INFO:tasks.workunit.client.0.vm03.stdout:3/678: dwrite d2/db/d3b/d5d/f8d [0,4194304] 0 2026-03-09T00:04:16.548 INFO:tasks.workunit.client.0.vm03.stdout:3/679: symlink d2/ld1 0 2026-03-09T00:04:16.549 INFO:tasks.workunit.client.0.vm03.stdout:3/680: rename d2/db/d3b/la8 to d2/db/d40/d44/db5/ld2 0 2026-03-09T00:04:16.551 INFO:tasks.workunit.client.0.vm03.stdout:3/681: rename d2/db/f64 to d2/db/d40/d44/db5/fd3 0 2026-03-09T00:04:16.554 INFO:tasks.workunit.client.0.vm03.stdout:7/759: dread d2/d1f/f3b [4194304,4194304] 0 2026-03-09T00:04:16.555 INFO:tasks.workunit.client.0.vm03.stdout:0/849: dwrite d2/da/d4e/faa [4194304,4194304] 0 2026-03-09T00:04:16.555 INFO:tasks.workunit.client.0.vm03.stdout:0/850: write f0 [4189245,50201] 0 2026-03-09T00:04:16.555 INFO:tasks.workunit.client.0.vm03.stdout:0/851: write f0 [1121414,18931] 0 2026-03-09T00:04:16.555 INFO:tasks.workunit.client.0.vm03.stdout:7/760: symlink d2/d4/d1e/le6 0 2026-03-09T00:04:16.555 INFO:tasks.workunit.client.0.vm03.stdout:7/761: dread - d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/fd7 zero size 2026-03-09T00:04:16.555 INFO:tasks.workunit.client.0.vm03.stdout:7/762: creat d2/d4/d1e/d5e/d7e/fe7 x:0 0 0 2026-03-09T00:04:16.556 INFO:tasks.workunit.client.0.vm03.stdout:0/852: mknod d2/da/dd/d49/d6c/da6/dcf/c131 0 2026-03-09T00:04:16.556 INFO:tasks.workunit.client.0.vm03.stdout:0/853: readlink d2/da/l110 0 2026-03-09T00:04:16.558 INFO:tasks.workunit.client.0.vm03.stdout:7/763: symlink d2/d1f/d3a/d24/da4/d46/d81/le8 0 2026-03-09T00:04:16.573 INFO:tasks.workunit.client.0.vm03.stdout:7/764: symlink d2/d1f/d3a/d24/da4/d46/d81/le9 0 2026-03-09T00:04:16.587 INFO:tasks.workunit.client.0.vm03.stdout:2/873: dwrite f7 [0,4194304] 0 2026-03-09T00:04:16.589 INFO:tasks.workunit.client.0.vm03.stdout:2/874: truncate d8/d26/d5e/f64 2448071 0 2026-03-09T00:04:16.595 INFO:tasks.workunit.client.0.vm03.stdout:2/875: dread d8/d26/d5e/d6f/d97/f1c [0,4194304] 0 2026-03-09T00:04:16.596 INFO:tasks.workunit.client.0.vm03.stdout:3/682: dread d2/db/f13 [0,4194304] 0 2026-03-09T00:04:16.598 INFO:tasks.workunit.client.0.vm03.stdout:2/876: creat d8/d1b/d24/f123 x:0 0 0 2026-03-09T00:04:16.618 INFO:tasks.workunit.client.0.vm03.stdout:8/843: dwrite d7/f25 [0,4194304] 0 2026-03-09T00:04:16.618 INFO:tasks.workunit.client.0.vm03.stdout:8/844: write d7/df/d1a/f2e [2128736,25409] 0 2026-03-09T00:04:16.618 INFO:tasks.workunit.client.0.vm03.stdout:8/845: write d7/df/d1a/fc4 [695492,66935] 0 2026-03-09T00:04:16.618 INFO:tasks.workunit.client.1.vm06.stdout:9/966: dwrite d1/f78 [4194304,4194304] 0 2026-03-09T00:04:16.618 INFO:tasks.workunit.client.1.vm06.stdout:9/967: write d1/d4/d11f/d25/d85/fb8 [4483360,127554] 0 2026-03-09T00:04:16.626 INFO:tasks.workunit.client.0.vm03.stdout:8/846: creat d7/df/d1a/d40/d9d/da3/df0/fff x:0 0 0 2026-03-09T00:04:16.626 INFO:tasks.workunit.client.0.vm03.stdout:8/847: chown d7/df/d1a/d40/d9d/df2/d38/d91 62454377 1 2026-03-09T00:04:16.630 INFO:tasks.workunit.client.0.vm03.stdout:8/848: dread d7/df/d1a/d40/d9d/df2/d3f/f59 [0,4194304] 0 2026-03-09T00:04:16.630 INFO:tasks.workunit.client.0.vm03.stdout:8/849: stat d7/df/d1a/d40/d9d/df2/d38/d91/ce6 0 2026-03-09T00:04:16.630 INFO:tasks.workunit.client.0.vm03.stdout:8/850: creat d7/df/d1a/d40/d9d/df2/d3f/df1/dfd/f100 x:0 0 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.0.vm03.stdout:8/851: getdents d7/df/d1a/d40/d9d/df2/dc3 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.0.vm03.stdout:8/852: mkdir d7/df/d1a/d40/dc8/d101 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.0.vm03.stdout:8/853: symlink d7/df/d1a/d2b/l102 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.1.vm06.stdout:9/968: dread d1/d4/d6e/f5d [4194304,4194304] 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.1.vm06.stdout:9/969: mknod d1/d3/d4f/c143 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.1.vm06.stdout:9/970: creat d1/d4/df5/f144 x:0 0 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.1.vm06.stdout:9/971: chown d1/d4/d11f/d25/d85/c13c 4770 1 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.1.vm06.stdout:9/972: write d1/d3/f11d [475409,104399] 0 2026-03-09T00:04:16.642 INFO:tasks.workunit.client.0.vm03.stdout:8/854: dread d7/df/d1a/d40/d58/f7a [0,4194304] 0 2026-03-09T00:04:16.648 INFO:tasks.workunit.client.0.vm03.stdout:9/776: dwrite d15/d1c/d21/d64/fac [0,4194304] 0 2026-03-09T00:04:16.652 INFO:tasks.workunit.client.0.vm03.stdout:9/777: unlink d15/d1c/d36/f72 0 2026-03-09T00:04:16.656 INFO:tasks.workunit.client.1.vm06.stdout:1/961: dwrite d6/d8f/d10f/f119 [0,4194304] 0 2026-03-09T00:04:16.656 INFO:tasks.workunit.client.1.vm06.stdout:1/962: creat d6/d21/d2d/f148 x:0 0 0 2026-03-09T00:04:16.661 INFO:tasks.workunit.client.0.vm03.stdout:9/778: write f11 [3788945,29161] 0 2026-03-09T00:04:16.671 INFO:tasks.workunit.client.1.vm06.stdout:1/963: dread d6/f19 [0,4194304] 0 2026-03-09T00:04:16.673 INFO:tasks.workunit.client.1.vm06.stdout:1/964: mkdir d6/d8f/d10f/d121/d149 0 2026-03-09T00:04:16.674 INFO:tasks.workunit.client.1.vm06.stdout:1/965: symlink d6/d21/d2d/d3b/dc9/l14a 0 2026-03-09T00:04:16.675 INFO:tasks.workunit.client.1.vm06.stdout:1/966: creat d6/dc4/f14b x:0 0 0 2026-03-09T00:04:16.677 INFO:tasks.workunit.client.1.vm06.stdout:1/967: rename d6/d21/d2d/d3b/d42/d129/f12c to d6/d4c/d79/d10c/d11d/f14c 0 2026-03-09T00:04:16.677 INFO:tasks.workunit.client.1.vm06.stdout:1/968: chown d6/d21/d2d/d37/d6d/dd7/ff6 9 1 2026-03-09T00:04:16.677 INFO:tasks.workunit.client.1.vm06.stdout:1/969: mkdir d6/d8f/d10f/d14d 0 2026-03-09T00:04:16.728 INFO:tasks.workunit.client.0.vm03.stdout:3/683: dwrite d2/f4e [0,4194304] 0 2026-03-09T00:04:16.729 INFO:tasks.workunit.client.0.vm03.stdout:3/684: stat d2/dbf 0 2026-03-09T00:04:16.729 INFO:tasks.workunit.client.0.vm03.stdout:3/685: getdents d2/db/d6a 0 2026-03-09T00:04:16.730 INFO:tasks.workunit.client.0.vm03.stdout:3/686: rename d2/db/d3b/d5f/da5/d72/dbd/fcf to d2/db/d56/fd4 0 2026-03-09T00:04:16.731 INFO:tasks.workunit.client.0.vm03.stdout:3/687: symlink d2/db/d3b/d5f/ld5 0 2026-03-09T00:04:16.731 INFO:tasks.workunit.client.0.vm03.stdout:3/688: readlink d2/db/d2d/l41 0 2026-03-09T00:04:16.753 INFO:tasks.workunit.client.0.vm03.stdout:8/855: dwrite d7/df/d1a/d40/d9d/df2/d38/d60/f6e [0,4194304] 0 2026-03-09T00:04:16.755 INFO:tasks.workunit.client.0.vm03.stdout:8/856: rename d7/df/d1a/d40/d9d/df2/d38/d91/dc2 to d7/df/d1a/d40/d9d/df2/d38/d91/d103 0 2026-03-09T00:04:16.773 INFO:tasks.workunit.client.0.vm03.stdout:4/915: dwrite d7/d20/d6a/f76 [0,4194304] 0 2026-03-09T00:04:16.774 INFO:tasks.workunit.client.0.vm03.stdout:2/877: dwrite d8/d1b/d2a/f2d [0,4194304] 0 2026-03-09T00:04:16.774 INFO:tasks.workunit.client.0.vm03.stdout:3/689: dwrite d2/db/d56/fd4 [0,4194304] 0 2026-03-09T00:04:16.780 INFO:tasks.workunit.client.1.vm06.stdout:1/970: dwrite d6/d21/d2d/d37/dbc/f12d [0,4194304] 0 2026-03-09T00:04:16.780 INFO:tasks.workunit.client.1.vm06.stdout:1/971: fdatasync d6/d4c/d71/fbf 0 2026-03-09T00:04:16.781 INFO:tasks.workunit.client.0.vm03.stdout:2/878: dread d8/d1b/d24/fb2 [0,4194304] 0 2026-03-09T00:04:16.781 INFO:tasks.workunit.client.0.vm03.stdout:2/879: creat d8/d1b/d2a/d2e/f124 x:0 0 0 2026-03-09T00:04:16.781 INFO:tasks.workunit.client.0.vm03.stdout:2/880: truncate d8/d26/d5e/d5f/d95/f10c 4430029 0 2026-03-09T00:04:16.784 INFO:tasks.workunit.client.0.vm03.stdout:3/690: rename d2/db/f28 to d2/db/d56/fd6 0 2026-03-09T00:04:16.784 INFO:tasks.workunit.client.0.vm03.stdout:3/691: stat d2/db/d3b/d5d/f60 0 2026-03-09T00:04:16.785 INFO:tasks.workunit.client.0.vm03.stdout:4/916: mkdir d7/d20/d6a/d77/db7/d122 0 2026-03-09T00:04:16.789 INFO:tasks.workunit.client.1.vm06.stdout:1/972: link d6/d4c/c7a d6/d21/d2d/c14e 0 2026-03-09T00:04:16.792 INFO:tasks.workunit.client.0.vm03.stdout:9/779: dwrite d15/d1c/d28/de1/ded/fef [0,4194304] 0 2026-03-09T00:04:16.792 INFO:tasks.workunit.client.0.vm03.stdout:9/780: creat d15/d7f/f107 x:0 0 0 2026-03-09T00:04:16.795 INFO:tasks.workunit.client.1.vm06.stdout:9/973: dwrite d1/d3/d2b/d58/fe4 [0,4194304] 0 2026-03-09T00:04:16.795 INFO:tasks.workunit.client.0.vm03.stdout:4/917: write d7/d27/fc5 [2108036,82273] 0 2026-03-09T00:04:16.802 INFO:tasks.workunit.client.1.vm06.stdout:9/974: dread d1/d4/d11f/d25/d85/f28 [0,4194304] 0 2026-03-09T00:04:16.802 INFO:tasks.workunit.client.1.vm06.stdout:9/975: chown d1/d3/d2b/fbb 781 1 2026-03-09T00:04:16.803 INFO:tasks.workunit.client.0.vm03.stdout:9/781: rename d15/d1c/d28/de1/ld3 to d15/d1c/d36/l108 0 2026-03-09T00:04:16.805 INFO:tasks.workunit.client.1.vm06.stdout:9/976: mknod d1/d4/d6e/d9/c145 0 2026-03-09T00:04:16.806 INFO:tasks.workunit.client.0.vm03.stdout:3/692: rmdir d2/db/d2d 39 2026-03-09T00:04:16.810 INFO:tasks.workunit.client.0.vm03.stdout:9/782: dread d15/d1c/d21/d64/fc2 [0,4194304] 0 2026-03-09T00:04:16.830 INFO:tasks.workunit.client.0.vm03.stdout:6/773: sync 2026-03-09T00:04:16.831 INFO:tasks.workunit.client.0.vm03.stdout:6/774: link d13/d1e/d44/d59/d77/l78 d13/d35/d71/d97/ded/l105 0 2026-03-09T00:04:16.832 INFO:tasks.workunit.client.0.vm03.stdout:6/775: mknod d13/d35/d71/d97/ded/c106 0 2026-03-09T00:04:16.832 INFO:tasks.workunit.client.0.vm03.stdout:6/776: chown d13/dc4/dea/d102/fde 0 1 2026-03-09T00:04:16.832 INFO:tasks.workunit.client.0.vm03.stdout:6/777: dread - d13/d35/d71/d97/da5/db1/feb zero size 2026-03-09T00:04:16.833 INFO:tasks.workunit.client.0.vm03.stdout:6/778: mknod d13/d35/d71/d97/c107 0 2026-03-09T00:04:16.835 INFO:tasks.workunit.client.0.vm03.stdout:6/779: link d13/d35/d74/d89/ff8 d13/d35/d71/d97/da5/db1/f108 0 2026-03-09T00:04:16.837 INFO:tasks.workunit.client.0.vm03.stdout:6/780: mkdir d13/dc4/dea/d109 0 2026-03-09T00:04:16.838 INFO:tasks.workunit.client.0.vm03.stdout:6/781: mknod d13/d1e/d44/d59/d77/c10a 0 2026-03-09T00:04:16.838 INFO:tasks.workunit.client.0.vm03.stdout:6/782: truncate d13/d1e/f3e 4089161 0 2026-03-09T00:04:16.838 INFO:tasks.workunit.client.0.vm03.stdout:6/783: fdatasync fb 0 2026-03-09T00:04:16.838 INFO:tasks.workunit.client.0.vm03.stdout:6/784: chown d13/d35/d72/fb7 6124731 1 2026-03-09T00:04:16.839 INFO:tasks.workunit.client.0.vm03.stdout:6/785: mkdir d13/d1e/d44/d10b 0 2026-03-09T00:04:16.841 INFO:tasks.workunit.client.0.vm03.stdout:6/786: creat d13/d1e/d44/d59/dec/d62/df5/f10c x:0 0 0 2026-03-09T00:04:16.845 INFO:tasks.workunit.client.0.vm03.stdout:8/857: write d7/f9b [2670143,79985] 0 2026-03-09T00:04:16.849 INFO:tasks.workunit.client.0.vm03.stdout:8/858: dread d7/df/d1a/d40/d9d/df2/d3f/d95/fb4 [0,4194304] 0 2026-03-09T00:04:16.850 INFO:tasks.workunit.client.0.vm03.stdout:8/859: write d7/df/d1a/d40/d9d/da3/df0/fff [471489,117079] 0 2026-03-09T00:04:16.850 INFO:tasks.workunit.client.0.vm03.stdout:8/860: truncate d7/df/d1a/d40/d9d/da3/dd2/fed 18975 0 2026-03-09T00:04:16.852 INFO:tasks.workunit.client.0.vm03.stdout:8/861: write f3 [2925412,95194] 0 2026-03-09T00:04:16.863 INFO:tasks.workunit.client.0.vm03.stdout:8/862: write d7/f34 [4147787,74769] 0 2026-03-09T00:04:16.863 INFO:tasks.workunit.client.0.vm03.stdout:9/783: dread d15/d1c/d36/f78 [0,4194304] 0 2026-03-09T00:04:16.864 INFO:tasks.workunit.client.0.vm03.stdout:9/784: creat d15/d77/de2/f109 x:0 0 0 2026-03-09T00:04:16.864 INFO:tasks.workunit.client.0.vm03.stdout:9/785: creat d15/d1c/d28/dda/f10a x:0 0 0 2026-03-09T00:04:16.864 INFO:tasks.workunit.client.0.vm03.stdout:9/786: chown d15/d1c/d21/d54/dab/ld7 6 1 2026-03-09T00:04:16.876 INFO:tasks.workunit.client.0.vm03.stdout:4/918: dwrite d7/d20/d6a/dea/d78/f114 [0,4194304] 0 2026-03-09T00:04:16.877 INFO:tasks.workunit.client.1.vm06.stdout:1/973: dwrite d6/d4c/d71/fea [0,4194304] 0 2026-03-09T00:04:16.883 INFO:tasks.workunit.client.1.vm06.stdout:1/974: symlink d6/d21/d2d/d37/dbc/l14f 0 2026-03-09T00:04:16.883 INFO:tasks.workunit.client.1.vm06.stdout:1/975: write d6/d21/f3d [3775456,42741] 0 2026-03-09T00:04:16.883 INFO:tasks.workunit.client.0.vm03.stdout:8/863: write d7/df/d1a/d40/db3/f75 [142423,46849] 0 2026-03-09T00:04:16.884 INFO:tasks.workunit.client.0.vm03.stdout:8/864: write d7/df/f3d [1992362,80384] 0 2026-03-09T00:04:16.893 INFO:tasks.workunit.client.1.vm06.stdout:1/976: stat d6/d21/d2d/lb6 0 2026-03-09T00:04:16.902 INFO:tasks.workunit.client.1.vm06.stdout:1/977: chown d6/d21/d2d/d3b/d42/c111 913 1 2026-03-09T00:04:16.902 INFO:tasks.workunit.client.1.vm06.stdout:1/978: fdatasync d6/d8f/d10f/f12e 0 2026-03-09T00:04:16.905 INFO:tasks.workunit.client.0.vm03.stdout:0/854: sync 2026-03-09T00:04:16.905 INFO:tasks.workunit.client.0.vm03.stdout:5/844: sync 2026-03-09T00:04:16.906 INFO:tasks.workunit.client.0.vm03.stdout:1/921: sync 2026-03-09T00:04:16.906 INFO:tasks.workunit.client.0.vm03.stdout:7/765: sync 2026-03-09T00:04:16.906 INFO:tasks.workunit.client.0.vm03.stdout:7/766: truncate d2/d4/db7/d67/f64 4531189 0 2026-03-09T00:04:16.909 INFO:tasks.workunit.client.0.vm03.stdout:5/845: dread d1c/d20/d55/d4f/d58/db5/f3c [0,4194304] 0 2026-03-09T00:04:16.911 INFO:tasks.workunit.client.0.vm03.stdout:0/855: rename d2/da/dd/d6e/l93 to d2/da/d1a/l132 0 2026-03-09T00:04:16.922 INFO:tasks.workunit.client.0.vm03.stdout:0/856: fdatasync d2/da/dd/f7b 0 2026-03-09T00:04:16.922 INFO:tasks.workunit.client.0.vm03.stdout:1/922: getdents d4/d15/d5c/d6c 0 2026-03-09T00:04:16.922 INFO:tasks.workunit.client.0.vm03.stdout:0/857: rename d2/da/d76/d8a/d8f/ffb to d2/da/d36/ddf/f133 0 2026-03-09T00:04:16.922 INFO:tasks.workunit.client.0.vm03.stdout:5/846: write d1c/d20/dc0/fd4 [1657337,68695] 0 2026-03-09T00:04:16.959 INFO:tasks.workunit.client.1.vm06.stdout:9/977: dwrite d1/d3/d4f/d52/f5e [0,4194304] 0 2026-03-09T00:04:16.960 INFO:tasks.workunit.client.1.vm06.stdout:9/978: unlink d1/d3/d4f/cc6 0 2026-03-09T00:04:16.960 INFO:tasks.workunit.client.0.vm03.stdout:0/858: dread d2/da/dd/d49/f6a [0,4194304] 0 2026-03-09T00:04:16.960 INFO:tasks.workunit.client.1.vm06.stdout:9/979: creat d1/d73/dcf/d109/f146 x:0 0 0 2026-03-09T00:04:16.960 INFO:tasks.workunit.client.1.vm06.stdout:9/980: read d1/d4/d2f/fa0 [3027350,110777] 0 2026-03-09T00:04:16.960 INFO:tasks.workunit.client.1.vm06.stdout:9/981: fsync d1/d3/d4f/d91/d94/fcd 0 2026-03-09T00:04:16.961 INFO:tasks.workunit.client.0.vm03.stdout:0/859: rename d2/da/dd/d49/d6c/d4b/d55/d6f/c79 to d2/d111/c134 0 2026-03-09T00:04:16.961 INFO:tasks.workunit.client.0.vm03.stdout:0/860: write d2/da/d4e/f120 [1249034,103591] 0 2026-03-09T00:04:16.963 INFO:tasks.workunit.client.0.vm03.stdout:5/847: write d1c/d20/d56/db4/fb7 [55627,32122] 0 2026-03-09T00:04:16.978 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:16 vm06.local ceph-mon[58395]: pgmap v11: 65 pgs: 65 active+clean; 3.5 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 103 MiB/s rd, 127 MiB/s wr, 199 op/s 2026-03-09T00:04:16.978 INFO:tasks.workunit.client.0.vm03.stdout:5/848: rmdir d1c/d20/d55/db0/dc7 39 2026-03-09T00:04:16.978 INFO:tasks.workunit.client.0.vm03.stdout:5/849: creat d1c/d20/d55/db0/dc7/d101/d10b/f111 x:0 0 0 2026-03-09T00:04:16.981 INFO:tasks.workunit.client.0.vm03.stdout:5/850: read d1c/d20/d55/d66/d70/f80 [630159,95503] 0 2026-03-09T00:04:16.988 INFO:tasks.workunit.client.0.vm03.stdout:6/787: dwrite d13/dc4/dea/d102/fde [0,4194304] 0 2026-03-09T00:04:16.991 INFO:tasks.workunit.client.0.vm03.stdout:6/788: write d13/d35/f68 [2287938,10901] 0 2026-03-09T00:04:16.991 INFO:tasks.workunit.client.0.vm03.stdout:6/789: write d13/d35/d69/f84 [8786814,117403] 0 2026-03-09T00:04:16.991 INFO:tasks.workunit.client.0.vm03.stdout:6/790: fdatasync d13/f1a 0 2026-03-09T00:04:16.991 INFO:tasks.workunit.client.0.vm03.stdout:6/791: symlink d13/d35/d74/d89/l10d 0 2026-03-09T00:04:16.991 INFO:tasks.workunit.client.0.vm03.stdout:6/792: link d13/d35/d69/dee/ffe d13/d1e/f10e 0 2026-03-09T00:04:16.996 INFO:tasks.workunit.client.0.vm03.stdout:5/851: dread ff [0,4194304] 0 2026-03-09T00:04:16.997 INFO:tasks.workunit.client.0.vm03.stdout:5/852: truncate ff 3621587 0 2026-03-09T00:04:16.998 INFO:tasks.workunit.client.0.vm03.stdout:5/853: unlink d1c/d20/d97/fb3 0 2026-03-09T00:04:16.998 INFO:tasks.workunit.client.0.vm03.stdout:5/854: read - d1c/d20/d55/d66/d70/fde zero size 2026-03-09T00:04:17.000 INFO:tasks.workunit.client.0.vm03.stdout:5/855: unlink d1c/d51/d6a/d75/faf 0 2026-03-09T00:04:17.004 INFO:tasks.workunit.client.0.vm03.stdout:5/856: getdents d1c/d51 0 2026-03-09T00:04:17.004 INFO:tasks.workunit.client.0.vm03.stdout:5/857: getdents d1c/d20/d55/d4f/d58/d73/d9e 0 2026-03-09T00:04:17.006 INFO:tasks.workunit.client.0.vm03.stdout:5/858: link fe d1c/f112 0 2026-03-09T00:04:17.006 INFO:tasks.workunit.client.0.vm03.stdout:5/859: creat d1c/d20/d55/f113 x:0 0 0 2026-03-09T00:04:17.009 INFO:tasks.workunit.client.0.vm03.stdout:5/860: mknod d1c/d20/d55/db0/c114 0 2026-03-09T00:04:17.044 INFO:tasks.workunit.client.0.vm03.stdout:4/919: dwrite d7/d20/d6a/d77/d25/de2/df1/f10d [0,4194304] 0 2026-03-09T00:04:17.044 INFO:tasks.workunit.client.0.vm03.stdout:9/787: dwrite d15/d1c/f102 [0,4194304] 0 2026-03-09T00:04:17.046 INFO:tasks.workunit.client.0.vm03.stdout:4/920: creat d7/d20/d6a/dea/d38/dfb/f123 x:0 0 0 2026-03-09T00:04:17.054 INFO:tasks.workunit.client.0.vm03.stdout:4/921: truncate d7/d20/d6a/d77/d25/de2/f11b 268507 0 2026-03-09T00:04:17.054 INFO:tasks.workunit.client.0.vm03.stdout:9/788: rename d15/l82 to d15/d7f/l10b 0 2026-03-09T00:04:17.054 INFO:tasks.workunit.client.0.vm03.stdout:9/789: getdents d15/d1c/d21/d75/de0 0 2026-03-09T00:04:17.054 INFO:tasks.workunit.client.0.vm03.stdout:9/790: dread d15/d7f/fdf [0,4194304] 0 2026-03-09T00:04:17.056 INFO:tasks.workunit.client.0.vm03.stdout:2/881: dwrite d8/d26/d5e/d6f/d97/f1a [0,4194304] 0 2026-03-09T00:04:17.058 INFO:tasks.workunit.client.0.vm03.stdout:2/882: mknod d8/d26/d5e/d5f/ded/c125 0 2026-03-09T00:04:17.058 INFO:tasks.workunit.client.0.vm03.stdout:2/883: chown d8/d1b/d8f/de6 86212544 1 2026-03-09T00:04:17.058 INFO:tasks.workunit.client.0.vm03.stdout:2/884: chown d8/d1b/d2a/d2e/df5/ff9 110193831 1 2026-03-09T00:04:17.058 INFO:tasks.workunit.client.0.vm03.stdout:2/885: creat d8/d26/d5e/d6f/f126 x:0 0 0 2026-03-09T00:04:17.058 INFO:tasks.workunit.client.0.vm03.stdout:2/886: stat d8/d1b/d2a/c114 0 2026-03-09T00:04:17.060 INFO:tasks.workunit.client.0.vm03.stdout:4/922: write d7/d20/d6a/d77/d25/fa1 [794167,123248] 0 2026-03-09T00:04:17.060 INFO:tasks.workunit.client.0.vm03.stdout:4/923: write d7/d20/d6a/dde/f109 [781928,121582] 0 2026-03-09T00:04:17.060 INFO:tasks.workunit.client.0.vm03.stdout:4/924: write d7/d20/d6a/dea/df6/f11a [825729,129502] 0 2026-03-09T00:04:17.060 INFO:tasks.workunit.client.0.vm03.stdout:2/887: truncate d8/d1b/d2a/fbb 7635299 0 2026-03-09T00:04:17.060 INFO:tasks.workunit.client.0.vm03.stdout:9/791: dread d15/f44 [0,4194304] 0 2026-03-09T00:04:17.061 INFO:tasks.workunit.client.0.vm03.stdout:9/792: chown d15/d1c/d36/d4d 426912380 1 2026-03-09T00:04:17.062 INFO:tasks.workunit.client.0.vm03.stdout:4/925: symlink d7/d20/d6a/dea/d54/d58/d11f/l124 0 2026-03-09T00:04:17.062 INFO:tasks.workunit.client.0.vm03.stdout:9/793: read d15/d1c/fd5 [375590,94772] 0 2026-03-09T00:04:17.064 INFO:tasks.workunit.client.0.vm03.stdout:2/888: rename d8/d26/d5e/d6f/d97 to d8/d1b/d24/da5/dda/def/d127 0 2026-03-09T00:04:17.064 INFO:tasks.workunit.client.0.vm03.stdout:2/889: write d8/d1b/d24/da5/dda/de0/ff1 [444260,42221] 0 2026-03-09T00:04:17.074 INFO:tasks.workunit.client.1.vm06.stdout:9/982: dwrite d1/d3/d4f/d52/de3/de5/fed [0,4194304] 0 2026-03-09T00:04:17.074 INFO:tasks.workunit.client.1.vm06.stdout:1/979: dwrite d6/d4c/f8e [0,4194304] 0 2026-03-09T00:04:17.076 INFO:tasks.workunit.client.1.vm06.stdout:9/983: mknod d1/d3/d4f/d52/de3/c147 0 2026-03-09T00:04:17.076 INFO:tasks.workunit.client.1.vm06.stdout:9/984: write d1/d4/fd6 [2585169,113076] 0 2026-03-09T00:04:17.076 INFO:tasks.workunit.client.1.vm06.stdout:1/980: rmdir d6/d21/d2d/d113 39 2026-03-09T00:04:17.077 INFO:tasks.workunit.client.1.vm06.stdout:1/981: truncate d6/d21/da6/f13b 675195 0 2026-03-09T00:04:17.079 INFO:tasks.workunit.client.1.vm06.stdout:1/982: symlink d6/d21/da6/l150 0 2026-03-09T00:04:17.085 INFO:tasks.workunit.client.1.vm06.stdout:1/983: write d6/f1d [1360713,114838] 0 2026-03-09T00:04:17.094 INFO:tasks.workunit.client.0.vm03.stdout:6/793: dread d13/f17 [0,4194304] 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/794: creat d13/d35/d72/dcc/f10f x:0 0 0 2026-03-09T00:04:17.095 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:16 vm03.local ceph-mon[52346]: pgmap v11: 65 pgs: 65 active+clean; 3.5 GiB data, 12 GiB used, 108 GiB / 120 GiB avail; 103 MiB/s rd, 127 MiB/s wr, 199 op/s 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.1.vm06.stdout:1/984: creat d6/d4c/d79/d10c/f151 x:0 0 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.1.vm06.stdout:1/985: creat d6/d8f/d10f/d121/d149/f152 x:0 0 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.1.vm06.stdout:1/986: link d6/d21/d2d/fc5 d6/d8f/d10f/d121/d149/f153 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/795: link d13/d1e/d44/d59/dec/d62/l90 d13/d35/d71/d97/da5/db1/l110 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/796: creat d13/d35/d72/dcc/f111 x:0 0 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/797: creat d13/d35/d69/dee/f112 x:0 0 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/798: mknod d13/d35/d69/c113 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/799: chown d13/dc4/dea/d102/cdf 10 1 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/800: write d13/d1e/d44/d4a/d52/f54 [759990,117024] 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/801: dread d13/d1e/d44/d59/dec/d62/f79 [0,4194304] 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/802: mkdir d13/d1e/d44/d59/d77/d114 0 2026-03-09T00:04:17.095 INFO:tasks.workunit.client.0.vm03.stdout:6/803: fsync d13/d35/dff/f104 0 2026-03-09T00:04:17.107 INFO:tasks.workunit.client.0.vm03.stdout:8/865: dwrite d7/df/d1a/d40/d9d/df2/d38/d4c/fc1 [0,4194304] 0 2026-03-09T00:04:17.107 INFO:tasks.workunit.client.0.vm03.stdout:8/866: write d7/df/f29 [5111772,16264] 0 2026-03-09T00:04:17.108 INFO:tasks.workunit.client.0.vm03.stdout:8/867: symlink d7/df/d1a/l104 0 2026-03-09T00:04:17.109 INFO:tasks.workunit.client.0.vm03.stdout:8/868: mknod d7/df/d1a/d40/d9d/df2/d3f/df1/c105 0 2026-03-09T00:04:17.109 INFO:tasks.workunit.client.0.vm03.stdout:8/869: write d7/df/d1a/d40/d9d/da3/ff6 [630922,95252] 0 2026-03-09T00:04:17.135 INFO:tasks.workunit.client.0.vm03.stdout:8/870: dread d7/df/d1a/d40/d9d/df2/d38/d91/fbf [0,4194304] 0 2026-03-09T00:04:17.136 INFO:tasks.workunit.client.0.vm03.stdout:0/861: dwrite d2/da/dd/d49/d6c/d4b/daf/f126 [0,4194304] 0 2026-03-09T00:04:17.145 INFO:tasks.workunit.client.0.vm03.stdout:0/862: link d2/da/dd/d49/d6c/d4b/f100 d2/da/d76/d8a/f135 0 2026-03-09T00:04:17.145 INFO:tasks.workunit.client.0.vm03.stdout:0/863: dread - d2/fd3 zero size 2026-03-09T00:04:17.145 INFO:tasks.workunit.client.0.vm03.stdout:0/864: rmdir d2/da/dd/d49/d6c/d4b/d55/d6f/dad 39 2026-03-09T00:04:17.146 INFO:tasks.workunit.client.0.vm03.stdout:0/865: mknod d2/da/d76/d8a/c136 0 2026-03-09T00:04:17.185 INFO:tasks.workunit.client.0.vm03.stdout:4/926: dwrite d7/d20/d6a/dea/d4e/f74 [0,4194304] 0 2026-03-09T00:04:17.186 INFO:tasks.workunit.client.0.vm03.stdout:4/927: creat d7/de6/f125 x:0 0 0 2026-03-09T00:04:17.186 INFO:tasks.workunit.client.0.vm03.stdout:4/928: readlink d7/d20/d6a/dea/l48 0 2026-03-09T00:04:17.186 INFO:tasks.workunit.client.0.vm03.stdout:4/929: truncate d7/d20/d6a/dea/d4e/f4f 974790 0 2026-03-09T00:04:17.187 INFO:tasks.workunit.client.0.vm03.stdout:4/930: link d7/d20/d6a/dea/d38/da9/lfe d7/d27/dc9/l126 0 2026-03-09T00:04:17.187 INFO:tasks.workunit.client.0.vm03.stdout:4/931: fdatasync d7/d20/d6a/dea/d38/f8f 0 2026-03-09T00:04:17.187 INFO:tasks.workunit.client.0.vm03.stdout:4/932: stat d7/d20/d6a/dea/d38/fca 0 2026-03-09T00:04:17.192 INFO:tasks.workunit.client.0.vm03.stdout:4/933: write d7/f28 [3311146,69593] 0 2026-03-09T00:04:17.192 INFO:tasks.workunit.client.0.vm03.stdout:4/934: readlink d7/d20/d6a/dea/d4e/l5c 0 2026-03-09T00:04:17.192 INFO:tasks.workunit.client.0.vm03.stdout:4/935: truncate d7/d20/d6a/d77/db7/f9f 4702820 0 2026-03-09T00:04:17.195 INFO:tasks.workunit.client.0.vm03.stdout:4/936: dread d7/d20/d6a/dea/d38/da9/ddc/f65 [0,4194304] 0 2026-03-09T00:04:17.241 INFO:tasks.workunit.client.0.vm03.stdout:4/937: dwrite d7/d6f/f9b [0,4194304] 0 2026-03-09T00:04:17.243 INFO:tasks.workunit.client.0.vm03.stdout:9/794: dwrite d15/d1c/d28/f29 [0,4194304] 0 2026-03-09T00:04:17.246 INFO:tasks.workunit.client.0.vm03.stdout:9/795: mkdir d15/d1c/d21/d54/d87/d93/dcf/d10c 0 2026-03-09T00:04:17.252 INFO:tasks.workunit.client.0.vm03.stdout:1/923: dwrite d4/d3a/d32/d87/d111/f119 [0,4194304] 0 2026-03-09T00:04:17.254 INFO:tasks.workunit.client.0.vm03.stdout:1/924: rename d4/d15/d77/la4 to d4/d15/d1a/l132 0 2026-03-09T00:04:17.258 INFO:tasks.workunit.client.0.vm03.stdout:1/925: dread d4/d6/f6e [0,4194304] 0 2026-03-09T00:04:17.270 INFO:tasks.workunit.client.0.vm03.stdout:1/926: truncate d4/d3a/d3d/d98/dee/d9e/fbb 690970 0 2026-03-09T00:04:17.270 INFO:tasks.workunit.client.0.vm03.stdout:1/927: read d4/fa0 [1751945,54759] 0 2026-03-09T00:04:17.270 INFO:tasks.workunit.client.0.vm03.stdout:1/928: creat d4/d3a/d8f/d104/d117/f133 x:0 0 0 2026-03-09T00:04:17.270 INFO:tasks.workunit.client.0.vm03.stdout:1/929: symlink d4/d15/d1a/dfb/l134 0 2026-03-09T00:04:17.282 INFO:tasks.workunit.client.0.vm03.stdout:3/693: dwrite d2/db/f7e [0,4194304] 0 2026-03-09T00:04:17.282 INFO:tasks.workunit.client.0.vm03.stdout:3/694: chown d2/db/f80 3 1 2026-03-09T00:04:17.284 INFO:tasks.workunit.client.0.vm03.stdout:7/767: dwrite d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/fdf [0,4194304] 0 2026-03-09T00:04:17.284 INFO:tasks.workunit.client.0.vm03.stdout:3/695: dread d2/db/d3b/d5d/fba [0,4194304] 0 2026-03-09T00:04:17.284 INFO:tasks.workunit.client.0.vm03.stdout:3/696: creat d2/db/d3b/d3f/db8/fd7 x:0 0 0 2026-03-09T00:04:17.286 INFO:tasks.workunit.client.0.vm03.stdout:7/768: link d2/d1f/f11 d2/d4/db7/fea 0 2026-03-09T00:04:17.290 INFO:tasks.workunit.client.0.vm03.stdout:5/861: dwrite d1c/d20/d55/d4f/d58/d73/d9e/fae [0,4194304] 0 2026-03-09T00:04:17.293 INFO:tasks.workunit.client.0.vm03.stdout:7/769: link d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/le2 d2/d4/leb 0 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:5/862: rename d1c/d20/d55/d66/d70/l88 to d1c/d20/d55/d66/dc6/l115 0 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:5/863: chown d1c/d20/d55/d43/ca9 52685207 1 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:7/770: getdents d2/d4/d1e/d78 0 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:5/864: mknod d1c/d51/d6a/d75/df0/c116 0 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:7/771: creat d2/d4/db7/daa/fec x:0 0 0 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:5/865: link d1c/d20/d56/lfc d1c/d20/d55/d66/db2/l117 0 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:5/866: chown d1c/d20/d56/db4 8058661 1 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:5/867: chown d1c/d20/c28 6 1 2026-03-09T00:04:17.301 INFO:tasks.workunit.client.0.vm03.stdout:7/772: rmdir d2/d4/d1e 39 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.301+0000 7f6f270cc700 1 -- 192.168.123.103:0/567059015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20072d90 msgr2=0x7f6f200731b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.301+0000 7f6f270cc700 1 --2- 192.168.123.103:0/567059015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20072d90 0x7f6f200731b0 secure :-1 s=READY pgs=315 cs=0 l=1 rev1=1 crypto rx=0x7f6f10009b00 tx=0x7f6f10009e10 comp rx=0 tx=0).stop 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- 192.168.123.103:0/567059015 shutdown_connections 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 --2- 192.168.123.103:0/567059015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f20078110 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 --2- 192.168.123.103:0/567059015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20072d90 0x7f6f200731b0 unknown :-1 s=CLOSED pgs=315 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- 192.168.123.103:0/567059015 >> 192.168.123.103:0/567059015 conn(0x7f6f2006dda0 msgr2=0x7f6f20070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- 192.168.123.103:0/567059015 shutdown_connections 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- 192.168.123.103:0/567059015 wait complete. 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 Processor -- start 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- start start 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f201b0ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:17.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20083aa0 0x7f6f201b3010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:17.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f201b3550 con 0x7f6f20083aa0 2026-03-09T00:04:17.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.302+0000 7f6f270cc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6f201b3690 con 0x7f6f20075c80 2026-03-09T00:04:17.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.303+0000 7f6f24e68700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f201b0ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:17.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.303+0000 7f6f24e68700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f201b0ac0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49360/0 (socket says 192.168.123.103:49360) 2026-03-09T00:04:17.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.303+0000 7f6f24e68700 1 -- 192.168.123.103:0/2405309507 learned_addr learned my addr 192.168.123.103:0/2405309507 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:17.304 INFO:tasks.workunit.client.1.vm06.stdout:9/985: dwrite d1/d3/d4f/d52/f11b [0,4194304] 0 2026-03-09T00:04:17.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.304+0000 7f6f24e68700 1 -- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20083aa0 msgr2=0x7f6f201b3010 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.304+0000 7f6f24e68700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20083aa0 0x7f6f201b3010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.304+0000 7f6f24e68700 1 -- 192.168.123.103:0/2405309507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6f100097e0 con 0x7f6f20075c80 2026-03-09T00:04:17.306 INFO:tasks.workunit.client.0.vm03.stdout:4/938: dwrite d7/d6f/fb2 [0,4194304] 0 2026-03-09T00:04:17.306 INFO:tasks.workunit.client.0.vm03.stdout:4/939: chown d7/d20/d6a/d77/d25/f102 1 1 2026-03-09T00:04:17.306 INFO:tasks.workunit.client.0.vm03.stdout:5/868: link d1c/d20/d55/db0/dc7/d101/f9a d1c/d20/d55/d4f/f118 0 2026-03-09T00:04:17.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.311+0000 7f6f24e68700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f201b0ac0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f6f100097a0 tx=0x7f6f1001d7e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:17.310 INFO:tasks.workunit.client.0.vm03.stdout:7/773: truncate d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fa1 1232575 0 2026-03-09T00:04:17.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.311+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f10005190 con 0x7f6f20075c80 2026-03-09T00:04:17.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.311+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6f201b38c0 con 0x7f6f20075c80 2026-03-09T00:04:17.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.311+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6f201b3cf0 con 0x7f6f20075c80 2026-03-09T00:04:17.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.311+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6f10026d30 con 0x7f6f20075c80 2026-03-09T00:04:17.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.311+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6f10003cd0 con 0x7f6f20075c80 2026-03-09T00:04:17.311 INFO:tasks.workunit.client.1.vm06.stdout:1/987: dwrite d6/d21/d2d/d37/d6d/dd7/ff6 [0,4194304] 0 2026-03-09T00:04:17.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.312+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f6f1000f460 con 0x7f6f20075c80 2026-03-09T00:04:17.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.313+0000 7f6f1dffb700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6f08077780 0x7f6f08079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:17.312 INFO:tasks.workunit.client.1.vm06.stdout:9/986: getdents d1/d3/d2b 0 2026-03-09T00:04:17.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.313+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f6f100a3fc0 con 0x7f6f20075c80 2026-03-09T00:04:17.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.315+0000 7f6f1ffff700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6f08077780 0x7f6f08079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:17.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.318+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6f0c005320 con 0x7f6f20075c80 2026-03-09T00:04:17.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.322+0000 7f6f1ffff700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6f08077780 0x7f6f08079c40 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6f180060b0 tx=0x7f6f18006040 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:17.323 INFO:tasks.workunit.client.0.vm03.stdout:4/940: truncate d7/fa7 1642290 0 2026-03-09T00:04:17.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.322+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f6f1006cad0 con 0x7f6f20075c80 2026-03-09T00:04:17.327 INFO:tasks.workunit.client.0.vm03.stdout:4/941: dread d7/fe [0,4194304] 0 2026-03-09T00:04:17.328 INFO:tasks.workunit.client.0.vm03.stdout:5/869: mkdir d1c/d20/d55/d4f/d58/d5d/d119 0 2026-03-09T00:04:17.328 INFO:tasks.workunit.client.0.vm03.stdout:5/870: chown d1c/d20/f39 0 1 2026-03-09T00:04:17.328 INFO:tasks.workunit.client.0.vm03.stdout:5/871: write d1c/d20/d55/db0/dc7/d101/f9a [2027996,70985] 0 2026-03-09T00:04:17.330 INFO:tasks.workunit.client.0.vm03.stdout:7/774: creat d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fed x:0 0 0 2026-03-09T00:04:17.330 INFO:tasks.workunit.client.0.vm03.stdout:7/775: write d2/d4/db7/d67/f64 [2087779,32143] 0 2026-03-09T00:04:17.330 INFO:tasks.workunit.client.0.vm03.stdout:7/776: chown d2/d1f/d3a/d24/da4/d46/lb8 4 1 2026-03-09T00:04:17.332 INFO:tasks.workunit.client.0.vm03.stdout:6/804: dwrite d13/fa9 [0,4194304] 0 2026-03-09T00:04:17.332 INFO:tasks.workunit.client.0.vm03.stdout:6/805: chown d13/d1e/d44/d4a/c73 28086 1 2026-03-09T00:04:17.333 INFO:tasks.workunit.client.0.vm03.stdout:4/942: unlink d7/d20/d6a/dea/d38/da9/cd6 0 2026-03-09T00:04:17.341 INFO:tasks.workunit.client.0.vm03.stdout:5/872: truncate d1c/d20/d55/d4f/d58/d73/d9e/fe4 1893615 0 2026-03-09T00:04:17.341 INFO:tasks.workunit.client.0.vm03.stdout:1/930: dwrite d4/d3a/d32/d6a/f76 [0,4194304] 0 2026-03-09T00:04:17.341 INFO:tasks.workunit.client.0.vm03.stdout:5/873: write d1c/d20/d55/d4f/d58/d73/d9e/fae [4359036,109241] 0 2026-03-09T00:04:17.347 INFO:tasks.workunit.client.0.vm03.stdout:5/874: write d1c/d20/d55/f5a [2642333,63408] 0 2026-03-09T00:04:17.350 INFO:tasks.workunit.client.0.vm03.stdout:7/777: mkdir d2/d4/d1e/dee 0 2026-03-09T00:04:17.350 INFO:tasks.workunit.client.0.vm03.stdout:7/778: creat d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fef x:0 0 0 2026-03-09T00:04:17.355 INFO:tasks.workunit.client.0.vm03.stdout:7/779: dread d2/d1f/d3a/d24/da4/d46/d81/d96/d80/f93 [0,4194304] 0 2026-03-09T00:04:17.355 INFO:tasks.workunit.client.0.vm03.stdout:7/780: fsync d2/d4/f2e 0 2026-03-09T00:04:17.361 INFO:tasks.workunit.client.0.vm03.stdout:6/806: symlink d13/d35/d71/d97/l115 0 2026-03-09T00:04:17.366 INFO:tasks.workunit.client.0.vm03.stdout:1/931: mknod d4/d3a/d32/dc2/c135 0 2026-03-09T00:04:17.367 INFO:tasks.workunit.client.0.vm03.stdout:1/932: stat d4/d15/c2d 0 2026-03-09T00:04:17.370 INFO:tasks.workunit.client.0.vm03.stdout:1/933: dread d4/d15/f4e [0,4194304] 0 2026-03-09T00:04:17.373 INFO:tasks.workunit.client.0.vm03.stdout:5/875: mkdir d1c/d51/df2/d11a 0 2026-03-09T00:04:17.383 INFO:tasks.workunit.client.0.vm03.stdout:7/781: symlink d2/d1f/d3a/d24/da4/d46/d81/d96/lf0 0 2026-03-09T00:04:17.388 INFO:tasks.workunit.client.0.vm03.stdout:7/782: dread - d2/d4/db7/d67/d6b/fbe zero size 2026-03-09T00:04:17.388 INFO:tasks.workunit.client.0.vm03.stdout:7/783: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d88/db9/le2 2456 1 2026-03-09T00:04:17.391 INFO:tasks.workunit.client.0.vm03.stdout:3/697: dwrite d2/db/d2d/f43 [0,4194304] 0 2026-03-09T00:04:17.397 INFO:tasks.workunit.client.0.vm03.stdout:1/934: mkdir d4/d3a/d3d/d98/dee/deb/d136 0 2026-03-09T00:04:17.407 INFO:tasks.workunit.client.0.vm03.stdout:1/935: getdents d4/d3a/d61/da6/dc3 0 2026-03-09T00:04:17.408 INFO:tasks.workunit.client.1.vm06.stdout:9/987: dwrite d1/d4/d11f/fe7 [0,4194304] 0 2026-03-09T00:04:17.430 INFO:tasks.workunit.client.1.vm06.stdout:9/988: dread d1/d4/f9c [0,4194304] 0 2026-03-09T00:04:17.430 INFO:tasks.workunit.client.1.vm06.stdout:9/989: fdatasync d1/d3/d4f/d52/f5e 0 2026-03-09T00:04:17.434 INFO:tasks.workunit.client.1.vm06.stdout:9/990: rmdir d1/d4/d11f/d25 39 2026-03-09T00:04:17.435 INFO:tasks.workunit.client.1.vm06.stdout:9/991: stat d1/d3/d4f/d91/de8/f121 0 2026-03-09T00:04:17.437 INFO:tasks.workunit.client.1.vm06.stdout:9/992: creat d1/d3/d4f/d91/d94/ddf/f148 x:0 0 0 2026-03-09T00:04:17.448 INFO:tasks.workunit.client.1.vm06.stdout:9/993: rename d1/d3/d4f/d52/fc1 to d1/d3/d4f/d91/d94/ddf/d130/d13f/f149 0 2026-03-09T00:04:17.448 INFO:tasks.workunit.client.1.vm06.stdout:9/994: write d1/d3/d4f/d91/d94/ddf/f10d [8808911,122381] 0 2026-03-09T00:04:17.457 INFO:tasks.workunit.client.1.vm06.stdout:9/995: dread d1/d3/d4f/d52/de3/de5/fed [0,4194304] 0 2026-03-09T00:04:17.459 INFO:tasks.workunit.client.1.vm06.stdout:9/996: symlink d1/d4/d2f/l14a 0 2026-03-09T00:04:17.459 INFO:tasks.workunit.client.1.vm06.stdout:9/997: chown d1/d3/d4f/d91/d94/ddf/f148 333507 1 2026-03-09T00:04:17.460 INFO:tasks.workunit.client.1.vm06.stdout:9/998: mkdir d1/d3/d50/d14b 0 2026-03-09T00:04:17.461 INFO:tasks.workunit.client.1.vm06.stdout:9/999: symlink d1/d4/df5/l14c 0 2026-03-09T00:04:17.468 INFO:tasks.workunit.client.0.vm03.stdout:4/943: dwrite d7/f62 [0,4194304] 0 2026-03-09T00:04:17.468 INFO:tasks.workunit.client.0.vm03.stdout:4/944: chown d7/d20/d6a/d77/d25/f7f 127 1 2026-03-09T00:04:17.468 INFO:tasks.workunit.client.0.vm03.stdout:4/945: readlink d7/d20/d6a/dea/d54/d58/l95 0 2026-03-09T00:04:17.468 INFO:tasks.workunit.client.0.vm03.stdout:4/946: chown d7/l14 211 1 2026-03-09T00:04:17.468 INFO:tasks.workunit.client.0.vm03.stdout:5/876: dwrite d1c/d20/d56/f10d [0,4194304] 0 2026-03-09T00:04:17.468 INFO:tasks.workunit.client.0.vm03.stdout:5/877: fsync d1c/d20/d55/f3d 0 2026-03-09T00:04:17.475 INFO:tasks.workunit.client.0.vm03.stdout:1/936: dwrite f2 [0,4194304] 0 2026-03-09T00:04:17.475 INFO:tasks.workunit.client.0.vm03.stdout:6/807: dwrite d13/d1e/f9f [0,4194304] 0 2026-03-09T00:04:17.488 INFO:tasks.workunit.client.0.vm03.stdout:4/947: creat d7/f127 x:0 0 0 2026-03-09T00:04:17.488 INFO:tasks.workunit.client.0.vm03.stdout:3/698: dwrite d2/db/f67 [0,4194304] 0 2026-03-09T00:04:17.493 INFO:tasks.workunit.client.0.vm03.stdout:7/784: dwrite d2/d4/db7/fcb [0,4194304] 0 2026-03-09T00:04:17.493 INFO:tasks.workunit.client.0.vm03.stdout:7/785: stat d2/d4/db7/d67/f64 0 2026-03-09T00:04:17.500 INFO:tasks.workunit.client.0.vm03.stdout:3/699: dread d2/db/f26 [0,4194304] 0 2026-03-09T00:04:17.508 INFO:tasks.workunit.client.0.vm03.stdout:5/878: mkdir d1c/d20/d97/d11b 0 2026-03-09T00:04:17.515 INFO:tasks.workunit.client.0.vm03.stdout:1/937: link d4/d3a/d32/l36 d4/d3a/l137 0 2026-03-09T00:04:17.522 INFO:tasks.workunit.client.0.vm03.stdout:4/948: creat d7/d20/d6a/d77/db7/f128 x:0 0 0 2026-03-09T00:04:17.533 INFO:tasks.workunit.client.0.vm03.stdout:5/879: rename d1c/d20/d55/d4f/d58/d73/d76/c94 to d1c/d20/d55/d4f/d58/d73/d9e/c11c 0 2026-03-09T00:04:17.537 INFO:tasks.workunit.client.0.vm03.stdout:1/938: mknod d4/d15/d77/dce/dd9/df3/c138 0 2026-03-09T00:04:17.537 INFO:tasks.workunit.client.0.vm03.stdout:1/939: write d4/d6/d52/db5/ff2 [742547,44907] 0 2026-03-09T00:04:17.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.546+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6f0c000bf0 con 0x7f6f08077780 2026-03-09T00:04:17.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.548+0000 7f6f1dffb700 1 -- 192.168.123.103:0/2405309507 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+360 (secure 0 0 0) 0x7f6f0c000bf0 con 0x7f6f08077780 2026-03-09T00:04:17.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6f08077780 msgr2=0x7f6f08079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6f08077780 0x7f6f08079c40 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6f180060b0 tx=0x7f6f18006040 comp rx=0 tx=0).stop 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 msgr2=0x7f6f201b0ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f201b0ac0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f6f100097a0 tx=0x7f6f1001d7e0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 shutdown_connections 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6f08077780 0x7f6f08079c40 secure :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6f180060b0 tx=0x7f6f18006040 comp rx=0 tx=0).stop 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6f20075c80 0x7f6f201b0ac0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 --2- 192.168.123.103:0/2405309507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6f20083aa0 0x7f6f201b3010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 >> 192.168.123.103:0/2405309507 conn(0x7f6f2006dda0 msgr2=0x7f6f20077480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 shutdown_connections 2026-03-09T00:04:17.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.553+0000 7f6f270cc700 1 -- 192.168.123.103:0/2405309507 wait complete. 2026-03-09T00:04:17.555 INFO:tasks.workunit.client.0.vm03.stdout:4/949: unlink d7/d20/d6a/d77/d25/cdf 0 2026-03-09T00:04:17.566 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:04:17.571 INFO:tasks.workunit.client.0.vm03.stdout:5/880: mkdir d1c/d20/d56/db4/d11d 0 2026-03-09T00:04:17.571 INFO:tasks.workunit.client.0.vm03.stdout:5/881: chown d1c/d20/d55/d66/dc6/l115 42005534 1 2026-03-09T00:04:17.574 INFO:tasks.workunit.client.0.vm03.stdout:5/882: mkdir d1c/d20/d56/db4/df3/d11e 0 2026-03-09T00:04:17.576 INFO:tasks.workunit.client.0.vm03.stdout:4/950: rename d7/d20/d6a/dea/d54/d58 to d7/d6f/dcf/de8/dee/d129 0 2026-03-09T00:04:17.577 INFO:tasks.workunit.client.0.vm03.stdout:4/951: write d7/de6/f125 [873184,81476] 0 2026-03-09T00:04:17.577 INFO:tasks.workunit.client.0.vm03.stdout:4/952: dread d7/d20/d6a/dea/d54/ffd [0,4194304] 0 2026-03-09T00:04:17.577 INFO:tasks.workunit.client.0.vm03.stdout:4/953: write d7/d20/d6a/dde/fec [248632,48828] 0 2026-03-09T00:04:17.579 INFO:tasks.workunit.client.0.vm03.stdout:3/700: dwrite d2/db/d40/d58/fd0 [0,4194304] 0 2026-03-09T00:04:17.580 INFO:tasks.workunit.client.1.vm06.stdout:1/988: rmdir d6/d8f 39 2026-03-09T00:04:17.582 INFO:tasks.workunit.client.0.vm03.stdout:2/890: write d8/d26/d5e/f64 [2321392,35015] 0 2026-03-09T00:04:17.582 INFO:tasks.workunit.client.0.vm03.stdout:2/891: chown d8/f15 93940 1 2026-03-09T00:04:17.583 INFO:tasks.workunit.client.0.vm03.stdout:5/883: dread d1c/d20/fa3 [0,4194304] 0 2026-03-09T00:04:17.584 INFO:tasks.workunit.client.0.vm03.stdout:6/808: dwrite d13/d1e/d44/d59/d77/f94 [0,4194304] 0 2026-03-09T00:04:17.599 INFO:tasks.workunit.client.0.vm03.stdout:5/884: dread d1c/d20/d55/f46 [4194304,4194304] 0 2026-03-09T00:04:17.606 INFO:tasks.workunit.client.0.vm03.stdout:3/701: mkdir d2/db/d3b/d5f/da5/dd8 0 2026-03-09T00:04:17.606 INFO:tasks.workunit.client.0.vm03.stdout:3/702: dread - d2/db/d3b/d5f/da5/d72/fb0 zero size 2026-03-09T00:04:17.606 INFO:tasks.workunit.client.1.vm06.stdout:1/989: link d6/fa d6/d21/d2d/d113/f154 0 2026-03-09T00:04:17.607 INFO:tasks.workunit.client.1.vm06.stdout:1/990: write d6/d21/fe7 [388877,11777] 0 2026-03-09T00:04:17.615 INFO:tasks.workunit.client.0.vm03.stdout:1/940: dwrite d4/d15/d5c/d6c/fc0 [0,4194304] 0 2026-03-09T00:04:17.619 INFO:tasks.workunit.client.0.vm03.stdout:6/809: read d13/d1e/f3e [2639997,65541] 0 2026-03-09T00:04:17.619 INFO:tasks.workunit.client.0.vm03.stdout:6/810: write d13/d1e/d44/d59/dec/f4f [4501742,34591] 0 2026-03-09T00:04:17.619 INFO:tasks.workunit.client.0.vm03.stdout:6/811: write d13/d1e/d44/d59/dec/d62/f103 [870301,48195] 0 2026-03-09T00:04:17.627 INFO:tasks.workunit.client.0.vm03.stdout:5/885: truncate d1c/d20/d55/f61 1223111 0 2026-03-09T00:04:17.635 INFO:tasks.workunit.client.0.vm03.stdout:5/886: write d1c/d20/d55/f3d [4796288,96290] 0 2026-03-09T00:04:17.635 INFO:tasks.workunit.client.0.vm03.stdout:5/887: write d1c/d20/d55/f34 [569911,122798] 0 2026-03-09T00:04:17.635 INFO:tasks.workunit.client.0.vm03.stdout:7/786: dwrite d2/d4/db7/d67/d6b/fda [0,4194304] 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.0.vm03.stdout:7/787: truncate d2/d4/db7/daa/fec 319553 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.0.vm03.stdout:7/788: chown d2/ce 3597042 1 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.0.vm03.stdout:7/789: fdatasync d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f56 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.0.vm03.stdout:7/790: read d2/d1f/d3a/f19 [1190510,114218] 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.0.vm03.stdout:1/941: creat d4/d3a/d32/d6a/f139 x:0 0 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.0.vm03.stdout:1/942: stat d4/d15/f7f 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.1.vm06.stdout:1/991: dread - d6/fa7 zero size 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.1.vm06.stdout:1/992: read d6/d21/f7b [16595,92] 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.1.vm06.stdout:1/993: write d6/dc4/f14b [679149,24650] 0 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.1.vm06.stdout:1/994: chown d6/d21/d2d/l32 2978486 1 2026-03-09T00:04:17.636 INFO:tasks.workunit.client.1.vm06.stdout:1/995: write d6/dc4/f105 [173952,68548] 0 2026-03-09T00:04:17.641 INFO:tasks.workunit.client.1.vm06.stdout:1/996: mkdir d6/d21/d2d/d37/d6d/d155 0 2026-03-09T00:04:17.646 INFO:tasks.workunit.client.0.vm03.stdout:6/812: mkdir d13/d35/d74/d89/d9d/d116 0 2026-03-09T00:04:17.650 INFO:tasks.workunit.client.1.vm06.stdout:1/997: creat d6/d8f/d10f/d121/d149/f156 x:0 0 0 2026-03-09T00:04:17.650 INFO:tasks.workunit.client.1.vm06.stdout:1/998: readlink d6/d21/d2d/d3b/d42/d129/lbb 0 2026-03-09T00:04:17.651 INFO:tasks.workunit.client.0.vm03.stdout:9/796: write d15/d1c/fb2 [255981,45524] 0 2026-03-09T00:04:17.651 INFO:tasks.workunit.client.0.vm03.stdout:9/797: chown d15/d1c/d21/fdc 273685482 1 2026-03-09T00:04:17.653 INFO:tasks.workunit.client.0.vm03.stdout:7/791: rename d2/d4/d1e/fe1 to d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/ff1 0 2026-03-09T00:04:17.656 INFO:tasks.workunit.client.1.vm06.stdout:1/999: mkdir d6/d21/d2d/d37/dbc/d157 0 2026-03-09T00:04:17.661 INFO:tasks.workunit.client.0.vm03.stdout:3/703: getdents d2/db/d3b/d5f/da5/d72/d96 0 2026-03-09T00:04:17.665 INFO:tasks.workunit.client.0.vm03.stdout:4/954: dwrite d7/d20/d6a/d77/db7/f100 [0,4194304] 0 2026-03-09T00:04:17.669 INFO:tasks.workunit.client.0.vm03.stdout:1/943: creat d4/d3a/d32/da1/f13a x:0 0 0 2026-03-09T00:04:17.669 INFO:tasks.workunit.client.0.vm03.stdout:1/944: chown d4/c3b 7 1 2026-03-09T00:04:17.669 INFO:tasks.workunit.client.0.vm03.stdout:1/945: dread - d4/d6/d52/db5/fc6 zero size 2026-03-09T00:04:17.669 INFO:tasks.workunit.client.0.vm03.stdout:1/946: fsync d4/d3a/d3d/fa2 0 2026-03-09T00:04:17.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- 192.168.123.103:0/282416669 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08075a10 msgr2=0x7fec08077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 --2- 192.168.123.103:0/282416669 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08075a10 0x7fec08077ea0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fec0400d3f0 tx=0x7fec0400d700 comp rx=0 tx=0).stop 2026-03-09T00:04:17.682 INFO:tasks.workunit.client.1.vm06.stderr:+ rm -rf -- ./tmp.42dJIYM0SW 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- 192.168.123.103:0/282416669 shutdown_connections 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 --2- 192.168.123.103:0/282416669 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08075a10 0x7fec08077ea0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 --2- 192.168.123.103:0/282416669 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec08072b20 0x7fec08072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- 192.168.123.103:0/282416669 >> 192.168.123.103:0/282416669 conn(0x7fec0806daa0 msgr2=0x7fec0806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- 192.168.123.103:0/282416669 shutdown_connections 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- 192.168.123.103:0/282416669 wait complete. 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 Processor -- start 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- start start 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 0x7fec08083060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec080835a0 0x7fec0812e3e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec08083ab0 con 0x7fec080835a0 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.671+0000 7fec0f62d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec08083c20 con 0x7fec08072b20 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.672+0000 7fec0e62b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 0x7fec08083060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.672+0000 7fec0e62b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 0x7fec08083060 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49378/0 (socket says 192.168.123.103:49378) 2026-03-09T00:04:17.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.672+0000 7fec0e62b700 1 -- 192.168.123.103:0/2995667628 learned_addr learned my addr 192.168.123.103:0/2995667628 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.672+0000 7fec0e62b700 1 -- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec080835a0 msgr2=0x7fec0812e3e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.672+0000 7fec0e62b700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec080835a0 0x7fec0812e3e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.672+0000 7fec0e62b700 1 -- 192.168.123.103:0/2995667628 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fec04007ed0 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.673+0000 7fec0e62b700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 0x7fec08083060 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7febfc00b770 tx=0x7febfc00bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:17.683 INFO:tasks.workunit.client.0.vm03.stdout:6/813: unlink d13/d1e/d44/d4a/f58 0 2026-03-09T00:04:17.683 INFO:tasks.workunit.client.0.vm03.stdout:9/798: mknod d15/d1c/d21/d64/c10d 0 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.674+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febfc00f820 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.674+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7febfc00fe60 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.674+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7febfc00d610 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.677+0000 7fec0f62d700 1 -- 192.168.123.103:0/2995667628 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fec0812e980 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.677+0000 7fec0f62d700 1 -- 192.168.123.103:0/2995667628 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fec0812eea0 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.677+0000 7fec0f62d700 1 -- 192.168.123.103:0/2995667628 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fec0804ea90 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:tasks.workunit.client.0.vm03.stdout:3/704: symlink d2/db/d3b/d5f/da5/dd8/ld9 0 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.680+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7febfc01e030 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.680+0000 7febfb7fe700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7febf40776c0 0x7febf4079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.680+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7febfc099550 con 0x7fec08072b20 2026-03-09T00:04:17.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.681+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7febfc062060 con 0x7fec08072b20 2026-03-09T00:04:17.686 INFO:tasks.workunit.client.0.vm03.stdout:4/955: link d7/d20/d6a/dea/ff0 d7/d20/d6a/d77/d25/de2/f12a 0 2026-03-09T00:04:17.688 INFO:tasks.workunit.client.0.vm03.stdout:9/799: unlink d15/d1c/d36/l3f 0 2026-03-09T00:04:17.691 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.682+0000 7fec0de2a700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7febf40776c0 0x7febf4079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:17.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.699+0000 7fec0de2a700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7febf40776c0 0x7febf4079b80 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fec0400db80 tx=0x7fec04006040 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:17.700 INFO:tasks.workunit.client.0.vm03.stdout:5/888: dwrite d1c/f96 [0,4194304] 0 2026-03-09T00:04:17.702 INFO:tasks.workunit.client.0.vm03.stdout:3/705: mknod d2/db/d2d/d55/cda 0 2026-03-09T00:04:17.707 INFO:tasks.workunit.client.0.vm03.stdout:9/800: read d15/f26 [3187819,109738] 0 2026-03-09T00:04:17.707 INFO:tasks.workunit.client.0.vm03.stdout:3/706: dread d2/db/d3b/d5f/da5/d72/f7a [0,4194304] 0 2026-03-09T00:04:17.709 INFO:tasks.workunit.client.0.vm03.stdout:4/956: truncate d7/d20/f3d 4040511 0 2026-03-09T00:04:17.732 INFO:tasks.workunit.client.0.vm03.stdout:2/892: dwrite f7 [0,4194304] 0 2026-03-09T00:04:17.740 INFO:tasks.workunit.client.0.vm03.stdout:0/866: dwrite d2/da/d1a/f84 [0,4194304] 0 2026-03-09T00:04:17.740 INFO:tasks.workunit.client.0.vm03.stdout:7/792: dwrite d2/fcd [0,4194304] 0 2026-03-09T00:04:17.740 INFO:tasks.workunit.client.0.vm03.stdout:7/793: chown d2/d1f/d3a/l65 3006476 1 2026-03-09T00:04:17.744 INFO:tasks.workunit.client.0.vm03.stdout:9/801: mkdir d15/d1c/d21/d54/dab/df6/d10e 0 2026-03-09T00:04:17.749 INFO:tasks.workunit.client.0.vm03.stdout:9/802: readlink d15/d1c/d21/d75/l97 0 2026-03-09T00:04:17.769 INFO:tasks.workunit.client.0.vm03.stdout:2/893: truncate d8/d1b/d2a/d6b/d50/d8a/fc3 668782 0 2026-03-09T00:04:17.769 INFO:tasks.workunit.client.0.vm03.stdout:2/894: chown d8/d26/d5e/d5f/ded/f113 83 1 2026-03-09T00:04:17.775 INFO:tasks.workunit.client.0.vm03.stdout:0/867: rename d2/da/d36/da4/f3f to d2/da/dd/d49/d6c/da6/dda/db5/dba/dff/f137 0 2026-03-09T00:04:17.775 INFO:tasks.workunit.client.0.vm03.stdout:0/868: dread - d2/da/dd/d49/d6c/da6/fc2 zero size 2026-03-09T00:04:17.775 INFO:tasks.workunit.client.0.vm03.stdout:0/869: creat d2/da/dd/d49/d6c/da6/dda/db5/dba/f138 x:0 0 0 2026-03-09T00:04:17.775 INFO:tasks.workunit.client.0.vm03.stdout:0/870: readlink d2/da/d36/da4/le2 0 2026-03-09T00:04:17.777 INFO:tasks.workunit.client.0.vm03.stdout:8/871: sync 2026-03-09T00:04:17.780 INFO:tasks.workunit.client.0.vm03.stdout:1/947: dwrite d4/d3a/d3d/d98/dee/d9e/d12e/fbe [0,4194304] 0 2026-03-09T00:04:17.791 INFO:tasks.workunit.client.0.vm03.stdout:6/814: dwrite d13/f92 [0,4194304] 0 2026-03-09T00:04:17.791 INFO:tasks.workunit.client.0.vm03.stdout:6/815: fsync d13/d1e/d44/d59/f6e 0 2026-03-09T00:04:17.795 INFO:tasks.workunit.client.0.vm03.stdout:4/957: fdatasync d7/d20/d6a/dde/fec 0 2026-03-09T00:04:17.857 INFO:tasks.workunit.client.0.vm03.stdout:0/871: mknod d2/da/d76/c139 0 2026-03-09T00:04:17.857 INFO:tasks.workunit.client.0.vm03.stdout:0/872: creat d2/da/dd/d49/d6c/d4b/f13a x:0 0 0 2026-03-09T00:04:17.857 INFO:tasks.workunit.client.0.vm03.stdout:0/873: readlink d2/da/d36/da4/l90 0 2026-03-09T00:04:17.857 INFO:tasks.workunit.client.0.vm03.stdout:0/874: write d2/da/dd/f7b [364596,105394] 0 2026-03-09T00:04:17.871 INFO:tasks.workunit.client.0.vm03.stdout:3/707: dwrite d2/db/d3b/d3f/db8/fc9 [0,4194304] 0 2026-03-09T00:04:17.871 INFO:tasks.workunit.client.0.vm03.stdout:0/875: write d2/da/d4e/faa [4623182,112928] 0 2026-03-09T00:04:17.872 INFO:tasks.workunit.client.0.vm03.stdout:2/895: dread d8/d1b/d24/da5/dda/def/d127/f68 [0,4194304] 0 2026-03-09T00:04:17.874 INFO:tasks.workunit.client.0.vm03.stdout:5/889: dwrite d1c/d20/d55/f34 [0,4194304] 0 2026-03-09T00:04:17.879 INFO:tasks.workunit.client.0.vm03.stdout:2/896: dread d8/d1b/d2a/d6b/d50/d8a/fc0 [0,4194304] 0 2026-03-09T00:04:17.879 INFO:tasks.workunit.client.0.vm03.stdout:2/897: readlink d8/le 0 2026-03-09T00:04:17.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.886+0000 7fec0f62d700 1 -- 192.168.123.103:0/2995667628 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fec080776d0 con 0x7febf40776c0 2026-03-09T00:04:17.886 INFO:tasks.workunit.client.0.vm03.stdout:8/872: creat d7/df/d1a/d40/d9d/df2/d38/d60/dcd/f106 x:0 0 0 2026-03-09T00:04:17.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.889+0000 7febfb7fe700 1 -- 192.168.123.103:0/2995667628 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7fec080776d0 con 0x7febf40776c0 2026-03-09T00:04:17.893 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.891+0000 7febf97fa700 1 -- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7febf40776c0 msgr2=0x7febf4079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.891+0000 7febf97fa700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7febf40776c0 0x7febf4079b80 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fec0400db80 tx=0x7fec04006040 comp rx=0 tx=0).stop 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.891+0000 7febf97fa700 1 -- 192.168.123.103:0/2995667628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 msgr2=0x7fec08083060 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.891+0000 7febf97fa700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 0x7fec08083060 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7febfc00b770 tx=0x7febfc00bb30 comp rx=0 tx=0).stop 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 -- 192.168.123.103:0/2995667628 shutdown_connections 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7febf40776c0 0x7febf4079b80 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fec08072b20 0x7fec08083060 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 --2- 192.168.123.103:0/2995667628 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fec080835a0 0x7fec0812e3e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 -- 192.168.123.103:0/2995667628 >> 192.168.123.103:0/2995667628 conn(0x7fec0806daa0 msgr2=0x7fec0806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 -- 192.168.123.103:0/2995667628 shutdown_connections 2026-03-09T00:04:17.894 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:17.893+0000 7febf97fa700 1 -- 192.168.123.103:0/2995667628 wait complete. 2026-03-09T00:04:17.932 INFO:tasks.workunit.client.0.vm03.stdout:1/948: creat d4/d15/dae/d101/f13b x:0 0 0 2026-03-09T00:04:17.932 INFO:tasks.workunit.client.0.vm03.stdout:1/949: fsync d4/d3a/d43/f47 0 2026-03-09T00:04:17.953 INFO:tasks.workunit.client.0.vm03.stdout:6/816: getdents d13/d1e/d44/d59/dec 0 2026-03-09T00:04:17.953 INFO:tasks.workunit.client.0.vm03.stdout:6/817: chown d13/d35/d71/d97/ded/l105 78 1 2026-03-09T00:04:17.953 INFO:tasks.workunit.client.0.vm03.stdout:6/818: fdatasync d13/f31 0 2026-03-09T00:04:17.979 INFO:tasks.workunit.client.0.vm03.stdout:2/898: dwrite d8/d1b/d24/da5/dc9/fe8 [0,4194304] 0 2026-03-09T00:04:17.979 INFO:tasks.workunit.client.0.vm03.stdout:2/899: dread - d8/d26/d5e/d5f/d116/f11c zero size 2026-03-09T00:04:17.979 INFO:tasks.workunit.client.0.vm03.stdout:5/890: dwrite d1c/d20/d55/db0/dc7/d101/f9a [0,4194304] 0 2026-03-09T00:04:17.991 INFO:tasks.workunit.client.0.vm03.stdout:6/819: dwrite d13/d1e/d44/d59/dec/d62/f103 [0,4194304] 0 2026-03-09T00:04:17.995 INFO:tasks.workunit.client.0.vm03.stdout:9/803: rmdir d15/d7f 39 2026-03-09T00:04:17.995 INFO:tasks.workunit.client.0.vm03.stdout:9/804: fdatasync d15/d1c/d21/d75/de0/fc9 0 2026-03-09T00:04:17.995 INFO:tasks.workunit.client.0.vm03.stdout:9/805: write d15/d1c/d28/dda/f10a [732782,58442] 0 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 -- 192.168.123.103:0/1968015581 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c075a10 msgr2=0x7f582c077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 --2- 192.168.123.103:0/1968015581 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c075a10 0x7f582c077ea0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f582400b3a0 tx=0x7f582400b6b0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 -- 192.168.123.103:0/1968015581 shutdown_connections 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 --2- 192.168.123.103:0/1968015581 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c075a10 0x7f582c077ea0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 --2- 192.168.123.103:0/1968015581 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f582c072b20 0x7f582c072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 -- 192.168.123.103:0/1968015581 >> 192.168.123.103:0/1968015581 conn(0x7f582c06daa0 msgr2=0x7f582c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 -- 192.168.123.103:0/1968015581 shutdown_connections 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 -- 192.168.123.103:0/1968015581 wait complete. 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.014+0000 7f582b59e700 1 Processor -- start 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f582b59e700 1 -- start start 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f582b59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f582c072b20 0x7f582c083060 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f582b59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 0x7f582c1b30f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f582b59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f582c083ab0 con 0x7f582c072b20 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f582b59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f582c083c20 con 0x7f582c0835a0 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f5829d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 0x7f582c1b30f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f5829d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 0x7f582c1b30f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49390/0 (socket says 192.168.123.103:49390) 2026-03-09T00:04:18.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.015+0000 7f5829d9b700 1 -- 192.168.123.103:0/2262694666 learned_addr learned my addr 192.168.123.103:0/2262694666 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:18.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.018+0000 7f5829d9b700 1 -- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f582c072b20 msgr2=0x7f582c083060 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.018+0000 7f5829d9b700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f582c072b20 0x7f582c083060 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.018+0000 7f5829d9b700 1 -- 192.168.123.103:0/2262694666 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f581c009710 con 0x7f582c0835a0 2026-03-09T00:04:18.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.018+0000 7f5829d9b700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 0x7f582c1b30f0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f5824007b60 tx=0x7f5824007bc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:18.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.019+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f582400e070 con 0x7f582c0835a0 2026-03-09T00:04:18.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.019+0000 7f582b59e700 1 -- 192.168.123.103:0/2262694666 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f582400b050 con 0x7f582c0835a0 2026-03-09T00:04:18.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.019+0000 7f582b59e700 1 -- 192.168.123.103:0/2262694666 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f582c1b39f0 con 0x7f582c0835a0 2026-03-09T00:04:18.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.019+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5824007e90 con 0x7f582c0835a0 2026-03-09T00:04:18.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.019+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f582401bb40 con 0x7f582c0835a0 2026-03-09T00:04:18.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.021+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f582401bca0 con 0x7f582c0835a0 2026-03-09T00:04:18.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.022+0000 7f581b7fe700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5814077780 0x7f5814079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.023+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f582409e560 con 0x7f582c0835a0 2026-03-09T00:04:18.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.023+0000 7f582b59e700 1 -- 192.168.123.103:0/2262694666 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f580c005320 con 0x7f582c0835a0 2026-03-09T00:04:18.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.025+0000 7f582a59c700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5814077780 0x7f5814079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.026+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f5824017070 con 0x7f582c0835a0 2026-03-09T00:04:18.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.026+0000 7f582a59c700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5814077780 0x7f5814079c40 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f581c005d90 tx=0x7f581c005ce0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:18.032 INFO:tasks.workunit.client.0.vm03.stdout:0/876: mknod d2/da/dd/d49/d6c/d4b/dc1/c13b 0 2026-03-09T00:04:18.037 INFO:tasks.workunit.client.0.vm03.stdout:0/877: dread d2/da/d1a/fc4 [0,4194304] 0 2026-03-09T00:04:18.039 INFO:tasks.workunit.client.0.vm03.stdout:3/708: dread d2/db/d3b/d5f/da5/f6e [0,4194304] 0 2026-03-09T00:04:18.048 INFO:tasks.workunit.client.0.vm03.stdout:8/873: mknod d7/df/d1a/d40/d9d/df2/d38/d4c/c107 0 2026-03-09T00:04:18.053 INFO:tasks.workunit.client.0.vm03.stdout:1/950: truncate d4/d15/d77/f7c 940520 0 2026-03-09T00:04:18.070 INFO:tasks.workunit.client.0.vm03.stdout:5/891: symlink d1c/d20/d55/d4f/d58/db5/l11f 0 2026-03-09T00:04:18.072 INFO:tasks.workunit.client.0.vm03.stdout:7/794: sync 2026-03-09T00:04:18.072 INFO:tasks.workunit.client.0.vm03.stdout:4/958: rmdir d7/d20/d6a/dea/d38/dfb 39 2026-03-09T00:04:18.072 INFO:tasks.workunit.client.0.vm03.stdout:4/959: fdatasync d7/d20/d6a/d77/f83 0 2026-03-09T00:04:18.075 INFO:tasks.workunit.client.0.vm03.stdout:9/806: link d15/d1c/d28/l2b d15/d1c/d21/d64/l10f 0 2026-03-09T00:04:18.079 INFO:tasks.workunit.client.0.vm03.stdout:5/892: dread d1c/d20/f4e [4194304,4194304] 0 2026-03-09T00:04:18.085 INFO:tasks.workunit.client.0.vm03.stdout:0/878: creat d2/da/d36/da4/f13c x:0 0 0 2026-03-09T00:04:18.088 INFO:tasks.workunit.client.0.vm03.stdout:3/709: creat d2/db/d3b/d5f/da5/d72/d96/fdb x:0 0 0 2026-03-09T00:04:18.088 INFO:tasks.workunit.client.0.vm03.stdout:3/710: write d2/db/d2d/f52 [4330891,30886] 0 2026-03-09T00:04:18.095 INFO:tasks.workunit.client.0.vm03.stdout:8/874: link d7/df/d1a/d2b/f72 d7/df/d1a/d40/d9d/df2/d38/d91/d103/f108 0 2026-03-09T00:04:18.116 INFO:tasks.workunit.client.0.vm03.stdout:8/875: read d7/df/d1a/d2b/f9f [2677212,55727] 0 2026-03-09T00:04:18.117 INFO:tasks.workunit.client.0.vm03.stdout:1/951: mknod d4/d3a/d61/da6/dc3/c13c 0 2026-03-09T00:04:18.117 INFO:tasks.workunit.client.0.vm03.stdout:1/952: stat d4/d15/dae 0 2026-03-09T00:04:18.122 INFO:tasks.workunit.client.0.vm03.stdout:1/953: dread d4/d3a/f2c [0,4194304] 0 2026-03-09T00:04:18.122 INFO:tasks.workunit.client.0.vm03.stdout:1/954: fdatasync d4/d3a/d3d/d98/dee/d9e/d12e/ffc 0 2026-03-09T00:04:18.122 INFO:tasks.workunit.client.0.vm03.stdout:1/955: chown d4/d15/f6d 3 1 2026-03-09T00:04:18.135 INFO:tasks.workunit.client.0.vm03.stdout:2/900: truncate d8/d1b/d2a/d6b/d50/d8a/fc3 489471 0 2026-03-09T00:04:18.135 INFO:tasks.workunit.client.0.vm03.stdout:1/956: dread d4/d3a/d61/d78/f8e [0,4194304] 0 2026-03-09T00:04:18.141 INFO:tasks.workunit.client.0.vm03.stdout:5/893: dwrite d1c/d20/d55/fdb [0,4194304] 0 2026-03-09T00:04:18.159 INFO:tasks.workunit.client.0.vm03.stdout:7/795: creat d2/d1f/d3a/d24/da4/d46/ff2 x:0 0 0 2026-03-09T00:04:18.159 INFO:tasks.workunit.client.0.vm03.stdout:7/796: readlink d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/l75 0 2026-03-09T00:04:18.168 INFO:tasks.workunit.client.0.vm03.stdout:4/960: symlink d7/d6f/dcf/de8/dee/d129/d85/l12b 0 2026-03-09T00:04:18.168 INFO:tasks.workunit.client.0.vm03.stdout:4/961: chown d7/d27/dc9/l126 115653584 1 2026-03-09T00:04:18.168 INFO:tasks.workunit.client.0.vm03.stdout:4/962: creat d7/d20/d6a/d77/d25/de2/f12c x:0 0 0 2026-03-09T00:04:18.171 INFO:tasks.workunit.client.0.vm03.stdout:9/807: mknod d15/d1c/d28/d6e/c110 0 2026-03-09T00:04:18.172 INFO:tasks.workunit.client.0.vm03.stdout:0/879: truncate d2/da/dd/f75 1563174 0 2026-03-09T00:04:18.172 INFO:tasks.workunit.client.0.vm03.stdout:0/880: creat d2/da/d36/ddf/df7/d12a/f13d x:0 0 0 2026-03-09T00:04:18.172 INFO:tasks.workunit.client.0.vm03.stdout:0/881: dread - d2/da/dd/d49/d6c/d4b/ff0 zero size 2026-03-09T00:04:18.173 INFO:tasks.workunit.client.0.vm03.stdout:0/882: write d2/da/f44 [67717,10850] 0 2026-03-09T00:04:18.183 INFO:tasks.workunit.client.0.vm03.stdout:8/876: rename d7/f9 to d7/df/d1a/d40/d58/f109 0 2026-03-09T00:04:18.183 INFO:tasks.workunit.client.0.vm03.stdout:8/877: fdatasync d7/df/d1a/d40/d58/fb6 0 2026-03-09T00:04:18.201 INFO:tasks.workunit.client.0.vm03.stdout:7/797: dwrite d2/d4/db7/daa/fec [0,4194304] 0 2026-03-09T00:04:18.206 INFO:tasks.workunit.client.0.vm03.stdout:7/798: dread d2/d1f/d3a/d24/da4/d46/f5b [0,4194304] 0 2026-03-09T00:04:18.208 INFO:tasks.workunit.client.0.vm03.stdout:2/901: dwrite d8/d26/d5e/d5f/f48 [0,4194304] 0 2026-03-09T00:04:18.218 INFO:tasks.workunit.client.0.vm03.stdout:1/957: mkdir d4/d15/d77/d8c/d13d 0 2026-03-09T00:04:18.218 INFO:tasks.workunit.client.0.vm03.stdout:1/958: chown f2 2 1 2026-03-09T00:04:18.218 INFO:tasks.workunit.client.0.vm03.stdout:5/894: truncate d1c/d20/d55/f46 744318 0 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all mgr 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: pgmap v12: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 158 MiB/s rd, 190 MiB/s wr, 298 op/s 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.221 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.yvcons"}]: dispatch 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.yvcons"}]': finished 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.rzcvhn"}]: dispatch 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.rzcvhn"}]': finished 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all rgw 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='client.24423 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.224 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all iscsi 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all nfs 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: Upgrade: Setting container_image for all nvmeof 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.225 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:18 vm06.local ceph-mon[58395]: from='client.24427 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:18.228 INFO:tasks.workunit.client.0.vm03.stdout:0/883: symlink d2/l13e 0 2026-03-09T00:04:18.234 INFO:tasks.workunit.client.0.vm03.stdout:3/711: getdents d2/db/d3b/d3f 0 2026-03-09T00:04:18.246 INFO:tasks.workunit.client.0.vm03.stdout:4/963: dwrite d7/d20/d6a/dea/f2a [4194304,4194304] 0 2026-03-09T00:04:18.252 INFO:tasks.workunit.client.0.vm03.stdout:8/878: dwrite d7/df/d1a/d40/d9d/df2/d3f/d95/fcb [0,4194304] 0 2026-03-09T00:04:18.259 INFO:tasks.workunit.client.0.vm03.stdout:9/808: rename d15/d1c/c90 to d15/d77/de2/c111 0 2026-03-09T00:04:18.259 INFO:tasks.workunit.client.0.vm03.stdout:5/895: write d1c/d20/d55/d4f/f118 [842535,39491] 0 2026-03-09T00:04:18.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.263+0000 7f582b59e700 1 -- 192.168.123.103:0/2262694666 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f580c000bf0 con 0x7f5814077780 2026-03-09T00:04:18.264 INFO:tasks.workunit.client.0.vm03.stdout:6/820: write d13/d35/d74/d89/d9d/fa7 [1299469,69872] 0 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 2s ago 4m 24.9M - 0.25.0 c8568f914cd2 9b05d2f3502a 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (4m) 2s ago 4m 8368k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (4m) 3s ago 4m 8510k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 2s ago 4m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (4m) 3s ago 4m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 2s ago 4m 88.8M - 9.4.7 954c08fa6188 9db2e5805e97 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (2m) 2s ago 2m 16.3M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (2m) 2s ago 2m 235M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (2m) 3s ago 2m 18.4M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (2m) 3s ago 2m 14.6M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (28s) 2s ago 5m 612M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (5s) 3s ago 4m 60.2M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 2s ago 5m 52.6M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (4m) 3s ago 4m 42.8M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 2s ago 4m 14.8M - 1.5.0 0da6a335fe13 750af7597536 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 3s ago 4m 15.4M - 1.5.0 0da6a335fe13 a82b7dc84593 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 2s ago 3m 377M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 2s ago 3m 400M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 2s ago 3m 311M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (3m) 3s ago 3m 418M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (3m) 3s ago 3m 390M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (2m) 3s ago 2m 373M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:04:18.272 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (8s) 2s ago 4m 42.9M - 2.43.0 a07b618ecd1d dc28bcba2c0d 2026-03-09T00:04:18.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.271+0000 7f581b7fe700 1 -- 192.168.123.103:0/2262694666 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f580c000bf0 con 0x7f5814077780 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.275+0000 7f58197fa700 1 -- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5814077780 msgr2=0x7f5814079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.275+0000 7f58197fa700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5814077780 0x7f5814079c40 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f581c005d90 tx=0x7f581c005ce0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.275+0000 7f58197fa700 1 -- 192.168.123.103:0/2262694666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 msgr2=0x7f582c1b30f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.275+0000 7f58197fa700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 0x7f582c1b30f0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f5824007b60 tx=0x7f5824007bc0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.277+0000 7f58197fa700 1 -- 192.168.123.103:0/2262694666 shutdown_connections 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.277+0000 7f58197fa700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5814077780 0x7f5814079c40 secure :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f581c005d90 tx=0x7f581c005ce0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.277+0000 7f58197fa700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f582c072b20 0x7f582c083060 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.277+0000 7f58197fa700 1 --2- 192.168.123.103:0/2262694666 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f582c0835a0 0x7f582c1b30f0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.277+0000 7f58197fa700 1 -- 192.168.123.103:0/2262694666 >> 192.168.123.103:0/2262694666 conn(0x7f582c06daa0 msgr2=0x7f582c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:18.278 INFO:tasks.workunit.client.0.vm03.stdout:1/959: creat d4/d15/d77/dce/f13e x:0 0 0 2026-03-09T00:04:18.278 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.278+0000 7f58197fa700 1 -- 192.168.123.103:0/2262694666 shutdown_connections 2026-03-09T00:04:18.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.279+0000 7f58197fa700 1 -- 192.168.123.103:0/2262694666 wait complete. 2026-03-09T00:04:18.280 INFO:tasks.workunit.client.0.vm03.stdout:2/902: dwrite d8/f11 [0,4194304] 0 2026-03-09T00:04:18.308 INFO:tasks.workunit.client.0.vm03.stdout:3/712: mknod d2/db/d40/d58/cdc 0 2026-03-09T00:04:18.308 INFO:tasks.workunit.client.0.vm03.stdout:3/713: chown d2/db/d3b/d5d/fba 2 1 2026-03-09T00:04:18.308 INFO:tasks.workunit.client.0.vm03.stdout:3/714: chown d2/db/d40 270 1 2026-03-09T00:04:18.308 INFO:tasks.workunit.client.0.vm03.stdout:5/896: read d1c/d20/d55/f46 [181341,112700] 0 2026-03-09T00:04:18.310 INFO:tasks.workunit.client.0.vm03.stdout:4/964: mknod d7/d20/d6a/c12d 0 2026-03-09T00:04:18.320 INFO:tasks.workunit.client.0.vm03.stdout:8/879: link d7/df/d1a/d40/d58/fb6 d7/df/d1a/d40/d9d/da3/df0/f10a 0 2026-03-09T00:04:18.320 INFO:tasks.workunit.client.0.vm03.stdout:7/799: rename d2/d1f/d3a/f1a to d2/d1f/d3a/d24/da4/d46/d81/ff3 0 2026-03-09T00:04:18.320 INFO:tasks.workunit.client.0.vm03.stdout:7/800: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d80 634047 1 2026-03-09T00:04:18.320 INFO:tasks.workunit.client.0.vm03.stdout:8/880: chown d7/df/d1a/d40/d9d/df2/d38/d60/ca4 117 1 2026-03-09T00:04:18.320 INFO:tasks.workunit.client.0.vm03.stdout:9/809: mkdir d15/d1c/d21/d112 0 2026-03-09T00:04:18.320 INFO:tasks.workunit.client.0.vm03.stdout:9/810: fdatasync d15/d1c/d21/d64/fc2 0 2026-03-09T00:04:18.341 INFO:tasks.workunit.client.0.vm03.stdout:6/821: link d13/d1e/f10e d13/d35/d74/d89/db3/f117 0 2026-03-09T00:04:18.341 INFO:tasks.workunit.client.0.vm03.stdout:1/960: truncate d4/d5e/fd3 2493437 0 2026-03-09T00:04:18.363 INFO:tasks.workunit.client.0.vm03.stdout:3/715: creat d2/db/d40/d44/db5/fdd x:0 0 0 2026-03-09T00:04:18.363 INFO:tasks.workunit.client.0.vm03.stdout:3/716: chown d2/db/d40/d58/cdc 2979 1 2026-03-09T00:04:18.363 INFO:tasks.workunit.client.0.vm03.stdout:4/965: getdents d7/d6f/dcf/de8/dee/d129/d11f 0 2026-03-09T00:04:18.374 INFO:tasks.workunit.client.0.vm03.stdout:4/966: dread d7/d20/d6a/d77/d25/fb8 [4194304,4194304] 0 2026-03-09T00:04:18.378 INFO:tasks.workunit.client.0.vm03.stdout:4/967: write d7/d20/fbf [1130278,38725] 0 2026-03-09T00:04:18.380 INFO:tasks.workunit.client.0.vm03.stdout:9/811: dwrite d15/d1c/d21/d54/f73 [0,4194304] 0 2026-03-09T00:04:18.385 INFO:tasks.workunit.client.0.vm03.stdout:8/881: mknod d7/df/d1a/d40/d9d/da3/dd2/c10b 0 2026-03-09T00:04:18.399 INFO:tasks.workunit.client.0.vm03.stdout:1/961: rename d4/d3a/d61/d78/l7e to d4/d15/dae/dcb/l13f 0 2026-03-09T00:04:18.400 INFO:tasks.workunit.client.0.vm03.stdout:3/717: creat d2/db/d40/d88/fde x:0 0 0 2026-03-09T00:04:18.400 INFO:tasks.workunit.client.0.vm03.stdout:3/718: creat d2/db/d3b/d5d/fdf x:0 0 0 2026-03-09T00:04:18.400 INFO:tasks.workunit.client.0.vm03.stdout:3/719: creat d2/db/d3b/d5f/da5/d72/dbd/fe0 x:0 0 0 2026-03-09T00:04:18.400 INFO:tasks.workunit.client.0.vm03.stdout:3/720: write d2/db/d40/d58/fd0 [4515574,5050] 0 2026-03-09T00:04:18.400 INFO:tasks.workunit.client.0.vm03.stdout:8/882: truncate d7/df/f3d 1844390 0 2026-03-09T00:04:18.406 INFO:tasks.workunit.client.0.vm03.stdout:9/812: read d15/d1c/d21/d64/f3d [1845146,89077] 0 2026-03-09T00:04:18.416 INFO:tasks.workunit.client.0.vm03.stdout:1/962: write d4/d3a/d3d/d46/f5d [719514,118557] 0 2026-03-09T00:04:18.416 INFO:tasks.workunit.client.0.vm03.stdout:1/963: stat d4/d3a/d3d/d98/dee/d9e/d12e/c9d 0 2026-03-09T00:04:18.416 INFO:tasks.workunit.client.0.vm03.stdout:1/964: fsync d4/d3a/d32/d87/fd5 0 2026-03-09T00:04:18.416 INFO:tasks.workunit.client.0.vm03.stdout:8/883: dread f3 [0,4194304] 0 2026-03-09T00:04:18.416 INFO:tasks.workunit.client.0.vm03.stdout:8/884: fsync d7/df/d1a/d40/d9d/da3/df0/f10a 0 2026-03-09T00:04:18.416 INFO:tasks.workunit.client.0.vm03.stdout:4/968: rename d7/d20/d6a/l87 to d7/d20/d6a/d77/d25/de2/df1/d11d/l12e 0 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all mgr 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: pgmap v12: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 158 MiB/s rd, 190 MiB/s wr, 298 op/s 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.yvcons"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm03.yvcons"}]': finished 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.rzcvhn"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm06.rzcvhn"}]': finished 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all rgw 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='client.24423 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all iscsi 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all nfs 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: Upgrade: Setting container_image for all nvmeof 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:18.417 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:18 vm03.local ceph-mon[52346]: from='client.24427 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:18.425 INFO:tasks.workunit.client.0.vm03.stdout:9/813: link d15/d1c/d36/fa8 d15/d1c/d21/d75/f113 0 2026-03-09T00:04:18.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.428+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2976353683 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 msgr2=0x7fc3ec072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.428+0000 7fc3f0d6d700 1 --2- 192.168.123.103:0/2976353683 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec072f40 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fc3dc007780 tx=0x7fc3dc00c050 comp rx=0 tx=0).stop 2026-03-09T00:04:18.428 INFO:tasks.workunit.client.0.vm03.stdout:4/969: getdents d7/d20/d6a/d77 0 2026-03-09T00:04:18.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2976353683 shutdown_connections 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 --2- 192.168.123.103:0/2976353683 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3ec075a10 0x7fc3ec077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 --2- 192.168.123.103:0/2976353683 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec072f40 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2976353683 >> 192.168.123.103:0/2976353683 conn(0x7fc3ec06daa0 msgr2=0x7fc3ec06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2976353683 shutdown_connections 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2976353683 wait complete. 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 Processor -- start 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 -- start start 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.429+0000 7fc3f0d6d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3ec075a10 0x7fc3ec0835c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3f0d6d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc3ec083c30 con 0x7fc3ec075a10 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3f0d6d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc3ec12e380 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec083080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec083080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49408/0 (socket says 192.168.123.103:49408) 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 -- 192.168.123.103:0/2739134642 learned_addr learned my addr 192.168.123.103:0/2739134642 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 -- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3ec075a10 msgr2=0x7fc3ec0835c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3ec075a10 0x7fc3ec0835c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 -- 192.168.123.103:0/2739134642 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc3dc007430 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.430+0000 7fc3eb7fe700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec083080 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fc3dc016040 tx=0x7fc3dc00cdd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.434+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc3dc00f040 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.434+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc3dc008dd0 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.434+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc3dc01ec30 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.434+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2739134642 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc3ec12e600 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.434+0000 7fc3f0d6d700 1 -- 192.168.123.103:0/2739134642 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc3ec12eaf0 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.434+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc3ec04ea90 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.436+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc3dc01e480 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.437+0000 7fc3e8ff9700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc3d4077730 0x7fc3d4079bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.437+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fc3dc09b4e0 con 0x7fc3ec072b20 2026-03-09T00:04:18.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.437+0000 7fc3eaffd700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc3d4077730 0x7fc3d4079bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.438+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fc3dc063ff0 con 0x7fc3ec072b20 2026-03-09T00:04:18.445 INFO:tasks.workunit.client.0.vm03.stdout:3/721: rename d2/db/d3b/d5f/d65/ca0 to d2/db/d3b/d5f/da5/d72/ce1 0 2026-03-09T00:04:18.445 INFO:tasks.workunit.client.0.vm03.stdout:2/903: sync 2026-03-09T00:04:18.445 INFO:tasks.workunit.client.0.vm03.stdout:0/884: sync 2026-03-09T00:04:18.445 INFO:tasks.workunit.client.0.vm03.stdout:0/885: dread - d2/da/d36/da4/f13c zero size 2026-03-09T00:04:18.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.454+0000 7fc3eaffd700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc3d4077730 0x7fc3d4079bf0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fc3ec129f50 tx=0x7fc3e0006d20 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:18.461 INFO:tasks.workunit.client.0.vm03.stdout:9/814: write d15/d1c/d36/f5c [80125,28353] 0 2026-03-09T00:04:18.461 INFO:tasks.workunit.client.0.vm03.stdout:1/965: rename d4/d15/d5c/d103/d11c to d4/d3a/d32/d87/d111/d140 0 2026-03-09T00:04:18.461 INFO:tasks.workunit.client.0.vm03.stdout:2/904: rmdir d8/d1b/d24/da5/dfe 39 2026-03-09T00:04:18.461 INFO:tasks.workunit.client.0.vm03.stdout:2/905: stat d8/d74/cc2 0 2026-03-09T00:04:18.463 INFO:tasks.workunit.client.0.vm03.stdout:5/897: dwrite d1c/d51/d6a/d75/f77 [0,4194304] 0 2026-03-09T00:04:18.463 INFO:tasks.workunit.client.0.vm03.stdout:5/898: chown d1c/d20/d55/db0/c114 7148 1 2026-03-09T00:04:18.463 INFO:tasks.workunit.client.0.vm03.stdout:9/815: write d15/d1c/d36/f78 [2643202,112769] 0 2026-03-09T00:04:18.464 INFO:tasks.workunit.client.0.vm03.stdout:0/886: mknod d2/da/d36/c13f 0 2026-03-09T00:04:18.470 INFO:tasks.workunit.client.0.vm03.stdout:3/722: dread d2/db/d3b/d5f/d65/f90 [0,4194304] 0 2026-03-09T00:04:18.471 INFO:tasks.workunit.client.0.vm03.stdout:7/801: write d2/d1f/d3a/d24/da4/d46/d81/d96/fbb [717803,37422] 0 2026-03-09T00:04:18.471 INFO:tasks.workunit.client.0.vm03.stdout:7/802: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/c98 4 1 2026-03-09T00:04:18.471 INFO:tasks.workunit.client.0.vm03.stdout:7/803: fsync d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/fd7 0 2026-03-09T00:04:18.471 INFO:tasks.workunit.client.0.vm03.stdout:7/804: chown d2/d1f/d3a/d24/da4/f7f 104 1 2026-03-09T00:04:18.476 INFO:tasks.workunit.client.0.vm03.stdout:6/822: dwrite d13/d1e/d44/d59/f6e [0,4194304] 0 2026-03-09T00:04:18.476 INFO:tasks.workunit.client.0.vm03.stdout:6/823: dread - d13/d1e/f10e zero size 2026-03-09T00:04:18.491 INFO:tasks.workunit.client.0.vm03.stdout:8/885: dwrite d7/df/d1a/d2b/f9f [0,4194304] 0 2026-03-09T00:04:18.491 INFO:tasks.workunit.client.0.vm03.stdout:1/966: rename d4/d6/d52/db5/ff2 to d4/d3a/d8f/d104/f141 0 2026-03-09T00:04:18.495 INFO:tasks.workunit.client.0.vm03.stdout:8/886: dread d7/df/d1a/f2e [0,4194304] 0 2026-03-09T00:04:18.500 INFO:tasks.workunit.client.0.vm03.stdout:2/906: mkdir d8/d74/d128 0 2026-03-09T00:04:18.503 INFO:tasks.workunit.client.0.vm03.stdout:1/967: write d4/d3a/d43/fba [1192989,19832] 0 2026-03-09T00:04:18.531 INFO:tasks.workunit.client.0.vm03.stdout:5/899: creat d1c/d107/f120 x:0 0 0 2026-03-09T00:04:18.538 INFO:tasks.workunit.client.0.vm03.stdout:9/816: symlink d15/d77/de2/l114 0 2026-03-09T00:04:18.550 INFO:tasks.workunit.client.0.vm03.stdout:0/887: truncate d2/da/f4f 2005730 0 2026-03-09T00:04:18.550 INFO:tasks.workunit.client.0.vm03.stdout:3/723: mknod d2/db/d3b/d5f/da5/d72/d96/ce2 0 2026-03-09T00:04:18.551 INFO:tasks.workunit.client.0.vm03.stdout:4/970: dwrite d7/d20/d6a/dea/d4e/dd0/ffc [0,4194304] 0 2026-03-09T00:04:18.553 INFO:tasks.workunit.client.0.vm03.stdout:7/805: creat d2/d1f/d3a/d24/da4/d46/d54/d8d/ff4 x:0 0 0 2026-03-09T00:04:18.553 INFO:tasks.workunit.client.0.vm03.stdout:7/806: fsync d2/d1f/d3a/d24/da4/d46/d81/d96/fbb 0 2026-03-09T00:04:18.571 INFO:tasks.workunit.client.0.vm03.stdout:6/824: unlink d13/d35/db5/fc6 0 2026-03-09T00:04:18.580 INFO:tasks.workunit.client.0.vm03.stdout:8/887: stat d7/df/d1a/d40/lcc 0 2026-03-09T00:04:18.582 INFO:tasks.workunit.client.0.vm03.stdout:2/907: mknod d8/d1b/d2a/d56/d110/c129 0 2026-03-09T00:04:18.582 INFO:tasks.workunit.client.0.vm03.stdout:2/908: read - d8/d1b/d2a/d2e/fd9 zero size 2026-03-09T00:04:18.582 INFO:tasks.workunit.client.0.vm03.stdout:2/909: fdatasync d8/d1b/d2a/d6b/dc6/fd2 0 2026-03-09T00:04:18.582 INFO:tasks.workunit.client.0.vm03.stdout:2/910: readlink d8/d1b/d8f/la0 0 2026-03-09T00:04:18.593 INFO:tasks.workunit.client.0.vm03.stdout:5/900: unlink d1c/d20/d55/d4f/d58/db5/f6f 0 2026-03-09T00:04:18.593 INFO:tasks.workunit.client.0.vm03.stdout:5/901: stat d1c/d20/d55/db0/lc9 0 2026-03-09T00:04:18.596 INFO:tasks.workunit.client.0.vm03.stdout:9/817: mkdir d15/d1c/d21/d54/d87/d93/d115 0 2026-03-09T00:04:18.596 INFO:tasks.workunit.client.0.vm03.stdout:9/818: creat d15/d1c/d21/d54/d87/d93/dcf/f116 x:0 0 0 2026-03-09T00:04:18.610 INFO:tasks.workunit.client.0.vm03.stdout:0/888: symlink d2/da/dd/d49/d6c/d4b/dc1/l140 0 2026-03-09T00:04:18.610 INFO:tasks.workunit.client.0.vm03.stdout:3/724: rename d2/db/d56/f7d to d2/db/d3b/d5f/da5/dd8/fe3 0 2026-03-09T00:04:18.613 INFO:tasks.workunit.client.0.vm03.stdout:7/807: mknod d2/d1f/d3a/d24/da4/d46/d54/d8d/dd2/cf5 0 2026-03-09T00:04:18.613 INFO:tasks.workunit.client.0.vm03.stdout:7/808: read - d2/d4/d1e/d5e/d7e/fe7 zero size 2026-03-09T00:04:18.614 INFO:tasks.workunit.client.0.vm03.stdout:7/809: chown d2/d1f/d3a/d24/da4/d46/d54 9138451 1 2026-03-09T00:04:18.614 INFO:tasks.workunit.client.0.vm03.stdout:4/971: mknod d7/d20/d6a/dea/d54/c12f 0 2026-03-09T00:04:18.614 INFO:tasks.workunit.client.0.vm03.stdout:4/972: chown d7/d20/d6a/d77/db7 68 1 2026-03-09T00:04:18.621 INFO:tasks.workunit.client.0.vm03.stdout:8/888: creat d7/df/d1a/d40/d9d/df2/d38/f10c x:0 0 0 2026-03-09T00:04:18.622 INFO:tasks.workunit.client.0.vm03.stdout:8/889: chown d7/df/d1a/d40/db3/l89 2610488 1 2026-03-09T00:04:18.622 INFO:tasks.workunit.client.0.vm03.stdout:2/911: mkdir d8/d1b/d24/da5/dfe/d105/d108/d12a 0 2026-03-09T00:04:18.626 INFO:tasks.workunit.client.0.vm03.stdout:5/902: rmdir d1c/d20/d55/d43 39 2026-03-09T00:04:18.639 INFO:tasks.workunit.client.0.vm03.stdout:4/973: write d7/d20/d6a/dea/f2a [8094165,3204] 0 2026-03-09T00:04:18.643 INFO:tasks.workunit.client.0.vm03.stdout:9/819: mknod d15/d1c/d21/d112/c117 0 2026-03-09T00:04:18.651 INFO:tasks.workunit.client.0.vm03.stdout:4/974: write d7/de6/f125 [281164,84825] 0 2026-03-09T00:04:18.651 INFO:tasks.workunit.client.0.vm03.stdout:4/975: write d7/d20/d6a/d77/db7/f128 [443099,24778] 0 2026-03-09T00:04:18.664 INFO:tasks.workunit.client.0.vm03.stdout:6/825: rename d13/d35/d74/fc5 to d13/d1e/d44/f118 0 2026-03-09T00:04:18.664 INFO:tasks.workunit.client.0.vm03.stdout:6/826: fsync d13/f31 0 2026-03-09T00:04:18.664 INFO:tasks.workunit.client.0.vm03.stdout:6/827: chown d13/d35/d71/d97/c9b 25643 1 2026-03-09T00:04:18.664 INFO:tasks.workunit.client.0.vm03.stdout:0/889: unlink d2/fbf 0 2026-03-09T00:04:18.665 INFO:tasks.workunit.client.0.vm03.stdout:3/725: dwrite d2/db/d40/d58/fd0 [0,4194304] 0 2026-03-09T00:04:18.665 INFO:tasks.workunit.client.0.vm03.stdout:3/726: fsync d2/db/d40/d88/f89 0 2026-03-09T00:04:18.676 INFO:tasks.workunit.client.0.vm03.stdout:8/890: creat d7/df/d1a/d40/db3/f10d x:0 0 0 2026-03-09T00:04:18.686 INFO:tasks.workunit.client.0.vm03.stdout:3/727: dread d2/db/d40/d88/f89 [0,4194304] 0 2026-03-09T00:04:18.686 INFO:tasks.workunit.client.0.vm03.stdout:3/728: fdatasync d2/f5 0 2026-03-09T00:04:18.690 INFO:tasks.workunit.client.0.vm03.stdout:5/903: dwrite d1c/d20/d55/f7d [0,4194304] 0 2026-03-09T00:04:18.691 INFO:tasks.workunit.client.0.vm03.stdout:9/820: dwrite d15/fe9 [0,4194304] 0 2026-03-09T00:04:18.694 INFO:tasks.workunit.client.0.vm03.stdout:4/976: creat d7/d20/d6a/dea/d38/dfb/f130 x:0 0 0 2026-03-09T00:04:18.709 INFO:tasks.workunit.client.0.vm03.stdout:6/828: creat d13/d35/f119 x:0 0 0 2026-03-09T00:04:18.718 INFO:tasks.workunit.client.0.vm03.stdout:6/829: read d13/f1a [6553077,128277] 0 2026-03-09T00:04:18.718 INFO:tasks.workunit.client.0.vm03.stdout:0/890: symlink d2/da/dd/d49/d6c/d4b/d55/d6f/dad/l141 0 2026-03-09T00:04:18.718 INFO:tasks.workunit.client.0.vm03.stdout:1/968: sync 2026-03-09T00:04:18.718 INFO:tasks.workunit.client.0.vm03.stdout:1/969: chown d4/d15/d77/dce/f13e 61757325 1 2026-03-09T00:04:18.718 INFO:tasks.workunit.client.0.vm03.stdout:1/970: fdatasync d4/d15/f7f 0 2026-03-09T00:04:18.719 INFO:tasks.workunit.client.0.vm03.stdout:1/971: dread f2 [0,4194304] 0 2026-03-09T00:04:18.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.727+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc3ec0619a0 con 0x7fc3ec072b20 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:04:18.738 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.730+0000 7fc3e8ff9700 1 -- 192.168.123.103:0/2739134642 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7fc3dc063740 con 0x7fc3ec072b20 2026-03-09T00:04:18.739 INFO:tasks.workunit.client.0.vm03.stdout:7/810: rmdir d2/d1f/d3a/d24/da4/d46 39 2026-03-09T00:04:18.739 INFO:tasks.workunit.client.0.vm03.stdout:8/891: mknod d7/df/d1a/d40/d9d/df2/d3f/d95/c10e 0 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc3d4077730 msgr2=0x7fc3d4079bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc3d4077730 0x7fc3d4079bf0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fc3ec129f50 tx=0x7fc3e0006d20 comp rx=0 tx=0).stop 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 msgr2=0x7fc3ec083080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec083080 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7fc3dc016040 tx=0x7fc3dc00cdd0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 shutdown_connections 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc3d4077730 0x7fc3d4079bf0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc3ec072b20 0x7fc3ec083080 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 --2- 192.168.123.103:0/2739134642 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc3ec075a10 0x7fc3ec0835c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 >> 192.168.123.103:0/2739134642 conn(0x7fc3ec06daa0 msgr2=0x7fc3ec06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 shutdown_connections 2026-03-09T00:04:18.739 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.735+0000 7fc3d27fc700 1 -- 192.168.123.103:0/2739134642 wait complete. 2026-03-09T00:04:18.739 INFO:tasks.workunit.client.0.vm03.stdout:7/811: dread d2/d1f/d3a/d24/da4/d46/f5b [0,4194304] 0 2026-03-09T00:04:18.739 INFO:tasks.workunit.client.0.vm03.stdout:7/812: write d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fef [80409,42461] 0 2026-03-09T00:04:18.750 INFO:tasks.workunit.client.0.vm03.stdout:3/729: link d2/db/d3b/d3f/db8/fc9 d2/db/d40/d51/fe4 0 2026-03-09T00:04:18.764 INFO:tasks.workunit.client.0.vm03.stdout:4/977: dwrite d7/d27/f89 [0,4194304] 0 2026-03-09T00:04:18.768 INFO:tasks.workunit.client.0.vm03.stdout:6/830: symlink d13/dc4/dea/d109/l11a 0 2026-03-09T00:04:18.778 INFO:tasks.workunit.client.0.vm03.stdout:0/891: creat d2/d111/f142 x:0 0 0 2026-03-09T00:04:18.778 INFO:tasks.workunit.client.0.vm03.stdout:3/730: link d2/db/d40/d44/l53 d2/db/d2d/le5 0 2026-03-09T00:04:18.781 INFO:tasks.workunit.client.0.vm03.stdout:6/831: dread d13/d35/fda [0,4194304] 0 2026-03-09T00:04:18.781 INFO:tasks.workunit.client.0.vm03.stdout:6/832: stat d13/d35/d74/d89/ff8 0 2026-03-09T00:04:18.789 INFO:tasks.workunit.client.0.vm03.stdout:1/972: dwrite d4/d15/dae/d12b/fad [4194304,4194304] 0 2026-03-09T00:04:18.789 INFO:tasks.workunit.client.0.vm03.stdout:1/973: write d4/d3a/d32/da1/f113 [4358965,57172] 0 2026-03-09T00:04:18.791 INFO:tasks.workunit.client.0.vm03.stdout:1/974: write d4/d15/d77/ff8 [903398,57650] 0 2026-03-09T00:04:18.796 INFO:tasks.workunit.client.0.vm03.stdout:5/904: dwrite d1c/d20/d55/d4f/d58/d73/d9e/fae [0,4194304] 0 2026-03-09T00:04:18.817 INFO:tasks.workunit.client.0.vm03.stdout:3/731: dwrite d2/db/d3b/d5f/fb3 [0,4194304] 0 2026-03-09T00:04:18.818 INFO:tasks.workunit.client.0.vm03.stdout:4/978: mknod d7/d20/d6a/d77/d25/c131 0 2026-03-09T00:04:18.818 INFO:tasks.workunit.client.0.vm03.stdout:4/979: write d7/d20/d6a/d77/d25/fa1 [5172725,101543] 0 2026-03-09T00:04:18.818 INFO:tasks.workunit.client.0.vm03.stdout:4/980: write d7/d6f/dcf/de8/dee/d129/f6b [2088361,74232] 0 2026-03-09T00:04:18.820 INFO:tasks.workunit.client.0.vm03.stdout:8/892: dwrite d7/df/fee [0,4194304] 0 2026-03-09T00:04:18.824 INFO:tasks.workunit.client.0.vm03.stdout:8/893: write d7/df/d1a/d40/f69 [309306,94029] 0 2026-03-09T00:04:18.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.833+0000 7f859b029700 1 -- 192.168.123.103:0/1162428084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8594075a40 msgr2=0x7f8594077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.833+0000 7f859b029700 1 --2- 192.168.123.103:0/1162428084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8594075a40 0x7f8594077ed0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f858c009230 tx=0x7f858c009260 comp rx=0 tx=0).stop 2026-03-09T00:04:18.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.834+0000 7f859b029700 1 -- 192.168.123.103:0/1162428084 shutdown_connections 2026-03-09T00:04:18.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.834+0000 7f859b029700 1 --2- 192.168.123.103:0/1162428084 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8594075a40 0x7f8594077ed0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.834+0000 7f859b029700 1 --2- 192.168.123.103:0/1162428084 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 0x7f8594072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.834+0000 7f859b029700 1 -- 192.168.123.103:0/1162428084 >> 192.168.123.103:0/1162428084 conn(0x7f859406dae0 msgr2=0x7f859406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:18.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.834+0000 7f859b029700 1 -- 192.168.123.103:0/1162428084 shutdown_connections 2026-03-09T00:04:18.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.834+0000 7f859b029700 1 -- 192.168.123.103:0/1162428084 wait complete. 2026-03-09T00:04:18.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.835+0000 7f859b029700 1 Processor -- start 2026-03-09T00:04:18.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.835+0000 7f859b029700 1 -- start start 2026-03-09T00:04:18.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f859b029700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 0x7f85940830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f859b029700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85940835e0 0x7f859412e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f859b029700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8594083af0 con 0x7f8594072b50 2026-03-09T00:04:18.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f859b029700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8594083c60 con 0x7f85940835e0 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8593fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85940835e0 0x7f859412e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8598dc5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 0x7f85940830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8598dc5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 0x7f85940830a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:34260/0 (socket says 192.168.123.103:34260) 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8598dc5700 1 -- 192.168.123.103:0/3447345447 learned_addr learned my addr 192.168.123.103:0/3447345447 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8593fff700 1 -- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 msgr2=0x7f85940830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8593fff700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 0x7f85940830a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8593fff700 1 -- 192.168.123.103:0/3447345447 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f858c008ee0 con 0x7f85940835e0 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.836+0000 7f8593fff700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85940835e0 0x7f859412e3f0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f858c000f80 tx=0x7f858c003fa0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:18.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.839+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f858c01d070 con 0x7f85940835e0 2026-03-09T00:04:18.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.840+0000 7f859b029700 1 -- 192.168.123.103:0/3447345447 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f859412e930 con 0x7f85940835e0 2026-03-09T00:04:18.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.840+0000 7f859b029700 1 -- 192.168.123.103:0/3447345447 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f859412ee20 con 0x7f85940835e0 2026-03-09T00:04:18.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.841+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f858c007cb0 con 0x7f85940835e0 2026-03-09T00:04:18.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.841+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f858c00e950 con 0x7f85940835e0 2026-03-09T00:04:18.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.842+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f858c00eab0 con 0x7f85940835e0 2026-03-09T00:04:18.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.843+0000 7f8591ffb700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f857c077790 0x7f857c079c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:18.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.843+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f858c012070 con 0x7f85940835e0 2026-03-09T00:04:18.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.843+0000 7f8598dc5700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f857c077790 0x7f857c079c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.843+0000 7f8598dc5700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f857c077790 0x7f857c079c50 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f859412a360 tx=0x7f858400d040 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:18.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.844+0000 7f859b029700 1 -- 192.168.123.103:0/3447345447 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8580005320 con 0x7f85940835e0 2026-03-09T00:04:18.850 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:18.847+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f858c064a20 con 0x7f85940835e0 2026-03-09T00:04:18.850 INFO:tasks.workunit.client.0.vm03.stdout:0/892: dwrite f0 [0,4194304] 0 2026-03-09T00:04:18.850 INFO:tasks.workunit.client.0.vm03.stdout:0/893: write d2/da/dd/d6e/f10b [245038,104196] 0 2026-03-09T00:04:18.852 INFO:tasks.workunit.client.0.vm03.stdout:9/821: rmdir d15/d1c/d36 39 2026-03-09T00:04:18.852 INFO:tasks.workunit.client.0.vm03.stdout:2/912: sync 2026-03-09T00:04:18.852 INFO:tasks.workunit.client.0.vm03.stdout:2/913: write d8/f59 [4456253,103894] 0 2026-03-09T00:04:18.858 INFO:tasks.workunit.client.0.vm03.stdout:7/813: rmdir d2/d4/d8c 39 2026-03-09T00:04:18.867 INFO:tasks.workunit.client.0.vm03.stdout:6/833: creat d13/d1e/d44/d10b/f11b x:0 0 0 2026-03-09T00:04:18.867 INFO:tasks.workunit.client.0.vm03.stdout:6/834: fdatasync d13/d1e/f34 0 2026-03-09T00:04:18.871 INFO:tasks.workunit.client.0.vm03.stdout:6/835: dread d13/d1e/d44/d59/dec/d62/f9a [0,4194304] 0 2026-03-09T00:04:18.871 INFO:tasks.workunit.client.0.vm03.stdout:6/836: fdatasync d13/d35/dff/f104 0 2026-03-09T00:04:18.871 INFO:tasks.workunit.client.0.vm03.stdout:6/837: write d13/d1e/f3e [1479034,22189] 0 2026-03-09T00:04:18.871 INFO:tasks.workunit.client.0.vm03.stdout:6/838: chown d13/d35/d71/d97/da5/ca8 812698 1 2026-03-09T00:04:18.871 INFO:tasks.workunit.client.0.vm03.stdout:6/839: creat d13/d35/d74/d89/f11c x:0 0 0 2026-03-09T00:04:18.872 INFO:tasks.workunit.client.0.vm03.stdout:6/840: dread d13/d1e/d44/d4a/d52/f54 [0,4194304] 0 2026-03-09T00:04:18.891 INFO:tasks.workunit.client.0.vm03.stdout:1/975: mknod d4/d3a/d3d/d98/dee/deb/c142 0 2026-03-09T00:04:18.900 INFO:tasks.workunit.client.0.vm03.stdout:5/905: creat d1c/d20/d55/d66/dc6/df1/f121 x:0 0 0 2026-03-09T00:04:18.903 INFO:tasks.workunit.client.0.vm03.stdout:5/906: dread d1c/d20/f25 [0,4194304] 0 2026-03-09T00:04:18.913 INFO:tasks.workunit.client.0.vm03.stdout:6/841: dwrite fb [0,4194304] 0 2026-03-09T00:04:18.927 INFO:tasks.workunit.client.0.vm03.stdout:4/981: creat d7/d20/d6a/dea/df6/d121/f132 x:0 0 0 2026-03-09T00:04:18.936 INFO:tasks.workunit.client.0.vm03.stdout:8/894: unlink d7/df/d1a/d40/d9d/df2/d38/d4c/c107 0 2026-03-09T00:04:18.936 INFO:tasks.workunit.client.0.vm03.stdout:0/894: symlink d2/da/d36/ddf/d12f/l143 0 2026-03-09T00:04:18.936 INFO:tasks.workunit.client.0.vm03.stdout:0/895: write d2/da/dd/d49/d6c/d4b/d55/d6f/fd4 [2591344,54128] 0 2026-03-09T00:04:18.944 INFO:tasks.workunit.client.0.vm03.stdout:2/914: link d8/d1b/d24/fb2 d8/d1b/d8f/f12b 0 2026-03-09T00:04:18.949 INFO:tasks.workunit.client.0.vm03.stdout:7/814: creat d2/d1f/d3a/d24/ff6 x:0 0 0 2026-03-09T00:04:18.963 INFO:tasks.workunit.client.0.vm03.stdout:1/976: truncate d4/d6/f127 426745 0 2026-03-09T00:04:18.963 INFO:tasks.workunit.client.0.vm03.stdout:1/977: fdatasync d4/d3a/d8f/ff1 0 2026-03-09T00:04:18.967 INFO:tasks.workunit.client.0.vm03.stdout:5/907: link d1c/d20/d55/d4f/d58/d73/d9e/c11c d1c/d20/d55/d66/d6b/de3/d109/c122 0 2026-03-09T00:04:18.967 INFO:tasks.workunit.client.0.vm03.stdout:5/908: dread - d1c/d20/d55/d66/d70/ff9 zero size 2026-03-09T00:04:18.967 INFO:tasks.workunit.client.0.vm03.stdout:5/909: read - d1c/d20/d55/d66/dc6/df1/f121 zero size 2026-03-09T00:04:18.968 INFO:tasks.workunit.client.0.vm03.stdout:6/842: mknod d13/d1e/d44/d59/dec/d62/df5/c11d 0 2026-03-09T00:04:18.976 INFO:tasks.workunit.client.0.vm03.stdout:8/895: dwrite d7/df/d1a/d40/f69 [0,4194304] 0 2026-03-09T00:04:18.977 INFO:tasks.workunit.client.0.vm03.stdout:3/732: rmdir d2/db/d56 39 2026-03-09T00:04:18.977 INFO:tasks.workunit.client.0.vm03.stdout:3/733: stat d2/db/d3b/d5f/da5/d72/d96/ce2 0 2026-03-09T00:04:18.977 INFO:tasks.workunit.client.0.vm03.stdout:3/734: write d2/db/d3b/d5f/da5/d72/f86 [1274114,5558] 0 2026-03-09T00:04:18.982 INFO:tasks.workunit.client.0.vm03.stdout:8/896: dread d7/df/d1a/d40/d9d/df2/d3f/d95/fb4 [0,4194304] 0 2026-03-09T00:04:19.000 INFO:tasks.workunit.client.0.vm03.stdout:4/982: symlink d7/d20/l133 0 2026-03-09T00:04:19.005 INFO:tasks.workunit.client.0.vm03.stdout:0/896: mknod d2/da/d36/ddf/df7/d107/c144 0 2026-03-09T00:04:19.010 INFO:tasks.workunit.client.0.vm03.stdout:9/822: getdents d15/d1c/d28/d6e/da2 0 2026-03-09T00:04:19.010 INFO:tasks.workunit.client.0.vm03.stdout:9/823: stat d15/d77/dfb 0 2026-03-09T00:04:19.025 INFO:tasks.workunit.client.0.vm03.stdout:2/915: creat d8/d1b/d6c/f12c x:0 0 0 2026-03-09T00:04:19.026 INFO:tasks.workunit.client.0.vm03.stdout:2/916: write d8/d1b/d2a/d6b/f8b [129052,11054] 0 2026-03-09T00:04:19.028 INFO:tasks.workunit.client.0.vm03.stdout:8/897: dwrite d7/df/d1a/d40/d9d/df2/dc3/fd3 [0,4194304] 0 2026-03-09T00:04:19.035 INFO:tasks.workunit.client.0.vm03.stdout:7/815: link d2/d4/ca3 d2/d1f/d3a/d24/da4/d46/d81/d96/d37/cf7 0 2026-03-09T00:04:19.036 INFO:tasks.workunit.client.0.vm03.stdout:1/978: link d4/d6/lca d4/d6/d52/l143 0 2026-03-09T00:04:19.036 INFO:tasks.workunit.client.0.vm03.stdout:1/979: fsync d4/d3a/d43/f5a 0 2026-03-09T00:04:19.044 INFO:tasks.workunit.client.0.vm03.stdout:7/816: read d2/d1f/d3a/d24/da4/d46/d54/f77 [2624587,107403] 0 2026-03-09T00:04:19.044 INFO:tasks.workunit.client.0.vm03.stdout:7/817: dread - d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/ff1 zero size 2026-03-09T00:04:19.044 INFO:tasks.workunit.client.0.vm03.stdout:9/824: dwrite d15/d7f/f88 [0,4194304] 0 2026-03-09T00:04:19.046 INFO:tasks.workunit.client.0.vm03.stdout:5/910: symlink d1c/d51/df2/l123 0 2026-03-09T00:04:19.046 INFO:tasks.workunit.client.0.vm03.stdout:6/843: symlink d13/d35/db5/l11e 0 2026-03-09T00:04:19.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.055+0000 7f859b029700 1 -- 192.168.123.103:0/3447345447 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8580000bf0 con 0x7f857c077790 2026-03-09T00:04:19.057 INFO:tasks.workunit.client.0.vm03.stdout:4/983: link d7/l11 d7/d20/d6a/d77/d25/l134 0 2026-03-09T00:04:19.057 INFO:tasks.workunit.client.0.vm03.stdout:4/984: fsync d7/f28 0 2026-03-09T00:04:19.057 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.058+0000 7f8591ffb700 1 -- 192.168.123.103:0/3447345447 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7f8580000bf0 con 0x7f857c077790 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/2 daemons upgraded", 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading node-exporter daemons", 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:04:19.058 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:04:19.062 INFO:tasks.workunit.client.0.vm03.stdout:0/897: symlink d2/da/dd/d49/d6c/d4b/l145 0 2026-03-09T00:04:19.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.064+0000 7f857b7fe700 1 -- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f857c077790 msgr2=0x7f857c079c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:19.064 INFO:tasks.workunit.client.0.vm03.stdout:2/917: mkdir d8/d26/d12d 0 2026-03-09T00:04:19.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.065+0000 7f857b7fe700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f857c077790 0x7f857c079c50 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f859412a360 tx=0x7f858400d040 comp rx=0 tx=0).stop 2026-03-09T00:04:19.064 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.065+0000 7f857b7fe700 1 -- 192.168.123.103:0/3447345447 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85940835e0 msgr2=0x7f859412e3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.065+0000 7f857b7fe700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85940835e0 0x7f859412e3f0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f858c000f80 tx=0x7f858c003fa0 comp rx=0 tx=0).stop 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 -- 192.168.123.103:0/3447345447 shutdown_connections 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f857c077790 0x7f857c079c50 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8594072b50 0x7f85940830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 --2- 192.168.123.103:0/3447345447 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85940835e0 0x7f859412e3f0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 -- 192.168.123.103:0/3447345447 >> 192.168.123.103:0/3447345447 conn(0x7f859406dae0 msgr2=0x7f859406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 -- 192.168.123.103:0/3447345447 shutdown_connections 2026-03-09T00:04:19.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:19.066+0000 7f857b7fe700 1 -- 192.168.123.103:0/3447345447 wait complete. 2026-03-09T00:04:19.070 INFO:tasks.workunit.client.0.vm03.stdout:8/898: rename d7/df/d1a/d40/f5e to d7/df/f10f 0 2026-03-09T00:04:19.078 INFO:tasks.workunit.client.0.vm03.stdout:1/980: creat d4/d15/d77/dce/dd9/f144 x:0 0 0 2026-03-09T00:04:19.078 INFO:tasks.workunit.client.0.vm03.stdout:7/818: mkdir d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/df8 0 2026-03-09T00:04:19.087 INFO:tasks.workunit.client.0.vm03.stdout:7/819: dread d2/d4/d1e/d78/fa5 [0,4194304] 0 2026-03-09T00:04:19.093 INFO:tasks.workunit.client.0.vm03.stdout:9/825: mknod d15/d1c/d21/d64/c118 0 2026-03-09T00:04:19.093 INFO:tasks.workunit.client.0.vm03.stdout:5/911: link d1c/d20/f25 d1c/d51/d6a/d75/df0/d10a/f124 0 2026-03-09T00:04:19.098 INFO:tasks.workunit.client.0.vm03.stdout:6/844: mknod d13/d35/d71/d97/da5/dc1/c11f 0 2026-03-09T00:04:19.141 INFO:tasks.workunit.client.0.vm03.stdout:3/735: dwrite d2/db/d40/d88/faa [0,4194304] 0 2026-03-09T00:04:19.141 INFO:tasks.workunit.client.0.vm03.stdout:3/736: fsync d2/db/d40/d44/f4d 0 2026-03-09T00:04:19.141 INFO:tasks.workunit.client.0.vm03.stdout:3/737: fdatasync d2/db/d40/d58/f7f 0 2026-03-09T00:04:19.141 INFO:tasks.workunit.client.0.vm03.stdout:3/738: write d2/db/d3b/d5f/da5/d72/fb0 [87417,119632] 0 2026-03-09T00:04:19.190 INFO:tasks.workunit.client.0.vm03.stdout:8/899: dread d7/df/d1a/d40/d9d/df2/d38/d60/dcd/fdd [0,4194304] 0 2026-03-09T00:04:19.191 INFO:tasks.workunit.client.0.vm03.stdout:7/820: truncate d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/fd5 162524 0 2026-03-09T00:04:19.194 INFO:tasks.workunit.client.0.vm03.stdout:9/826: mkdir d15/d1c/d28/d6e/da2/d119 0 2026-03-09T00:04:19.201 INFO:tasks.workunit.client.0.vm03.stdout:5/912: creat d1c/f125 x:0 0 0 2026-03-09T00:04:19.218 INFO:tasks.workunit.client.0.vm03.stdout:6/845: truncate d13/d35/d74/d89/db3/ff9 422297 0 2026-03-09T00:04:19.247 INFO:tasks.workunit.client.0.vm03.stdout:4/985: dwrite d7/d20/d6a/dea/d4e/f4f [0,4194304] 0 2026-03-09T00:04:19.247 INFO:tasks.workunit.client.0.vm03.stdout:4/986: write d7/d20/d6a/d77/d25/de2/f12c [1030538,66209] 0 2026-03-09T00:04:19.253 INFO:tasks.workunit.client.0.vm03.stdout:0/898: dwrite d2/da/dd/d49/d6c/f57 [0,4194304] 0 2026-03-09T00:04:19.315 INFO:tasks.workunit.client.0.vm03.stdout:2/918: mknod d8/d1b/d24/da5/dfe/c12e 0 2026-03-09T00:04:19.322 INFO:tasks.workunit.client.0.vm03.stdout:2/919: write d8/d1b/d24/da5/dda/def/d127/f1d [4171864,110453] 0 2026-03-09T00:04:19.322 INFO:tasks.workunit.client.0.vm03.stdout:2/920: fsync d8/f15 0 2026-03-09T00:04:19.327 INFO:tasks.workunit.client.0.vm03.stdout:2/921: write d8/f59 [973907,46316] 0 2026-03-09T00:04:19.327 INFO:tasks.workunit.client.0.vm03.stdout:2/922: read - d8/d1b/d2a/d2e/f124 zero size 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='client.24431 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: Upgrade: Updating node-exporter.vm03 (1/2) 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: Deploying daemon node-exporter.vm03 on vm03 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: Standby manager daemon vm06.rzcvhn restarted 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/2739134642' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:19.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:19 vm03.local ceph-mon[52346]: from='client.24439 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:19.352 INFO:tasks.workunit.client.0.vm03.stdout:2/923: dwrite d8/d26/d5e/d6f/f11a [0,4194304] 0 2026-03-09T00:04:19.356 INFO:tasks.workunit.client.0.vm03.stdout:2/924: dread d8/d1b/d24/da5/dc9/fe8 [0,4194304] 0 2026-03-09T00:04:19.377 INFO:tasks.workunit.client.0.vm03.stdout:3/739: truncate d2/db/f14 7110986 0 2026-03-09T00:04:19.397 INFO:tasks.workunit.client.0.vm03.stdout:7/821: getdents d2/d4/db7 0 2026-03-09T00:04:19.401 INFO:tasks.workunit.client.0.vm03.stdout:9/827: mkdir d15/d1c/d21/d64/d11a 0 2026-03-09T00:04:19.406 INFO:tasks.workunit.client.0.vm03.stdout:5/913: rmdir d1c/d20/d55/d4f/d58/d73 39 2026-03-09T00:04:19.409 INFO:tasks.workunit.client.0.vm03.stdout:9/828: write d15/d1c/d21/d54/f73 [65764,123100] 0 2026-03-09T00:04:19.416 INFO:tasks.workunit.client.0.vm03.stdout:9/829: dread d15/d1c/d21/d75/fa6 [0,4194304] 0 2026-03-09T00:04:19.417 INFO:tasks.workunit.client.0.vm03.stdout:6/846: symlink d13/d1e/d44/d4a/l120 0 2026-03-09T00:04:19.421 INFO:tasks.workunit.client.0.vm03.stdout:4/987: creat d7/d20/d6a/f135 x:0 0 0 2026-03-09T00:04:19.423 INFO:tasks.workunit.client.0.vm03.stdout:0/899: getdents d2/da/d36/ddf 0 2026-03-09T00:04:19.435 INFO:tasks.workunit.client.0.vm03.stdout:2/925: truncate d8/d1b/d2a/fbb 1106399 0 2026-03-09T00:04:19.438 INFO:tasks.workunit.client.0.vm03.stdout:9/830: truncate d15/d1c/d36/f3a 2700562 0 2026-03-09T00:04:19.439 INFO:tasks.workunit.client.0.vm03.stdout:0/900: mkdir d2/da/dd/d49/d6c/d4b/d55/d6f/d146 0 2026-03-09T00:04:19.439 INFO:tasks.workunit.client.0.vm03.stdout:0/901: chown d2/da/dd/d49/d6c/da6/dda/db5/fb6 50033 1 2026-03-09T00:04:19.439 INFO:tasks.workunit.client.0.vm03.stdout:0/902: chown d2/da/d1a/fd5 1332603 1 2026-03-09T00:04:19.441 INFO:tasks.workunit.client.0.vm03.stdout:3/740: mkdir d2/db/de6 0 2026-03-09T00:04:19.451 INFO:tasks.workunit.client.0.vm03.stdout:7/822: dwrite d2/f73 [0,4194304] 0 2026-03-09T00:04:19.453 INFO:tasks.workunit.client.0.vm03.stdout:3/741: read d2/db/d3b/f3e [1567861,80170] 0 2026-03-09T00:04:19.465 INFO:tasks.workunit.client.0.vm03.stdout:7/823: symlink d2/d4/db7/d67/lf9 0 2026-03-09T00:04:19.471 INFO:tasks.workunit.client.0.vm03.stdout:8/900: rename d7/df/d1a/d40/d9d/df2/d38/f85 to d7/df/d1a/d40/d9d/da3/dd2/f110 0 2026-03-09T00:04:19.472 INFO:tasks.workunit.client.0.vm03.stdout:6/847: rename d13/d1e/c20 to d13/d1e/d44/d4a/d52/c121 0 2026-03-09T00:04:19.472 INFO:tasks.workunit.client.0.vm03.stdout:4/988: rename d7/d6f/dcf/de8/dee to d7/d6f/dcf/de8/dee/d136 22 2026-03-09T00:04:19.472 INFO:tasks.workunit.client.0.vm03.stdout:4/989: stat d7/d20/d6a/dea/d38/dfb 0 2026-03-09T00:04:19.474 INFO:tasks.workunit.client.0.vm03.stdout:8/901: symlink d7/df/d1a/d40/d9d/df2/d38/d91/l111 0 2026-03-09T00:04:19.474 INFO:tasks.workunit.client.0.vm03.stdout:7/824: dread d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/fd5 [0,4194304] 0 2026-03-09T00:04:19.474 INFO:tasks.workunit.client.0.vm03.stdout:7/825: dread - d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/d6e/fed zero size 2026-03-09T00:04:19.474 INFO:tasks.workunit.client.0.vm03.stdout:7/826: fsync d2/d1f/d3a/d24/da4/d46/d81/f8f 0 2026-03-09T00:04:19.479 INFO:tasks.workunit.client.0.vm03.stdout:1/981: sync 2026-03-09T00:04:19.487 INFO:tasks.workunit.client.0.vm03.stdout:3/742: rename d2/db/d3b/d5f/da5/f6e to d2/db/d40/d51/fe7 0 2026-03-09T00:04:19.505 INFO:tasks.workunit.client.0.vm03.stdout:7/827: link d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/f74 d2/d1f/d3a/d24/da4/d46/d81/d96/d88/ffa 0 2026-03-09T00:04:19.505 INFO:tasks.workunit.client.0.vm03.stdout:1/982: getdents d4/d3a/d3d/d98/d11d 0 2026-03-09T00:04:19.505 INFO:tasks.workunit.client.0.vm03.stdout:5/914: dwrite fe [0,4194304] 0 2026-03-09T00:04:19.523 INFO:tasks.workunit.client.0.vm03.stdout:4/990: link d7/c75 d7/d20/d6a/d77/d25/c137 0 2026-03-09T00:04:19.523 INFO:tasks.workunit.client.0.vm03.stdout:4/991: dread - d7/d20/d6a/dea/fd7 zero size 2026-03-09T00:04:19.523 INFO:tasks.workunit.client.0.vm03.stdout:4/992: chown d7/d20/d35 86 1 2026-03-09T00:04:19.524 INFO:tasks.workunit.client.0.vm03.stdout:9/831: dwrite d15/d1c/d28/f29 [0,4194304] 0 2026-03-09T00:04:19.525 INFO:tasks.workunit.client.0.vm03.stdout:9/832: write d15/d1c/d28/dda/f10a [204335,54885] 0 2026-03-09T00:04:19.530 INFO:tasks.workunit.client.0.vm03.stdout:9/833: creat d15/d1c/d21/d54/dab/df6/d10e/f11b x:0 0 0 2026-03-09T00:04:19.530 INFO:tasks.workunit.client.0.vm03.stdout:4/993: getdents d7/d20/d6a/d77/d25/de2 0 2026-03-09T00:04:19.532 INFO:tasks.workunit.client.0.vm03.stdout:9/834: mkdir d15/d1c/d28/dda/d11c 0 2026-03-09T00:04:19.541 INFO:tasks.workunit.client.0.vm03.stdout:4/994: dread d7/d27/dc9/fd2 [4194304,4194304] 0 2026-03-09T00:04:19.542 INFO:tasks.workunit.client.0.vm03.stdout:2/926: dwrite d8/d1b/d2a/d2e/fd9 [0,4194304] 0 2026-03-09T00:04:19.542 INFO:tasks.workunit.client.0.vm03.stdout:6/848: dwrite d13/d35/d71/d97/da5/db1/f108 [0,4194304] 0 2026-03-09T00:04:19.542 INFO:tasks.workunit.client.0.vm03.stdout:6/849: write f10 [438453,88321] 0 2026-03-09T00:04:19.544 INFO:tasks.workunit.client.0.vm03.stdout:4/995: creat d7/d27/f138 x:0 0 0 2026-03-09T00:04:19.545 INFO:tasks.workunit.client.0.vm03.stdout:2/927: unlink d8/f15 0 2026-03-09T00:04:19.547 INFO:tasks.workunit.client.0.vm03.stdout:4/996: mkdir d7/d20/d6a/dea/d54/d139 0 2026-03-09T00:04:19.547 INFO:tasks.workunit.client.0.vm03.stdout:4/997: chown d7/d20/d6a/d77/db7/fa3 1026319 1 2026-03-09T00:04:19.549 INFO:tasks.workunit.client.0.vm03.stdout:6/850: read d13/d1e/d44/d59/d77/f96 [2939587,31733] 0 2026-03-09T00:04:19.549 INFO:tasks.workunit.client.0.vm03.stdout:6/851: truncate d13/d1e/d44/d59/dec/d62/fdb 44713 0 2026-03-09T00:04:19.550 INFO:tasks.workunit.client.0.vm03.stdout:6/852: rename d13/dc4/fd9 to d13/d1e/d44/d59/dec/f122 0 2026-03-09T00:04:19.572 INFO:tasks.workunit.client.0.vm03.stdout:4/998: dread d7/d20/d6a/dea/d38/f8f [0,4194304] 0 2026-03-09T00:04:19.573 INFO:tasks.workunit.client.0.vm03.stdout:8/902: dwrite d7/df/d1a/d40/d9d/df2/d38/d91/fa5 [4194304,4194304] 0 2026-03-09T00:04:19.595 INFO:tasks.workunit.client.0.vm03.stdout:0/903: write d2/da/d76/fa1 [519403,33540] 0 2026-03-09T00:04:19.611 INFO:tasks.workunit.client.0.vm03.stdout:0/904: mkdir d2/da/dd/d49/d6c/da6/dda/db5/dba/d147 0 2026-03-09T00:04:19.611 INFO:tasks.workunit.client.0.vm03.stdout:0/905: readlink d2/da/dd/d49/d6c/da6/l118 0 2026-03-09T00:04:19.611 INFO:tasks.workunit.client.0.vm03.stdout:0/906: chown d2/da/d1a/l5e 461 1 2026-03-09T00:04:19.611 INFO:tasks.workunit.client.0.vm03.stdout:0/907: chown d2/l86 8302 1 2026-03-09T00:04:19.614 INFO:tasks.workunit.client.0.vm03.stdout:0/908: write d2/da/d1a/f25 [121172,101131] 0 2026-03-09T00:04:19.616 INFO:tasks.workunit.client.0.vm03.stdout:0/909: rmdir d2/da/d76/d8a/d8f/db8 39 2026-03-09T00:04:19.653 INFO:tasks.workunit.client.0.vm03.stdout:5/915: dwrite d1c/d20/d55/d4f/d58/d73/d76/d91/fa2 [0,4194304] 0 2026-03-09T00:04:19.670 INFO:tasks.workunit.client.0.vm03.stdout:6/853: dwrite d13/d35/fd4 [0,4194304] 0 2026-03-09T00:04:19.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='client.24431 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:19.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: Upgrade: Updating node-exporter.vm03 (1/2) 2026-03-09T00:04:19.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: Deploying daemon node-exporter.vm03 on vm03 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: Standby manager daemon vm06.rzcvhn restarted 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1912083941' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/2739134642' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:19.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:19 vm06.local ceph-mon[58395]: from='client.24439 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:19.671 INFO:tasks.workunit.client.0.vm03.stdout:0/910: dwrite d2/da/dd/d49/d6c/d4b/f13a [0,4194304] 0 2026-03-09T00:04:19.671 INFO:tasks.workunit.client.0.vm03.stdout:0/911: write d2/da/dd/d49/d6c/d4b/ffe [623329,54822] 0 2026-03-09T00:04:19.672 INFO:tasks.workunit.client.0.vm03.stdout:6/854: dread d13/d1e/d44/d4a/d52/f54 [0,4194304] 0 2026-03-09T00:04:19.673 INFO:tasks.workunit.client.0.vm03.stdout:6/855: dread d13/d1e/d44/d4a/d52/f54 [0,4194304] 0 2026-03-09T00:04:19.673 INFO:tasks.workunit.client.0.vm03.stdout:6/856: chown d13/d1e/d44/d4a/d52/f6d 29604 1 2026-03-09T00:04:19.673 INFO:tasks.workunit.client.0.vm03.stdout:6/857: chown d13/d35 0 1 2026-03-09T00:04:19.679 INFO:tasks.workunit.client.0.vm03.stdout:6/858: dread d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:04:19.679 INFO:tasks.workunit.client.0.vm03.stdout:6/859: creat d13/d1e/d44/d10b/f123 x:0 0 0 2026-03-09T00:04:19.680 INFO:tasks.workunit.client.0.vm03.stdout:7/828: dwrite d2/d1f/d3a/d24/ff6 [0,4194304] 0 2026-03-09T00:04:19.680 INFO:tasks.workunit.client.0.vm03.stdout:7/829: chown d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/f49 54747 1 2026-03-09T00:04:19.681 INFO:tasks.workunit.client.0.vm03.stdout:8/903: symlink d7/df/d1a/l112 0 2026-03-09T00:04:19.681 INFO:tasks.workunit.client.0.vm03.stdout:8/904: creat d7/df/d1a/d40/d58/f113 x:0 0 0 2026-03-09T00:04:19.686 INFO:tasks.workunit.client.0.vm03.stdout:5/916: rmdir d1c/d20/d55/dac 39 2026-03-09T00:04:19.686 INFO:tasks.workunit.client.0.vm03.stdout:5/917: truncate d1c/d20/d55/d66/d70/f8c 997914 0 2026-03-09T00:04:19.690 INFO:tasks.workunit.client.0.vm03.stdout:6/860: truncate d13/f3a 3480836 0 2026-03-09T00:04:19.692 INFO:tasks.workunit.client.0.vm03.stdout:6/861: write d13/d1e/d44/d59/f6c [555484,24505] 0 2026-03-09T00:04:19.692 INFO:tasks.workunit.client.0.vm03.stdout:6/862: write d13/f1d [2945560,8771] 0 2026-03-09T00:04:19.692 INFO:tasks.workunit.client.0.vm03.stdout:6/863: creat d13/d35/d71/d97/da5/db1/f124 x:0 0 0 2026-03-09T00:04:19.692 INFO:tasks.workunit.client.0.vm03.stdout:6/864: write d13/d1e/d44/d59/f6c [1636641,64335] 0 2026-03-09T00:04:19.694 INFO:tasks.workunit.client.0.vm03.stdout:7/830: link d2/d4/ccf d2/d1f/d3a/d24/da4/d46/d81/d96/d88/cfb 0 2026-03-09T00:04:19.694 INFO:tasks.workunit.client.0.vm03.stdout:7/831: chown d2/d1f/d3a/d24/da4/d46/d81/d96/d8e 4649335 1 2026-03-09T00:04:19.695 INFO:tasks.workunit.client.0.vm03.stdout:4/999: dwrite d7/d20/d6a/d77/f82 [4194304,4194304] 0 2026-03-09T00:04:19.695 INFO:tasks.workunit.client.0.vm03.stdout:7/832: dread d2/d1f/d3a/d24/da4/d46/d81/d96/fbb [0,4194304] 0 2026-03-09T00:04:19.702 INFO:tasks.workunit.client.0.vm03.stdout:0/912: dwrite d2/da/dd/d49/d6c/da6/dda/db5/fb6 [0,4194304] 0 2026-03-09T00:04:19.702 INFO:tasks.workunit.client.0.vm03.stdout:0/913: write d2/da/d4e/f120 [4166436,10166] 0 2026-03-09T00:04:19.706 INFO:tasks.workunit.client.0.vm03.stdout:8/905: mkdir d7/df/d1a/d40/d9d/df2/d38/d4c/d114 0 2026-03-09T00:04:19.710 INFO:tasks.workunit.client.0.vm03.stdout:5/918: link d1c/d20/d55/d66/dc6/l115 d1c/d20/d55/d4f/d58/d73/d76/d8e/l126 0 2026-03-09T00:04:19.710 INFO:tasks.workunit.client.0.vm03.stdout:5/919: chown d1c/d20/d55/d4f/d58/db5/df7 3404932 1 2026-03-09T00:04:19.710 INFO:tasks.workunit.client.0.vm03.stdout:5/920: write d1c/d20/d55/fbc [243252,33299] 0 2026-03-09T00:04:19.730 INFO:tasks.workunit.client.0.vm03.stdout:2/928: write d8/d1b/d2a/d6b/fcf [16769,38224] 0 2026-03-09T00:04:19.745 INFO:tasks.workunit.client.0.vm03.stdout:7/833: creat d2/d4/d1e/d5e/d7e/ffc x:0 0 0 2026-03-09T00:04:19.750 INFO:tasks.workunit.client.0.vm03.stdout:7/834: write d2/d1f/d3a/d24/da4/d46/d81/d96/d37/f56 [5892707,44182] 0 2026-03-09T00:04:19.750 INFO:tasks.workunit.client.0.vm03.stdout:7/835: fdatasync d2/d4/db7/d67/d6b/fbe 0 2026-03-09T00:04:19.765 INFO:tasks.workunit.client.0.vm03.stdout:5/921: symlink d1c/d20/d56/db4/df3/d11e/l127 0 2026-03-09T00:04:19.770 INFO:tasks.workunit.client.0.vm03.stdout:2/929: mkdir d8/d1b/d2a/d6b/d12f 0 2026-03-09T00:04:19.771 INFO:tasks.workunit.client.0.vm03.stdout:2/930: chown d8/d26/d5e/d6f 1821794 1 2026-03-09T00:04:19.778 INFO:tasks.workunit.client.0.vm03.stdout:0/914: dread d2/fb [0,4194304] 0 2026-03-09T00:04:19.778 INFO:tasks.workunit.client.0.vm03.stdout:0/915: write d2/fe [3693815,53799] 0 2026-03-09T00:04:19.778 INFO:tasks.workunit.client.0.vm03.stdout:6/865: dwrite d13/d35/d71/d97/da5/db1/f108 [4194304,4194304] 0 2026-03-09T00:04:19.781 INFO:tasks.workunit.client.0.vm03.stdout:0/916: mkdir d2/da/dd/d49/d148 0 2026-03-09T00:04:19.781 INFO:tasks.workunit.client.0.vm03.stdout:0/917: creat d2/da/dd/d49/d6c/da6/dcf/f149 x:0 0 0 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/918: dread d2/da/dd/d49/d6c/da6/dda/db5/dba/fbc [0,4194304] 0 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/919: readlink d2/da/d36/l7a 0 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/920: chown d2/da/dd/d6e/lb9 39157873 1 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/921: truncate d2/da/d36/ddf/df7/d12a/f12c 657857 0 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/922: readlink d2/da/d76/d8a/d8f/l123 0 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/923: chown d2/da/d1a/fd5 222718 1 2026-03-09T00:04:19.782 INFO:tasks.workunit.client.0.vm03.stdout:0/924: readlink d2/da/dd/d6e/lb9 0 2026-03-09T00:04:19.787 INFO:tasks.workunit.client.0.vm03.stdout:8/906: dwrite d7/df/d1a/d40/fb5 [0,4194304] 0 2026-03-09T00:04:19.787 INFO:tasks.workunit.client.0.vm03.stdout:8/907: truncate d7/df/d1a/d40/d9d/df2/dc3/fd3 5059285 0 2026-03-09T00:04:19.811 INFO:tasks.workunit.client.0.vm03.stdout:0/925: creat d2/da/dd/d49/f14a x:0 0 0 2026-03-09T00:04:19.811 INFO:tasks.workunit.client.0.vm03.stdout:0/926: chown d2/da/dd/f75 30491 1 2026-03-09T00:04:19.811 INFO:tasks.workunit.client.0.vm03.stdout:8/908: truncate d7/df/d1a/d40/d58/fb6 785446 0 2026-03-09T00:04:19.842 INFO:tasks.workunit.client.0.vm03.stdout:3/743: sync 2026-03-09T00:04:19.842 INFO:tasks.workunit.client.0.vm03.stdout:3/744: fsync d2/db/d56/fd6 0 2026-03-09T00:04:19.844 INFO:tasks.workunit.client.0.vm03.stdout:9/835: sync 2026-03-09T00:04:19.844 INFO:tasks.workunit.client.0.vm03.stdout:1/983: sync 2026-03-09T00:04:19.848 INFO:tasks.workunit.client.0.vm03.stdout:0/927: dwrite d2/da/d1a/f25 [4194304,4194304] 0 2026-03-09T00:04:19.848 INFO:tasks.workunit.client.0.vm03.stdout:0/928: stat f0 0 2026-03-09T00:04:19.856 INFO:tasks.workunit.client.0.vm03.stdout:3/745: mknod d2/db/d40/d51/da2/ce8 0 2026-03-09T00:04:19.862 INFO:tasks.workunit.client.0.vm03.stdout:0/929: symlink d2/da/d36/l14b 0 2026-03-09T00:04:19.862 INFO:tasks.workunit.client.0.vm03.stdout:0/930: chown d2/da/d36/ddf/l125 82 1 2026-03-09T00:04:19.868 INFO:tasks.workunit.client.0.vm03.stdout:0/931: mkdir d2/da/dd/d49/d6c/da6/dcf/d14c 0 2026-03-09T00:04:19.920 INFO:tasks.workunit.client.0.vm03.stdout:9/836: dwrite d15/d1c/d21/fea [0,4194304] 0 2026-03-09T00:04:19.922 INFO:tasks.workunit.client.0.vm03.stdout:9/837: mkdir d15/d1c/d36/d4d/d11d 0 2026-03-09T00:04:19.925 INFO:tasks.workunit.client.0.vm03.stdout:9/838: dread d15/f17 [0,4194304] 0 2026-03-09T00:04:19.925 INFO:tasks.workunit.client.0.vm03.stdout:9/839: fsync d15/d1c/d36/fb1 0 2026-03-09T00:04:19.932 INFO:tasks.workunit.client.0.vm03.stdout:0/932: dwrite d2/f22 [0,4194304] 0 2026-03-09T00:04:19.933 INFO:tasks.workunit.client.0.vm03.stdout:1/984: dwrite d4/d6/f6e [0,4194304] 0 2026-03-09T00:04:19.933 INFO:tasks.workunit.client.0.vm03.stdout:1/985: fsync d4/d3a/f2c 0 2026-03-09T00:04:19.933 INFO:tasks.workunit.client.0.vm03.stdout:0/933: symlink d2/da/dd/d49/d6c/da6/dda/db5/dba/d147/l14d 0 2026-03-09T00:04:19.939 INFO:tasks.workunit.client.0.vm03.stdout:1/986: dread d4/d3a/d43/f5a [0,4194304] 0 2026-03-09T00:04:19.939 INFO:tasks.workunit.client.0.vm03.stdout:1/987: fdatasync d4/d3a/d8f/d104/d117/f133 0 2026-03-09T00:04:19.940 INFO:tasks.workunit.client.0.vm03.stdout:1/988: dread d4/d3a/f2c [0,4194304] 0 2026-03-09T00:04:19.940 INFO:tasks.workunit.client.0.vm03.stdout:1/989: getdents d4/d3a/d3d/d98/dee/d9e/def 0 2026-03-09T00:04:19.941 INFO:tasks.workunit.client.0.vm03.stdout:1/990: unlink d4/d3a/d3d/d98/dee/d9e/d12e/c9d 0 2026-03-09T00:04:19.941 INFO:tasks.workunit.client.0.vm03.stdout:1/991: write d4/d3a/d32/d87/fd5 [1327545,74522] 0 2026-03-09T00:04:19.942 INFO:tasks.workunit.client.0.vm03.stdout:0/934: write d2/da/d1a/f25 [7682589,83871] 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:1/992: rmdir d4/d3a/d3d/d46/d11b 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:0/935: mknod d2/da/dd/d49/d6c/da6/dda/db5/dba/c14e 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:1/993: link d4/d3a/d3d/d98/dee/deb/c109 d4/d3a/d32/da1/c145 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:1/994: write d4/d3a/d8f/ff1 [318967,109883] 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:1/995: stat d4/d15/d77/d8c/cb4 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:0/936: rename d2/da/d76/d8a/f135 to d2/da/dd/d49/d6c/da6/f14f 0 2026-03-09T00:04:19.950 INFO:tasks.workunit.client.0.vm03.stdout:0/937: chown d2/da/dd/d49/d6c/da6/dda/c106 4617516 1 2026-03-09T00:04:19.951 INFO:tasks.workunit.client.0.vm03.stdout:8/909: dwrite d7/df/d1a/d40/d58/fb6 [0,4194304] 0 2026-03-09T00:04:19.951 INFO:tasks.workunit.client.0.vm03.stdout:8/910: write d7/df/d1a/d40/db3/f75 [744416,45611] 0 2026-03-09T00:04:19.952 INFO:tasks.workunit.client.0.vm03.stdout:1/996: creat d4/d15/d5c/d6c/f146 x:0 0 0 2026-03-09T00:04:19.957 INFO:tasks.workunit.client.0.vm03.stdout:0/938: dread d2/da/dd/d49/d6c/d4b/daf/f10d [0,4194304] 0 2026-03-09T00:04:19.961 INFO:tasks.workunit.client.0.vm03.stdout:9/840: dwrite d15/d1c/d36/d4d/dc4/fff [0,4194304] 0 2026-03-09T00:04:19.968 INFO:tasks.workunit.client.0.vm03.stdout:0/939: dread d2/da/d76/d8a/d8f/db8/f112 [0,4194304] 0 2026-03-09T00:04:19.968 INFO:tasks.workunit.client.0.vm03.stdout:0/940: chown d2/da/dd/l98 894960965 1 2026-03-09T00:04:19.977 INFO:tasks.workunit.client.0.vm03.stdout:5/922: sync 2026-03-09T00:04:19.977 INFO:tasks.workunit.client.0.vm03.stdout:2/931: sync 2026-03-09T00:04:19.977 INFO:tasks.workunit.client.0.vm03.stdout:6/866: sync 2026-03-09T00:04:19.977 INFO:tasks.workunit.client.0.vm03.stdout:7/836: sync 2026-03-09T00:04:19.979 INFO:tasks.workunit.client.0.vm03.stdout:3/746: sync 2026-03-09T00:04:19.979 INFO:tasks.workunit.client.0.vm03.stdout:3/747: chown d2/db/d3b/d5f/da5/d72/dbd/cc4 40 1 2026-03-09T00:04:19.983 INFO:tasks.workunit.client.0.vm03.stdout:3/748: dread d2/db/d3b/d5f/da5/d72/f7a [0,4194304] 0 2026-03-09T00:04:19.984 INFO:tasks.workunit.client.0.vm03.stdout:8/911: mknod d7/df/d1a/d40/d9d/df2/d3f/df1/c115 0 2026-03-09T00:04:19.984 INFO:tasks.workunit.client.0.vm03.stdout:8/912: fdatasync d7/df/d1a/d40/d9d/df2/d3f/f47 0 2026-03-09T00:04:19.991 INFO:tasks.workunit.client.0.vm03.stdout:9/841: creat d15/d77/f11e x:0 0 0 2026-03-09T00:04:19.992 INFO:tasks.workunit.client.0.vm03.stdout:1/997: dwrite d4/d3a/f4d [8388608,4194304] 0 2026-03-09T00:04:19.992 INFO:tasks.workunit.client.0.vm03.stdout:1/998: chown d4/d3a/d32/cda 4242 1 2026-03-09T00:04:19.996 INFO:tasks.workunit.client.0.vm03.stdout:9/842: dread d15/d1c/d21/d54/f80 [0,4194304] 0 2026-03-09T00:04:19.996 INFO:tasks.workunit.client.0.vm03.stdout:9/843: write d15/d1c/fb2 [480069,76946] 0 2026-03-09T00:04:19.996 INFO:tasks.workunit.client.0.vm03.stdout:1/999: dread d4/d15/d77/ff8 [0,4194304] 0 2026-03-09T00:04:19.997 INFO:tasks.workunit.client.0.vm03.stdout:0/941: symlink d2/da/dd/d49/l150 0 2026-03-09T00:04:19.997 INFO:tasks.workunit.client.0.vm03.stdout:0/942: fsync d2/fcd 0 2026-03-09T00:04:19.997 INFO:tasks.workunit.client.0.vm03.stdout:2/932: mknod d8/d1b/d6c/c130 0 2026-03-09T00:04:19.997 INFO:tasks.workunit.client.0.vm03.stdout:9/844: read d15/d1c/d36/f6d [294143,168] 0 2026-03-09T00:04:19.999 INFO:tasks.workunit.client.0.vm03.stdout:5/923: truncate d1c/d20/d55/d4f/d58/d73/d9e/fe4 567092 0 2026-03-09T00:04:20.005 INFO:tasks.workunit.client.0.vm03.stdout:6/867: mkdir d13/d1e/d44/d59/d77/d114/d125 0 2026-03-09T00:04:20.006 INFO:tasks.workunit.client.0.vm03.stdout:7/837: creat d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/db2/ffd x:0 0 0 2026-03-09T00:04:20.006 INFO:tasks.workunit.client.0.vm03.stdout:7/838: getdents d2/d1f/dc6 0 2026-03-09T00:04:20.007 INFO:tasks.workunit.client.0.vm03.stdout:8/913: creat d7/df/d1a/d40/dc8/d101/f116 x:0 0 0 2026-03-09T00:04:20.007 INFO:tasks.workunit.client.0.vm03.stdout:8/914: chown d7/df/d1a/d40/d9d/df2/d3f/d95/fb4 221 1 2026-03-09T00:04:20.007 INFO:tasks.workunit.client.0.vm03.stdout:8/915: chown d7/df/d1a/d40/d9d/df2/d3f/f59 2633102 1 2026-03-09T00:04:20.011 INFO:tasks.workunit.client.0.vm03.stdout:0/943: creat d2/da/d1a/f151 x:0 0 0 2026-03-09T00:04:20.017 INFO:tasks.workunit.client.0.vm03.stdout:2/933: mkdir d8/d26/d5e/d5f/ded/d131 0 2026-03-09T00:04:20.018 INFO:tasks.workunit.client.0.vm03.stdout:9/845: symlink d15/d1c/d21/db5/l11f 0 2026-03-09T00:04:20.018 INFO:tasks.workunit.client.0.vm03.stdout:9/846: dread - d15/d1c/d21/d54/d87/d93/dcf/fe6 zero size 2026-03-09T00:04:20.018 INFO:tasks.workunit.client.0.vm03.stdout:9/847: dread - d15/d1c/d21/d54/d87/fd6 zero size 2026-03-09T00:04:20.018 INFO:tasks.workunit.client.0.vm03.stdout:9/848: chown d15/d1c/d28/l2b 165679 1 2026-03-09T00:04:20.018 INFO:tasks.workunit.client.0.vm03.stdout:9/849: fsync d15/d1c/d21/d54/fdb 0 2026-03-09T00:04:20.022 INFO:tasks.workunit.client.0.vm03.stdout:6/868: link d13/d35/d71/d97/da5/db1/feb d13/d1e/d44/d59/d77/f126 0 2026-03-09T00:04:20.023 INFO:tasks.workunit.client.0.vm03.stdout:7/839: symlink d2/d1f/d3a/lfe 0 2026-03-09T00:04:20.033 INFO:tasks.workunit.client.0.vm03.stdout:2/934: unlink d8/d1b/d24/f86 0 2026-03-09T00:04:20.033 INFO:tasks.workunit.client.0.vm03.stdout:2/935: dread - d8/d1b/d2a/d6b/dc6/ff4 zero size 2026-03-09T00:04:20.033 INFO:tasks.workunit.client.0.vm03.stdout:2/936: fdatasync d8/d1b/d2a/d2e/f124 0 2026-03-09T00:04:20.033 INFO:tasks.workunit.client.0.vm03.stdout:6/869: getdents d13/d35/d71/d97/da5 0 2026-03-09T00:04:20.038 INFO:tasks.workunit.client.0.vm03.stdout:8/916: getdents d7/df 0 2026-03-09T00:04:20.044 INFO:tasks.workunit.client.0.vm03.stdout:2/937: symlink d8/d1b/d24/da5/dfe/d105/d108/d12a/l132 0 2026-03-09T00:04:20.045 INFO:tasks.workunit.client.0.vm03.stdout:2/938: mknod d8/d26/d5e/dc5/c133 0 2026-03-09T00:04:20.045 INFO:tasks.workunit.client.0.vm03.stdout:8/917: getdents d7/df/d1a/d40/d9d/da3/dd2 0 2026-03-09T00:04:20.045 INFO:tasks.workunit.client.0.vm03.stdout:3/749: dwrite d2/db/d56/fb4 [0,4194304] 0 2026-03-09T00:04:20.047 INFO:tasks.workunit.client.0.vm03.stdout:8/918: write d7/df/d1a/d40/d9d/df2/d3f/d95/fb4 [211448,28683] 0 2026-03-09T00:04:20.048 INFO:tasks.workunit.client.0.vm03.stdout:3/750: dread d2/f4e [0,4194304] 0 2026-03-09T00:04:20.049 INFO:tasks.workunit.client.0.vm03.stdout:3/751: mkdir d2/db/d40/d51/da2/de9 0 2026-03-09T00:04:20.049 INFO:tasks.workunit.client.0.vm03.stdout:3/752: chown d2/db/d40/f4a 189 1 2026-03-09T00:04:20.050 INFO:tasks.workunit.client.0.vm03.stdout:3/753: write d2/db/d6a/fa1 [97578,19727] 0 2026-03-09T00:04:20.088 INFO:tasks.workunit.client.0.vm03.stdout:8/919: dread d7/df/d1a/d40/d9d/df2/d3f/f7d [0,4194304] 0 2026-03-09T00:04:20.093 INFO:tasks.workunit.client.0.vm03.stdout:8/920: dread d7/df/d1a/d2b/f44 [0,4194304] 0 2026-03-09T00:04:20.094 INFO:tasks.workunit.client.0.vm03.stdout:8/921: truncate d7/df/d1a/d40/d9d/da3/dd2/fed 19227 0 2026-03-09T00:04:20.094 INFO:tasks.workunit.client.0.vm03.stdout:8/922: symlink d7/df/d1a/d40/d9d/l117 0 2026-03-09T00:04:20.094 INFO:tasks.workunit.client.0.vm03.stdout:8/923: readlink d7/df/d1a/d40/d9d/df2/d38/d4c/la2 0 2026-03-09T00:04:20.094 INFO:tasks.workunit.client.0.vm03.stdout:8/924: mknod d7/df/d1a/c118 0 2026-03-09T00:04:20.095 INFO:tasks.workunit.client.0.vm03.stdout:8/925: unlink d7/df/d1a/d40/d9d/df2/d38/d60/ca4 0 2026-03-09T00:04:20.095 INFO:tasks.workunit.client.0.vm03.stdout:8/926: write d7/df/d1a/d2b/ff4 [997653,99166] 0 2026-03-09T00:04:20.096 INFO:tasks.workunit.client.0.vm03.stdout:8/927: mknod d7/df/c119 0 2026-03-09T00:04:20.109 INFO:tasks.workunit.client.0.vm03.stdout:9/850: dwrite d15/d1c/d21/d54/f80 [0,4194304] 0 2026-03-09T00:04:20.109 INFO:tasks.workunit.client.0.vm03.stdout:9/851: chown d15/d1c/d21/d54/d87/d93/dcf/fe6 22 1 2026-03-09T00:04:20.109 INFO:tasks.workunit.client.0.vm03.stdout:0/944: dwrite d2/da/dd/d49/fa9 [0,4194304] 0 2026-03-09T00:04:20.109 INFO:tasks.workunit.client.0.vm03.stdout:0/945: write d2/da/dd/d49/d6c/da6/f14f [4747038,57238] 0 2026-03-09T00:04:20.112 INFO:tasks.workunit.client.0.vm03.stdout:0/946: symlink d2/da/d76/d8a/d116/l152 0 2026-03-09T00:04:20.112 INFO:tasks.workunit.client.0.vm03.stdout:0/947: stat d2/da/d76/fa1 0 2026-03-09T00:04:20.112 INFO:tasks.workunit.client.0.vm03.stdout:0/948: stat d2/da/dd/d49/d6c/d4b/d55/d6f/dad 0 2026-03-09T00:04:20.112 INFO:tasks.workunit.client.0.vm03.stdout:0/949: read - d2/da/d1a/fb0 zero size 2026-03-09T00:04:20.114 INFO:tasks.workunit.client.0.vm03.stdout:5/924: dwrite fe [4194304,4194304] 0 2026-03-09T00:04:20.114 INFO:tasks.workunit.client.0.vm03.stdout:5/925: readlink d1c/d20/d55/d4f/d58/db5/lbd 0 2026-03-09T00:04:20.117 INFO:tasks.workunit.client.0.vm03.stdout:5/926: link d1c/d20/d55/d4f/d58/d5d/c93 d1c/d20/c128 0 2026-03-09T00:04:20.120 INFO:tasks.workunit.client.0.vm03.stdout:5/927: rename d1c/c31 to d1c/d20/d55/d66/d6b/d8f/c129 0 2026-03-09T00:04:20.120 INFO:tasks.workunit.client.0.vm03.stdout:5/928: fsync d1c/d20/d97/f110 0 2026-03-09T00:04:20.122 INFO:tasks.workunit.client.0.vm03.stdout:5/929: creat d1c/d20/d55/dac/f12a x:0 0 0 2026-03-09T00:04:20.122 INFO:tasks.workunit.client.0.vm03.stdout:5/930: truncate d1c/d51/d6a/d75/fd2 3129482 0 2026-03-09T00:04:20.125 INFO:tasks.workunit.client.0.vm03.stdout:7/840: dwrite d2/d1f/d3a/d24/da4/d46/d54/d8d/dad/d9c/fd7 [0,4194304] 0 2026-03-09T00:04:20.127 INFO:tasks.workunit.client.0.vm03.stdout:7/841: getdents d2/d4/d1e/d5e/daf 0 2026-03-09T00:04:20.139 INFO:tasks.workunit.client.0.vm03.stdout:7/842: mkdir d2/d1f/d3a/d24/da4/d46/d81/d96/d37/d39/dff 0 2026-03-09T00:04:20.139 INFO:tasks.workunit.client.0.vm03.stdout:7/843: creat d2/d4/f100 x:0 0 0 2026-03-09T00:04:20.139 INFO:tasks.workunit.client.0.vm03.stdout:7/844: symlink d2/d1f/d3a/l101 0 2026-03-09T00:04:20.139 INFO:tasks.workunit.client.0.vm03.stdout:7/845: link d2/d1f/d35/l66 d2/d1f/dc6/l102 0 2026-03-09T00:04:20.139 INFO:tasks.workunit.client.0.vm03.stdout:7/846: fdatasync d2/d1f/d3a/d24/da4/d46/d54/f9b 0 2026-03-09T00:04:20.139 INFO:tasks.workunit.client.0.vm03.stdout:7/847: creat d2/d4/d1e/d78/f103 x:0 0 0 2026-03-09T00:04:20.154 INFO:tasks.workunit.client.0.vm03.stdout:2/939: dwrite d8/d26/d5e/d5f/d95/faf [0,4194304] 0 2026-03-09T00:04:20.155 INFO:tasks.workunit.client.0.vm03.stdout:2/940: link d8/d26/d5e/d5f/f60 d8/d1b/d24/da5/dfe/f134 0 2026-03-09T00:04:20.155 INFO:tasks.workunit.client.0.vm03.stdout:2/941: write d8/d26/dfc/fff [459745,46183] 0 2026-03-09T00:04:20.155 INFO:tasks.workunit.client.0.vm03.stdout:2/942: getdents d8/d74/d128 0 2026-03-09T00:04:20.155 INFO:tasks.workunit.client.0.vm03.stdout:2/943: rename d8/d1b/d24/da5/dda/def/d127/f1a to d8/d1b/d2a/d2e/df5/f135 0 2026-03-09T00:04:20.164 INFO:tasks.workunit.client.0.vm03.stdout:2/944: write d8/d26/d5e/d5f/f60 [2286426,941] 0 2026-03-09T00:04:20.169 INFO:tasks.workunit.client.0.vm03.stdout:2/945: link d8/f9b d8/d1b/d2a/d2e/d9a/f136 0 2026-03-09T00:04:20.189 INFO:tasks.workunit.client.0.vm03.stdout:2/946: dread d8/d1b/d2a/f33 [0,4194304] 0 2026-03-09T00:04:20.189 INFO:tasks.workunit.client.0.vm03.stdout:2/947: write d8/fb [1989419,47797] 0 2026-03-09T00:04:20.189 INFO:tasks.workunit.client.0.vm03.stdout:2/948: creat d8/d26/d5e/d5f/ded/f137 x:0 0 0 2026-03-09T00:04:20.190 INFO:tasks.workunit.client.0.vm03.stdout:2/949: dread d8/d1b/d2a/d6b/f8b [0,4194304] 0 2026-03-09T00:04:20.195 INFO:tasks.workunit.client.0.vm03.stdout:2/950: write d8/d1b/d24/da5/dda/def/d127/f34 [7569269,84466] 0 2026-03-09T00:04:20.203 INFO:tasks.workunit.client.0.vm03.stdout:2/951: read d8/d1b/d24/da5/dc9/fe8 [1059207,79487] 0 2026-03-09T00:04:20.203 INFO:tasks.workunit.client.0.vm03.stdout:2/952: getdents d8/d26 0 2026-03-09T00:04:20.203 INFO:tasks.workunit.client.0.vm03.stdout:2/953: creat d8/d1b/d2a/d6b/d50/d8a/f138 x:0 0 0 2026-03-09T00:04:20.219 INFO:tasks.workunit.client.0.vm03.stdout:3/754: dwrite d2/db/d2d/f8b [0,4194304] 0 2026-03-09T00:04:20.221 INFO:tasks.workunit.client.0.vm03.stdout:3/755: dread d2/db/f26 [0,4194304] 0 2026-03-09T00:04:20.221 INFO:tasks.workunit.client.0.vm03.stdout:3/756: chown d2/db/d40/d51/fe4 35265 1 2026-03-09T00:04:20.222 INFO:tasks.workunit.client.0.vm03.stdout:3/757: symlink d2/db/d40/d88/lea 0 2026-03-09T00:04:20.222 INFO:tasks.workunit.client.0.vm03.stdout:3/758: mknod d2/db/d3b/d3f/ceb 0 2026-03-09T00:04:20.225 INFO:tasks.workunit.client.0.vm03.stdout:3/759: creat d2/db/d3b/d5f/da5/fec x:0 0 0 2026-03-09T00:04:20.225 INFO:tasks.workunit.client.0.vm03.stdout:3/760: getdents d2/db/d6a/dc6 0 2026-03-09T00:04:20.228 INFO:tasks.workunit.client.0.vm03.stdout:9/852: dwrite d15/d1c/d21/f25 [0,4194304] 0 2026-03-09T00:04:20.230 INFO:tasks.workunit.client.0.vm03.stdout:0/950: dwrite d2/da/d36/ff6 [0,4194304] 0 2026-03-09T00:04:20.230 INFO:tasks.workunit.client.0.vm03.stdout:0/951: fdatasync d2/da/f4f 0 2026-03-09T00:04:20.231 INFO:tasks.workunit.client.0.vm03.stdout:9/853: rmdir d15/d1c/d28/dda/d11c 0 2026-03-09T00:04:20.233 INFO:tasks.workunit.client.0.vm03.stdout:0/952: truncate d2/da/dd/d49/d6c/d4b/f67 699262 0 2026-03-09T00:04:20.234 INFO:tasks.workunit.client.0.vm03.stdout:0/953: unlink d2/da/dd/d49/ff2 0 2026-03-09T00:04:20.234 INFO:tasks.workunit.client.0.vm03.stdout:0/954: chown d2/da/dd/d6e/le4 4579383 1 2026-03-09T00:04:20.234 INFO:tasks.workunit.client.0.vm03.stdout:0/955: dread - d2/da/d36/ddf/df7/f105 zero size 2026-03-09T00:04:20.235 INFO:tasks.workunit.client.0.vm03.stdout:0/956: mknod d2/da/d36/ddf/d12f/d130/c153 0 2026-03-09T00:04:20.259 INFO:tasks.workunit.client.0.vm03.stdout:5/931: write d1c/d20/d55/d4f/d58/d73/d9e/fe4 [389904,46907] 0 2026-03-09T00:04:20.260 INFO:tasks.workunit.client.0.vm03.stdout:7/848: dwrite d2/d1f/d3a/d24/da4/d46/d81/d96/f44 [4194304,4194304] 0 2026-03-09T00:04:20.261 INFO:tasks.workunit.client.0.vm03.stdout:7/849: fsync d2/d4/db7/d67/f64 0 2026-03-09T00:04:20.261 INFO:tasks.workunit.client.0.vm03.stdout:7/850: truncate d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/ff1 289773 0 2026-03-09T00:04:20.262 INFO:tasks.workunit.client.0.vm03.stdout:5/932: mkdir d1c/d20/dc0/d12b 0 2026-03-09T00:04:20.262 INFO:tasks.workunit.client.0.vm03.stdout:5/933: chown d1c/d20/c28 17830504 1 2026-03-09T00:04:20.262 INFO:tasks.workunit.client.0.vm03.stdout:5/934: read - d1c/f125 zero size 2026-03-09T00:04:20.264 INFO:tasks.workunit.client.0.vm03.stdout:5/935: creat d1c/d20/d55/d66/dc6/d108/f12c x:0 0 0 2026-03-09T00:04:20.264 INFO:tasks.workunit.client.0.vm03.stdout:5/936: dread - d1c/d20/d55/d66/d70/f72 zero size 2026-03-09T00:04:20.264 INFO:tasks.workunit.client.0.vm03.stdout:5/937: write ff [4038143,19021] 0 2026-03-09T00:04:20.266 INFO:tasks.workunit.client.0.vm03.stdout:5/938: dread d1c/d20/d55/f61 [0,4194304] 0 2026-03-09T00:04:20.266 INFO:tasks.workunit.client.0.vm03.stdout:5/939: creat d1c/d20/d55/d4f/f12d x:0 0 0 2026-03-09T00:04:20.271 INFO:tasks.workunit.client.0.vm03.stdout:7/851: dread d2/d4/d1e/fae [0,4194304] 0 2026-03-09T00:04:20.275 INFO:tasks.workunit.client.0.vm03.stdout:7/852: dread d2/d4/db7/daa/fb3 [0,4194304] 0 2026-03-09T00:04:20.277 INFO:tasks.workunit.client.0.vm03.stdout:5/940: write d1c/d20/d55/f34 [667103,94711] 0 2026-03-09T00:04:20.279 INFO:tasks.workunit.client.0.vm03.stdout:5/941: dread d1c/d20/d55/d4f/d58/d73/d9e/fd1 [0,4194304] 0 2026-03-09T00:04:20.287 INFO:tasks.workunit.client.0.vm03.stdout:7/853: write d2/d1f/f62 [2574638,4095] 0 2026-03-09T00:04:20.287 INFO:tasks.workunit.client.0.vm03.stdout:7/854: write d2/d1f/d3a/d24/da4/d46/d81/d96/d8e/db2/ffd [210464,35153] 0 2026-03-09T00:04:20.289 INFO:tasks.workunit.client.0.vm03.stdout:9/854: dwrite d15/d1c/d28/dda/f10a [0,4194304] 0 2026-03-09T00:04:20.289 INFO:tasks.workunit.client.0.vm03.stdout:9/855: chown d15/d1c/d28/c49 165 1 2026-03-09T00:04:20.296 INFO:tasks.workunit.client.0.vm03.stdout:8/928: dwrite d7/df/d1a/d2b/ff4 [0,4194304] 0 2026-03-09T00:04:20.301 INFO:tasks.workunit.client.0.vm03.stdout:8/929: creat d7/df/d1a/d40/d9d/df2/dad/f11a x:0 0 0 2026-03-09T00:04:20.306 INFO:tasks.workunit.client.0.vm03.stdout:8/930: truncate d7/df/d1a/d40/d9d/df2/d3f/d95/fb4 173775 0 2026-03-09T00:04:20.324 INFO:tasks.workunit.client.0.vm03.stdout:8/931: chown d7/df/d1a/d40/d58/f109 0 1 2026-03-09T00:04:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/932: link d7/df/d1a/d40/d9d/da3/df0/f10a d7/df/d1a/d40/d9d/df2/d38/d4c/d98/f11b 0 2026-03-09T00:04:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/933: truncate d7/df/d1a/d40/d9d/df2/d3f/f7d 1683623 0 2026-03-09T00:04:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/934: mkdir d7/df/d1a/d40/d9d/df2/d11c 0 2026-03-09T00:04:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/935: mkdir d7/df/d1a/d40/d9d/df2/d38/d91/d103/d11d 0 2026-03-09T00:04:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/936: link d7/df/d1a/cbb d7/df/d1a/d40/db3/c11e 0 2026-03-09T00:04:20.325 INFO:tasks.workunit.client.0.vm03.stdout:8/937: symlink d7/df/d1a/d40/d9d/df2/d38/d91/l11f 0 2026-03-09T00:04:20.348 INFO:tasks.workunit.client.0.vm03.stdout:5/942: dwrite d1c/d20/d55/f5a [0,4194304] 0 2026-03-09T00:04:20.350 INFO:tasks.workunit.client.0.vm03.stdout:5/943: mkdir d1c/d107/d12e 0 2026-03-09T00:04:20.350 INFO:tasks.workunit.client.0.vm03.stdout:3/761: dwrite d2/db/d2d/f45 [0,4194304] 0 2026-03-09T00:04:20.357 INFO:tasks.workunit.client.0.vm03.stdout:7/855: dwrite d2/fcd [4194304,4194304] 0 2026-03-09T00:04:20.357 INFO:tasks.workunit.client.0.vm03.stdout:7/856: chown d2/ce 6875600 1 2026-03-09T00:04:20.359 INFO:tasks.workunit.client.0.vm03.stdout:0/957: rmdir d2 39 2026-03-09T00:04:20.359 INFO:tasks.workunit.client.0.vm03.stdout:0/958: chown d2/da/dd/d49/d6c/da6/fc2 802 1 2026-03-09T00:04:20.365 INFO:tasks.workunit.client.0.vm03.stdout:3/762: symlink d2/db/d40/d58/led 0 2026-03-09T00:04:20.371 INFO:tasks.workunit.client.0.vm03.stdout:3/763: creat d2/db/d3b/d5f/da5/d72/dbd/fee x:0 0 0 2026-03-09T00:04:20.371 INFO:tasks.workunit.client.0.vm03.stdout:3/764: write d2/db/d2d/f52 [4716190,42534] 0 2026-03-09T00:04:20.371 INFO:tasks.workunit.client.0.vm03.stdout:3/765: stat d2/f4e 0 2026-03-09T00:04:20.371 INFO:tasks.workunit.client.0.vm03.stdout:3/766: chown d2/db/d6a/dc6/lcc 2734131 1 2026-03-09T00:04:20.375 INFO:tasks.workunit.client.0.vm03.stdout:7/857: rename d2/d1f/d3a/d24/da4/d46 to d2/d4/d1e/dee/d104 0 2026-03-09T00:04:20.377 INFO:tasks.workunit.client.0.vm03.stdout:7/858: symlink d2/d4/d1e/dee/d104/d81/d96/d37/l105 0 2026-03-09T00:04:20.380 INFO:tasks.workunit.client.0.vm03.stdout:8/938: dwrite d7/df/d1a/d40/d58/fe1 [0,4194304] 0 2026-03-09T00:04:20.436 INFO:tasks.workunit.client.0.vm03.stdout:3/767: dwrite d2/db/d40/d58/f7f [0,4194304] 0 2026-03-09T00:04:20.436 INFO:tasks.workunit.client.0.vm03.stdout:3/768: readlink d2/db/d3b/d5f/ld5 0 2026-03-09T00:04:20.436 INFO:tasks.workunit.client.0.vm03.stdout:3/769: dread - d2/db/d6a/f83 zero size 2026-03-09T00:04:20.441 INFO:tasks.workunit.client.0.vm03.stdout:5/944: dwrite d1c/d20/d55/d66/dc6/df1/f121 [0,4194304] 0 2026-03-09T00:04:20.441 INFO:tasks.workunit.client.0.vm03.stdout:5/945: readlink d1c/d20/d55/d4f/d58/d73/d76/d8e/l126 0 2026-03-09T00:04:20.444 INFO:tasks.workunit.client.0.vm03.stdout:3/770: link d2/db/d3b/f3e d2/db/d3b/d5f/da5/d72/d96/fef 0 2026-03-09T00:04:20.445 INFO:tasks.workunit.client.0.vm03.stdout:6/870: sync 2026-03-09T00:04:20.451 INFO:tasks.workunit.client.0.vm03.stdout:3/771: mknod d2/db/d3b/d3f/daf/cf0 0 2026-03-09T00:04:20.457 INFO:tasks.workunit.client.0.vm03.stdout:3/772: dread d2/db/d3b/d5f/da5/d72/d96/fef [0,4194304] 0 2026-03-09T00:04:20.468 INFO:tasks.workunit.client.0.vm03.stdout:2/954: sync 2026-03-09T00:04:20.468 INFO:tasks.workunit.client.0.vm03.stdout:2/955: truncate d8/d26/d5e/dc5/fde 815555 0 2026-03-09T00:04:20.468 INFO:tasks.workunit.client.0.vm03.stdout:2/956: chown d8/d1b/d6c 183534 1 2026-03-09T00:04:20.489 INFO:tasks.workunit.client.0.vm03.stdout:2/957: dread d8/d1b/d2a/d56/fa4 [0,4194304] 0 2026-03-09T00:04:20.489 INFO:tasks.workunit.client.0.vm03.stdout:2/958: getdents d8/d26/d5e 0 2026-03-09T00:04:20.490 INFO:tasks.workunit.client.0.vm03.stdout:2/959: read d8/d1b/d2a/d6b/d50/d8a/fc3 [237313,39639] 0 2026-03-09T00:04:20.490 INFO:tasks.workunit.client.0.vm03.stdout:2/960: truncate d8/d1b/d2a/d6b/d50/d8a/fc3 1395537 0 2026-03-09T00:04:20.490 INFO:tasks.workunit.client.0.vm03.stdout:2/961: readlink d8/d1b/d24/la1 0 2026-03-09T00:04:20.491 INFO:tasks.workunit.client.0.vm03.stdout:2/962: mkdir d8/d1b/d2a/d6b/d139 0 2026-03-09T00:04:20.491 INFO:tasks.workunit.client.0.vm03.stdout:2/963: dread - d8/d1b/d2a/d6b/dc6/ff4 zero size 2026-03-09T00:04:20.491 INFO:tasks.workunit.client.0.vm03.stdout:2/964: creat d8/d1b/d8f/f13a x:0 0 0 2026-03-09T00:04:20.499 INFO:tasks.workunit.client.0.vm03.stdout:7/859: dwrite d2/d4/d1e/dee/d104/d81/d96/d37/f56 [0,4194304] 0 2026-03-09T00:04:20.499 INFO:tasks.workunit.client.0.vm03.stdout:9/856: dwrite d15/d1c/d36/f9e [0,4194304] 0 2026-03-09T00:04:20.504 INFO:tasks.workunit.client.0.vm03.stdout:7/860: creat d2/d4/d1e/dee/d104/d81/d96/d37/d39/f106 x:0 0 0 2026-03-09T00:04:20.509 INFO:tasks.workunit.client.0.vm03.stdout:3/773: dread d2/db/d3b/d3f/f46 [0,4194304] 0 2026-03-09T00:04:20.509 INFO:tasks.workunit.client.0.vm03.stdout:3/774: write d2/db/d3b/d5f/da5/d72/dbd/fe0 [954004,75355] 0 2026-03-09T00:04:20.510 INFO:tasks.workunit.client.0.vm03.stdout:7/861: creat d2/d1f/d35/f107 x:0 0 0 2026-03-09T00:04:20.510 INFO:tasks.workunit.client.0.vm03.stdout:7/862: creat d2/d4/db7/d67/d6b/f108 x:0 0 0 2026-03-09T00:04:20.510 INFO:tasks.workunit.client.0.vm03.stdout:3/775: truncate d2/db/d40/d44/f4d 2778663 0 2026-03-09T00:04:20.515 INFO:tasks.workunit.client.0.vm03.stdout:7/863: creat d2/d4/d1e/dee/d104/d54/f109 x:0 0 0 2026-03-09T00:04:20.515 INFO:tasks.workunit.client.0.vm03.stdout:7/864: truncate d2/d4/db7/fcb 105996 0 2026-03-09T00:04:20.516 INFO:tasks.workunit.client.0.vm03.stdout:7/865: link d2/d4/d1e/dee/d104/d81/d96/d37/f4c d2/d4/d1e/dee/d104/d81/d96/da2/f10a 0 2026-03-09T00:04:20.517 INFO:tasks.workunit.client.0.vm03.stdout:7/866: creat d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/f10b x:0 0 0 2026-03-09T00:04:20.566 INFO:tasks.workunit.client.0.vm03.stdout:8/939: dwrite f6 [0,4194304] 0 2026-03-09T00:04:20.575 INFO:tasks.workunit.client.0.vm03.stdout:8/940: write d7/f9c [20479,28280] 0 2026-03-09T00:04:20.575 INFO:tasks.workunit.client.0.vm03.stdout:8/941: write d7/df/f31 [501786,64046] 0 2026-03-09T00:04:20.582 INFO:tasks.workunit.client.0.vm03.stdout:5/946: dwrite d1c/d20/d55/d4f/d58/db5/df7/ff8 [0,4194304] 0 2026-03-09T00:04:20.582 INFO:tasks.workunit.client.0.vm03.stdout:6/871: dwrite d13/d35/f9e [0,4194304] 0 2026-03-09T00:04:20.587 INFO:tasks.workunit.client.0.vm03.stdout:2/965: dwrite d8/d1b/d2a/d6b/dc6/ff4 [0,4194304] 0 2026-03-09T00:04:20.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:20 vm03.local ceph-mon[52346]: mgrmap e31: vm03.yvcons(active, since 20s), standbys: vm06.rzcvhn 2026-03-09T00:04:20.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:20 vm03.local ceph-mon[52346]: pgmap v13: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 110 MiB/s rd, 132 MiB/s wr, 197 op/s 2026-03-09T00:04:20.588 INFO:tasks.workunit.client.0.vm03.stdout:6/872: mkdir d13/d35/d74/d127 0 2026-03-09T00:04:20.588 INFO:tasks.workunit.client.0.vm03.stdout:6/873: dread - d13/d35/d71/d97/da5/db1/feb zero size 2026-03-09T00:04:20.588 INFO:tasks.workunit.client.0.vm03.stdout:6/874: write d13/d35/d72/dcc/f111 [637623,69134] 0 2026-03-09T00:04:20.588 INFO:tasks.workunit.client.0.vm03.stdout:5/947: mknod d1c/d20/d55/d43/c12f 0 2026-03-09T00:04:20.589 INFO:tasks.workunit.client.0.vm03.stdout:2/966: mknod d8/d26/d5e/dc5/d104/c13b 0 2026-03-09T00:04:20.589 INFO:tasks.workunit.client.0.vm03.stdout:6/875: rmdir d13/d35/d71/d97/ded 39 2026-03-09T00:04:20.589 INFO:tasks.workunit.client.0.vm03.stdout:6/876: chown fb 65985920 1 2026-03-09T00:04:20.596 INFO:tasks.workunit.client.0.vm03.stdout:2/967: truncate d8/d1b/f1f 2831831 0 2026-03-09T00:04:20.597 INFO:tasks.workunit.client.0.vm03.stdout:6/877: mkdir d13/d1e/d44/d4a/d128 0 2026-03-09T00:04:20.599 INFO:tasks.workunit.client.0.vm03.stdout:2/968: rename d8/d1b/d8f/f12b to d8/d26/f13c 0 2026-03-09T00:04:20.609 INFO:tasks.workunit.client.0.vm03.stdout:6/878: rename d13/f92 to d13/d35/d74/d89/db3/f129 0 2026-03-09T00:04:20.609 INFO:tasks.workunit.client.0.vm03.stdout:6/879: rename d13/d35/le8 to d13/d35/d71/l12a 0 2026-03-09T00:04:20.612 INFO:tasks.workunit.client.0.vm03.stdout:6/880: write d13/d1e/f9f [2650411,107005] 0 2026-03-09T00:04:20.612 INFO:tasks.workunit.client.0.vm03.stdout:0/959: dwrite d2/da/dd/d49/f69 [0,4194304] 0 2026-03-09T00:04:20.612 INFO:tasks.workunit.client.0.vm03.stdout:0/960: readlink d2/da/dd/d49/d6c/d4b/d55/l96 0 2026-03-09T00:04:20.613 INFO:tasks.workunit.client.0.vm03.stdout:3/776: dwrite d2/db/d40/f4a [0,4194304] 0 2026-03-09T00:04:20.614 INFO:tasks.workunit.client.0.vm03.stdout:7/867: dwrite d2/d1f/d35/f5a [0,4194304] 0 2026-03-09T00:04:20.614 INFO:tasks.workunit.client.0.vm03.stdout:7/868: chown d2/d1f/c10 7162 1 2026-03-09T00:04:20.616 INFO:tasks.workunit.client.0.vm03.stdout:2/969: dread d8/d1b/d2a/d6b/dc6/ff4 [0,4194304] 0 2026-03-09T00:04:20.621 INFO:tasks.workunit.client.0.vm03.stdout:0/961: link d2/da/dd/d49/d6c/da6/dda/fc5 d2/da/dd/f154 0 2026-03-09T00:04:20.623 INFO:tasks.workunit.client.0.vm03.stdout:3/777: mkdir d2/db/d56/df1 0 2026-03-09T00:04:20.623 INFO:tasks.workunit.client.0.vm03.stdout:3/778: chown d2/db/d2d/f8b 3709800 1 2026-03-09T00:04:20.623 INFO:tasks.workunit.client.0.vm03.stdout:3/779: readlink d2/db/d40/d58/led 0 2026-03-09T00:04:20.625 INFO:tasks.workunit.client.0.vm03.stdout:7/869: link d2/d4/d1e/dee/d104/d54/f9b d2/d1f/d3a/f10c 0 2026-03-09T00:04:20.625 INFO:tasks.workunit.client.0.vm03.stdout:7/870: readlink d2/d4/d8c/le0 0 2026-03-09T00:04:20.625 INFO:tasks.workunit.client.0.vm03.stdout:7/871: fdatasync d2/d4/d1e/dee/d104/ff2 0 2026-03-09T00:04:20.625 INFO:tasks.workunit.client.0.vm03.stdout:7/872: write d2/d4/f2e [321617,43230] 0 2026-03-09T00:04:20.626 INFO:tasks.workunit.client.0.vm03.stdout:3/780: read d2/db/f1a [39644,130605] 0 2026-03-09T00:04:20.629 INFO:tasks.workunit.client.0.vm03.stdout:8/942: dwrite d7/df/d1a/d40/d9d/df2/d38/d60/dcd/f106 [0,4194304] 0 2026-03-09T00:04:20.629 INFO:tasks.workunit.client.0.vm03.stdout:8/943: chown d7/df/c45 0 1 2026-03-09T00:04:20.631 INFO:tasks.workunit.client.0.vm03.stdout:3/781: write d2/fc5 [3172816,97613] 0 2026-03-09T00:04:20.632 INFO:tasks.workunit.client.0.vm03.stdout:2/970: link d8/d1b/d6c/f7b d8/d26/f13d 0 2026-03-09T00:04:20.632 INFO:tasks.workunit.client.0.vm03.stdout:2/971: fdatasync d8/d26/d5e/f7c 0 2026-03-09T00:04:20.632 INFO:tasks.workunit.client.0.vm03.stdout:2/972: dread - d8/d1b/d2a/d6b/fd5 zero size 2026-03-09T00:04:20.633 INFO:tasks.workunit.client.0.vm03.stdout:0/962: link d2/da/dd/d49/d6c/d4b/d55/d6f/lf3 d2/da/d4e/l155 0 2026-03-09T00:04:20.635 INFO:tasks.workunit.client.0.vm03.stdout:7/873: rename d2/d4/d1e/dee/d104/d81/d96/d37/fb0 to d2/d4/d1e/dee/d104/d81/d96/d8e/db2/f10d 0 2026-03-09T00:04:20.643 INFO:tasks.workunit.client.0.vm03.stdout:3/782: creat d2/db/de6/ff2 x:0 0 0 2026-03-09T00:04:20.659 INFO:tasks.workunit.client.0.vm03.stdout:2/973: mknod d8/d26/d5e/d5f/d95/de5/c13e 0 2026-03-09T00:04:20.661 INFO:tasks.workunit.client.0.vm03.stdout:0/963: mkdir d2/da/d76/d8a/d156 0 2026-03-09T00:04:20.664 INFO:tasks.workunit.client.0.vm03.stdout:8/944: rename d7/df/d1a/d40/d9d/df2/d38/d91/d103/lca to d7/df/d1a/d40/d9d/df2/d3f/l120 0 2026-03-09T00:04:20.666 INFO:tasks.workunit.client.0.vm03.stdout:3/783: link d2/db/d3b/d3f/f7c d2/db/d6a/dc6/ff3 0 2026-03-09T00:04:20.670 INFO:tasks.workunit.client.0.vm03.stdout:7/874: rmdir d2/d1f/d35 39 2026-03-09T00:04:20.677 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:20 vm06.local ceph-mon[58395]: mgrmap e31: vm03.yvcons(active, since 20s), standbys: vm06.rzcvhn 2026-03-09T00:04:20.677 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:20 vm06.local ceph-mon[58395]: pgmap v13: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 110 MiB/s rd, 132 MiB/s wr, 197 op/s 2026-03-09T00:04:20.677 INFO:tasks.workunit.client.0.vm03.stdout:2/974: rename d8/d1b/d2a/d56/f96 to d8/d1b/d2a/d6b/d139/f13f 0 2026-03-09T00:04:20.677 INFO:tasks.workunit.client.0.vm03.stdout:2/975: chown d8/d1b/d8f/de6 1211 1 2026-03-09T00:04:20.677 INFO:tasks.workunit.client.0.vm03.stdout:2/976: truncate d8/d26/d5e/d5f/ded/f137 612997 0 2026-03-09T00:04:20.677 INFO:tasks.workunit.client.0.vm03.stdout:2/977: write d8/d1b/d2a/f4c [6316994,107368] 0 2026-03-09T00:04:20.677 INFO:tasks.workunit.client.0.vm03.stdout:3/784: mknod d2/db/de6/cf4 0 2026-03-09T00:04:20.685 INFO:tasks.workunit.client.0.vm03.stdout:7/875: write d2/d4/d1e/dee/d104/d81/ff3 [7419242,98952] 0 2026-03-09T00:04:20.685 INFO:tasks.workunit.client.0.vm03.stdout:3/785: creat d2/db/d40/d58/ff5 x:0 0 0 2026-03-09T00:04:20.685 INFO:tasks.workunit.client.0.vm03.stdout:3/786: chown d2/db/d3b/d3f/db8 2822042 1 2026-03-09T00:04:20.687 INFO:tasks.workunit.client.0.vm03.stdout:7/876: mknod d2/d4/d1e/dee/d104/d81/d96/d37/c10e 0 2026-03-09T00:04:20.687 INFO:tasks.workunit.client.0.vm03.stdout:7/877: write d2/d4/f2e [598210,61653] 0 2026-03-09T00:04:20.688 INFO:tasks.workunit.client.0.vm03.stdout:3/787: mknod d2/db/d3b/d5f/d65/cf6 0 2026-03-09T00:04:20.689 INFO:tasks.workunit.client.0.vm03.stdout:3/788: rename d2/db/d3b/d5f/da5/dd8/ld9 to d2/db/d6a/dc6/lf7 0 2026-03-09T00:04:20.695 INFO:tasks.workunit.client.0.vm03.stdout:3/789: dread d2/db/d3b/d5f/da5/dd8/fe3 [0,4194304] 0 2026-03-09T00:04:20.695 INFO:tasks.workunit.client.0.vm03.stdout:3/790: write d2/db/d2d/f36 [4752660,7458] 0 2026-03-09T00:04:20.695 INFO:tasks.workunit.client.0.vm03.stdout:9/857: dwrite d15/d1c/d21/d75/fa4 [0,4194304] 0 2026-03-09T00:04:20.699 INFO:tasks.workunit.client.0.vm03.stdout:9/858: write d15/d1c/f102 [1965237,52865] 0 2026-03-09T00:04:20.702 INFO:tasks.workunit.client.0.vm03.stdout:9/859: creat d15/d1c/d21/d64/f120 x:0 0 0 2026-03-09T00:04:20.702 INFO:tasks.workunit.client.0.vm03.stdout:9/860: write d15/d1c/f102 [5001251,613] 0 2026-03-09T00:04:20.702 INFO:tasks.workunit.client.0.vm03.stdout:3/791: symlink d2/db/d40/d51/da2/lf8 0 2026-03-09T00:04:20.702 INFO:tasks.workunit.client.0.vm03.stdout:3/792: truncate d2/db/d3b/f95 1579203 0 2026-03-09T00:04:20.702 INFO:tasks.workunit.client.0.vm03.stdout:3/793: fdatasync d2/db/d40/d88/fde 0 2026-03-09T00:04:20.702 INFO:tasks.workunit.client.0.vm03.stdout:3/794: dread - d2/db/d40/d88/fde zero size 2026-03-09T00:04:20.713 INFO:tasks.workunit.client.0.vm03.stdout:9/861: getdents d15/d1c/d21/db5 0 2026-03-09T00:04:20.714 INFO:tasks.workunit.client.0.vm03.stdout:9/862: rename d15/d1c/d36/d4d/dc4/dec/lf5 to d15/d77/de2/l121 0 2026-03-09T00:04:20.715 INFO:tasks.workunit.client.0.vm03.stdout:9/863: rmdir d15/d1c/d28/de1 39 2026-03-09T00:04:20.739 INFO:tasks.workunit.client.0.vm03.stdout:2/978: dwrite d8/d26/d5e/d6f/f126 [0,4194304] 0 2026-03-09T00:04:20.739 INFO:tasks.workunit.client.0.vm03.stdout:2/979: chown d8/d26/d5e/d5f/ded/c125 14168177 1 2026-03-09T00:04:20.771 INFO:tasks.workunit.client.0.vm03.stdout:7/878: dwrite d2/d4/d1e/dee/d104/d81/d96/d37/d39/d6e/fed [0,4194304] 0 2026-03-09T00:04:20.778 INFO:tasks.workunit.client.0.vm03.stdout:7/879: rename d2/d4/d1e/dee/d104/d81/d96/d37/l105 to d2/d4/db7/d67/l10f 0 2026-03-09T00:04:20.815 INFO:tasks.workunit.client.0.vm03.stdout:2/980: dwrite d8/d26/d5e/d5f/ded/f137 [0,4194304] 0 2026-03-09T00:04:20.817 INFO:tasks.workunit.client.0.vm03.stdout:9/864: dwrite d15/d7f/fdf [0,4194304] 0 2026-03-09T00:04:20.817 INFO:tasks.workunit.client.0.vm03.stdout:9/865: fdatasync d15/d1c/d21/fdc 0 2026-03-09T00:04:20.820 INFO:tasks.workunit.client.0.vm03.stdout:6/881: sync 2026-03-09T00:04:20.820 INFO:tasks.workunit.client.0.vm03.stdout:5/948: sync 2026-03-09T00:04:20.820 INFO:tasks.workunit.client.0.vm03.stdout:5/949: chown d1c/d20/d56/c5b 0 1 2026-03-09T00:04:20.820 INFO:tasks.workunit.client.0.vm03.stdout:5/950: dread - d1c/d51/d6a/d75/df0/f100 zero size 2026-03-09T00:04:20.820 INFO:tasks.workunit.client.0.vm03.stdout:5/951: write d1c/d20/d56/da1/f106 [62026,113186] 0 2026-03-09T00:04:20.824 INFO:tasks.workunit.client.0.vm03.stdout:2/981: link d8/d1b/f8d d8/d1b/d24/da5/dfe/d105/f140 0 2026-03-09T00:04:20.827 INFO:tasks.workunit.client.0.vm03.stdout:6/882: mknod d13/d1e/d44/d59/dec/d62/df5/c12b 0 2026-03-09T00:04:20.827 INFO:tasks.workunit.client.0.vm03.stdout:6/883: fdatasync d13/d1e/d44/d59/dec/d62/f79 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:5/952: mknod d1c/d51/df2/c130 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:5/953: fdatasync ff 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:5/954: stat d1c/d20/d55/f9b 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:5/955: write d1c/d20/d55/d66/d70/fde [318068,106114] 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:2/982: mkdir d8/d1b/d2a/d6b/d50/d8a/d141 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:2/983: creat d8/d26/d5e/d5f/ded/f142 x:0 0 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:2/984: dread - d8/d1b/f8d zero size 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:2/985: creat d8/d1b/d2a/d6b/d50/f143 x:0 0 0 2026-03-09T00:04:20.828 INFO:tasks.workunit.client.0.vm03.stdout:5/956: mknod d1c/d20/d55/d66/dc6/c131 0 2026-03-09T00:04:20.833 INFO:tasks.workunit.client.0.vm03.stdout:2/986: creat d8/d1b/d24/da5/dfe/d105/d108/d12a/f144 x:0 0 0 2026-03-09T00:04:20.834 INFO:tasks.workunit.client.0.vm03.stdout:2/987: truncate d8/d26/d5e/dc5/fde 1728340 0 2026-03-09T00:04:20.834 INFO:tasks.workunit.client.0.vm03.stdout:2/988: write d8/d1b/d6c/f90 [970801,97477] 0 2026-03-09T00:04:20.834 INFO:tasks.workunit.client.0.vm03.stdout:2/989: chown d8/d26/d5e/d5f/d95 2 1 2026-03-09T00:04:20.834 INFO:tasks.workunit.client.0.vm03.stdout:5/957: read d1c/d20/d55/d66/d70/f71 [747745,8641] 0 2026-03-09T00:04:20.835 INFO:tasks.workunit.client.0.vm03.stdout:0/964: dwrite d2/da/dd/d49/d6c/f9d [0,4194304] 0 2026-03-09T00:04:20.836 INFO:tasks.workunit.client.0.vm03.stdout:2/990: creat d8/d1b/d24/da5/dda/f145 x:0 0 0 2026-03-09T00:04:20.836 INFO:tasks.workunit.client.0.vm03.stdout:2/991: chown d8/d1b/d6c/dd7 0 1 2026-03-09T00:04:20.841 INFO:tasks.workunit.client.0.vm03.stdout:8/945: dwrite d7/df/f53 [0,4194304] 0 2026-03-09T00:04:20.852 INFO:tasks.workunit.client.0.vm03.stdout:5/958: creat d1c/d20/d55/d4f/d58/d73/f132 x:0 0 0 2026-03-09T00:04:20.863 INFO:tasks.workunit.client.0.vm03.stdout:8/946: mknod d7/df/d1a/d40/d9d/df2/d38/d60/c121 0 2026-03-09T00:04:20.866 INFO:tasks.workunit.client.0.vm03.stdout:8/947: dread d7/df/d1a/d40/d58/fe8 [4194304,4194304] 0 2026-03-09T00:04:20.869 INFO:tasks.workunit.client.0.vm03.stdout:5/959: mkdir d1c/d51/df2/d11a/d133 0 2026-03-09T00:04:20.869 INFO:tasks.workunit.client.0.vm03.stdout:8/948: truncate d7/df/d1a/d2b/d62/fd5 4150076 0 2026-03-09T00:04:20.870 INFO:tasks.workunit.client.0.vm03.stdout:5/960: dread d1c/d20/d55/d43/f4d [0,4194304] 0 2026-03-09T00:04:20.871 INFO:tasks.workunit.client.0.vm03.stdout:6/884: dwrite d13/d35/f119 [0,4194304] 0 2026-03-09T00:04:20.873 INFO:tasks.workunit.client.0.vm03.stdout:8/949: creat d7/df/d1a/d40/d9d/df2/d38/d4c/d98/f122 x:0 0 0 2026-03-09T00:04:20.874 INFO:tasks.workunit.client.0.vm03.stdout:5/961: mknod d1c/d20/d97/d11b/c134 0 2026-03-09T00:04:20.874 INFO:tasks.workunit.client.0.vm03.stdout:5/962: chown d1c/d51/df2 18 1 2026-03-09T00:04:20.875 INFO:tasks.workunit.client.0.vm03.stdout:6/885: creat d13/dc4/dea/dd7/f12c x:0 0 0 2026-03-09T00:04:20.876 INFO:tasks.workunit.client.0.vm03.stdout:8/950: symlink d7/df/d1a/d40/d9d/df2/d38/d4c/d98/l123 0 2026-03-09T00:04:20.876 INFO:tasks.workunit.client.0.vm03.stdout:5/963: mkdir d1c/d20/d55/d4f/d58/d5d/d119/d135 0 2026-03-09T00:04:20.877 INFO:tasks.workunit.client.0.vm03.stdout:8/951: rename d7/df/d1a/d40/d9d/df2/dad/lef to d7/df/d1a/d40/dc8/d101/l124 0 2026-03-09T00:04:20.877 INFO:tasks.workunit.client.0.vm03.stdout:5/964: creat d1c/d20/d55/d66/d6b/de3/f136 x:0 0 0 2026-03-09T00:04:20.877 INFO:tasks.workunit.client.0.vm03.stdout:5/965: stat d1c/d20/d56/l95 0 2026-03-09T00:04:20.880 INFO:tasks.workunit.client.0.vm03.stdout:6/886: dread d13/f31 [0,4194304] 0 2026-03-09T00:04:20.880 INFO:tasks.workunit.client.0.vm03.stdout:6/887: chown d13 1354 1 2026-03-09T00:04:20.881 INFO:tasks.workunit.client.0.vm03.stdout:8/952: rmdir d7/df/d1a/d40/d9d/df2/d11c 0 2026-03-09T00:04:20.882 INFO:tasks.workunit.client.0.vm03.stdout:5/966: mknod d1c/d20/d56/da1/c137 0 2026-03-09T00:04:20.883 INFO:tasks.workunit.client.0.vm03.stdout:8/953: mkdir d7/df/d1a/d40/d9d/df2/dc3/d125 0 2026-03-09T00:04:20.883 INFO:tasks.workunit.client.0.vm03.stdout:6/888: dread d13/d1e/d44/d59/dec/d62/fa0 [0,4194304] 0 2026-03-09T00:04:20.883 INFO:tasks.workunit.client.0.vm03.stdout:2/992: dwrite d8/d1b/d24/da5/dda/de0/ff1 [0,4194304] 0 2026-03-09T00:04:20.885 INFO:tasks.workunit.client.0.vm03.stdout:6/889: unlink d13/d1e/d44/d59/dec/la4 0 2026-03-09T00:04:20.885 INFO:tasks.workunit.client.0.vm03.stdout:6/890: chown d13/d35/c3d 192 1 2026-03-09T00:04:20.887 INFO:tasks.workunit.client.0.vm03.stdout:2/993: mknod d8/d1b/d24/da5/dfe/d105/d108/d12a/c146 0 2026-03-09T00:04:20.893 INFO:tasks.workunit.client.0.vm03.stdout:8/954: dread d7/df/d1a/d40/d9d/df2/d3f/f59 [0,4194304] 0 2026-03-09T00:04:20.900 INFO:tasks.workunit.client.0.vm03.stdout:5/967: symlink d1c/l138 0 2026-03-09T00:04:20.900 INFO:tasks.workunit.client.0.vm03.stdout:5/968: chown d1c/d20/f25 11469 1 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:9/866: sync 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:9/867: fsync d15/d77/f11e 0 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:7/880: sync 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:7/881: stat d2/d4/d1e/dee/d104/d81/le8 0 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:7/882: creat d2/d4/db7/daa/f110 x:0 0 0 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:3/795: sync 2026-03-09T00:04:20.913 INFO:tasks.workunit.client.0.vm03.stdout:3/796: readlink d2/db/l33 0 2026-03-09T00:04:20.917 INFO:tasks.workunit.client.0.vm03.stdout:3/797: read d2/db/d3b/d3f/f46 [143205,41394] 0 2026-03-09T00:04:20.918 INFO:tasks.workunit.client.0.vm03.stdout:3/798: dread d2/db/d3b/d3f/f69 [0,4194304] 0 2026-03-09T00:04:20.918 INFO:tasks.workunit.client.0.vm03.stdout:3/799: dread - d2/db/d2d/fb9 zero size 2026-03-09T00:04:20.924 INFO:tasks.workunit.client.0.vm03.stdout:6/891: fsync d13/dc4/dea/dd7/f12c 0 2026-03-09T00:04:20.929 INFO:tasks.workunit.client.0.vm03.stdout:6/892: creat d13/d35/d69/f12d x:0 0 0 2026-03-09T00:04:20.934 INFO:tasks.workunit.client.0.vm03.stdout:7/883: mkdir d2/d4/d1e/dee/d104/d81/d111 0 2026-03-09T00:04:20.936 INFO:tasks.workunit.client.0.vm03.stdout:0/965: sync 2026-03-09T00:04:20.936 INFO:tasks.workunit.client.0.vm03.stdout:0/966: dread - d2/da/dd/d49/f14a zero size 2026-03-09T00:04:20.936 INFO:tasks.workunit.client.0.vm03.stdout:0/967: truncate d2/da/d36/ff6 5074021 0 2026-03-09T00:04:20.940 INFO:tasks.workunit.client.0.vm03.stdout:3/800: creat d2/db/ff9 x:0 0 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/893: creat d13/dc4/dea/d109/f12e x:0 0 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:3/801: mkdir d2/dbf/dfa 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:3/802: fdatasync d2/db/d40/f78 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/894: rename d13/d35/d71/d97/la1 to d13/d1e/d44/d10b/l12f 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:3/803: symlink d2/db/d3b/dc2/lfb 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:3/804: stat d2/dbf/cab 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:3/805: readlink d2/db/l38 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/895: mkdir d13/d35/d71/d97/da5/db1/d130 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/896: creat d13/d35/d74/d89/d9d/d116/f131 x:0 0 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/897: mknod d13/d1e/d44/d59/dec/c132 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/898: creat d13/d1e/d44/d4a/d52/dbf/f133 x:0 0 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/899: mknod d13/d1e/d44/d59/dec/d62/c134 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/900: mknod d13/d35/d71/d97/da5/c135 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/901: dread - d13/d1e/d44/d59/fe0 zero size 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/902: stat d13/d1e/c64 0 2026-03-09T00:04:20.951 INFO:tasks.workunit.client.0.vm03.stdout:6/903: symlink d13/d35/d69/l136 0 2026-03-09T00:04:20.952 INFO:tasks.workunit.client.0.vm03.stdout:6/904: readlink d13/dc4/dea/dd7/lf2 0 2026-03-09T00:04:20.952 INFO:tasks.workunit.client.0.vm03.stdout:6/905: fdatasync d13/d35/f119 0 2026-03-09T00:04:20.952 INFO:tasks.workunit.client.0.vm03.stdout:6/906: chown d13/d35/d71/d97/da5 110684 1 2026-03-09T00:04:20.958 INFO:tasks.workunit.client.0.vm03.stdout:0/968: dread d2/da/d36/ff6 [0,4194304] 0 2026-03-09T00:04:20.963 INFO:tasks.workunit.client.0.vm03.stdout:0/969: chown d2/da/d76/d8a/d8f/db8/l10f 275960 1 2026-03-09T00:04:20.991 INFO:tasks.workunit.client.0.vm03.stdout:2/994: dwrite d8/d26/d5e/dc5/fd1 [0,4194304] 0 2026-03-09T00:04:21.041 INFO:tasks.workunit.client.0.vm03.stdout:5/969: dwrite d1c/d20/d55/ff6 [0,4194304] 0 2026-03-09T00:04:21.042 INFO:tasks.workunit.client.0.vm03.stdout:8/955: dwrite d7/df/d1a/d2b/f72 [0,4194304] 0 2026-03-09T00:04:21.042 INFO:tasks.workunit.client.0.vm03.stdout:5/970: creat d1c/d20/d55/d4f/f139 x:0 0 0 2026-03-09T00:04:21.044 INFO:tasks.workunit.client.0.vm03.stdout:8/956: mkdir d7/df/d1a/d40/d9d/da3/df0/d126 0 2026-03-09T00:04:21.044 INFO:tasks.workunit.client.0.vm03.stdout:5/971: rmdir d1c/d20/d55/d4f/d58/db5 39 2026-03-09T00:04:21.045 INFO:tasks.workunit.client.0.vm03.stdout:5/972: mkdir d1c/d20/d55/d66/db2/d13a 0 2026-03-09T00:04:21.048 INFO:tasks.workunit.client.0.vm03.stdout:8/957: dread d7/df/d1a/d40/d58/f7f [0,4194304] 0 2026-03-09T00:04:21.057 INFO:tasks.workunit.client.0.vm03.stdout:5/973: link d1c/d20/d55/l40 d1c/d20/d55/d66/dc6/l13b 0 2026-03-09T00:04:21.057 INFO:tasks.workunit.client.0.vm03.stdout:8/958: creat d7/df/d1a/d40/d9d/f127 x:0 0 0 2026-03-09T00:04:21.057 INFO:tasks.workunit.client.0.vm03.stdout:5/974: creat d1c/d20/dc0/d12b/f13c x:0 0 0 2026-03-09T00:04:21.057 INFO:tasks.workunit.client.0.vm03.stdout:5/975: unlink d1c/d20/d55/d66/db2/l117 0 2026-03-09T00:04:21.069 INFO:tasks.workunit.client.0.vm03.stdout:7/884: dwrite d2/d4/db7/daa/fc4 [0,4194304] 0 2026-03-09T00:04:21.072 INFO:tasks.workunit.client.0.vm03.stdout:9/868: dwrite d15/d1c/fb8 [0,4194304] 0 2026-03-09T00:04:21.072 INFO:tasks.workunit.client.0.vm03.stdout:9/869: chown d15/d1c/d36/d4d/d11d 205641 1 2026-03-09T00:04:21.072 INFO:tasks.workunit.client.0.vm03.stdout:9/870: fsync d15/d1c/d28/d6e/fa9 0 2026-03-09T00:04:21.092 INFO:tasks.workunit.client.0.vm03.stdout:0/970: dwrite d2/da/dd/d49/d6c/d4b/fdb [0,4194304] 0 2026-03-09T00:04:21.092 INFO:tasks.workunit.client.0.vm03.stdout:0/971: truncate d2/da/dd/d49/d6c/da6/dda/fc5 788689 0 2026-03-09T00:04:21.128 INFO:tasks.workunit.client.0.vm03.stdout:3/806: dwrite d2/db/d3b/f3e [4194304,4194304] 0 2026-03-09T00:04:21.129 INFO:tasks.workunit.client.0.vm03.stdout:3/807: creat d2/db/d2d/d55/da9/ffc x:0 0 0 2026-03-09T00:04:21.130 INFO:tasks.workunit.client.0.vm03.stdout:3/808: rename d2/db/d3b/d5d/c5e to d2/dbf/dfa/cfd 0 2026-03-09T00:04:21.132 INFO:tasks.workunit.client.0.vm03.stdout:3/809: dread d2/fc5 [0,4194304] 0 2026-03-09T00:04:21.134 INFO:tasks.workunit.client.0.vm03.stdout:2/995: dwrite d8/d1b/d2a/d6b/dc6/ff4 [0,4194304] 0 2026-03-09T00:04:21.134 INFO:tasks.workunit.client.0.vm03.stdout:2/996: fsync d8/d1b/d2a/d2e/d9a/f136 0 2026-03-09T00:04:21.136 INFO:tasks.workunit.client.0.vm03.stdout:2/997: mkdir d8/d1b/d24/da5/dc9/d147 0 2026-03-09T00:04:21.136 INFO:tasks.workunit.client.0.vm03.stdout:2/998: rename d8/d1b/d2a/d6b/d50/c6a to d8/d26/d5e/d5f/ded/d131/c148 0 2026-03-09T00:04:21.137 INFO:tasks.workunit.client.0.vm03.stdout:2/999: fsync d8/f5d 0 2026-03-09T00:04:21.138 INFO:tasks.workunit.client.0.vm03.stdout:3/810: write d2/db/f7e [3018527,54076] 0 2026-03-09T00:04:21.138 INFO:tasks.workunit.client.0.vm03.stdout:3/811: symlink d2/db/d3b/d3f/db8/lfe 0 2026-03-09T00:04:21.138 INFO:tasks.workunit.client.0.vm03.stdout:3/812: write d2/fc5 [828337,23346] 0 2026-03-09T00:04:21.139 INFO:tasks.workunit.client.0.vm03.stdout:3/813: creat d2/fff x:0 0 0 2026-03-09T00:04:21.139 INFO:tasks.workunit.client.0.vm03.stdout:3/814: write d2/db/d6a/fa1 [787455,20862] 0 2026-03-09T00:04:21.139 INFO:tasks.workunit.client.0.vm03.stdout:3/815: write d2/db/d3b/d5d/fc0 [491797,31024] 0 2026-03-09T00:04:21.140 INFO:tasks.workunit.client.0.vm03.stdout:3/816: mkdir d2/dbf/dfa/d100 0 2026-03-09T00:04:21.141 INFO:tasks.workunit.client.0.vm03.stdout:3/817: rename d2/db/d40/d44 to d2/db/d40/d101 0 2026-03-09T00:04:21.141 INFO:tasks.workunit.client.0.vm03.stdout:3/818: stat d2/db/d3b/d5f/da5/d72/d96/fbb 0 2026-03-09T00:04:21.141 INFO:tasks.workunit.client.0.vm03.stdout:3/819: chown d2/db/d3b/d5d/fc0 88 1 2026-03-09T00:04:21.142 INFO:tasks.workunit.client.0.vm03.stdout:3/820: mkdir d2/dbf/d102 0 2026-03-09T00:04:21.142 INFO:tasks.workunit.client.0.vm03.stdout:3/821: truncate d2/db/d40/d51/fe7 203162 0 2026-03-09T00:04:21.151 INFO:tasks.workunit.client.0.vm03.stdout:8/959: dwrite d7/df/fee [0,4194304] 0 2026-03-09T00:04:21.157 INFO:tasks.workunit.client.0.vm03.stdout:6/907: dwrite d13/d1e/d44/d4a/d52/fe5 [0,4194304] 0 2026-03-09T00:04:21.189 INFO:tasks.workunit.client.0.vm03.stdout:9/871: dwrite d15/d1c/d21/fea [0,4194304] 0 2026-03-09T00:04:21.199 INFO:tasks.workunit.client.0.vm03.stdout:7/885: write d2/d4/d1e/dee/d104/d81/d96/d88/ffa [624517,58765] 0 2026-03-09T00:04:21.199 INFO:tasks.workunit.client.0.vm03.stdout:7/886: symlink d2/d4/d1e/dee/l112 0 2026-03-09T00:04:21.200 INFO:tasks.workunit.client.0.vm03.stdout:7/887: link d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/f5f d2/d4/db7/f113 0 2026-03-09T00:04:21.200 INFO:tasks.workunit.client.0.vm03.stdout:7/888: mkdir d2/d4/d1e/dee/d104/d114 0 2026-03-09T00:04:21.210 INFO:tasks.workunit.client.0.vm03.stdout:8/960: dwrite d7/df/d1a/d40/f76 [0,4194304] 0 2026-03-09T00:04:21.216 INFO:tasks.workunit.client.0.vm03.stdout:8/961: dread d7/df/d1a/d2b/f77 [4194304,4194304] 0 2026-03-09T00:04:21.218 INFO:tasks.workunit.client.0.vm03.stdout:8/962: truncate d7/df/d1a/f93 172257 0 2026-03-09T00:04:21.218 INFO:tasks.workunit.client.0.vm03.stdout:8/963: stat d7/df/d1a/d40/d9d/df2/d38/d60 0 2026-03-09T00:04:21.220 INFO:tasks.workunit.client.0.vm03.stdout:8/964: getdents d7/df/d1a/d40/d9d 0 2026-03-09T00:04:21.222 INFO:tasks.workunit.client.0.vm03.stdout:8/965: dread d7/df/d1a/d40/d9d/da3/ff6 [0,4194304] 0 2026-03-09T00:04:21.222 INFO:tasks.workunit.client.0.vm03.stdout:8/966: mkdir d7/df/d1a/d40/d9d/df2/d3f/d95/d128 0 2026-03-09T00:04:21.222 INFO:tasks.workunit.client.0.vm03.stdout:8/967: fdatasync d7/df/d1a/d40/d9d/df2/d3f/f59 0 2026-03-09T00:04:21.233 INFO:tasks.workunit.client.0.vm03.stdout:0/972: dwrite d2/da/dd/d49/d6c/d4b/d55/d6f/dad/fcc [0,4194304] 0 2026-03-09T00:04:21.233 INFO:tasks.workunit.client.0.vm03.stdout:3/822: dwrite d2/f4e [0,4194304] 0 2026-03-09T00:04:21.233 INFO:tasks.workunit.client.0.vm03.stdout:3/823: chown d2/ca3 9573 1 2026-03-09T00:04:21.236 INFO:tasks.workunit.client.0.vm03.stdout:0/973: mkdir d2/d111/d157 0 2026-03-09T00:04:21.241 INFO:tasks.workunit.client.0.vm03.stdout:0/974: dread d2/da/dd/d49/d6c/d4b/f88 [0,4194304] 0 2026-03-09T00:04:21.241 INFO:tasks.workunit.client.0.vm03.stdout:0/975: chown d2/d71/f7c 13 1 2026-03-09T00:04:21.270 INFO:tasks.workunit.client.0.vm03.stdout:9/872: dwrite d15/d1c/d21/d54/d87/d93/fba [0,4194304] 0 2026-03-09T00:04:21.270 INFO:tasks.workunit.client.0.vm03.stdout:9/873: chown d15/d1c/d21/f34 123012860 1 2026-03-09T00:04:21.270 INFO:tasks.workunit.client.0.vm03.stdout:9/874: write d15/d1c/d21/f101 [94506,101114] 0 2026-03-09T00:04:21.271 INFO:tasks.workunit.client.0.vm03.stdout:7/889: dwrite d2/d1f/f3b [4194304,4194304] 0 2026-03-09T00:04:21.271 INFO:tasks.workunit.client.0.vm03.stdout:0/976: dwrite d2/f7f [0,4194304] 0 2026-03-09T00:04:21.276 INFO:tasks.workunit.client.0.vm03.stdout:9/875: creat d15/d1c/f122 x:0 0 0 2026-03-09T00:04:21.278 INFO:tasks.workunit.client.0.vm03.stdout:5/976: sync 2026-03-09T00:04:21.279 INFO:tasks.workunit.client.0.vm03.stdout:5/977: write d1c/d20/d55/d66/d6b/d8f/f98 [2169997,76356] 0 2026-03-09T00:04:21.279 INFO:tasks.workunit.client.0.vm03.stdout:7/890: dread d2/d4/fb [0,4194304] 0 2026-03-09T00:04:21.279 INFO:tasks.workunit.client.0.vm03.stdout:7/891: readlink d2/d1f/dc6/l102 0 2026-03-09T00:04:21.279 INFO:tasks.workunit.client.0.vm03.stdout:3/824: write d2/db/d3b/f95 [1999591,9087] 0 2026-03-09T00:04:21.290 INFO:tasks.workunit.client.0.vm03.stdout:0/977: mkdir d2/da/dd/d49/d6c/da6/dcf/d14c/d158 0 2026-03-09T00:04:21.291 INFO:tasks.workunit.client.0.vm03.stdout:5/978: rename d1c/d20/d55/d43/l8b to d1c/d20/dc0/l13d 0 2026-03-09T00:04:21.291 INFO:tasks.workunit.client.0.vm03.stdout:5/979: read - d1c/d20/d55/d66/d6b/de3/f136 zero size 2026-03-09T00:04:21.292 INFO:tasks.workunit.client.0.vm03.stdout:3/825: creat d2/db/d40/f103 x:0 0 0 2026-03-09T00:04:21.299 INFO:tasks.workunit.client.0.vm03.stdout:0/978: creat d2/da/dd/f159 x:0 0 0 2026-03-09T00:04:21.308 INFO:tasks.workunit.client.0.vm03.stdout:5/980: truncate d1c/d20/d55/d4f/d58/db5/f45 3243084 0 2026-03-09T00:04:21.308 INFO:tasks.workunit.client.0.vm03.stdout:5/981: write d1c/d20/f39 [5254630,85741] 0 2026-03-09T00:04:21.308 INFO:tasks.workunit.client.0.vm03.stdout:0/979: link d2/da/dd/d49/d6c/da6/dcf/c131 d2/da/dd/d49/c15a 0 2026-03-09T00:04:21.312 INFO:tasks.workunit.client.0.vm03.stdout:5/982: rmdir d1c/d20/d55/d4f/d58/d5d/d119/d135 0 2026-03-09T00:04:21.314 INFO:tasks.workunit.client.0.vm03.stdout:0/980: link d2/da/l13 d2/da/d36/ddf/l15b 0 2026-03-09T00:04:21.319 INFO:tasks.workunit.client.0.vm03.stdout:5/983: rename d1c/d20/d55/d4f/d58/d73/d76/d91/lce to d1c/d20/d55/d66/dc6/df1/l13e 0 2026-03-09T00:04:21.321 INFO:tasks.workunit.client.0.vm03.stdout:5/984: chown d1c/d20/d55/d4f/d58/d73/d76/d91 940821221 1 2026-03-09T00:04:21.323 INFO:tasks.workunit.client.0.vm03.stdout:5/985: write d1c/d20/d55/d4f/d58/d73/d9e/fd1 [1398640,17845] 0 2026-03-09T00:04:21.324 INFO:tasks.workunit.client.0.vm03.stdout:5/986: stat d1c/d20/d55/d43/lee 0 2026-03-09T00:04:21.328 INFO:tasks.workunit.client.0.vm03.stdout:5/987: creat d1c/d20/d55/d4f/d58/db5/f13f x:0 0 0 2026-03-09T00:04:21.328 INFO:tasks.workunit.client.0.vm03.stdout:5/988: chown d1c/d20/d55/d66/d6b/d8f/c129 796728 1 2026-03-09T00:04:21.329 INFO:tasks.workunit.client.0.vm03.stdout:9/876: dwrite d15/d1c/d21/d64/f4e [0,4194304] 0 2026-03-09T00:04:21.329 INFO:tasks.workunit.client.0.vm03.stdout:9/877: chown d15/d1c/d21/d54/l5a 15156249 1 2026-03-09T00:04:21.329 INFO:tasks.workunit.client.0.vm03.stdout:9/878: write d15/d1c/d21/d54/d87/d93/fbf [2576536,42914] 0 2026-03-09T00:04:21.329 INFO:tasks.workunit.client.0.vm03.stdout:9/879: creat d15/d1c/f123 x:0 0 0 2026-03-09T00:04:21.330 INFO:tasks.workunit.client.0.vm03.stdout:9/880: read d15/d1c/d36/f6d [522507,94719] 0 2026-03-09T00:04:21.330 INFO:tasks.workunit.client.0.vm03.stdout:9/881: truncate d15/d77/fbd 230772 0 2026-03-09T00:04:21.354 INFO:tasks.workunit.client.0.vm03.stdout:9/882: rename d15/d1c/d21/d54 to d15/d1c/d36/d4d/d124 0 2026-03-09T00:04:21.358 INFO:tasks.workunit.client.0.vm03.stdout:9/883: write d15/fe9 [4129795,3738] 0 2026-03-09T00:04:21.360 INFO:tasks.workunit.client.0.vm03.stdout:9/884: creat d15/d1c/d36/d4d/d124/dab/df6/f125 x:0 0 0 2026-03-09T00:04:21.361 INFO:tasks.workunit.client.0.vm03.stdout:9/885: dread d15/d1c/d21/d64/f50 [4194304,4194304] 0 2026-03-09T00:04:21.362 INFO:tasks.workunit.client.0.vm03.stdout:9/886: stat d15/d1c/d28/de1/ldd 0 2026-03-09T00:04:21.363 INFO:tasks.workunit.client.0.vm03.stdout:7/892: dwrite d2/f50 [0,4194304] 0 2026-03-09T00:04:21.363 INFO:tasks.workunit.client.0.vm03.stdout:3/826: dwrite d2/db/d6a/fa1 [0,4194304] 0 2026-03-09T00:04:21.364 INFO:tasks.workunit.client.0.vm03.stdout:5/989: dwrite fe [0,4194304] 0 2026-03-09T00:04:21.364 INFO:tasks.workunit.client.0.vm03.stdout:5/990: write d1c/d20/d56/fc8 [289021,47411] 0 2026-03-09T00:04:21.365 INFO:tasks.workunit.client.0.vm03.stdout:9/887: symlink d15/d1c/d21/d64/l126 0 2026-03-09T00:04:21.366 INFO:tasks.workunit.client.0.vm03.stdout:9/888: read d15/d1c/d21/f61 [67336,58672] 0 2026-03-09T00:04:21.367 INFO:tasks.workunit.client.0.vm03.stdout:6/908: sync 2026-03-09T00:04:21.367 INFO:tasks.workunit.client.0.vm03.stdout:8/968: sync 2026-03-09T00:04:21.367 INFO:tasks.workunit.client.0.vm03.stdout:8/969: chown d7/df/d1a/d2b 0 1 2026-03-09T00:04:21.367 INFO:tasks.workunit.client.0.vm03.stdout:6/909: fdatasync d13/d35/d69/f12d 0 2026-03-09T00:04:21.367 INFO:tasks.workunit.client.0.vm03.stdout:8/970: chown d7/df/d1a/d40/d58/f7f 58729391 1 2026-03-09T00:04:21.371 INFO:tasks.workunit.client.0.vm03.stdout:8/971: dread d7/df/d1a/d2b/f77 [0,4194304] 0 2026-03-09T00:04:21.371 INFO:tasks.workunit.client.0.vm03.stdout:8/972: fdatasync d7/df/d1a/fc4 0 2026-03-09T00:04:21.381 INFO:tasks.workunit.client.0.vm03.stdout:5/991: dread d1c/d20/d55/f3d [0,4194304] 0 2026-03-09T00:04:21.382 INFO:tasks.workunit.client.0.vm03.stdout:7/893: symlink d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/df8/l115 0 2026-03-09T00:04:21.382 INFO:tasks.workunit.client.0.vm03.stdout:7/894: read - d2/d4/d1e/d5e/d7e/ffc zero size 2026-03-09T00:04:21.386 INFO:tasks.workunit.client.0.vm03.stdout:7/895: dread d2/fcd [4194304,4194304] 0 2026-03-09T00:04:21.387 INFO:tasks.workunit.client.0.vm03.stdout:3/827: truncate d2/db/d40/d51/f5a 1336710 0 2026-03-09T00:04:21.387 INFO:tasks.workunit.client.0.vm03.stdout:3/828: chown d2/db/d2d/l2e 1055597719 1 2026-03-09T00:04:21.397 INFO:tasks.workunit.client.0.vm03.stdout:9/889: rename d15/d1c/d36/d4d/d124/d87/d93/l96 to d15/d77/dfb/l127 0 2026-03-09T00:04:21.397 INFO:tasks.workunit.client.0.vm03.stdout:9/890: write d15/d1c/f123 [417418,122148] 0 2026-03-09T00:04:21.401 INFO:tasks.workunit.client.0.vm03.stdout:8/973: unlink d7/df/d1a/d40/d58/f57 0 2026-03-09T00:04:21.409 INFO:tasks.workunit.client.0.vm03.stdout:5/992: symlink d1c/d20/d55/l140 0 2026-03-09T00:04:21.409 INFO:tasks.workunit.client.0.vm03.stdout:8/974: rename d7/df/d1a/d40/db3/f75 to d7/df/d1a/d40/d9d/df2/dc3/f129 0 2026-03-09T00:04:21.409 INFO:tasks.workunit.client.0.vm03.stdout:5/993: rename d1c/d20/d55/d66/d6b/d8f/dca to d1c/d51/d141 0 2026-03-09T00:04:21.409 INFO:tasks.workunit.client.0.vm03.stdout:5/994: creat d1c/d20/d55/dac/f142 x:0 0 0 2026-03-09T00:04:21.409 INFO:tasks.workunit.client.0.vm03.stdout:5/995: creat d1c/d20/d55/d66/f143 x:0 0 0 2026-03-09T00:04:21.410 INFO:tasks.workunit.client.0.vm03.stdout:8/975: write d7/df/d1a/d40/d9d/df2/d3f/f90 [1214162,10693] 0 2026-03-09T00:04:21.413 INFO:tasks.workunit.client.0.vm03.stdout:6/910: dwrite d13/dc4/dea/dd7/f12c [0,4194304] 0 2026-03-09T00:04:21.413 INFO:tasks.workunit.client.0.vm03.stdout:6/911: write d13/d1e/d44/d59/d77/f126 [250872,96256] 0 2026-03-09T00:04:21.413 INFO:tasks.workunit.client.0.vm03.stdout:6/912: fdatasync d13/d1e/d44/d59/dec/f122 0 2026-03-09T00:04:21.413 INFO:tasks.workunit.client.0.vm03.stdout:6/913: creat d13/dc4/dea/dd7/f137 x:0 0 0 2026-03-09T00:04:21.414 INFO:tasks.workunit.client.0.vm03.stdout:8/976: dread d7/df/d1a/d2b/f9f [0,4194304] 0 2026-03-09T00:04:21.414 INFO:tasks.workunit.client.0.vm03.stdout:8/977: truncate d7/df/d1a/d40/d9d/df2/d3f/df1/dfd/f100 371011 0 2026-03-09T00:04:21.415 INFO:tasks.workunit.client.0.vm03.stdout:5/996: mkdir d1c/d67/d144 0 2026-03-09T00:04:21.417 INFO:tasks.workunit.client.0.vm03.stdout:6/914: rename d13/d1e/d44/d4a/d52/fe5 to d13/dc4/dea/f138 0 2026-03-09T00:04:21.417 INFO:tasks.workunit.client.0.vm03.stdout:6/915: getdents d13/d35/d74/d127 0 2026-03-09T00:04:21.421 INFO:tasks.workunit.client.0.vm03.stdout:7/896: dwrite d2/d4/d1e/dee/d104/d54/d8d/ff4 [0,4194304] 0 2026-03-09T00:04:21.422 INFO:tasks.workunit.client.0.vm03.stdout:6/916: rename d13/d35/d71/d97/da5/db1/d130 to d13/dc4/dea/dd7/d139 0 2026-03-09T00:04:21.429 INFO:tasks.workunit.client.0.vm03.stdout:7/897: rename d2/d4/d1e/dee/d104/d81/d96/d37/d39/d6e/fef to d2/d4/d1e/dee/d104/d81/d96/d88/db9/f116 0 2026-03-09T00:04:21.438 INFO:tasks.workunit.client.0.vm03.stdout:6/917: rmdir d13/d35/d74/df1 0 2026-03-09T00:04:21.445 INFO:tasks.workunit.client.0.vm03.stdout:3/829: write f1 [837535,63933] 0 2026-03-09T00:04:21.456 INFO:tasks.workunit.client.0.vm03.stdout:3/830: getdents d2/db/d2d 0 2026-03-09T00:04:21.460 INFO:tasks.workunit.client.0.vm03.stdout:0/981: sync 2026-03-09T00:04:21.460 INFO:tasks.workunit.client.0.vm03.stdout:0/982: dread - d2/fd3 zero size 2026-03-09T00:04:21.460 INFO:tasks.workunit.client.0.vm03.stdout:0/983: chown d2/da/dd/d49/d6c/d4b/d55/c12d 54995 1 2026-03-09T00:04:21.461 INFO:tasks.workunit.client.0.vm03.stdout:0/984: symlink d2/da/dd/d49/d6c/da6/dda/db5/dba/dff/l15c 0 2026-03-09T00:04:21.461 INFO:tasks.workunit.client.0.vm03.stdout:0/985: write d2/da/dd/f7b [2901071,62452] 0 2026-03-09T00:04:21.462 INFO:tasks.workunit.client.0.vm03.stdout:0/986: dread d2/da/d76/fa1 [0,4194304] 0 2026-03-09T00:04:21.464 INFO:tasks.workunit.client.0.vm03.stdout:0/987: creat d2/da/dd/d6e/f15d x:0 0 0 2026-03-09T00:04:21.465 INFO:tasks.workunit.client.0.vm03.stdout:0/988: write d2/da/d4e/f120 [4060792,128653] 0 2026-03-09T00:04:21.469 INFO:tasks.workunit.client.0.vm03.stdout:0/989: getdents d2/da/dd/d49/d6c/d4b/d55 0 2026-03-09T00:04:21.471 INFO:tasks.workunit.client.0.vm03.stdout:0/990: rmdir d2/da/dd/d49/d6c/d4b/d55/d6f/dad/de8 39 2026-03-09T00:04:21.499 INFO:tasks.workunit.client.0.vm03.stdout:7/898: dwrite d2/d4/db7/d67/f64 [0,4194304] 0 2026-03-09T00:04:21.499 INFO:tasks.workunit.client.0.vm03.stdout:7/899: fdatasync d2/d1f/d35/f107 0 2026-03-09T00:04:21.529 INFO:tasks.workunit.client.0.vm03.stdout:6/918: dwrite d13/d35/d69/dee/ffe [0,4194304] 0 2026-03-09T00:04:21.529 INFO:tasks.workunit.client.0.vm03.stdout:6/919: write d13/d1e/f3e [3136061,106045] 0 2026-03-09T00:04:21.531 INFO:tasks.workunit.client.0.vm03.stdout:6/920: write d13/f31 [941206,71309] 0 2026-03-09T00:04:21.532 INFO:tasks.workunit.client.0.vm03.stdout:6/921: mkdir d13/d1e/d44/d59/d77/d114/d13a 0 2026-03-09T00:04:21.533 INFO:tasks.workunit.client.0.vm03.stdout:6/922: rename d13/d35/d71/d97/da5/dc1/c11f to d13/c13b 0 2026-03-09T00:04:21.533 INFO:tasks.workunit.client.0.vm03.stdout:6/923: stat d13/d35/d74/d89/d9d/d116 0 2026-03-09T00:04:21.534 INFO:tasks.workunit.client.0.vm03.stdout:6/924: truncate d13/d1e/d44/d59/dec/f4f 152311 0 2026-03-09T00:04:21.550 INFO:tasks.workunit.client.0.vm03.stdout:3/831: dwrite d2/db/d6a/f83 [0,4194304] 0 2026-03-09T00:04:21.550 INFO:tasks.workunit.client.0.vm03.stdout:3/832: write d2/db/d3b/d5f/da5/d72/d96/fdb [443519,118077] 0 2026-03-09T00:04:21.550 INFO:tasks.workunit.client.0.vm03.stdout:3/833: chown d2/dbf/dfa/cfd 822323372 1 2026-03-09T00:04:21.550 INFO:tasks.workunit.client.0.vm03.stdout:3/834: truncate d2/db/d3b/d3f/fcb 966541 0 2026-03-09T00:04:21.551 INFO:tasks.workunit.client.0.vm03.stdout:3/835: symlink d2/db/d2d/dc7/l104 0 2026-03-09T00:04:21.551 INFO:tasks.workunit.client.0.vm03.stdout:3/836: fdatasync d2/db/d40/d58/ff5 0 2026-03-09T00:04:21.551 INFO:tasks.workunit.client.0.vm03.stdout:3/837: mknod d2/db/d3b/d3f/db8/c105 0 2026-03-09T00:04:21.589 INFO:tasks.workunit.client.0.vm03.stdout:0/991: dwrite d2/da/dd/d49/d6c/d4b/daf/fc6 [0,4194304] 0 2026-03-09T00:04:21.589 INFO:tasks.workunit.client.0.vm03.stdout:8/978: dwrite d7/df/d1a/d40/d9d/df2/f66 [0,4194304] 0 2026-03-09T00:04:21.593 INFO:tasks.workunit.client.0.vm03.stdout:8/979: dread d7/df/d1a/f4f [0,4194304] 0 2026-03-09T00:04:21.593 INFO:tasks.workunit.client.0.vm03.stdout:8/980: readlink d7/df/d1a/d40/d9d/df2/lc0 0 2026-03-09T00:04:21.593 INFO:tasks.workunit.client.0.vm03.stdout:8/981: readlink d7/df/l52 0 2026-03-09T00:04:21.593 INFO:tasks.workunit.client.0.vm03.stdout:8/982: readlink d7/df/d1a/d40/d9d/df2/l27 0 2026-03-09T00:04:21.599 INFO:tasks.workunit.client.0.vm03.stdout:6/925: dwrite d13/d35/d71/d97/da5/fad [4194304,4194304] 0 2026-03-09T00:04:21.600 INFO:tasks.workunit.client.0.vm03.stdout:5/997: sync 2026-03-09T00:04:21.600 INFO:tasks.workunit.client.0.vm03.stdout:9/891: sync 2026-03-09T00:04:21.605 INFO:tasks.workunit.client.0.vm03.stdout:6/926: creat d13/d35/d71/d97/f13c x:0 0 0 2026-03-09T00:04:21.606 INFO:tasks.workunit.client.0.vm03.stdout:9/892: rename d15/d77/fbb to d15/f128 0 2026-03-09T00:04:21.606 INFO:tasks.workunit.client.0.vm03.stdout:9/893: stat d15/d1c/d36/d4d/d124/d87/d93/fba 0 2026-03-09T00:04:21.606 INFO:tasks.workunit.client.0.vm03.stdout:9/894: creat d15/d1c/d28/dda/f129 x:0 0 0 2026-03-09T00:04:21.608 INFO:tasks.workunit.client.0.vm03.stdout:9/895: mkdir d15/d1c/d36/d4d/d124/dab/df6/d10e/d12a 0 2026-03-09T00:04:21.613 INFO:tasks.workunit.client.0.vm03.stdout:7/900: dwrite d2/d4/d1e/dee/d104/d81/d96/d8e/fc8 [0,4194304] 0 2026-03-09T00:04:21.613 INFO:tasks.workunit.client.0.vm03.stdout:7/901: readlink d2/l89 0 2026-03-09T00:04:21.613 INFO:tasks.workunit.client.0.vm03.stdout:7/902: dread - d2/d4/db7/d67/d6b/fbe zero size 2026-03-09T00:04:21.614 INFO:tasks.workunit.client.0.vm03.stdout:9/896: mkdir d15/d1c/d36/d4d/d124/d12b 0 2026-03-09T00:04:21.614 INFO:tasks.workunit.client.0.vm03.stdout:9/897: fdatasync d15/d1c/d36/f5c 0 2026-03-09T00:04:21.618 INFO:tasks.workunit.client.0.vm03.stdout:7/903: mknod d2/d4/d1e/dee/d104/d81/d96/d8e/db2/c117 0 2026-03-09T00:04:21.629 INFO:tasks.workunit.client.0.vm03.stdout:5/998: write d1c/f4c [1441240,52120] 0 2026-03-09T00:04:21.629 INFO:tasks.workunit.client.0.vm03.stdout:5/999: mkdir d1c/d20/d55/d66/d70/d145 0 2026-03-09T00:04:21.633 INFO:tasks.workunit.client.0.vm03.stdout:7/904: dread d2/d4/db7/fea [0,4194304] 0 2026-03-09T00:04:21.646 INFO:tasks.workunit.client.0.vm03.stdout:7/905: link d2/d1f/d3a/l65 d2/d4/d1e/dee/d104/l118 0 2026-03-09T00:04:21.647 INFO:tasks.workunit.client.0.vm03.stdout:7/906: mknod d2/d1f/d3a/d24/da4/c119 0 2026-03-09T00:04:21.649 INFO:tasks.workunit.client.0.vm03.stdout:7/907: creat d2/d4/d1e/dee/d104/d54/d8d/dd2/f11a x:0 0 0 2026-03-09T00:04:21.683 INFO:tasks.workunit.client.0.vm03.stdout:8/983: dwrite d7/df/d1a/d40/d58/fe8 [4194304,4194304] 0 2026-03-09T00:04:21.684 INFO:tasks.workunit.client.0.vm03.stdout:8/984: getdents d7/df/d1a/d40/d9d/df2/dc3 0 2026-03-09T00:04:21.687 INFO:tasks.workunit.client.0.vm03.stdout:8/985: creat d7/df/d1a/d40/d9d/df2/d38/f12a x:0 0 0 2026-03-09T00:04:21.692 INFO:tasks.workunit.client.0.vm03.stdout:8/986: dread f6 [0,4194304] 0 2026-03-09T00:04:21.693 INFO:tasks.workunit.client.0.vm03.stdout:8/987: symlink d7/df/d1a/d40/d9d/df2/d38/d4c/d98/l12b 0 2026-03-09T00:04:21.693 INFO:tasks.workunit.client.0.vm03.stdout:8/988: symlink d7/df/d1a/d2b/d62/l12c 0 2026-03-09T00:04:21.693 INFO:tasks.workunit.client.0.vm03.stdout:8/989: chown d7/df/d1a/d40/d9d/df2/d38/d60 83 1 2026-03-09T00:04:21.693 INFO:tasks.workunit.client.0.vm03.stdout:8/990: write d7/f34 [1425769,113790] 0 2026-03-09T00:04:21.693 INFO:tasks.workunit.client.0.vm03.stdout:8/991: creat d7/df/d1a/d40/dc8/d101/f12d x:0 0 0 2026-03-09T00:04:21.695 INFO:tasks.workunit.client.0.vm03.stdout:8/992: creat d7/df/d1a/d40/dc8/f12e x:0 0 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/993: mknod d7/df/d1a/d40/d9d/da3/df0/d126/c12f 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/994: mknod d7/df/d1a/d40/d9d/df2/d38/d60/dcd/c130 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/995: rename d7/df/d1a/d40/d9d/df2 to d7/df/d1a/d40/d9d/df2/d3f/d131 22 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/996: write d7/df/d1a/d40/d9d/da3/dd2/fed [425572,45925] 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/997: creat d7/df/d1a/d40/d9d/df2/d38/d91/d103/f132 x:0 0 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/998: creat d7/df/d1a/d40/d58/f133 x:0 0 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:8/999: write d7/df/d1a/fe7 [709955,113028] 0 2026-03-09T00:04:21.706 INFO:tasks.workunit.client.0.vm03.stdout:6/927: dwrite d13/d1e/f9f [0,4194304] 0 2026-03-09T00:04:21.707 INFO:tasks.workunit.client.0.vm03.stdout:9/898: dwrite d15/d1c/d36/d4d/d124/f73 [0,4194304] 0 2026-03-09T00:04:21.708 INFO:tasks.workunit.client.0.vm03.stdout:9/899: write d15/f23 [1524454,36066] 0 2026-03-09T00:04:21.708 INFO:tasks.workunit.client.0.vm03.stdout:0/992: dwrite d2/da/dd/f38 [0,4194304] 0 2026-03-09T00:04:21.709 INFO:tasks.workunit.client.0.vm03.stdout:7/908: dwrite d2/d4/d1e/dee/d104/d81/d96/da2/fdc [0,4194304] 0 2026-03-09T00:04:21.711 INFO:tasks.workunit.client.0.vm03.stdout:0/993: write d2/da/dd/d49/d6c/d4b/f88 [2801767,45208] 0 2026-03-09T00:04:21.711 INFO:tasks.workunit.client.0.vm03.stdout:0/994: write d2/da/d4e/faa [6910885,39632] 0 2026-03-09T00:04:21.711 INFO:tasks.workunit.client.0.vm03.stdout:0/995: write d2/da/dd/d49/d6c/da6/fc2 [353543,118852] 0 2026-03-09T00:04:21.713 INFO:tasks.workunit.client.0.vm03.stdout:6/928: creat d13/d35/d71/d97/da5/dc1/f13d x:0 0 0 2026-03-09T00:04:21.717 INFO:tasks.workunit.client.0.vm03.stdout:0/996: write d2/da/dd/d49/d6c/d4b/daf/fc6 [1685052,107112] 0 2026-03-09T00:04:21.719 INFO:tasks.workunit.client.0.vm03.stdout:9/900: unlink d15/d1c/d21/d64/c9b 0 2026-03-09T00:04:21.744 INFO:tasks.workunit.client.0.vm03.stdout:7/909: rename d2/d4/d1e/dee/d104/d54/f109 to d2/d4/db7/d67/d6b/f11b 0 2026-03-09T00:04:21.744 INFO:tasks.workunit.client.0.vm03.stdout:7/910: write d2/d4/d1e/dee/d104/d81/d96/d88/ffa [2995279,62151] 0 2026-03-09T00:04:21.744 INFO:tasks.workunit.client.0.vm03.stdout:7/911: readlink d2/d4/d1e/dee/d104/d81/le9 0 2026-03-09T00:04:21.745 INFO:tasks.workunit.client.0.vm03.stdout:0/997: rmdir d2/da/dd/d49/d6c/da6/dda/db5/dba/dff 39 2026-03-09T00:04:21.745 INFO:tasks.workunit.client.0.vm03.stdout:0/998: truncate d2/da/d36/ddf/df7/d12a/f13d 639196 0 2026-03-09T00:04:21.749 INFO:tasks.workunit.client.0.vm03.stdout:9/901: creat d15/d1c/d36/d4d/d124/d12b/f12c x:0 0 0 2026-03-09T00:04:21.750 INFO:tasks.workunit.client.0.vm03.stdout:0/999: dread d2/da/dd/d49/d6c/d4b/d55/ffa [0,4194304] 0 2026-03-09T00:04:21.755 INFO:tasks.workunit.client.0.vm03.stdout:7/912: rmdir d2/d4/d1e/dee/d104/d81/d96/d37 39 2026-03-09T00:04:21.755 INFO:tasks.workunit.client.0.vm03.stdout:7/913: chown d2/d4/db7/daa/f110 2 1 2026-03-09T00:04:21.762 INFO:tasks.workunit.client.0.vm03.stdout:9/902: truncate d15/f1b 3937446 0 2026-03-09T00:04:21.764 INFO:tasks.workunit.client.0.vm03.stdout:7/914: link d2/d4/d1e/f97 d2/d4/db7/d67/f11c 0 2026-03-09T00:04:21.767 INFO:tasks.workunit.client.0.vm03.stdout:9/903: unlink d15/d1c/l6a 0 2026-03-09T00:04:21.767 INFO:tasks.workunit.client.0.vm03.stdout:9/904: read - d15/d1c/d21/fcd zero size 2026-03-09T00:04:21.767 INFO:tasks.workunit.client.0.vm03.stdout:9/905: fdatasync d15/d1c/d36/d4d/d124/d87/d93/f7e 0 2026-03-09T00:04:21.772 INFO:tasks.workunit.client.0.vm03.stdout:7/915: rename d2/d4/d1e/d5e/daf/cca to d2/d4/d1e/d5e/c11d 0 2026-03-09T00:04:21.772 INFO:tasks.workunit.client.0.vm03.stdout:7/916: write d2/d4/d1e/dee/d104/d81/d96/d8e/ff1 [1130254,112461] 0 2026-03-09T00:04:21.772 INFO:tasks.workunit.client.0.vm03.stdout:7/917: write d2/d4/d1e/dee/d104/d81/ff3 [4800038,13066] 0 2026-03-09T00:04:21.780 INFO:tasks.workunit.client.0.vm03.stdout:7/918: creat d2/d4/d1e/f11e x:0 0 0 2026-03-09T00:04:21.783 INFO:tasks.workunit.client.0.vm03.stdout:7/919: unlink d2/d4/fb 0 2026-03-09T00:04:21.793 INFO:tasks.workunit.client.0.vm03.stdout:3/838: sync 2026-03-09T00:04:21.796 INFO:tasks.workunit.client.0.vm03.stdout:9/906: dwrite d15/d1c/d21/fcd [0,4194304] 0 2026-03-09T00:04:21.797 INFO:tasks.workunit.client.0.vm03.stdout:3/839: rmdir d2/db/d3b/d5f/da5/d72 39 2026-03-09T00:04:21.807 INFO:tasks.workunit.client.0.vm03.stdout:9/907: creat d15/d77/dfb/f12d x:0 0 0 2026-03-09T00:04:21.807 INFO:tasks.workunit.client.0.vm03.stdout:9/908: truncate d15/d1c/d36/fb1 649036 0 2026-03-09T00:04:21.808 INFO:tasks.workunit.client.0.vm03.stdout:6/929: sync 2026-03-09T00:04:21.808 INFO:tasks.workunit.client.0.vm03.stdout:6/930: write d13/d1e/d44/d59/dec/d62/dfb/f101 [937489,57009] 0 2026-03-09T00:04:21.816 INFO:tasks.workunit.client.0.vm03.stdout:9/909: symlink d15/d1c/d21/d64/l12e 0 2026-03-09T00:04:21.832 INFO:tasks.workunit.client.0.vm03.stdout:3/840: dwrite d2/db/d6a/f83 [0,4194304] 0 2026-03-09T00:04:21.835 INFO:tasks.workunit.client.0.vm03.stdout:9/910: stat d15/f1b 0 2026-03-09T00:04:21.835 INFO:tasks.workunit.client.0.vm03.stdout:9/911: chown d15/d1c/d36/d4d/d124/d87/d93/ff1 11551542 1 2026-03-09T00:04:21.835 INFO:tasks.workunit.client.0.vm03.stdout:9/912: read - d15/d1c/d36/f105 zero size 2026-03-09T00:04:21.837 INFO:tasks.workunit.client.0.vm03.stdout:6/931: sync 2026-03-09T00:04:21.840 INFO:tasks.workunit.client.0.vm03.stdout:3/841: truncate d2/db/d3b/f63 1671091 0 2026-03-09T00:04:21.845 INFO:tasks.workunit.client.0.vm03.stdout:9/913: creat d15/d7f/f12f x:0 0 0 2026-03-09T00:04:21.845 INFO:tasks.workunit.client.0.vm03.stdout:6/932: truncate d13/fa9 111727 0 2026-03-09T00:04:21.847 INFO:tasks.workunit.client.0.vm03.stdout:3/842: symlink d2/db/d3b/d5f/d65/l106 0 2026-03-09T00:04:21.849 INFO:tasks.workunit.client.0.vm03.stdout:9/914: rename d15/d1c/d28/lcc to d15/d1c/d21/d64/d11a/l130 0 2026-03-09T00:04:21.849 INFO:tasks.workunit.client.0.vm03.stdout:9/915: write d15/d1c/d21/d75/fbc [1809678,113074] 0 2026-03-09T00:04:21.849 INFO:tasks.workunit.client.0.vm03.stdout:9/916: creat d15/d1c/d28/de1/ded/f131 x:0 0 0 2026-03-09T00:04:21.851 INFO:tasks.workunit.client.0.vm03.stdout:7/920: sync 2026-03-09T00:04:21.856 INFO:tasks.workunit.client.0.vm03.stdout:6/933: mkdir d13/d35/d71/d97/da5/db1/d13e 0 2026-03-09T00:04:21.857 INFO:tasks.workunit.client.0.vm03.stdout:3/843: mknod d2/db/d2d/d55/da9/c107 0 2026-03-09T00:04:21.869 INFO:tasks.workunit.client.0.vm03.stdout:9/917: creat d15/d1c/d36/d4d/d124/dab/df6/d10e/d12a/f132 x:0 0 0 2026-03-09T00:04:21.869 INFO:tasks.workunit.client.0.vm03.stdout:6/934: dread d13/f5d [0,4194304] 0 2026-03-09T00:04:21.875 INFO:tasks.workunit.client.0.vm03.stdout:6/935: dread d13/d1e/d44/d4a/d52/f7a [0,4194304] 0 2026-03-09T00:04:21.880 INFO:tasks.workunit.client.0.vm03.stdout:7/921: symlink d2/d4/d1e/d5e/l11f 0 2026-03-09T00:04:21.884 INFO:tasks.workunit.client.0.vm03.stdout:6/936: rename d13/d1e/d44/d59/dec/d62/fa0 to d13/d1e/d44/d59/dec/d62/df5/f13f 0 2026-03-09T00:04:21.887 INFO:tasks.workunit.client.0.vm03.stdout:7/922: link d2/d4/f13 d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/f120 0 2026-03-09T00:04:21.889 INFO:tasks.workunit.client.0.vm03.stdout:6/937: dread fb [0,4194304] 0 2026-03-09T00:04:21.894 INFO:tasks.workunit.client.0.vm03.stdout:7/923: symlink d2/d4/d1e/dee/d104/d81/l121 0 2026-03-09T00:04:21.896 INFO:tasks.workunit.client.0.vm03.stdout:7/924: write d2/d1f/f11 [887568,116536] 0 2026-03-09T00:04:21.900 INFO:tasks.workunit.client.0.vm03.stdout:6/938: symlink d13/d35/dff/l140 0 2026-03-09T00:04:21.901 INFO:tasks.workunit.client.0.vm03.stdout:3/844: dwrite d2/db/d6a/fa1 [4194304,4194304] 0 2026-03-09T00:04:21.902 INFO:tasks.workunit.client.0.vm03.stdout:9/918: dwrite d15/f23 [0,4194304] 0 2026-03-09T00:04:21.905 INFO:tasks.workunit.client.0.vm03.stdout:7/925: truncate d2/d4/d1e/d78/fa5 356201 0 2026-03-09T00:04:21.905 INFO:tasks.workunit.client.0.vm03.stdout:7/926: write d2/d4/db7/d67/d6b/fbe [1019109,113474] 0 2026-03-09T00:04:21.908 INFO:tasks.workunit.client.0.vm03.stdout:3/845: creat d2/db/d3b/d5f/da5/d72/f108 x:0 0 0 2026-03-09T00:04:21.908 INFO:tasks.workunit.client.0.vm03.stdout:3/846: chown d2/db/d40/d58/led 33170615 1 2026-03-09T00:04:21.912 INFO:tasks.workunit.client.0.vm03.stdout:9/919: rename d15/d1c/d21/l4b to d15/d1c/d36/d4d/d124/d87/d93/d115/l133 0 2026-03-09T00:04:21.912 INFO:tasks.workunit.client.0.vm03.stdout:6/939: write d13/d35/f9e [2287718,90663] 0 2026-03-09T00:04:21.912 INFO:tasks.workunit.client.0.vm03.stdout:6/940: creat d13/d1e/d44/d59/dec/d62/df5/f141 x:0 0 0 2026-03-09T00:04:21.916 INFO:tasks.workunit.client.0.vm03.stdout:7/927: unlink d2/d4/d1e/dee/d104/lb8 0 2026-03-09T00:04:21.916 INFO:tasks.workunit.client.0.vm03.stdout:7/928: readlink d2/d1f/d3a/l65 0 2026-03-09T00:04:21.917 INFO:tasks.workunit.client.0.vm03.stdout:9/920: read d15/d1c/d36/d4d/dc4/fff [3573870,6263] 0 2026-03-09T00:04:21.917 INFO:tasks.workunit.client.0.vm03.stdout:9/921: write d15/d77/dfb/f12d [171036,10998] 0 2026-03-09T00:04:21.930 INFO:tasks.workunit.client.0.vm03.stdout:3/847: creat d2/db/d40/d101/db5/f109 x:0 0 0 2026-03-09T00:04:21.936 INFO:tasks.workunit.client.0.vm03.stdout:7/929: mkdir d2/d4/d1e/dee/d104/d81/d96/d8e/d122 0 2026-03-09T00:04:21.940 INFO:tasks.workunit.client.0.vm03.stdout:7/930: dread d2/d1f/d3a/f10c [0,4194304] 0 2026-03-09T00:04:21.944 INFO:tasks.workunit.client.0.vm03.stdout:3/848: symlink d2/db/d40/d101/d68/l10a 0 2026-03-09T00:04:21.944 INFO:tasks.workunit.client.0.vm03.stdout:3/849: dread - d2/db/ff9 zero size 2026-03-09T00:04:21.944 INFO:tasks.workunit.client.0.vm03.stdout:7/931: mknod d2/d4/d1e/dee/d104/c123 0 2026-03-09T00:04:21.944 INFO:tasks.workunit.client.0.vm03.stdout:7/932: creat d2/d4/d1e/dee/d104/d54/d8d/dd2/f124 x:0 0 0 2026-03-09T00:04:21.950 INFO:tasks.workunit.client.0.vm03.stdout:3/850: symlink d2/db/d3b/d5d/l10b 0 2026-03-09T00:04:21.952 INFO:tasks.workunit.client.0.vm03.stdout:9/922: dwrite d15/d1c/d28/f2f [0,4194304] 0 2026-03-09T00:04:21.953 INFO:tasks.workunit.client.0.vm03.stdout:9/923: dread - d15/d1c/d28/de1/ded/f131 zero size 2026-03-09T00:04:21.959 INFO:tasks.workunit.client.0.vm03.stdout:3/851: chown d2/db/c18 353 1 2026-03-09T00:04:21.959 INFO:tasks.workunit.client.0.vm03.stdout:3/852: stat d2/db/d6a/c94 0 2026-03-09T00:04:21.961 INFO:tasks.workunit.client.0.vm03.stdout:9/924: mkdir d15/d1c/d36/d4d/d124/d87/d93/dcf/d10c/d134 0 2026-03-09T00:04:21.961 INFO:tasks.workunit.client.0.vm03.stdout:9/925: write d15/d1c/d36/d4d/d124/f73 [4421158,17174] 0 2026-03-09T00:04:21.970 INFO:tasks.workunit.client.0.vm03.stdout:3/853: mkdir d2/dbf/dfa/d100/d10c 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:3/854: write d2/db/d3b/d3f/db8/fc8 [433306,75913] 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:3/855: write d2/f4e [1896239,126010] 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:3/856: write d2/db/d3b/d5f/da5/dd8/fe3 [2209317,130366] 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:3/857: creat d2/db/d3b/d3f/db8/f10d x:0 0 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:3/858: chown d2/db/d40/d88/fde 1 1 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:9/926: truncate d15/f1b 2222494 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:9/927: fdatasync d15/f17 0 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:9/928: chown d15/d1c/d21/d64/cde 939821 1 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:9/929: chown d15/d1c/d28 10985238 1 2026-03-09T00:04:21.981 INFO:tasks.workunit.client.0.vm03.stdout:3/859: getdents d2/db/d40/d101 0 2026-03-09T00:04:21.985 INFO:tasks.workunit.client.0.vm03.stdout:3/860: read d2/f1d [490924,21620] 0 2026-03-09T00:04:21.985 INFO:tasks.workunit.client.0.vm03.stdout:3/861: write d2/db/d3b/d3f/f46 [2140754,90181] 0 2026-03-09T00:04:21.988 INFO:tasks.workunit.client.0.vm03.stdout:3/862: truncate d2/db/d40/d58/fd0 179771 0 2026-03-09T00:04:21.988 INFO:tasks.workunit.client.0.vm03.stdout:3/863: fdatasync d2/f5 0 2026-03-09T00:04:21.988 INFO:tasks.workunit.client.0.vm03.stdout:3/864: chown d2/db/de6/ff2 47652 1 2026-03-09T00:04:21.989 INFO:tasks.workunit.client.0.vm03.stdout:3/865: creat d2/db/d3b/dc2/f10e x:0 0 0 2026-03-09T00:04:21.998 INFO:tasks.workunit.client.0.vm03.stdout:7/933: dwrite d2/d4/db7/f72 [0,4194304] 0 2026-03-09T00:04:21.998 INFO:tasks.workunit.client.0.vm03.stdout:7/934: chown d2/d4/d1e/dee/d104/d54/c55 3598440 1 2026-03-09T00:04:22.002 INFO:tasks.workunit.client.0.vm03.stdout:7/935: read d2/d1f/d3a/f10c [5033638,69733] 0 2026-03-09T00:04:22.006 INFO:tasks.workunit.client.0.vm03.stdout:7/936: rename d2/d4/d1e/dee/d104/d54/d8d/dd2/la7 to d2/d4/d1e/d5e/d7e/dd0/l125 0 2026-03-09T00:04:22.041 INFO:tasks.workunit.client.0.vm03.stdout:9/930: dwrite d15/d1c/d36/d4d/d124/d87/fd6 [0,4194304] 0 2026-03-09T00:04:22.041 INFO:tasks.workunit.client.0.vm03.stdout:9/931: chown d15/d1c/d36/d4d/d124/d87/fd6 241240 1 2026-03-09T00:04:22.043 INFO:tasks.workunit.client.0.vm03.stdout:7/937: dwrite d2/d4/d1e/dee/d104/d81/d96/d8e/ff1 [0,4194304] 0 2026-03-09T00:04:22.064 INFO:tasks.workunit.client.0.vm03.stdout:7/938: symlink d2/d4/d1e/dee/d104/d81/d96/d37/d39/l126 0 2026-03-09T00:04:22.068 INFO:tasks.workunit.client.0.vm03.stdout:7/939: creat d2/d4/d1e/dee/d104/d81/d96/d8e/d122/f127 x:0 0 0 2026-03-09T00:04:22.072 INFO:tasks.workunit.client.0.vm03.stdout:7/940: dread d2/d1f/d3a/d24/da4/f47 [0,4194304] 0 2026-03-09T00:04:22.072 INFO:tasks.workunit.client.0.vm03.stdout:7/941: chown d2/d4/d1e/dee/d104/d81/d96/d37/d39/d6e/fa1 718 1 2026-03-09T00:04:22.073 INFO:tasks.workunit.client.0.vm03.stdout:9/932: dwrite d15/d1c/d36/f5c [0,4194304] 0 2026-03-09T00:04:22.073 INFO:tasks.workunit.client.0.vm03.stdout:9/933: chown d15/d1c/d36/d4d/d124/d87/d93/fba 282 1 2026-03-09T00:04:22.073 INFO:tasks.workunit.client.0.vm03.stdout:9/934: readlink d15/d1c/d21/d75/de0/lc5 0 2026-03-09T00:04:22.076 INFO:tasks.workunit.client.0.vm03.stdout:7/942: symlink d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/l128 0 2026-03-09T00:04:22.076 INFO:tasks.workunit.client.0.vm03.stdout:7/943: stat d2/d4/d1e/dee/d104/d81/d96/d37/ce3 0 2026-03-09T00:04:22.076 INFO:tasks.workunit.client.0.vm03.stdout:7/944: readlink d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/l9f 0 2026-03-09T00:04:22.076 INFO:tasks.workunit.client.0.vm03.stdout:7/945: dread - d2/d4/d1e/dee/d104/d81/d96/d37/d39/f106 zero size 2026-03-09T00:04:22.076 INFO:tasks.workunit.client.0.vm03.stdout:7/946: write d2/d4/f100 [47006,19312] 0 2026-03-09T00:04:22.081 INFO:tasks.workunit.client.0.vm03.stdout:7/947: dread d2/d1f/d3a/d24/ff6 [0,4194304] 0 2026-03-09T00:04:22.081 INFO:tasks.workunit.client.0.vm03.stdout:7/948: chown d2/d4/db7/d67/l10f 111094758 1 2026-03-09T00:04:22.086 INFO:tasks.workunit.client.0.vm03.stdout:9/935: truncate d15/d1c/d21/d75/fa6 353676 0 2026-03-09T00:04:22.086 INFO:tasks.workunit.client.0.vm03.stdout:9/936: fsync d15/d1c/d28/f5b 0 2026-03-09T00:04:22.088 INFO:tasks.workunit.client.0.vm03.stdout:7/949: rmdir d2/d4/db7/d67 39 2026-03-09T00:04:22.093 INFO:tasks.workunit.client.0.vm03.stdout:7/950: write d2/d4/db7/f72 [3020633,127080] 0 2026-03-09T00:04:22.093 INFO:tasks.workunit.client.0.vm03.stdout:7/951: read d2/d4/d1e/dee/d104/d81/d96/d88/db9/f116 [104835,128722] 0 2026-03-09T00:04:22.095 INFO:tasks.workunit.client.0.vm03.stdout:6/941: sync 2026-03-09T00:04:22.098 INFO:tasks.workunit.client.0.vm03.stdout:3/866: sync 2026-03-09T00:04:22.098 INFO:tasks.workunit.client.0.vm03.stdout:3/867: fdatasync d2/db/f10 0 2026-03-09T00:04:22.098 INFO:tasks.workunit.client.0.vm03.stdout:3/868: chown d2/db/d6a/l71 13758617 1 2026-03-09T00:04:22.100 INFO:tasks.workunit.client.0.vm03.stdout:7/952: symlink d2/d4/d1e/d78/l129 0 2026-03-09T00:04:22.103 INFO:tasks.workunit.client.0.vm03.stdout:3/869: write d2/db/d3b/d5f/fb3 [58442,7596] 0 2026-03-09T00:04:22.103 INFO:tasks.workunit.client.0.vm03.stdout:3/870: readlink d2/db/d40/l8e 0 2026-03-09T00:04:22.103 INFO:tasks.workunit.client.0.vm03.stdout:3/871: chown d2/dbf/dfa/d100/d10c 9092692 1 2026-03-09T00:04:22.115 INFO:tasks.workunit.client.0.vm03.stdout:6/942: creat d13/d1e/f142 x:0 0 0 2026-03-09T00:04:22.115 INFO:tasks.workunit.client.0.vm03.stdout:6/943: creat d13/d1e/d44/d59/dec/d62/dfb/f143 x:0 0 0 2026-03-09T00:04:22.117 INFO:tasks.workunit.client.0.vm03.stdout:9/937: truncate d15/d1c/d36/f5c 413142 0 2026-03-09T00:04:22.131 INFO:tasks.workunit.client.0.vm03.stdout:7/953: symlink d2/d4/db7/d67/d6b/l12a 0 2026-03-09T00:04:22.133 INFO:tasks.workunit.client.0.vm03.stdout:6/944: symlink d13/d35/d74/d89/l144 0 2026-03-09T00:04:22.133 INFO:tasks.workunit.client.0.vm03.stdout:6/945: chown d13/d35/dff/f104 8575 1 2026-03-09T00:04:22.133 INFO:tasks.workunit.client.0.vm03.stdout:7/954: mkdir d2/d4/d1e/dee/d104/d54/d12b 0 2026-03-09T00:04:22.140 INFO:tasks.workunit.client.0.vm03.stdout:7/955: creat d2/d1f/d35/f12c x:0 0 0 2026-03-09T00:04:22.141 INFO:tasks.workunit.client.0.vm03.stdout:7/956: creat d2/d4/d1e/dee/f12d x:0 0 0 2026-03-09T00:04:22.142 INFO:tasks.workunit.client.0.vm03.stdout:7/957: unlink d2/d1f/d35/l66 0 2026-03-09T00:04:22.142 INFO:tasks.workunit.client.0.vm03.stdout:7/958: write d2/d1f/d35/f107 [1005180,97718] 0 2026-03-09T00:04:22.142 INFO:tasks.workunit.client.0.vm03.stdout:7/959: chown d2/d4/d1e/dee/d104/d81/d111 1699 1 2026-03-09T00:04:22.143 INFO:tasks.workunit.client.0.vm03.stdout:7/960: mknod d2/d4/c12e 0 2026-03-09T00:04:22.156 INFO:tasks.workunit.client.0.vm03.stdout:7/961: creat d2/d4/d1e/dee/d104/d54/f12f x:0 0 0 2026-03-09T00:04:22.158 INFO:tasks.workunit.client.0.vm03.stdout:3/872: dwrite d2/db/d3b/d5f/da5/d72/d96/fbb [0,4194304] 0 2026-03-09T00:04:22.158 INFO:tasks.workunit.client.0.vm03.stdout:3/873: write d2/db/d3b/d3f/f46 [5170967,58489] 0 2026-03-09T00:04:22.160 INFO:tasks.workunit.client.0.vm03.stdout:7/962: dread d2/d4/d1e/dee/d104/d81/d96/d8e/ff1 [0,4194304] 0 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:3/874: creat d2/db/d40/d101/d68/f10f x:0 0 0 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:3/875: chown d2/db/d3b/d3f/db8/lfe 141210970 1 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:3/876: truncate d2/f8c 485268 0 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:3/877: fsync d2/db/d40/d88/f89 0 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:7/963: mknod d2/d1f/c130 0 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:7/964: fsync d2/d4/d1e/dee/d104/d81/f8f 0 2026-03-09T00:04:22.163 INFO:tasks.workunit.client.0.vm03.stdout:7/965: chown d2/d1f/d35/f12c 112434788 1 2026-03-09T00:04:22.170 INFO:tasks.workunit.client.0.vm03.stdout:9/938: dwrite d15/d1c/d21/f34 [0,4194304] 0 2026-03-09T00:04:22.171 INFO:tasks.workunit.client.0.vm03.stdout:7/966: creat d2/d4/db7/f131 x:0 0 0 2026-03-09T00:04:22.173 INFO:tasks.workunit.client.0.vm03.stdout:9/939: getdents d15/d1c/d36/d4d/d124/dab/df6/d10e 0 2026-03-09T00:04:22.174 INFO:tasks.workunit.client.0.vm03.stdout:7/967: rename d2/d4/d1e/dee/d104/l33 to d2/d4/d1e/l132 0 2026-03-09T00:04:22.201 INFO:tasks.workunit.client.0.vm03.stdout:6/946: dwrite d13/d35/db5/fe3 [0,4194304] 0 2026-03-09T00:04:22.203 INFO:tasks.workunit.client.0.vm03.stdout:6/947: dread d13/d1e/fc3 [0,4194304] 0 2026-03-09T00:04:22.236 INFO:tasks.workunit.client.0.vm03.stdout:3/878: dwrite d2/db/d3b/d5d/f60 [0,4194304] 0 2026-03-09T00:04:22.236 INFO:tasks.workunit.client.0.vm03.stdout:3/879: fsync d2/db/d40/d101/db5/fdd 0 2026-03-09T00:04:22.237 INFO:tasks.workunit.client.0.vm03.stdout:3/880: symlink d2/dbf/d102/l110 0 2026-03-09T00:04:22.237 INFO:tasks.workunit.client.0.vm03.stdout:3/881: truncate d2/db/d3b/d5f/da5/d72/d96/fdb 917057 0 2026-03-09T00:04:22.238 INFO:tasks.workunit.client.0.vm03.stdout:3/882: unlink d2/db/d3b/d5f/da5/fec 0 2026-03-09T00:04:22.242 INFO:tasks.workunit.client.0.vm03.stdout:3/883: dread d2/db/d6a/fa1 [4194304,4194304] 0 2026-03-09T00:04:22.242 INFO:tasks.workunit.client.0.vm03.stdout:3/884: mkdir d2/db/d3b/d3f/d111 0 2026-03-09T00:04:22.243 INFO:tasks.workunit.client.0.vm03.stdout:3/885: rename d2/db/d40/d51/f5c to d2/db/d2d/d55/f112 0 2026-03-09T00:04:22.246 INFO:tasks.workunit.client.0.vm03.stdout:3/886: creat d2/db/d56/df1/f113 x:0 0 0 2026-03-09T00:04:22.248 INFO:tasks.workunit.client.0.vm03.stdout:3/887: rename d2/l19 to d2/db/d3b/d5f/da5/d72/dbd/l114 0 2026-03-09T00:04:22.250 INFO:tasks.workunit.client.0.vm03.stdout:3/888: rename d2/db/d40/d101/d68/lac to d2/db/d40/d58/l115 0 2026-03-09T00:04:22.264 INFO:tasks.workunit.client.0.vm03.stdout:7/968: dwrite d2/d1f/f3b [0,4194304] 0 2026-03-09T00:04:22.265 INFO:tasks.workunit.client.0.vm03.stdout:9/940: dwrite d15/d1c/d36/f86 [0,4194304] 0 2026-03-09T00:04:22.265 INFO:tasks.workunit.client.0.vm03.stdout:9/941: readlink d15/d1c/d28/de1/lce 0 2026-03-09T00:04:22.274 INFO:tasks.workunit.client.0.vm03.stdout:7/969: rename d2/d4/d1e/dee/d104/d81/d96/d37/d39/dff to d2/d4/d1e/dee/d104/d54/d12b/d133 0 2026-03-09T00:04:22.279 INFO:tasks.workunit.client.0.vm03.stdout:3/889: dwrite d2/db/d3b/d5f/da5/d72/dbd/fe0 [0,4194304] 0 2026-03-09T00:04:22.280 INFO:tasks.workunit.client.0.vm03.stdout:6/948: dwrite d13/d35/d69/dee/f112 [0,4194304] 0 2026-03-09T00:04:22.283 INFO:tasks.workunit.client.0.vm03.stdout:9/942: creat d15/f135 x:0 0 0 2026-03-09T00:04:22.288 INFO:tasks.workunit.client.0.vm03.stdout:9/943: dread d15/d1c/f102 [0,4194304] 0 2026-03-09T00:04:22.294 INFO:tasks.workunit.client.0.vm03.stdout:3/890: mknod d2/db/d3b/dc2/c116 0 2026-03-09T00:04:22.294 INFO:tasks.workunit.client.0.vm03.stdout:3/891: write d2/db/d3b/d3f/db8/f10d [584930,124183] 0 2026-03-09T00:04:22.294 INFO:tasks.workunit.client.0.vm03.stdout:3/892: fsync d2/db/f24 0 2026-03-09T00:04:22.297 INFO:tasks.workunit.client.0.vm03.stdout:3/893: dread d2/db/d40/d88/f89 [0,4194304] 0 2026-03-09T00:04:22.299 INFO:tasks.workunit.client.0.vm03.stdout:9/944: symlink d15/d1c/d36/d4d/d124/d87/d93/dcf/d10c/d134/l136 0 2026-03-09T00:04:22.304 INFO:tasks.workunit.client.0.vm03.stdout:7/970: dwrite d2/d4/db7/daa/fec [0,4194304] 0 2026-03-09T00:04:22.309 INFO:tasks.workunit.client.0.vm03.stdout:6/949: getdents d13/d35/db5 0 2026-03-09T00:04:22.311 INFO:tasks.workunit.client.0.vm03.stdout:3/894: creat d2/dbf/d102/f117 x:0 0 0 2026-03-09T00:04:22.318 INFO:tasks.workunit.client.0.vm03.stdout:6/950: creat d13/d35/d74/d89/d9d/f145 x:0 0 0 2026-03-09T00:04:22.318 INFO:tasks.workunit.client.0.vm03.stdout:6/951: chown d13/d35/d71/d97/fd2 120710 1 2026-03-09T00:04:22.320 INFO:tasks.workunit.client.0.vm03.stdout:3/895: mkdir d2/db/d3b/d3f/d111/d118 0 2026-03-09T00:04:22.322 INFO:tasks.workunit.client.0.vm03.stdout:6/952: rename ce to d13/d1e/d44/d59/dec/c146 0 2026-03-09T00:04:22.322 INFO:tasks.workunit.client.0.vm03.stdout:6/953: write d13/d1e/d44/d59/f6e [4816959,118007] 0 2026-03-09T00:04:22.322 INFO:tasks.workunit.client.0.vm03.stdout:6/954: chown d13/dc4/dea/d102/fde 171451 1 2026-03-09T00:04:22.337 INFO:tasks.workunit.client.0.vm03.stdout:6/955: unlink d13/d1e/d44/f49 0 2026-03-09T00:04:22.342 INFO:tasks.workunit.client.0.vm03.stdout:6/956: mknod d13/d35/d71/d97/da5/c147 0 2026-03-09T00:04:22.342 INFO:tasks.workunit.client.0.vm03.stdout:6/957: chown d13/l2a 25219931 1 2026-03-09T00:04:22.343 INFO:tasks.workunit.client.0.vm03.stdout:3/896: rename d2/db/d3b/d3f/db8 to d2/db/d56/df1/d119 0 2026-03-09T00:04:22.346 INFO:tasks.workunit.client.0.vm03.stdout:7/971: dwrite d2/d4/db7/d67/f95 [0,4194304] 0 2026-03-09T00:04:22.349 INFO:tasks.workunit.client.0.vm03.stdout:9/945: dwrite d15/d1c/d21/d64/fc2 [0,4194304] 0 2026-03-09T00:04:22.349 INFO:tasks.workunit.client.0.vm03.stdout:9/946: chown d15/d1c/d36/d4d/fb9 96 1 2026-03-09T00:04:22.350 INFO:tasks.workunit.client.0.vm03.stdout:6/958: creat d13/d1e/d44/d59/dec/f148 x:0 0 0 2026-03-09T00:04:22.351 INFO:tasks.workunit.client.0.vm03.stdout:3/897: write d2/f8a [2232039,104906] 0 2026-03-09T00:04:22.354 INFO:tasks.workunit.client.0.vm03.stdout:6/959: dread d13/d35/f119 [0,4194304] 0 2026-03-09T00:04:22.358 INFO:tasks.workunit.client.0.vm03.stdout:7/972: symlink d2/d4/db7/daa/l134 0 2026-03-09T00:04:22.361 INFO:tasks.workunit.client.0.vm03.stdout:7/973: dread d2/d4/d1e/dee/d104/d54/f9b [0,4194304] 0 2026-03-09T00:04:22.365 INFO:tasks.workunit.client.0.vm03.stdout:9/947: mkdir d15/d1c/d36/d4d/d124/d87/d93/dcf/d137 0 2026-03-09T00:04:22.365 INFO:tasks.workunit.client.0.vm03.stdout:9/948: read - d15/f128 zero size 2026-03-09T00:04:22.365 INFO:tasks.workunit.client.0.vm03.stdout:9/949: write d15/d1c/d36/d4d/d124/d12b/f12c [572565,91196] 0 2026-03-09T00:04:22.377 INFO:tasks.workunit.client.0.vm03.stdout:3/898: symlink d2/db/d2d/d55/l11a 0 2026-03-09T00:04:22.377 INFO:tasks.workunit.client.0.vm03.stdout:3/899: creat d2/db/de6/f11b x:0 0 0 2026-03-09T00:04:22.377 INFO:tasks.workunit.client.0.vm03.stdout:6/960: link d13/d35/d74/d89/db3/ff6 d13/d1e/d44/d10b/f149 0 2026-03-09T00:04:22.377 INFO:tasks.workunit.client.0.vm03.stdout:7/974: rename d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/l45 to d2/d4/db7/d67/l135 0 2026-03-09T00:04:22.389 INFO:tasks.workunit.client.0.vm03.stdout:3/900: dread d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:04:22.389 INFO:tasks.workunit.client.0.vm03.stdout:3/901: readlink d2/db/d40/d58/l115 0 2026-03-09T00:04:22.397 INFO:tasks.workunit.client.0.vm03.stdout:6/961: rename d13/d35/d74/d89/f11c to d13/d1e/d44/d59/f14a 0 2026-03-09T00:04:22.406 INFO:tasks.workunit.client.0.vm03.stdout:6/962: mknod d13/d35/d72/c14b 0 2026-03-09T00:04:22.430 INFO:tasks.workunit.client.0.vm03.stdout:6/963: dread d13/d1e/f34 [4194304,4194304] 0 2026-03-09T00:04:22.430 INFO:tasks.workunit.client.0.vm03.stdout:6/964: creat d13/d35/d72/f14c x:0 0 0 2026-03-09T00:04:22.430 INFO:tasks.workunit.client.0.vm03.stdout:6/965: dread - d13/d1e/f142 zero size 2026-03-09T00:04:22.431 INFO:tasks.workunit.client.0.vm03.stdout:6/966: mknod d13/dc4/c14d 0 2026-03-09T00:04:22.431 INFO:tasks.workunit.client.0.vm03.stdout:6/967: stat d13/d35/d71/d97/fd2 0 2026-03-09T00:04:22.432 INFO:tasks.workunit.client.0.vm03.stdout:6/968: mkdir d13/d1e/d44/d10b/d14e 0 2026-03-09T00:04:22.432 INFO:tasks.workunit.client.0.vm03.stdout:6/969: chown d13/d35/d74/d89/db3/f129 985 1 2026-03-09T00:04:22.432 INFO:tasks.workunit.client.0.vm03.stdout:6/970: chown d13/d1e/d44/l53 8330069 1 2026-03-09T00:04:22.432 INFO:tasks.workunit.client.0.vm03.stdout:6/971: creat d13/d1e/d44/d4a/f14f x:0 0 0 2026-03-09T00:04:22.435 INFO:tasks.workunit.client.0.vm03.stdout:7/975: dwrite d2/d4/d1e/dee/f12d [0,4194304] 0 2026-03-09T00:04:22.437 INFO:tasks.workunit.client.0.vm03.stdout:9/950: dwrite d15/d1c/d21/d64/f50 [0,4194304] 0 2026-03-09T00:04:22.437 INFO:tasks.workunit.client.0.vm03.stdout:9/951: dread - d15/f135 zero size 2026-03-09T00:04:22.437 INFO:tasks.workunit.client.0.vm03.stdout:9/952: write d15/d1c/d28/d6e/da2/ff2 [3942544,70496] 0 2026-03-09T00:04:22.440 INFO:tasks.workunit.client.0.vm03.stdout:7/976: mknod d2/d4/db7/c136 0 2026-03-09T00:04:22.441 INFO:tasks.workunit.client.0.vm03.stdout:9/953: creat d15/d1c/d36/d4d/d124/f138 x:0 0 0 2026-03-09T00:04:22.445 INFO:tasks.workunit.client.0.vm03.stdout:7/977: rename d2/d4/d1e/dee/d104/d81 to d2/d4/db7/d137 0 2026-03-09T00:04:22.447 INFO:tasks.workunit.client.0.vm03.stdout:7/978: rmdir d2/d4/db7/d137/d96/d80 39 2026-03-09T00:04:22.449 INFO:tasks.workunit.client.0.vm03.stdout:3/902: dwrite d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:04:22.449 INFO:tasks.workunit.client.0.vm03.stdout:3/903: creat d2/db/d40/f11c x:0 0 0 2026-03-09T00:04:22.451 INFO:tasks.workunit.client.0.vm03.stdout:7/979: link d2/d4/c1c d2/d1f/d3a/c138 0 2026-03-09T00:04:22.456 INFO:tasks.workunit.client.0.vm03.stdout:7/980: symlink d2/d4/db7/daa/l139 0 2026-03-09T00:04:22.457 INFO:tasks.workunit.client.0.vm03.stdout:7/981: mknod d2/d1f/c13a 0 2026-03-09T00:04:22.459 INFO:tasks.workunit.client.0.vm03.stdout:7/982: symlink d2/d4/d1e/dee/d104/d54/d12b/d133/l13b 0 2026-03-09T00:04:22.460 INFO:tasks.workunit.client.0.vm03.stdout:7/983: rename d2/d4/db7/d137/d96/d80/l9d to d2/d4/d1e/dee/d104/d54/d8d/dad/d9c/l13c 0 2026-03-09T00:04:22.462 INFO:tasks.workunit.client.0.vm03.stdout:6/972: dread d13/d1e/d44/d59/f6e [0,4194304] 0 2026-03-09T00:04:22.462 INFO:tasks.workunit.client.0.vm03.stdout:7/984: mknod d2/d4/d1e/d78/c13d 0 2026-03-09T00:04:22.462 INFO:tasks.workunit.client.0.vm03.stdout:9/954: fdatasync fb 0 2026-03-09T00:04:22.462 INFO:tasks.workunit.client.0.vm03.stdout:9/955: readlink d15/d1c/d36/d4d/d124/d87/d93/d115/l133 0 2026-03-09T00:04:22.466 INFO:tasks.workunit.client.0.vm03.stdout:6/973: rename d13/d1e/c64 to d13/d1e/d44/d59/c150 0 2026-03-09T00:04:22.468 INFO:tasks.workunit.client.0.vm03.stdout:7/985: rename d2/d4/d1e/d5e/lb5 to d2/d4/d1e/dee/l13e 0 2026-03-09T00:04:22.468 INFO:tasks.workunit.client.0.vm03.stdout:7/986: chown d2/d4/d1e/dee/d104/l4f 1128706 1 2026-03-09T00:04:22.469 INFO:tasks.workunit.client.0.vm03.stdout:6/974: mknod d13/d35/d71/d97/da5/db1/c151 0 2026-03-09T00:04:22.472 INFO:tasks.workunit.client.0.vm03.stdout:7/987: read d2/d4/db7/d137/d96/d8e/db2/ffd [108154,97920] 0 2026-03-09T00:04:22.472 INFO:tasks.workunit.client.0.vm03.stdout:7/988: mkdir d2/d4/d1e/dee/d104/d54/d12b/d133/d13f 0 2026-03-09T00:04:22.473 INFO:tasks.workunit.client.0.vm03.stdout:7/989: truncate d2/f50 6967735 0 2026-03-09T00:04:22.475 INFO:tasks.workunit.client.0.vm03.stdout:7/990: symlink d2/d4/d1e/d5e/daf/l140 0 2026-03-09T00:04:22.530 INFO:tasks.workunit.client.0.vm03.stdout:3/904: dwrite d2/db/d40/d58/fd0 [0,4194304] 0 2026-03-09T00:04:22.530 INFO:tasks.workunit.client.0.vm03.stdout:3/905: creat d2/db/f11d x:0 0 0 2026-03-09T00:04:22.531 INFO:tasks.workunit.client.0.vm03.stdout:3/906: rmdir d2/db/d3b 39 2026-03-09T00:04:22.531 INFO:tasks.workunit.client.0.vm03.stdout:3/907: rename d2/db/d2d/d55/da9/c107 to d2/dbf/dfa/c11e 0 2026-03-09T00:04:22.531 INFO:tasks.workunit.client.0.vm03.stdout:3/908: fsync d2/db/d3b/d5d/fba 0 2026-03-09T00:04:22.532 INFO:tasks.workunit.client.0.vm03.stdout:3/909: symlink d2/l11f 0 2026-03-09T00:04:22.532 INFO:tasks.workunit.client.0.vm03.stdout:3/910: write d2/db/d40/d51/fe7 [544219,81616] 0 2026-03-09T00:04:22.532 INFO:tasks.workunit.client.0.vm03.stdout:3/911: rmdir d2/db/d40/d51/da2/db1 39 2026-03-09T00:04:22.533 INFO:tasks.workunit.client.0.vm03.stdout:3/912: mkdir d2/db/d3b/d3f/daf/d120 0 2026-03-09T00:04:22.542 INFO:tasks.workunit.client.0.vm03.stdout:3/913: dread d2/db/d40/f4a [0,4194304] 0 2026-03-09T00:04:22.542 INFO:tasks.workunit.client.0.vm03.stdout:3/914: read d2/db/d3b/f95 [125105,130701] 0 2026-03-09T00:04:22.546 INFO:tasks.workunit.client.0.vm03.stdout:3/915: mknod d2/db/d3b/d3f/d111/c121 0 2026-03-09T00:04:22.547 INFO:tasks.workunit.client.0.vm03.stdout:3/916: rename d2/db/d3b/d5f/da5/d72/dbd/fee to d2/db/d2d/f122 0 2026-03-09T00:04:22.553 INFO:tasks.workunit.client.0.vm03.stdout:9/956: dwrite d15/d1c/d36/f78 [4194304,4194304] 0 2026-03-09T00:04:22.553 INFO:tasks.workunit.client.0.vm03.stdout:6/975: dwrite d13/d1e/d44/d4a/d52/f54 [0,4194304] 0 2026-03-09T00:04:22.553 INFO:tasks.workunit.client.0.vm03.stdout:9/957: creat d15/d77/de2/f139 x:0 0 0 2026-03-09T00:04:22.553 INFO:tasks.workunit.client.0.vm03.stdout:9/958: fdatasync d15/d1c/d36/d4d/d124/d87/d93/f7e 0 2026-03-09T00:04:22.554 INFO:tasks.workunit.client.0.vm03.stdout:7/991: dwrite d2/d4/db7/d67/f70 [0,4194304] 0 2026-03-09T00:04:22.556 INFO:tasks.workunit.client.0.vm03.stdout:3/917: write d2/db/d3b/d3f/f46 [5030506,79686] 0 2026-03-09T00:04:22.570 INFO:tasks.workunit.client.0.vm03.stdout:6/976: truncate d13/d35/d74/d89/db3/f129 2904216 0 2026-03-09T00:04:22.570 INFO:tasks.workunit.client.0.vm03.stdout:7/992: symlink d2/d4/db7/d137/d96/d88/l141 0 2026-03-09T00:04:22.576 INFO:tasks.workunit.client.0.vm03.stdout:3/918: creat d2/db/d40/d51/f123 x:0 0 0 2026-03-09T00:04:22.630 INFO:tasks.workunit.client.0.vm03.stdout:9/959: dwrite d15/d7f/f12f [0,4194304] 0 2026-03-09T00:04:22.667 INFO:tasks.workunit.client.0.vm03.stdout:6/977: dwrite d13/d1e/d44/d59/dec/f122 [0,4194304] 0 2026-03-09T00:04:22.667 INFO:tasks.workunit.client.0.vm03.stdout:6/978: dread - d13/d1e/d44/d59/d77/ff4 zero size 2026-03-09T00:04:22.667 INFO:tasks.workunit.client.0.vm03.stdout:7/993: dwrite d2/d4/d1e/d5e/d7e/fb4 [0,4194304] 0 2026-03-09T00:04:22.667 INFO:tasks.workunit.client.0.vm03.stdout:3/919: dwrite d2/db/f24 [4194304,4194304] 0 2026-03-09T00:04:22.670 INFO:tasks.workunit.client.0.vm03.stdout:9/960: dwrite d15/f17 [0,4194304] 0 2026-03-09T00:04:22.674 INFO:tasks.workunit.client.0.vm03.stdout:6/979: mknod d13/d1e/d44/d59/d77/c152 0 2026-03-09T00:04:22.677 INFO:tasks.workunit.client.0.vm03.stdout:7/994: truncate d2/f3 2764141 0 2026-03-09T00:04:22.679 INFO:tasks.workunit.client.0.vm03.stdout:6/980: link d13/d35/d74/d89/db3/cd0 d13/d35/d74/d89/d9d/c153 0 2026-03-09T00:04:22.679 INFO:tasks.workunit.client.0.vm03.stdout:6/981: getdents d13/d35/db5 0 2026-03-09T00:04:22.679 INFO:tasks.workunit.client.0.vm03.stdout:6/982: chown d13/d35/d74/d89/l100 27736918 1 2026-03-09T00:04:22.679 INFO:tasks.workunit.client.0.vm03.stdout:6/983: write d13/d1e/d44/d59/d77/ff4 [524203,97652] 0 2026-03-09T00:04:22.690 INFO:tasks.workunit.client.0.vm03.stdout:6/984: dread d13/f5d [0,4194304] 0 2026-03-09T00:04:22.690 INFO:tasks.workunit.client.0.vm03.stdout:6/985: mknod d13/d35/d71/d97/da5/dc1/c154 0 2026-03-09T00:04:22.690 INFO:tasks.workunit.client.0.vm03.stdout:6/986: fsync f2 0 2026-03-09T00:04:22.691 INFO:tasks.workunit.client.0.vm03.stdout:6/987: getdents d13/d35/d72/dcc 0 2026-03-09T00:04:22.691 INFO:tasks.workunit.client.0.vm03.stdout:6/988: fdatasync d13/dc4/dea/dd7/f137 0 2026-03-09T00:04:22.692 INFO:tasks.workunit.client.0.vm03.stdout:6/989: mkdir d13/d35/d69/dee/d155 0 2026-03-09T00:04:22.733 INFO:tasks.workunit.client.0.vm03.stdout:9/961: dwrite d15/d1c/d36/d4d/d124/d87/d93/ff1 [0,4194304] 0 2026-03-09T00:04:22.734 INFO:tasks.workunit.client.0.vm03.stdout:9/962: unlink d15/d1c/d36/d4d/d124/dab/lb7 0 2026-03-09T00:04:22.734 INFO:tasks.workunit.client.0.vm03.stdout:9/963: chown d15/d1c/d36/d4d/d124/d87/d93/fba 23574 1 2026-03-09T00:04:22.736 INFO:tasks.workunit.client.0.vm03.stdout:9/964: mknod d15/d1c/d21/d75/c13a 0 2026-03-09T00:04:22.738 INFO:tasks.workunit.client.0.vm03.stdout:9/965: rename d15/d7f/fdf to d15/d1c/d21/f13b 0 2026-03-09T00:04:22.740 INFO:tasks.workunit.client.0.vm03.stdout:9/966: creat d15/d77/dfb/f13c x:0 0 0 2026-03-09T00:04:22.740 INFO:tasks.workunit.client.0.vm03.stdout:9/967: write d15/f128 [834993,17044] 0 2026-03-09T00:04:22.741 INFO:tasks.workunit.client.0.vm03.stdout:3/920: dwrite d2/f8c [0,4194304] 0 2026-03-09T00:04:22.742 INFO:tasks.workunit.client.0.vm03.stdout:6/990: dwrite d13/d1e/d44/d59/f6c [0,4194304] 0 2026-03-09T00:04:22.742 INFO:tasks.workunit.client.0.vm03.stdout:9/968: creat d15/d1c/d36/d4d/d124/d87/d93/d115/f13d x:0 0 0 2026-03-09T00:04:22.742 INFO:tasks.workunit.client.0.vm03.stdout:9/969: chown d15/d1c/d21/f4c 1033712 1 2026-03-09T00:04:22.748 INFO:tasks.workunit.client.0.vm03.stdout:3/921: rename d2/db/d6a/dc6/lcc to d2/db/d40/d101/db5/l124 0 2026-03-09T00:04:22.748 INFO:tasks.workunit.client.0.vm03.stdout:3/922: write d2/db/d3b/d5d/fdf [884757,104295] 0 2026-03-09T00:04:22.748 INFO:tasks.workunit.client.0.vm03.stdout:3/923: write d2/db/d2d/fae [4776979,119378] 0 2026-03-09T00:04:22.750 INFO:tasks.workunit.client.0.vm03.stdout:6/991: mkdir d13/d1e/d44/d4a/d128/d156 0 2026-03-09T00:04:22.758 INFO:tasks.workunit.client.0.vm03.stdout:7/995: dread d2/d4/d1e/fae [0,4194304] 0 2026-03-09T00:04:22.759 INFO:tasks.workunit.client.0.vm03.stdout:7/996: dread - d2/d4/d1e/f11e zero size 2026-03-09T00:04:22.759 INFO:tasks.workunit.client.0.vm03.stdout:7/997: write d2/d4/d8c/fce [730421,54581] 0 2026-03-09T00:04:22.759 INFO:tasks.workunit.client.0.vm03.stdout:7/998: dread d2/d4/db7/d137/d96/d8e/db2/ffd [0,4194304] 0 2026-03-09T00:04:22.759 INFO:tasks.workunit.client.0.vm03.stdout:7/999: write d2/d4/db7/d137/d96/fbb [2209901,67709] 0 2026-03-09T00:04:22.760 INFO:tasks.workunit.client.0.vm03.stdout:9/970: rename d15/d1c/d21/d75/de0 to d15/d1c/d36/d4d/d124/d87/d93/dcf/d13e 0 2026-03-09T00:04:22.763 INFO:tasks.workunit.client.0.vm03.stdout:6/992: link d13/d35/d71/d97/da5/db1/c151 d13/d35/d74/d89/d9d/c157 0 2026-03-09T00:04:22.769 INFO:tasks.workunit.client.0.vm03.stdout:6/993: write d13/d35/d69/ff7 [29834,113709] 0 2026-03-09T00:04:22.769 INFO:tasks.workunit.client.0.vm03.stdout:9/971: mknod d15/d1c/d36/d4d/d124/d87/d93/c13f 0 2026-03-09T00:04:22.769 INFO:tasks.workunit.client.0.vm03.stdout:9/972: fdatasync d15/d7f/f88 0 2026-03-09T00:04:22.769 INFO:tasks.workunit.client.0.vm03.stdout:9/973: dread d15/d1c/d21/f4c [0,4194304] 0 2026-03-09T00:04:22.773 INFO:tasks.workunit.client.0.vm03.stdout:6/994: unlink d13/d1e/d44/d4a/d52/f54 0 2026-03-09T00:04:22.773 INFO:tasks.workunit.client.0.vm03.stdout:6/995: fsync d13/d1e/f3e 0 2026-03-09T00:04:22.779 INFO:tasks.workunit.client.0.vm03.stdout:6/996: rename d13/d1e/fc3 to d13/d35/d74/d89/d9d/f158 0 2026-03-09T00:04:22.782 INFO:tasks.workunit.client.0.vm03.stdout:6/997: symlink d13/d1e/d44/d59/d77/d114/d13a/l159 0 2026-03-09T00:04:22.787 INFO:tasks.workunit.client.0.vm03.stdout:6/998: write d13/d1e/f9f [1715636,19120] 0 2026-03-09T00:04:22.792 INFO:tasks.workunit.client.0.vm03.stdout:6/999: link d13/d1e/f142 d13/d1e/d44/d59/dec/d62/dfb/f15a 0 2026-03-09T00:04:22.800 INFO:tasks.workunit.client.0.vm03.stdout:3/924: dwrite d2/db/ff9 [0,4194304] 0 2026-03-09T00:04:22.804 INFO:tasks.workunit.client.0.vm03.stdout:9/974: dwrite d15/d1c/d36/d4d/d124/f65 [0,4194304] 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/975: truncate d15/d1c/d21/d64/fc2 2114035 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/976: chown d15/d77/dfb/f12d 65445 1 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/977: fsync d15/d1c/d21/fcd 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/978: write d15/d1c/d36/d4d/d124/f138 [870908,122170] 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/979: write d15/d1c/d28/dda/f129 [1009845,18529] 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/980: stat d15/d1c/d36/f78 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/981: truncate d15/d1c/d28/d6e/da2/fc0 333034 0 2026-03-09T00:04:22.805 INFO:tasks.workunit.client.0.vm03.stdout:9/982: write d15/d1c/d36/d4d/d124/dab/df6/f125 [708777,59470] 0 2026-03-09T00:04:22.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:22 vm03.local ceph-mon[52346]: pgmap v14: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 158 MiB/s rd, 190 MiB/s wr, 288 op/s 2026-03-09T00:04:22.853 INFO:tasks.workunit.client.0.vm03.stdout:9/983: dwrite d15/d1c/d36/d4d/d124/f80 [0,4194304] 0 2026-03-09T00:04:22.853 INFO:tasks.workunit.client.0.vm03.stdout:9/984: readlink d15/d1c/d21/d64/l10f 0 2026-03-09T00:04:22.853 INFO:tasks.workunit.client.0.vm03.stdout:9/985: chown fb 119295925 1 2026-03-09T00:04:22.854 INFO:tasks.workunit.client.0.vm03.stdout:3/925: dwrite d2/db/d40/d88/f89 [0,4194304] 0 2026-03-09T00:04:22.855 INFO:tasks.workunit.client.0.vm03.stdout:9/986: rmdir d15/d1c/d36/d4d/d124/d87/d93/dcf/d10c/d134 39 2026-03-09T00:04:22.863 INFO:tasks.workunit.client.0.vm03.stdout:3/926: dread d2/db/d40/d51/fe7 [0,4194304] 0 2026-03-09T00:04:22.863 INFO:tasks.workunit.client.0.vm03.stdout:3/927: dread - d2/db/d40/f103 zero size 2026-03-09T00:04:22.864 INFO:tasks.workunit.client.0.vm03.stdout:9/987: rename d15/d1c/d28/le8 to d15/d1c/d21/l140 0 2026-03-09T00:04:22.868 INFO:tasks.workunit.client.0.vm03.stdout:3/928: link d2/ld1 d2/db/d2d/d55/l125 0 2026-03-09T00:04:22.868 INFO:tasks.workunit.client.0.vm03.stdout:3/929: chown d2/db/d3b/l91 18875608 1 2026-03-09T00:04:22.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:22 vm06.local ceph-mon[58395]: pgmap v14: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 158 MiB/s rd, 190 MiB/s wr, 288 op/s 2026-03-09T00:04:22.873 INFO:tasks.workunit.client.0.vm03.stdout:9/988: truncate d15/d1c/d36/d4d/d124/dab/fe3 159540 0 2026-03-09T00:04:22.873 INFO:tasks.workunit.client.0.vm03.stdout:9/989: creat d15/d1c/d28/de1/ded/f141 x:0 0 0 2026-03-09T00:04:22.879 INFO:tasks.workunit.client.0.vm03.stdout:3/930: unlink d2/db/d3b/l91 0 2026-03-09T00:04:22.885 INFO:tasks.workunit.client.0.vm03.stdout:9/990: rename d15/d1c/d36/d4d/dc4/lfe to d15/d1c/d36/d4d/d124/d87/l142 0 2026-03-09T00:04:22.885 INFO:tasks.workunit.client.0.vm03.stdout:3/931: truncate d2/fc5 3967324 0 2026-03-09T00:04:22.885 INFO:tasks.workunit.client.0.vm03.stdout:3/932: write d2/db/d3b/dc2/f10e [1009435,25807] 0 2026-03-09T00:04:22.886 INFO:tasks.workunit.client.0.vm03.stdout:9/991: mkdir d15/d1c/d36/d4d/d124/dab/df6/d10e/d12a/d143 0 2026-03-09T00:04:22.887 INFO:tasks.workunit.client.0.vm03.stdout:3/933: creat d2/f126 x:0 0 0 2026-03-09T00:04:22.892 INFO:tasks.workunit.client.0.vm03.stdout:9/992: rename d15/d77/fbd to d15/d1c/d36/d4d/d124/d87/d93/dcf/f144 0 2026-03-09T00:04:22.892 INFO:tasks.workunit.client.0.vm03.stdout:9/993: stat d15/d1c/d21/d64/c40 0 2026-03-09T00:04:22.892 INFO:tasks.workunit.client.0.vm03.stdout:9/994: write d15/d7f/f12f [4291880,129941] 0 2026-03-09T00:04:22.897 INFO:tasks.workunit.client.0.vm03.stdout:9/995: link d15/d1c/d28/de1/ded/f141 d15/d1c/d36/d4d/d11d/f145 0 2026-03-09T00:04:22.935 INFO:tasks.workunit.client.0.vm03.stdout:9/996: dwrite d15/f44 [0,4194304] 0 2026-03-09T00:04:22.935 INFO:tasks.workunit.client.0.vm03.stdout:3/934: dwrite d2/db/d3b/d5f/d65/f90 [0,4194304] 0 2026-03-09T00:04:22.935 INFO:tasks.workunit.client.0.vm03.stdout:3/935: chown d2/db/d56/df1/d119/fc9 3858 1 2026-03-09T00:04:22.935 INFO:tasks.workunit.client.0.vm03.stdout:3/936: creat d2/db/d40/d51/f127 x:0 0 0 2026-03-09T00:04:22.935 INFO:tasks.workunit.client.0.vm03.stdout:3/937: creat d2/dbf/d102/f128 x:0 0 0 2026-03-09T00:04:22.935 INFO:tasks.workunit.client.0.vm03.stdout:3/938: truncate d2/dbf/d102/f117 937234 0 2026-03-09T00:04:22.937 INFO:tasks.workunit.client.0.vm03.stdout:9/997: creat d15/d1c/d36/d4d/d124/dab/f146 x:0 0 0 2026-03-09T00:04:22.940 INFO:tasks.workunit.client.0.vm03.stdout:3/939: symlink d2/db/d3b/d5f/da5/d72/dbd/l129 0 2026-03-09T00:04:22.944 INFO:tasks.workunit.client.0.vm03.stdout:3/940: read d2/f16 [339505,126745] 0 2026-03-09T00:04:22.944 INFO:tasks.workunit.client.0.vm03.stdout:3/941: read d2/db/f7e [853852,15961] 0 2026-03-09T00:04:22.945 INFO:tasks.workunit.client.0.vm03.stdout:9/998: rename d15/d1c/d36/fb1 to d15/d1c/d36/d4d/d124/dab/f147 0 2026-03-09T00:04:22.947 INFO:tasks.workunit.client.0.vm03.stdout:9/999: mkdir d15/d1c/d36/d4d/dc4/d148 0 2026-03-09T00:04:22.950 INFO:tasks.workunit.client.0.vm03.stdout:3/942: rename d2/db/d3b/d3f/f46 to d2/db/f12a 0 2026-03-09T00:04:22.978 INFO:tasks.workunit.client.0.vm03.stdout:3/943: dwrite d2/f30 [8388608,4194304] 0 2026-03-09T00:04:22.982 INFO:tasks.workunit.client.0.vm03.stdout:3/944: creat d2/db/d3b/d3f/d111/f12b x:0 0 0 2026-03-09T00:04:22.983 INFO:tasks.workunit.client.0.vm03.stdout:3/945: mknod d2/dbf/dfa/c12c 0 2026-03-09T00:04:22.986 INFO:tasks.workunit.client.0.vm03.stdout:3/946: mknod d2/db/d6a/dc6/c12d 0 2026-03-09T00:04:23.016 INFO:tasks.workunit.client.0.vm03.stdout:3/947: dwrite d2/f16 [0,4194304] 0 2026-03-09T00:04:23.016 INFO:tasks.workunit.client.0.vm03.stdout:3/948: readlink d2/db/d3b/d5f/da5/d72/d96/la7 0 2026-03-09T00:04:23.019 INFO:tasks.workunit.client.0.vm03.stdout:3/949: unlink d2/db/d56/fb4 0 2026-03-09T00:04:23.019 INFO:tasks.workunit.client.0.vm03.stdout:3/950: symlink d2/db/d3b/dc2/l12e 0 2026-03-09T00:04:23.019 INFO:tasks.workunit.client.0.vm03.stdout:3/951: chown d2/db/d3b/d5f/ld5 61679070 1 2026-03-09T00:04:23.024 INFO:tasks.workunit.client.0.vm03.stdout:3/952: read d2/db/d3b/d5d/f8d [3675022,74964] 0 2026-03-09T00:04:23.024 INFO:tasks.workunit.client.0.vm03.stdout:3/953: write d2/db/d56/fd4 [4521034,86432] 0 2026-03-09T00:04:23.024 INFO:tasks.workunit.client.0.vm03.stdout:3/954: creat d2/db/d3b/d5f/da5/d72/d96/f12f x:0 0 0 2026-03-09T00:04:23.026 INFO:tasks.workunit.client.0.vm03.stdout:3/955: creat d2/db/d3b/d3f/daf/f130 x:0 0 0 2026-03-09T00:04:23.058 INFO:tasks.workunit.client.0.vm03.stdout:3/956: dwrite d2/f8a [0,4194304] 0 2026-03-09T00:04:23.058 INFO:tasks.workunit.client.0.vm03.stdout:3/957: write d2/db/d56/df1/d119/f10d [1144808,53999] 0 2026-03-09T00:04:23.058 INFO:tasks.workunit.client.0.vm03.stdout:3/958: dread - d2/db/d40/d58/ff5 zero size 2026-03-09T00:04:23.062 INFO:tasks.workunit.client.0.vm03.stdout:3/959: mkdir d2/db/d56/d131 0 2026-03-09T00:04:23.062 INFO:tasks.workunit.client.0.vm03.stdout:3/960: getdents d2/db/d40/d51/da2/de9 0 2026-03-09T00:04:23.064 INFO:tasks.workunit.client.0.vm03.stdout:3/961: rmdir d2/dbf/d102 39 2026-03-09T00:04:23.064 INFO:tasks.workunit.client.0.vm03.stdout:3/962: dread - d2/db/d40/d51/f127 zero size 2026-03-09T00:04:23.064 INFO:tasks.workunit.client.0.vm03.stdout:3/963: mknod d2/db/d3b/d5f/da5/dd8/c132 0 2026-03-09T00:04:23.064 INFO:tasks.workunit.client.0.vm03.stdout:3/964: fsync d2/db/d40/d101/db5/fd3 0 2026-03-09T00:04:23.064 INFO:tasks.workunit.client.0.vm03.stdout:3/965: fdatasync d2/db/d40/d58/ff5 0 2026-03-09T00:04:23.092 INFO:tasks.workunit.client.0.vm03.stdout:3/966: dwrite d2/db/d3b/d5f/da5/d72/f86 [0,4194304] 0 2026-03-09T00:04:23.092 INFO:tasks.workunit.client.0.vm03.stdout:3/967: dread - d2/db/d40/d51/f127 zero size 2026-03-09T00:04:23.097 INFO:tasks.workunit.client.0.vm03.stdout:3/968: write d2/db/d3b/d5f/d65/f90 [3415537,12060] 0 2026-03-09T00:04:23.097 INFO:tasks.workunit.client.0.vm03.stdout:3/969: chown d2/db/d40/d101/d68/f10f 328201 1 2026-03-09T00:04:23.098 INFO:tasks.workunit.client.0.vm03.stdout:3/970: creat d2/db/d2d/dc7/f133 x:0 0 0 2026-03-09T00:04:23.098 INFO:tasks.workunit.client.0.vm03.stdout:3/971: chown d2/db/de6 23 1 2026-03-09T00:04:23.125 INFO:tasks.workunit.client.0.vm03.stdout:3/972: dwrite f1 [0,4194304] 0 2026-03-09T00:04:23.125 INFO:tasks.workunit.client.0.vm03.stdout:3/973: readlink d2/db/d3b/d5f/da5/d72/d96/la7 0 2026-03-09T00:04:23.125 INFO:tasks.workunit.client.0.vm03.stdout:3/974: stat d2/db/d3b/d5d/fc0 0 2026-03-09T00:04:23.127 INFO:tasks.workunit.client.0.vm03.stdout:3/975: symlink d2/db/d40/d51/l134 0 2026-03-09T00:04:23.154 INFO:tasks.workunit.client.0.vm03.stdout:3/976: dwrite d2/db/d56/df1/d119/fc8 [0,4194304] 0 2026-03-09T00:04:23.157 INFO:tasks.workunit.client.0.vm03.stdout:3/977: link d2/db/d40/d88/c9d d2/db/d3b/d3f/daf/c135 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/978: fsync d2/db/de6/f11b 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/979: readlink d2/db/d2d/d55/l125 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/980: creat d2/db/d56/df1/f136 x:0 0 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/981: stat d2/db/l33 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/982: creat d2/db/d40/d101/db5/f137 x:0 0 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/983: readlink d2/db/d6a/l71 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/984: write d2/db/d40/d101/d68/f10f [1014075,26740] 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/985: dread - d2/db/d40/d88/fde zero size 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/986: read d2/f9 [3921132,76630] 0 2026-03-09T00:04:23.158 INFO:tasks.workunit.client.0.vm03.stdout:3/987: creat d2/db/d3b/d5d/f138 x:0 0 0 2026-03-09T00:04:23.160 INFO:tasks.workunit.client.0.vm03.stdout:3/988: symlink d2/db/d40/d51/da2/de9/l139 0 2026-03-09T00:04:23.160 INFO:tasks.workunit.client.0.vm03.stdout:3/989: creat d2/db/d40/f13a x:0 0 0 2026-03-09T00:04:23.189 INFO:tasks.workunit.client.0.vm03.stdout:3/990: dwrite d2/db/d2d/f36 [0,4194304] 0 2026-03-09T00:04:23.189 INFO:tasks.workunit.client.0.vm03.stdout:3/991: chown d2/db/d6a/dc6 169 1 2026-03-09T00:04:23.189 INFO:tasks.workunit.client.0.vm03.stdout:3/992: fsync d2/dbf/fc3 0 2026-03-09T00:04:23.189 INFO:tasks.workunit.client.0.vm03.stdout:3/993: chown d2/f30 422046 1 2026-03-09T00:04:23.193 INFO:tasks.workunit.client.0.vm03.stdout:3/994: dread d2/db/d2d/f8b [0,4194304] 0 2026-03-09T00:04:23.194 INFO:tasks.workunit.client.0.vm03.stdout:3/995: getdents d2/db/d40/d51 0 2026-03-09T00:04:23.203 INFO:tasks.workunit.client.0.vm03.stdout:3/996: write d2/db/d2d/f54 [856419,124082] 0 2026-03-09T00:04:23.208 INFO:tasks.workunit.client.0.vm03.stdout:3/997: write d2/db/d40/d58/fd0 [3863479,20706] 0 2026-03-09T00:04:23.236 INFO:tasks.workunit.client.0.vm03.stdout:3/998: dwrite d2/f16 [0,4194304] 0 2026-03-09T00:04:23.236 INFO:tasks.workunit.client.0.vm03.stdout:3/999: mkdir d2/db/d3b/dc2/d13b 0 2026-03-09T00:04:23.239 INFO:tasks.workunit.client.0.vm03.stderr:+ rm -rf -- ./tmp.Tq6zp6bsRi 2026-03-09T00:04:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:23 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:23 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:23 vm03.local ceph-mon[52346]: Upgrade: Updating node-exporter.vm06 (2/2) 2026-03-09T00:04:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:23 vm03.local ceph-mon[52346]: Deploying daemon node-exporter.vm06 on vm06 2026-03-09T00:04:23.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:23 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:23.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:23 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:23.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:23 vm06.local ceph-mon[58395]: Upgrade: Updating node-exporter.vm06 (2/2) 2026-03-09T00:04:23.889 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:23 vm06.local ceph-mon[58395]: Deploying daemon node-exporter.vm06 on vm06 2026-03-09T00:04:24.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:24 vm06.local ceph-mon[58395]: pgmap v15: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 103 MiB/s rd, 122 MiB/s wr, 190 op/s 2026-03-09T00:04:25.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:24 vm03.local ceph-mon[52346]: pgmap v15: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 103 MiB/s rd, 122 MiB/s wr, 190 op/s 2026-03-09T00:04:26.650 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:26 vm06.local ceph-mon[58395]: pgmap v16: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 103 MiB/s rd, 122 MiB/s wr, 190 op/s 2026-03-09T00:04:26.651 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:26 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:26.651 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:26 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:26.651 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:26 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:26.973 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:26 vm03.local ceph-mon[52346]: pgmap v16: 65 pgs: 65 active+clean; 4.1 GiB data, 14 GiB used, 106 GiB / 120 GiB avail; 103 MiB/s rd, 122 MiB/s wr, 190 op/s 2026-03-09T00:04:26.973 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:26 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:26.973 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:26 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:26.973 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:26 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:28 vm03.local ceph-mon[52346]: pgmap v17: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 125 MiB/s rd, 156 MiB/s wr, 283 op/s 2026-03-09T00:04:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:28 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:28 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:28 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:28 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:28 vm06.local ceph-mon[58395]: pgmap v17: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 125 MiB/s rd, 156 MiB/s wr, 283 op/s 2026-03-09T00:04:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:28 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:28 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:28 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:28 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: pgmap v18: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 70 MiB/s rd, 92 MiB/s wr, 184 op/s 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.181 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: pgmap v18: 65 pgs: 65 active+clean; 3.8 GiB data, 13 GiB used, 107 GiB / 120 GiB avail; 70 MiB/s rd, 92 MiB/s wr, 184 op/s 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:31.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:32.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:32 vm03.local ceph-mon[52346]: pgmap v19: 65 pgs: 65 active+clean; 2.7 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 70 MiB/s rd, 93 MiB/s wr, 249 op/s 2026-03-09T00:04:32.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:32 vm03.local ceph-mon[52346]: Upgrade: Updating prometheus.vm03 2026-03-09T00:04:32.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:32 vm03.local ceph-mon[52346]: Deploying daemon prometheus.vm03 on vm03 2026-03-09T00:04:32.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:32 vm06.local ceph-mon[58395]: pgmap v19: 65 pgs: 65 active+clean; 2.7 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 70 MiB/s rd, 93 MiB/s wr, 249 op/s 2026-03-09T00:04:32.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:32 vm06.local ceph-mon[58395]: Upgrade: Updating prometheus.vm03 2026-03-09T00:04:32.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:32 vm06.local ceph-mon[58395]: Deploying daemon prometheus.vm03 on vm03 2026-03-09T00:04:34.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:34 vm03.local ceph-mon[52346]: pgmap v20: 65 pgs: 65 active+clean; 2.7 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 22 MiB/s rd, 34 MiB/s wr, 158 op/s 2026-03-09T00:04:34.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:34 vm06.local ceph-mon[58395]: pgmap v20: 65 pgs: 65 active+clean; 2.7 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 22 MiB/s rd, 34 MiB/s wr, 158 op/s 2026-03-09T00:04:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:36 vm03.local ceph-mon[52346]: pgmap v21: 65 pgs: 65 active+clean; 2.7 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 22 MiB/s rd, 34 MiB/s wr, 158 op/s 2026-03-09T00:04:37.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:36 vm06.local ceph-mon[58395]: pgmap v21: 65 pgs: 65 active+clean; 2.7 GiB data, 10 GiB used, 110 GiB / 120 GiB avail; 22 MiB/s rd, 34 MiB/s wr, 158 op/s 2026-03-09T00:04:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:39 vm06.local ceph-mon[58395]: pgmap v22: 65 pgs: 65 active+clean; 1.8 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail; 22 MiB/s rd, 35 MiB/s wr, 216 op/s 2026-03-09T00:04:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:39 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:39 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:39 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:39.438 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:39 vm03.local ceph-mon[52346]: pgmap v22: 65 pgs: 65 active+clean; 1.8 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail; 22 MiB/s rd, 35 MiB/s wr, 216 op/s 2026-03-09T00:04:39.438 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:39 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:39.438 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:39 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:39.438 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:39 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:41 vm03.local ceph-mon[52346]: pgmap v23: 65 pgs: 65 active+clean; 1.8 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail; 20 KiB/s rd, 1.1 MiB/s wr, 123 op/s 2026-03-09T00:04:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:41 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:41 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:41 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:41 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:41 vm06.local ceph-mon[58395]: pgmap v23: 65 pgs: 65 active+clean; 1.8 GiB data, 7.7 GiB used, 112 GiB / 120 GiB avail; 20 KiB/s rd, 1.1 MiB/s wr, 123 op/s 2026-03-09T00:04:41.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:41 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:41 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:41 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:41.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:41 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.024 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: pgmap v24: 65 pgs: 65 active+clean; 869 MiB data, 5.2 GiB used, 115 GiB / 120 GiB avail; 30 KiB/s rd, 1.6 MiB/s wr, 178 op/s 2026-03-09T00:04:43.024 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.024 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.024 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.025 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:42 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: pgmap v24: 65 pgs: 65 active+clean; 869 MiB data, 5.2 GiB used, 115 GiB / 120 GiB avail; 30 KiB/s rd, 1.6 MiB/s wr, 178 op/s 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:42 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:44.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:43 vm03.local ceph-mon[52346]: Upgrade: Updating alertmanager.vm03 2026-03-09T00:04:44.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:43 vm03.local ceph-mon[52346]: Deploying daemon alertmanager.vm03 on vm03 2026-03-09T00:04:44.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:43 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:44.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:43 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:44.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:43 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:43 vm06.local ceph-mon[58395]: Upgrade: Updating alertmanager.vm03 2026-03-09T00:04:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:43 vm06.local ceph-mon[58395]: Deploying daemon alertmanager.vm03 on vm03 2026-03-09T00:04:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:43 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:43 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:43 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:44.886 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T00:04:44.886 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-09T00:04:44.958 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:44 vm03.local ceph-mon[52346]: pgmap v25: 65 pgs: 65 active+clean; 869 MiB data, 5.2 GiB used, 115 GiB / 120 GiB avail; 19 KiB/s rd, 1.1 MiB/s wr, 113 op/s 2026-03-09T00:04:45.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:44 vm06.local ceph-mon[58395]: pgmap v25: 65 pgs: 65 active+clean; 869 MiB data, 5.2 GiB used, 115 GiB / 120 GiB avail; 19 KiB/s rd, 1.1 MiB/s wr, 113 op/s 2026-03-09T00:04:46.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:45 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:46.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:45 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:04:47.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:47 vm03.local ceph-mon[52346]: pgmap v26: 65 pgs: 65 active+clean; 869 MiB data, 5.2 GiB used, 115 GiB / 120 GiB avail; 19 KiB/s rd, 1.1 MiB/s wr, 113 op/s 2026-03-09T00:04:47.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:47 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:47 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:47 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.675 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:47 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:47 vm06.local ceph-mon[58395]: pgmap v26: 65 pgs: 65 active+clean; 869 MiB data, 5.2 GiB used, 115 GiB / 120 GiB avail; 19 KiB/s rd, 1.1 MiB/s wr, 113 op/s 2026-03-09T00:04:47.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:47 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:47 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:47 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:47 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:47.933 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-09T00:04:47.933 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: pgmap v27: 65 pgs: 65 active+clean; 394 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 2.1 MiB/s wr, 146 op/s 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:48.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:48.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:48 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: pgmap v27: 65 pgs: 65 active+clean; 394 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 2.1 MiB/s wr, 146 op/s 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:48 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 -- 192.168.123.103:0/2089511191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c075a40 msgr2=0x7fbf3c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 --2- 192.168.123.103:0/2089511191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c075a40 0x7fbf3c077ed0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fbf3400d3f0 tx=0x7fbf3400d700 comp rx=0 tx=0).stop 2026-03-09T00:04:49.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 -- 192.168.123.103:0/2089511191 shutdown_connections 2026-03-09T00:04:49.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 --2- 192.168.123.103:0/2089511191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c075a40 0x7fbf3c077ed0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 --2- 192.168.123.103:0/2089511191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf3c072b50 0x7fbf3c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 -- 192.168.123.103:0/2089511191 >> 192.168.123.103:0/2089511191 conn(0x7fbf3c06dae0 msgr2=0x7fbf3c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 -- 192.168.123.103:0/2089511191 shutdown_connections 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 -- 192.168.123.103:0/2089511191 wait complete. 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 Processor -- start 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.206+0000 7fbf41fc4700 1 -- start start 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.207+0000 7fbf41fc4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf3c072b50 0x7fbf3c0830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.207+0000 7fbf41fc4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 0x7fbf3c12e3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.207+0000 7fbf41fc4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf3c083af0 con 0x7fbf3c072b50 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.207+0000 7fbf41fc4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf3c083c60 con 0x7fbf3c0835e0 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 0x7fbf3c12e3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 0x7fbf3c12e3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33258/0 (socket says 192.168.123.103:33258) 2026-03-09T00:04:49.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 -- 192.168.123.103:0/942458955 learned_addr learned my addr 192.168.123.103:0/942458955 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:49.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 -- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf3c072b50 msgr2=0x7fbf3c0830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf3c072b50 0x7fbf3c0830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 -- 192.168.123.103:0/942458955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf34007ed0 con 0x7fbf3c0835e0 2026-03-09T00:04:49.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.208+0000 7fbf3affd700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 0x7fbf3c12e3f0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fbf34003c60 tx=0x7fbf34003d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.209+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf3401c070 con 0x7fbf3c0835e0 2026-03-09T00:04:49.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.209+0000 7fbf41fc4700 1 -- 192.168.123.103:0/942458955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf3c12e930 con 0x7fbf3c0835e0 2026-03-09T00:04:49.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.209+0000 7fbf41fc4700 1 -- 192.168.123.103:0/942458955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf3c12ee80 con 0x7fbf3c0835e0 2026-03-09T00:04:49.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.209+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf34004330 con 0x7fbf3c0835e0 2026-03-09T00:04:49.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.209+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf34021ad0 con 0x7fbf3c0835e0 2026-03-09T00:04:49.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.210+0000 7fbf41fc4700 1 -- 192.168.123.103:0/942458955 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf28005320 con 0x7fbf3c0835e0 2026-03-09T00:04:49.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.211+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fbf3400f810 con 0x7fbf3c0835e0 2026-03-09T00:04:49.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.211+0000 7fbf38ff9700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fbf240776c0 0x7fbf24079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.212+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fbf34013070 con 0x7fbf3c0835e0 2026-03-09T00:04:49.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.213+0000 7fbf3b7fe700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fbf240776c0 0x7fbf24079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.213+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fbf340647b0 con 0x7fbf3c0835e0 2026-03-09T00:04:49.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.221+0000 7fbf3b7fe700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fbf240776c0 0x7fbf24079b80 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fbf3c12a360 tx=0x7fbf2c006d90 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.359+0000 7fbf41fc4700 1 -- 192.168.123.103:0/942458955 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbf28000bf0 con 0x7fbf240776c0 2026-03-09T00:04:49.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.375+0000 7fbf38ff9700 1 -- 192.168.123.103:0/942458955 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7fbf28000bf0 con 0x7fbf240776c0 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 -- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fbf240776c0 msgr2=0x7fbf24079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fbf240776c0 0x7fbf24079b80 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fbf3c12a360 tx=0x7fbf2c006d90 comp rx=0 tx=0).stop 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 -- 192.168.123.103:0/942458955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 msgr2=0x7fbf3c12e3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 0x7fbf3c12e3f0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7fbf34003c60 tx=0x7fbf34003d40 comp rx=0 tx=0).stop 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 -- 192.168.123.103:0/942458955 shutdown_connections 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fbf240776c0 0x7fbf24079b80 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf3c072b50 0x7fbf3c0830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 --2- 192.168.123.103:0/942458955 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf3c0835e0 0x7fbf3c12e3f0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 -- 192.168.123.103:0/942458955 >> 192.168.123.103:0/942458955 conn(0x7fbf3c06dae0 msgr2=0x7fbf3c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 -- 192.168.123.103:0/942458955 shutdown_connections 2026-03-09T00:04:49.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.378+0000 7fbf227fc700 1 -- 192.168.123.103:0/942458955 wait complete. 2026-03-09T00:04:49.395 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:04:49.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 -- 192.168.123.103:0/362216550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9644102e70 msgr2=0x7f9644103290 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 --2- 192.168.123.103:0/362216550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9644102e70 0x7f9644103290 secure :-1 s=READY pgs=316 cs=0 l=1 rev1=1 crypto rx=0x7f9638009a60 tx=0x7f9638009d70 comp rx=0 tx=0).stop 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 -- 192.168.123.103:0/362216550 shutdown_connections 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 --2- 192.168.123.103:0/362216550 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f96441044e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 --2- 192.168.123.103:0/362216550 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9644102e70 0x7f9644103290 unknown :-1 s=CLOSED pgs=316 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 -- 192.168.123.103:0/362216550 >> 192.168.123.103:0/362216550 conn(0x7f96440fe440 msgr2=0x7f96441008a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 -- 192.168.123.103:0/362216550 shutdown_connections 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.503+0000 7f9643fff700 1 -- 192.168.123.103:0/362216550 wait complete. 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9643fff700 1 Processor -- start 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9643fff700 1 -- start start 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9643fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f964419cdf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9643fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f964419d330 0x7f96441a23a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9643fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f964419d840 con 0x7f964419d330 2026-03-09T00:04:49.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9643fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f964419d9b0 con 0x7f9644104060 2026-03-09T00:04:49.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.504+0000 7f9642ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f964419cdf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9642ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f964419cdf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33272/0 (socket says 192.168.123.103:33272) 2026-03-09T00:04:49.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9642ffd700 1 -- 192.168.123.103:0/882766407 learned_addr learned my addr 192.168.123.103:0/882766407 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:49.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9642ffd700 1 -- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f964419d330 msgr2=0x7f96441a23a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9642ffd700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f964419d330 0x7f96441a23a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9642ffd700 1 -- 192.168.123.103:0/882766407 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9638009710 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9642ffd700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f964419cdf0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f9644103bd0 tx=0x7f963800f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f963801d070 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9643fff700 1 -- 192.168.123.103:0/882766407 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96441a28e0 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f9643fff700 1 -- 192.168.123.103:0/882766407 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96441a2dd0 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f963800fca0 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.505+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9638017600 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.507+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f96380219f0 con 0x7f9644104060 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.507+0000 7f964880f700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f962c077680 0x7f962c079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.507+0000 7f9643fff700 1 -- 192.168.123.103:0/882766407 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9630005320 con 0x7f9644104060 2026-03-09T00:04:49.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.507+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f963809a8b0 con 0x7f9644104060 2026-03-09T00:04:49.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.508+0000 7f96427fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f962c077680 0x7f962c079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.508+0000 7f96427fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f962c077680 0x7f962c079b40 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9644102ba0 tx=0x7f96340098e0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.510+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f96380633f0 con 0x7f9644104060 2026-03-09T00:04:49.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.667+0000 7f9643fff700 1 -- 192.168.123.103:0/882766407 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9630000bf0 con 0x7f962c077680 2026-03-09T00:04:49.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.671+0000 7f964880f700 1 -- 192.168.123.103:0/882766407 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f9630000bf0 con 0x7f962c077680 2026-03-09T00:04:49.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 -- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f962c077680 msgr2=0x7f962c079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f962c077680 0x7f962c079b40 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f9644102ba0 tx=0x7f96340098e0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 -- 192.168.123.103:0/882766407 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 msgr2=0x7f964419cdf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f964419cdf0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f9644103bd0 tx=0x7f963800f740 comp rx=0 tx=0).stop 2026-03-09T00:04:49.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 -- 192.168.123.103:0/882766407 shutdown_connections 2026-03-09T00:04:49.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f962c077680 0x7f962c079b40 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9644104060 0x7f964419cdf0 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 --2- 192.168.123.103:0/882766407 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f964419d330 0x7f96441a23a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.676+0000 7f962a7fc700 1 -- 192.168.123.103:0/882766407 >> 192.168.123.103:0/882766407 conn(0x7f96440fe440 msgr2=0x7f9644107320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.677+0000 7f962a7fc700 1 -- 192.168.123.103:0/882766407 shutdown_connections 2026-03-09T00:04:49.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.677+0000 7f962a7fc700 1 -- 192.168.123.103:0/882766407 wait complete. 2026-03-09T00:04:49.758 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:49 vm03.local ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:04:49.758 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:49 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.758 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:49 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.758 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:49 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:49.758 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:49 vm03.local ceph-mon[52346]: Upgrade: Updating grafana.vm03 2026-03-09T00:04:49.759 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:49 vm03.local ceph-mon[52346]: Deploying daemon grafana.vm03 on vm03 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.758+0000 7f90cbbc2700 1 -- 192.168.123.103:0/3659645824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c40ffbb0 msgr2=0x7f90c40fffd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.758+0000 7f90cbbc2700 1 --2- 192.168.123.103:0/3659645824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c40ffbb0 0x7f90c40fffd0 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f90c0009b50 tx=0x7f90c0009e60 comp rx=0 tx=0).stop 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.760+0000 7f90cbbc2700 1 -- 192.168.123.103:0/3659645824 shutdown_connections 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.760+0000 7f90cbbc2700 1 --2- 192.168.123.103:0/3659645824 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c4100db0 0x7f90c4101210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.760+0000 7f90cbbc2700 1 --2- 192.168.123.103:0/3659645824 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c40ffbb0 0x7f90c40fffd0 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.760+0000 7f90cbbc2700 1 -- 192.168.123.103:0/3659645824 >> 192.168.123.103:0/3659645824 conn(0x7f90c40fb110 msgr2=0x7f90c40fd590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.760+0000 7f90cbbc2700 1 -- 192.168.123.103:0/3659645824 shutdown_connections 2026-03-09T00:04:49.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.760+0000 7f90cbbc2700 1 -- 192.168.123.103:0/3659645824 wait complete. 2026-03-09T00:04:49.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.761+0000 7f90cbbc2700 1 Processor -- start 2026-03-09T00:04:49.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.761+0000 7f90cbbc2700 1 -- start start 2026-03-09T00:04:49.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.761+0000 7f90cbbc2700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 0x7f90c41965b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.761+0000 7f90cbbc2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c4100db0 0x7f90c4196af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.761+0000 7f90c995e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 0x7f90c41965b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.762+0000 7f90c915d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c4100db0 0x7f90c4196af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.762+0000 7f90c915d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c4100db0 0x7f90c4196af0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:39704/0 (socket says 192.168.123.103:39704) 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.762+0000 7f90c995e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 0x7f90c41965b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33292/0 (socket says 192.168.123.103:33292) 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.762+0000 7f90c915d700 1 -- 192.168.123.103:0/1351085109 learned_addr learned my addr 192.168.123.103:0/1351085109 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.762+0000 7f90cbbc2700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90c4197110 con 0x7f90c4100db0 2026-03-09T00:04:49.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.762+0000 7f90cbbc2700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90c4197250 con 0x7f90c40ffbb0 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.766+0000 7f90c995e700 1 -- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c4100db0 msgr2=0x7f90c4196af0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.766+0000 7f90c995e700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c4100db0 0x7f90c4196af0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.766+0000 7f90c995e700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f90c00097e0 con 0x7f90c40ffbb0 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.766+0000 7f90c995e700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 0x7f90c41965b0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f90c0000c00 tx=0x7f90c0005740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.767+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90c001d070 con 0x7f90c40ffbb0 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.767+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f90c000bc30 con 0x7f90c40ffbb0 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.767+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90c000f630 con 0x7f90c40ffbb0 2026-03-09T00:04:49.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.767+0000 7f90cbbc2700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f90c4105ff0 con 0x7f90c40ffbb0 2026-03-09T00:04:49.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.767+0000 7f90cbbc2700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f90c41064e0 con 0x7f90c40ffbb0 2026-03-09T00:04:49.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.769+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f90c000f7b0 con 0x7f90c40ffbb0 2026-03-09T00:04:49.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.771+0000 7f90baffd700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f90b007bb90 0x7f90b007e050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.771+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f90c009be80 con 0x7f90c40ffbb0 2026-03-09T00:04:49.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.774+0000 7f90c915d700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f90b007bb90 0x7f90b007e050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.774+0000 7f90c915d700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f90b007bb90 0x7f90b007e050 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f90b4005950 tx=0x7f90b400b500 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.774+0000 7f90cbbc2700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f90a8005320 con 0x7f90c40ffbb0 2026-03-09T00:04:49.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.777+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f90c0064a40 con 0x7f90c40ffbb0 2026-03-09T00:04:49.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.897+0000 7f90cbbc2700 1 -- 192.168.123.103:0/1351085109 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f90a8000bf0 con 0x7f90b007bb90 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (6s) 3s ago 5m 16.7M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (5m) 3s ago 5m 8409k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (4m) 22s ago 4m 8522k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 3s ago 5m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (4m) 22s ago 4m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 3s ago 5m 89.9M - 9.4.7 954c08fa6188 9db2e5805e97 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (3m) 3s ago 3m 16.6M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (3m) 3s ago 3m 256M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (3m) 22s ago 3m 18.5M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (3m) 22s ago 3m 14.7M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (60s) 3s ago 5m 620M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (37s) 22s ago 4m 487M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 3s ago 5m 54.6M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (4m) 22s ago 4m 45.6M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (27s) 3s ago 5m 8875k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (23s) 22s ago 4m 5511k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 3s ago 4m 342M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 3s ago 4m 367M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 3s ago 3m 304M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (3m) 22s ago 3m 421M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (3m) 22s ago 3m 384M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (3m) 22s ago 3m 342M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (11s) 3s ago 4m 44.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:04:49.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.902+0000 7f90baffd700 1 -- 192.168.123.103:0/1351085109 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f90a8000bf0 con 0x7f90b007bb90 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 -- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f90b007bb90 msgr2=0x7f90b007e050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f90b007bb90 0x7f90b007e050 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f90b4005950 tx=0x7f90b400b500 comp rx=0 tx=0).stop 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 -- 192.168.123.103:0/1351085109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 msgr2=0x7f90c41965b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 0x7f90c41965b0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f90c0000c00 tx=0x7f90c0005740 comp rx=0 tx=0).stop 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 -- 192.168.123.103:0/1351085109 shutdown_connections 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f90b007bb90 0x7f90b007e050 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f90c40ffbb0 0x7f90c41965b0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 --2- 192.168.123.103:0/1351085109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f90c4100db0 0x7f90c4196af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.906+0000 7f90b8ff9700 1 -- 192.168.123.103:0/1351085109 >> 192.168.123.103:0/1351085109 conn(0x7f90c40fb110 msgr2=0x7f90c4103fe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.908+0000 7f90b8ff9700 1 -- 192.168.123.103:0/1351085109 shutdown_connections 2026-03-09T00:04:49.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.908+0000 7f90b8ff9700 1 -- 192.168.123.103:0/1351085109 wait complete. 2026-03-09T00:04:49.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.983+0000 7f76423a9700 1 -- 192.168.123.103:0/1288130564 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c075a40 msgr2=0x7f763c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.983+0000 7f76423a9700 1 --2- 192.168.123.103:0/1288130564 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c075a40 0x7f763c077ed0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f7634009230 tx=0x7f7634009260 comp rx=0 tx=0).stop 2026-03-09T00:04:49.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.983+0000 7f76423a9700 1 -- 192.168.123.103:0/1288130564 shutdown_connections 2026-03-09T00:04:49.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.983+0000 7f76423a9700 1 --2- 192.168.123.103:0/1288130564 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c075a40 0x7f763c077ed0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.983+0000 7f76423a9700 1 --2- 192.168.123.103:0/1288130564 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f763c072b50 0x7f763c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.983 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.983+0000 7f76423a9700 1 -- 192.168.123.103:0/1288130564 >> 192.168.123.103:0/1288130564 conn(0x7f763c06dae0 msgr2=0x7f763c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 -- 192.168.123.103:0/1288130564 shutdown_connections 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 -- 192.168.123.103:0/1288130564 wait complete. 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 Processor -- start 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 -- start start 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 0x7f763c083120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f763c083660 0x7f763c12e470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f763c083b70 con 0x7f763c083660 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f76423a9700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f763c083ce0 con 0x7f763c072b50 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f763b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f763c083660 0x7f763c12e470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f763bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 0x7f763c083120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f763bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 0x7f763c083120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33308/0 (socket says 192.168.123.103:33308) 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.984+0000 7f763bfff700 1 -- 192.168.123.103:0/3234990933 learned_addr learned my addr 192.168.123.103:0/3234990933 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f763bfff700 1 -- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f763c083660 msgr2=0x7f763c12e470 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f763bfff700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f763c083660 0x7f763c12e470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f763bfff700 1 -- 192.168.123.103:0/3234990933 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7634008ee0 con 0x7f763c072b50 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f763bfff700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 0x7f763c083120 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f762c00abb0 tx=0x7f762c00af70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.984 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f762c010070 con 0x7f763c072b50 2026-03-09T00:04:49.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f76423a9700 1 -- 192.168.123.103:0/3234990933 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f763c12ea10 con 0x7f763c072b50 2026-03-09T00:04:49.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.985+0000 7f76423a9700 1 -- 192.168.123.103:0/3234990933 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f763c12ef60 con 0x7f763c072b50 2026-03-09T00:04:49.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.986+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f762c014410 con 0x7f763c072b50 2026-03-09T00:04:49.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.986+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f762c0135c0 con 0x7f763c072b50 2026-03-09T00:04:49.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.987+0000 7f76423a9700 1 -- 192.168.123.103:0/3234990933 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f763c07c880 con 0x7f763c072b50 2026-03-09T00:04:49.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.988+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f762c014a60 con 0x7f763c072b50 2026-03-09T00:04:49.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.988+0000 7f76397fa700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f76240776c0 0x7f7624079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:49.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.988+0000 7f763b7fe700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f76240776c0 0x7f7624079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:49.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.988+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f762c098c40 con 0x7f763c072b50 2026-03-09T00:04:49.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.989+0000 7f763b7fe700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f76240776c0 0x7f7624079b80 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f763400d010 tx=0x7f763400c9d0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:49.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:49.990+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f762c061780 con 0x7f763c072b50 2026-03-09T00:04:50.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:49 vm06.local ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-09T00:04:50.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:49 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:50.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:49 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:50.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:49 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:50.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:49 vm06.local ceph-mon[58395]: Upgrade: Updating grafana.vm03 2026-03-09T00:04:50.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:49 vm06.local ceph-mon[58395]: Deploying daemon grafana.vm03 on vm03 2026-03-09T00:04:50.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.212+0000 7f76423a9700 1 -- 192.168.123.103:0/3234990933 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f763c04ea90 con 0x7f763c072b50 2026-03-09T00:04:50.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.215+0000 7f76397fa700 1 -- 192.168.123.103:0/3234990933 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f762c060ed0 con 0x7f763c072b50 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:04:50.215 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 -- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f76240776c0 msgr2=0x7f7624079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f76240776c0 0x7f7624079b80 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f763400d010 tx=0x7f763400c9d0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 -- 192.168.123.103:0/3234990933 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 msgr2=0x7f763c083120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 0x7f763c083120 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f762c00abb0 tx=0x7f762c00af70 comp rx=0 tx=0).stop 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 -- 192.168.123.103:0/3234990933 shutdown_connections 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f76240776c0 0x7f7624079b80 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f763c072b50 0x7f763c083120 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 --2- 192.168.123.103:0/3234990933 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f763c083660 0x7f763c12e470 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 -- 192.168.123.103:0/3234990933 >> 192.168.123.103:0/3234990933 conn(0x7f763c06dae0 msgr2=0x7f763c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 -- 192.168.123.103:0/3234990933 shutdown_connections 2026-03-09T00:04:50.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.222+0000 7f7622ffd700 1 -- 192.168.123.103:0/3234990933 wait complete. 2026-03-09T00:04:50.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.309+0000 7f6e4e9db700 1 -- 192.168.123.103:0/1816754155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810b6a0 msgr2=0x7f6e4810bac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.309+0000 7f6e4e9db700 1 --2- 192.168.123.103:0/1816754155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810b6a0 0x7f6e4810bac0 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f6e44009b00 tx=0x7f6e44009e10 comp rx=0 tx=0).stop 2026-03-09T00:04:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.309+0000 7f6e4e9db700 1 -- 192.168.123.103:0/1816754155 shutdown_connections 2026-03-09T00:04:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.309+0000 7f6e4e9db700 1 --2- 192.168.123.103:0/1816754155 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810c870 0x7f6e4810ccf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.309+0000 7f6e4e9db700 1 --2- 192.168.123.103:0/1816754155 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810b6a0 0x7f6e4810bac0 unknown :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.309+0000 7f6e4e9db700 1 -- 192.168.123.103:0/1816754155 >> 192.168.123.103:0/1816754155 conn(0x7f6e4806b380 msgr2=0x7f6e4806b7d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.310+0000 7f6e4e9db700 1 -- 192.168.123.103:0/1816754155 shutdown_connections 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.310+0000 7f6e4e9db700 1 -- 192.168.123.103:0/1816754155 wait complete. 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4e9db700 1 Processor -- start 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4e9db700 1 -- start start 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4e9db700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 0x7f6e48117c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4e9db700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810c870 0x7f6e481181a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:50.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4e9db700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e481187c0 con 0x7f6e4810c870 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4e9db700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e48118900 con 0x7f6e4810b6a0 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4d9d9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 0x7f6e48117c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4d9d9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 0x7f6e48117c60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33328/0 (socket says 192.168.123.103:33328) 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4d9d9700 1 -- 192.168.123.103:0/2371484242 learned_addr learned my addr 192.168.123.103:0/2371484242 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4d9d9700 1 -- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810c870 msgr2=0x7f6e481181a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4d9d9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810c870 0x7f6e481181a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.311+0000 7f6e4d9d9700 1 -- 192.168.123.103:0/2371484242 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e440097e0 con 0x7f6e4810b6a0 2026-03-09T00:04:50.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.312+0000 7f6e4d9d9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 0x7f6e48117c60 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6e4400b5c0 tx=0x7f6e440051f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.312+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e4401d070 con 0x7f6e4810b6a0 2026-03-09T00:04:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.312+0000 7f6e4e9db700 1 -- 192.168.123.103:0/2371484242 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e48119330 con 0x7f6e4810b6a0 2026-03-09T00:04:50.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.313+0000 7f6e4e9db700 1 -- 192.168.123.103:0/2371484242 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e481197c0 con 0x7f6e4810b6a0 2026-03-09T00:04:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.313+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6e4400bd80 con 0x7f6e4810b6a0 2026-03-09T00:04:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.313+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e4400f980 con 0x7f6e4810b6a0 2026-03-09T00:04:50.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.314+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f6e4400fae0 con 0x7f6e4810b6a0 2026-03-09T00:04:50.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.314+0000 7f6e3effd700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6e34077790 0x7f6e34079c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:04:50.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.314+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f6e4409b080 con 0x7f6e4810b6a0 2026-03-09T00:04:50.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.314+0000 7f6e4e9db700 1 -- 192.168.123.103:0/2371484242 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e4804ea90 con 0x7f6e4810b6a0 2026-03-09T00:04:50.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.315+0000 7f6e4d1d8700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6e34077790 0x7f6e34079c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:04:50.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.315+0000 7f6e4d1d8700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6e34077790 0x7f6e34079c50 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f6e38005950 tx=0x7f6e380058e0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:04:50.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.317+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f6e44064140 con 0x7f6e4810b6a0 2026-03-09T00:04:50.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.447+0000 7f6e4e9db700 1 -- 192.168.123.103:0/2371484242 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6e4810a450 con 0x7f6e34077790 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.452+0000 7f6e3effd700 1 -- 192.168.123.103:0/2371484242 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f6e4810a450 con 0x7f6e34077790 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "2/2 daemons upgraded", 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading grafana daemons", 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:04:50.451 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 -- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6e34077790 msgr2=0x7f6e34079c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6e34077790 0x7f6e34079c50 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f6e38005950 tx=0x7f6e380058e0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 -- 192.168.123.103:0/2371484242 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 msgr2=0x7f6e48117c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 0x7f6e48117c60 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f6e4400b5c0 tx=0x7f6e440051f0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 -- 192.168.123.103:0/2371484242 shutdown_connections 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6e34077790 0x7f6e34079c50 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e4810b6a0 0x7f6e48117c60 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 --2- 192.168.123.103:0/2371484242 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e4810c870 0x7f6e481181a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:04:50.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.455+0000 7f6e3cff9700 1 -- 192.168.123.103:0/2371484242 >> 192.168.123.103:0/2371484242 conn(0x7f6e4806b380 msgr2=0x7f6e481087e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:04:50.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.456+0000 7f6e3cff9700 1 -- 192.168.123.103:0/2371484242 shutdown_connections 2026-03-09T00:04:50.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:04:50.456+0000 7f6e3cff9700 1 -- 192.168.123.103:0/2371484242 wait complete. 2026-03-09T00:04:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:50 vm03.local ceph-mon[52346]: from='client.24443 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:50 vm03.local ceph-mon[52346]: pgmap v28: 65 pgs: 65 active+clean; 394 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 1.5 MiB/s wr, 88 op/s 2026-03-09T00:04:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:50 vm03.local ceph-mon[52346]: from='client.24447 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:50 vm03.local ceph-mon[52346]: from='client.24451 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:50 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/3234990933' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:50 vm06.local ceph-mon[58395]: from='client.24443 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:50 vm06.local ceph-mon[58395]: pgmap v28: 65 pgs: 65 active+clean; 394 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 15 KiB/s rd, 1.5 MiB/s wr, 88 op/s 2026-03-09T00:04:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:50 vm06.local ceph-mon[58395]: from='client.24447 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:50 vm06.local ceph-mon[58395]: from='client.24451 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:50 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/3234990933' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:04:52.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:51 vm03.local ceph-mon[52346]: from='client.24459 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:52.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:51 vm06.local ceph-mon[58395]: from='client.24459 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:04:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:52 vm03.local ceph-mon[52346]: pgmap v29: 65 pgs: 65 active+clean; 306 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 2.6 MiB/s wr, 146 op/s 2026-03-09T00:04:53.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:52 vm06.local ceph-mon[58395]: pgmap v29: 65 pgs: 65 active+clean; 306 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 2.6 MiB/s wr, 146 op/s 2026-03-09T00:04:55.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:54 vm06.local ceph-mon[58395]: pgmap v30: 65 pgs: 65 active+clean; 306 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 9.4 KiB/s rd, 2.1 MiB/s wr, 91 op/s 2026-03-09T00:04:55.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:54 vm03.local ceph-mon[52346]: pgmap v30: 65 pgs: 65 active+clean; 306 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 9.4 KiB/s rd, 2.1 MiB/s wr, 91 op/s 2026-03-09T00:04:57.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:56 vm06.local ceph-mon[58395]: pgmap v31: 65 pgs: 65 active+clean; 306 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 9.4 KiB/s rd, 2.1 MiB/s wr, 91 op/s 2026-03-09T00:04:57.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:56 vm03.local ceph-mon[52346]: pgmap v31: 65 pgs: 65 active+clean; 306 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 9.4 KiB/s rd, 2.1 MiB/s wr, 91 op/s 2026-03-09T00:04:59.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:59 vm03.local ceph-mon[52346]: pgmap v32: 65 pgs: 65 active+clean; 317 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 3.1 MiB/s wr, 164 op/s 2026-03-09T00:04:59.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:59.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:59.436 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:04:59 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:04:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:59 vm06.local ceph-mon[58395]: pgmap v32: 65 pgs: 65 active+clean; 317 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 3.1 MiB/s wr, 164 op/s 2026-03-09T00:04:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:04:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:04:59 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:01.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:00 vm03.local ceph-mon[52346]: pgmap v33: 65 pgs: 65 active+clean; 317 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 11 KiB/s rd, 2.2 MiB/s wr, 131 op/s 2026-03-09T00:05:01.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:00 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:01.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:00 vm06.local ceph-mon[58395]: pgmap v33: 65 pgs: 65 active+clean; 317 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 11 KiB/s rd, 2.2 MiB/s wr, 131 op/s 2026-03-09T00:05:01.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:00 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:02.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:02 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:02 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:02 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:02 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:02 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:02 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:02 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:02.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:02 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: pgmap v34: 65 pgs: 65 active+clean; 325 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 2.9 MiB/s wr, 202 op/s 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:05:03.315 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:03.316 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:03 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: pgmap v34: 65 pgs: 65 active+clean; 325 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 2.9 MiB/s wr, 202 op/s 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T00:05:03.322 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:03.323 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:03 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: Upgrade: Finalizing container_image settings 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: Upgrade: Complete! 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:04.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:04 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: Upgrade: Finalizing container_image settings 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: Upgrade: Complete! 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:04.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:04 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:05.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:05 vm03.local ceph-mon[52346]: pgmap v35: 65 pgs: 65 active+clean; 325 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 1.8 MiB/s wr, 144 op/s 2026-03-09T00:05:05.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:05 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:05.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:05 vm06.local ceph-mon[58395]: pgmap v35: 65 pgs: 65 active+clean; 325 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 1.8 MiB/s wr, 144 op/s 2026-03-09T00:05:05.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:05 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:07.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:07 vm03.local ceph-mon[52346]: pgmap v36: 65 pgs: 65 active+clean; 325 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 1.8 MiB/s wr, 144 op/s 2026-03-09T00:05:07.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:07 vm06.local ceph-mon[58395]: pgmap v36: 65 pgs: 65 active+clean; 325 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 1.8 MiB/s wr, 144 op/s 2026-03-09T00:05:08.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:08 vm03.local ceph-mon[52346]: pgmap v37: 65 pgs: 65 active+clean; 329 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 2.4 MiB/s wr, 287 op/s 2026-03-09T00:05:08.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:08 vm06.local ceph-mon[58395]: pgmap v37: 65 pgs: 65 active+clean; 329 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 2.4 MiB/s wr, 287 op/s 2026-03-09T00:05:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:10 vm03.local ceph-mon[52346]: pgmap v38: 65 pgs: 65 active+clean; 329 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.3 MiB/s wr, 213 op/s 2026-03-09T00:05:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:10 vm06.local ceph-mon[58395]: pgmap v38: 65 pgs: 65 active+clean; 329 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.3 MiB/s wr, 213 op/s 2026-03-09T00:05:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:12 vm03.local ceph-mon[52346]: pgmap v39: 65 pgs: 65 active+clean; 335 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 2.2 MiB/s wr, 413 op/s 2026-03-09T00:05:13.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:12 vm06.local ceph-mon[58395]: pgmap v39: 65 pgs: 65 active+clean; 335 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 2.2 MiB/s wr, 413 op/s 2026-03-09T00:05:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:14 vm03.local ceph-mon[52346]: pgmap v40: 65 pgs: 65 active+clean; 335 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 342 op/s 2026-03-09T00:05:15.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:14 vm06.local ceph-mon[58395]: pgmap v40: 65 pgs: 65 active+clean; 335 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 342 op/s 2026-03-09T00:05:16.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:15 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:16.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:15 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:17.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:16 vm06.local ceph-mon[58395]: pgmap v41: 65 pgs: 65 active+clean; 335 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 342 op/s 2026-03-09T00:05:17.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:16 vm03.local ceph-mon[52346]: pgmap v41: 65 pgs: 65 active+clean; 335 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.4 MiB/s wr, 342 op/s 2026-03-09T00:05:19.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:19 vm06.local ceph-mon[58395]: pgmap v42: 65 pgs: 65 active+clean; 336 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 51 KiB/s rd, 2.3 MiB/s wr, 545 op/s 2026-03-09T00:05:19.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:19 vm03.local ceph-mon[52346]: pgmap v42: 65 pgs: 65 active+clean; 336 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 51 KiB/s rd, 2.3 MiB/s wr, 545 op/s 2026-03-09T00:05:20.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.580+0000 7f9c08993700 1 -- 192.168.123.103:0/580154909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c00075a40 msgr2=0x7f9c00077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:20.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.580+0000 7f9c08993700 1 --2- 192.168.123.103:0/580154909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c00075a40 0x7f9c00077ed0 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7f9bfc00d3f0 tx=0x7f9bfc00d700 comp rx=0 tx=0).stop 2026-03-09T00:05:20.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.581+0000 7f9c08993700 1 -- 192.168.123.103:0/580154909 shutdown_connections 2026-03-09T00:05:20.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.581+0000 7f9c08993700 1 --2- 192.168.123.103:0/580154909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c00075a40 0x7f9c00077ed0 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:20.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.581+0000 7f9c08993700 1 --2- 192.168.123.103:0/580154909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:20.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.581+0000 7f9c08993700 1 -- 192.168.123.103:0/580154909 >> 192.168.123.103:0/580154909 conn(0x7f9c0006dae0 msgr2=0x7f9c0006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 -- 192.168.123.103:0/580154909 shutdown_connections 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 -- 192.168.123.103:0/580154909 wait complete. 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 Processor -- start 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 -- start start 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00082f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c000834a0 0x7f9c00083920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:20.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c0012e700 con 0x7f9c000834a0 2026-03-09T00:05:20.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.582+0000 7f9c08993700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9c0012e870 con 0x7f9c00072b50 2026-03-09T00:05:20.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00082f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:20.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00082f60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48402/0 (socket says 192.168.123.103:48402) 2026-03-09T00:05:20.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 -- 192.168.123.103:0/3063953521 learned_addr learned my addr 192.168.123.103:0/3063953521 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:20.583 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c05f2e700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c000834a0 0x7f9c00083920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:20.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 -- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c000834a0 msgr2=0x7f9c00083920 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:20.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c000834a0 0x7f9c00083920 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:20.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 -- 192.168.123.103:0/3063953521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9bfc007ed0 con 0x7f9c00072b50 2026-03-09T00:05:20.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.583+0000 7f9c0672f700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00082f60 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9bf400d8d0 tx=0x7f9bf400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:20.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.585+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9bf4009940 con 0x7f9c00072b50 2026-03-09T00:05:20.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.585+0000 7f9c08993700 1 -- 192.168.123.103:0/3063953521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9c0012eb50 con 0x7f9c00072b50 2026-03-09T00:05:20.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.585+0000 7f9c08993700 1 -- 192.168.123.103:0/3063953521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9c0012f0a0 con 0x7f9c00072b50 2026-03-09T00:05:20.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.586+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9bf4010460 con 0x7f9c00072b50 2026-03-09T00:05:20.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.586+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9bf400f5d0 con 0x7f9c00072b50 2026-03-09T00:05:20.586 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.587+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f9bf400f790 con 0x7f9c00072b50 2026-03-09T00:05:20.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.588+0000 7f9bf37fe700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9bec077790 0x7f9bec079c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:20.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.588+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f9bf409a650 con 0x7f9c00072b50 2026-03-09T00:05:20.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.588+0000 7f9c08993700 1 -- 192.168.123.103:0/3063953521 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9be4005320 con 0x7f9c00072b50 2026-03-09T00:05:20.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.592+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f9bf4062a90 con 0x7f9c00072b50 2026-03-09T00:05:20.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.593+0000 7f9c05f2e700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9bec077790 0x7f9bec079c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:20.596 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.597+0000 7f9c05f2e700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9bec077790 0x7f9bec079c50 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f9bfc00db80 tx=0x7f9bfc006040 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:20.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.769+0000 7f9c08993700 1 -- 192.168.123.103:0/3063953521 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9be4000bf0 con 0x7f9bec077790 2026-03-09T00:05:20.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.771+0000 7f9bf37fe700 1 -- 192.168.123.103:0/3063953521 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f9be4000bf0 con 0x7f9bec077790 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 -- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9bec077790 msgr2=0x7f9bec079c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9bec077790 0x7f9bec079c50 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f9bfc00db80 tx=0x7f9bfc006040 comp rx=0 tx=0).stop 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 -- 192.168.123.103:0/3063953521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 msgr2=0x7f9c00082f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00082f60 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f9bf400d8d0 tx=0x7f9bf400dc90 comp rx=0 tx=0).stop 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 -- 192.168.123.103:0/3063953521 shutdown_connections 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9bec077790 0x7f9bec079c50 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9c00072b50 0x7f9c00082f60 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 --2- 192.168.123.103:0/3063953521 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9c000834a0 0x7f9c00083920 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 -- 192.168.123.103:0/3063953521 >> 192.168.123.103:0/3063953521 conn(0x7f9c0006dae0 msgr2=0x7f9c0006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:20.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 -- 192.168.123.103:0/3063953521 shutdown_connections 2026-03-09T00:05:20.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:20.774+0000 7f9bf17fa700 1 -- 192.168.123.103:0/3063953521 wait complete. 2026-03-09T00:05:20.961 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | length == 1'"'"'' 2026-03-09T00:05:21.189 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:21.231 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:21 vm03.local ceph-mon[52346]: pgmap v43: 65 pgs: 65 active+clean; 336 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 1.8 MiB/s wr, 403 op/s 2026-03-09T00:05:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:21 vm06.local ceph-mon[58395]: pgmap v43: 65 pgs: 65 active+clean; 336 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 38 KiB/s rd, 1.8 MiB/s wr, 403 op/s 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 -- 192.168.123.103:0/1600865010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b54107d90 msgr2=0x7f1b541081f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 --2- 192.168.123.103:0/1600865010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b54107d90 0x7f1b541081f0 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7f1b4c00d3f0 tx=0x7f1b4c00d700 comp rx=0 tx=0).stop 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 -- 192.168.123.103:0/1600865010 shutdown_connections 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 --2- 192.168.123.103:0/1600865010 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b54107d90 0x7f1b541081f0 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 --2- 192.168.123.103:0/1600865010 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5410d310 0x7f1b5410d6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 -- 192.168.123.103:0/1600865010 >> 192.168.123.103:0/1600865010 conn(0x7f1b5406ce20 msgr2=0x7f1b5406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 -- 192.168.123.103:0/1600865010 shutdown_connections 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.529+0000 7f1b5a961700 1 -- 192.168.123.103:0/1600865010 wait complete. 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5a961700 1 Processor -- start 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5a961700 1 -- start start 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5a961700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b5410d310 0x7f1b5407ce60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5a961700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 0x7f1b5407d820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:21.529 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5a961700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b540819e0 con 0x7f1b5410d310 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5a961700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b54081b50 con 0x7f1b5407d3a0 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5915e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 0x7f1b5407d820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5915e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 0x7f1b5407d820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48422/0 (socket says 192.168.123.103:48422) 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.530+0000 7f1b5915e700 1 -- 192.168.123.103:0/821558169 learned_addr learned my addr 192.168.123.103:0/821558169 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b5915e700 1 -- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b5410d310 msgr2=0x7f1b5407ce60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b5915e700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b5410d310 0x7f1b5407ce60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b5915e700 1 -- 192.168.123.103:0/821558169 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1b4c007ed0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b5915e700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 0x7f1b5407d820 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f1b4c003c60 tx=0x7f1b4c003d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:21.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b4c01c070 con 0x7f1b5407d3a0 2026-03-09T00:05:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b5a961700 1 -- 192.168.123.103:0/821558169 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1b54081dd0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.531+0000 7f1b5a961700 1 -- 192.168.123.103:0/821558169 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1b540822c0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.532+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1b4c00deb0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.532+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b4c0177f0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.532+0000 7f1b5a961700 1 -- 192.168.123.103:0/821558169 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1b38005320 con 0x7f1b5407d3a0 2026-03-09T00:05:21.532 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.533+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f1b4c017950 con 0x7f1b5407d3a0 2026-03-09T00:05:21.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.533+0000 7f1b4affd700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b400798d0 0x7f1b4007bd90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:21.533 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.534+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f1b4c013070 con 0x7f1b5407d3a0 2026-03-09T00:05:21.535 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.536+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f1b4c063c80 con 0x7f1b5407d3a0 2026-03-09T00:05:21.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.545+0000 7f1b5995f700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b400798d0 0x7f1b4007bd90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:21.556 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.556+0000 7f1b5995f700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b400798d0 0x7f1b4007bd90 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f1b50005950 tx=0x7f1b500058e0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:21.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.758+0000 7f1b5a961700 1 -- 192.168.123.103:0/821558169 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f1b38005cc0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.758 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.759+0000 7f1b4affd700 1 -- 192.168.123.103:0/821558169 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f1b4c0633d0 con 0x7f1b5407d3a0 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 -- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b400798d0 msgr2=0x7f1b4007bd90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b400798d0 0x7f1b4007bd90 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f1b50005950 tx=0x7f1b500058e0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 -- 192.168.123.103:0/821558169 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 msgr2=0x7f1b5407d820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 0x7f1b5407d820 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f1b4c003c60 tx=0x7f1b4c003d40 comp rx=0 tx=0).stop 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 -- 192.168.123.103:0/821558169 shutdown_connections 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b400798d0 0x7f1b4007bd90 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b5410d310 0x7f1b5407ce60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 --2- 192.168.123.103:0/821558169 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b5407d3a0 0x7f1b5407d820 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:21.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.762+0000 7f1b48ff9700 1 -- 192.168.123.103:0/821558169 >> 192.168.123.103:0/821558169 conn(0x7f1b5406ce20 msgr2=0x7f1b54071100 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:21.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.763+0000 7f1b48ff9700 1 -- 192.168.123.103:0/821558169 shutdown_connections 2026-03-09T00:05:21.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:21.763+0000 7f1b48ff9700 1 -- 192.168.123.103:0/821558169 wait complete. 2026-03-09T00:05:21.771 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:05:21.819 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | keys'"'"' | grep $sha1' 2026-03-09T00:05:21.995 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:22.380 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:22 vm03.local ceph-mon[52346]: from='client.24463 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:22.380 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:22 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/821558169' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:22 vm06.local ceph-mon[58395]: from='client.24463 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:22.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:22 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/821558169' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.465+0000 7f9fe2342700 1 -- 192.168.123.103:0/3381668210 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc107ff0 msgr2=0x7f9fdc1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.465+0000 7f9fe2342700 1 --2- 192.168.123.103:0/3381668210 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc107ff0 0x7f9fdc1083d0 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f9fcc005fd0 tx=0x7f9fcc00adf0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.465+0000 7f9fe2342700 1 -- 192.168.123.103:0/3381668210 shutdown_connections 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.465+0000 7f9fe2342700 1 --2- 192.168.123.103:0/3381668210 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc1089a0 0x7f9fdc10be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.465+0000 7f9fe2342700 1 --2- 192.168.123.103:0/3381668210 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc107ff0 0x7f9fdc1083d0 unknown :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.465+0000 7f9fe2342700 1 -- 192.168.123.103:0/3381668210 >> 192.168.123.103:0/3381668210 conn(0x7f9fdc06ce20 msgr2=0x7f9fdc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 -- 192.168.123.103:0/3381668210 shutdown_connections 2026-03-09T00:05:22.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 -- 192.168.123.103:0/3381668210 wait complete. 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 Processor -- start 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 -- start start 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc1089a0 0x7f9fdc07cea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 0x7f9fdc07d860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fdc081a20 con 0x7f9fdc1089a0 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fe2342700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9fdc081b90 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fdb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 0x7f9fdc07d860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fdbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc1089a0 0x7f9fdc07cea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fdb7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 0x7f9fdc07d860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48450/0 (socket says 192.168.123.103:48450) 2026-03-09T00:05:22.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.466+0000 7f9fdb7fe700 1 -- 192.168.123.103:0/3776528134 learned_addr learned my addr 192.168.123.103:0/3776528134 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:22.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.467+0000 7f9fdb7fe700 1 -- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc1089a0 msgr2=0x7f9fdc07cea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:22.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.467+0000 7f9fdb7fe700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc1089a0 0x7f9fdc07cea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.467+0000 7f9fdb7fe700 1 -- 192.168.123.103:0/3776528134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9fcc0082d0 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.467+0000 7f9fdb7fe700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 0x7f9fdc07d860 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f9fd400e550 tx=0x7f9fd400e860 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:22.467 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.468+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9fd4004d60 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.468+0000 7f9fe2342700 1 -- 192.168.123.103:0/3776528134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9fdc081e70 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.468+0000 7f9fe2342700 1 -- 192.168.123.103:0/3776528134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9fdc0823c0 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.468+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9fd4011930 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.468+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9fd4005600 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.469+0000 7f9fe2342700 1 -- 192.168.123.103:0/3776528134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9fc8005320 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.470+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f9fd4005840 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.470+0000 7f9fd97fa700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9fc40776d0 0x7f9fc4079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:22.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.472+0000 7f9fdbfff700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9fc40776d0 0x7f9fc4079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:22.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.472+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f9fd4099750 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.471 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.472+0000 7f9fdbfff700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9fc40776d0 0x7f9fc4079b90 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9fcc00b3a0 tx=0x7f9fcc005b20 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:22.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.475+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f9fd40621e0 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.732+0000 7f9fe2342700 1 -- 192.168.123.103:0/3776528134 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9fc8005cc0 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.732 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.733+0000 7f9fd97fa700 1 -- 192.168.123.103:0/3776528134 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f9fd40152c0 con 0x7f9fdc07d3e0 2026-03-09T00:05:22.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.735+0000 7f9fc2ffd700 1 -- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9fc40776d0 msgr2=0x7f9fc4079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:22.735 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.735+0000 7f9fc2ffd700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9fc40776d0 0x7f9fc4079b90 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f9fcc00b3a0 tx=0x7f9fcc005b20 comp rx=0 tx=0).stop 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 -- 192.168.123.103:0/3776528134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 msgr2=0x7f9fdc07d860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 0x7f9fdc07d860 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f9fd400e550 tx=0x7f9fd400e860 comp rx=0 tx=0).stop 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 -- 192.168.123.103:0/3776528134 shutdown_connections 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9fc40776d0 0x7f9fc4079b90 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9fdc1089a0 0x7f9fdc07cea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 --2- 192.168.123.103:0/3776528134 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9fdc07d3e0 0x7f9fdc07d860 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 -- 192.168.123.103:0/3776528134 >> 192.168.123.103:0/3776528134 conn(0x7f9fdc06ce20 msgr2=0x7f9fdc071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 -- 192.168.123.103:0/3776528134 shutdown_connections 2026-03-09T00:05:22.736 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:22.737+0000 7f9fc2ffd700 1 -- 192.168.123.103:0/3776528134 wait complete. 2026-03-09T00:05:22.751 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-09T00:05:22.809 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 2'"'"'' 2026-03-09T00:05:22.994 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:23.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:23 vm03.local ceph-mon[52346]: pgmap v44: 65 pgs: 65 active+clean; 333 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 53 KiB/s rd, 2.6 MiB/s wr, 918 op/s 2026-03-09T00:05:23.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:23 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/3776528134' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 -- 192.168.123.103:0/1246981627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45010d310 msgr2=0x7fb45010d6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 --2- 192.168.123.103:0/1246981627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45010d310 0x7fb45010d6f0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fb44000bc70 tx=0x7fb44000bf80 comp rx=0 tx=0).stop 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 -- 192.168.123.103:0/1246981627 shutdown_connections 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 --2- 192.168.123.103:0/1246981627 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450107d90 0x7fb4501081f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 --2- 192.168.123.103:0/1246981627 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45010d310 0x7fb45010d6f0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 -- 192.168.123.103:0/1246981627 >> 192.168.123.103:0/1246981627 conn(0x7fb45006ce20 msgr2=0x7fb45006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 -- 192.168.123.103:0/1246981627 shutdown_connections 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.347+0000 7fb455848700 1 -- 192.168.123.103:0/1246981627 wait complete. 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb455848700 1 Processor -- start 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb455848700 1 -- start start 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb455848700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450107d90 0x7fb45007cdb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb455848700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 0x7fb45007d770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb455848700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb450083e20 con 0x7fb450107d90 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb455848700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb450081930 con 0x7fb45007d2f0 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb44ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 0x7fb45007d770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb44ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 0x7fb45007d770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48464/0 (socket says 192.168.123.103:48464) 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb44ffff700 1 -- 192.168.123.103:0/3236235993 learned_addr learned my addr 192.168.123.103:0/3236235993 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:23.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.348+0000 7fb454846700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450107d90 0x7fb45007cdb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb44ffff700 1 -- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450107d90 msgr2=0x7fb45007cdb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb44ffff700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450107d90 0x7fb45007cdb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb44ffff700 1 -- 192.168.123.103:0/3236235993 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb44000b920 con 0x7fb45007d2f0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb44ffff700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 0x7fb45007d770 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fb44800e9d0 tx=0x7fb44800ece0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb44800c4f0 con 0x7fb45007d2f0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb450081c10 con 0x7fb45007d2f0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.349+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb450082160 con 0x7fb45007d2f0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.350+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb448013070 con 0x7fb45007d2f0 2026-03-09T00:05:23.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.350+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb44800f940 con 0x7fb45007d2f0 2026-03-09T00:05:23.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.351+0000 7fb4377fe700 1 -- 192.168.123.103:0/3236235993 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb43c005320 con 0x7fb45007d2f0 2026-03-09T00:05:23.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.352+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fb44800fba0 con 0x7fb45007d2f0 2026-03-09T00:05:23.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.353+0000 7fb44dffb700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb438077790 0x7fb438079c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:23.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.357+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fb448015070 con 0x7fb45007d2f0 2026-03-09T00:05:23.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.357+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fb4480c8c90 con 0x7fb45007d2f0 2026-03-09T00:05:23.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.357+0000 7fb454846700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb438077790 0x7fb438079c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:23.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.361+0000 7fb454846700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb438077790 0x7fb438079c50 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fb44000bc40 tx=0x7fb44000d330 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:23.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.570+0000 7fb4377fe700 1 -- 192.168.123.103:0/3236235993 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fb43c005cc0 con 0x7fb45007d2f0 2026-03-09T00:05:23.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.571+0000 7fb44dffb700 1 -- 192.168.123.103:0/3236235993 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7fb448062a30 con 0x7fb45007d2f0 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb438077790 msgr2=0x7fb438079c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb438077790 0x7fb438079c50 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fb44000bc40 tx=0x7fb44000d330 comp rx=0 tx=0).stop 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 msgr2=0x7fb45007d770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 0x7fb45007d770 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fb44800e9d0 tx=0x7fb44800ece0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 shutdown_connections 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb438077790 0x7fb438079c50 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb450107d90 0x7fb45007cdb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 --2- 192.168.123.103:0/3236235993 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb45007d2f0 0x7fb45007d770 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 >> 192.168.123.103:0/3236235993 conn(0x7fb45006ce20 msgr2=0x7fb450071e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 shutdown_connections 2026-03-09T00:05:23.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:23.575+0000 7fb455848700 1 -- 192.168.123.103:0/3236235993 wait complete. 2026-03-09T00:05:23.583 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:05:23.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:23 vm06.local ceph-mon[58395]: pgmap v44: 65 pgs: 65 active+clean; 333 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 53 KiB/s rd, 2.6 MiB/s wr, 918 op/s 2026-03-09T00:05:23.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:23 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/3776528134' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:24.455 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '"'"'.up_to_date | length == 2'"'"'' 2026-03-09T00:05:24.643 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:24.707 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:24 vm03.local ceph-mon[52346]: pgmap v45: 65 pgs: 65 active+clean; 333 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.7 MiB/s wr, 717 op/s 2026-03-09T00:05:24.707 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:24 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/3236235993' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:24 vm06.local ceph-mon[58395]: pgmap v45: 65 pgs: 65 active+clean; 333 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.7 MiB/s wr, 717 op/s 2026-03-09T00:05:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:24 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/3236235993' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:24.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.956+0000 7fc85dd49700 1 -- 192.168.123.103:0/3609819221 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 msgr2=0x7fc85810d040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:24.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.956+0000 7fc85dd49700 1 --2- 192.168.123.103:0/3609819221 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 0x7fc85810d040 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7fc844009b00 tx=0x7fc844009e10 comp rx=0 tx=0).stop 2026-03-09T00:05:24.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.960+0000 7fc85dd49700 1 -- 192.168.123.103:0/3609819221 shutdown_connections 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.960+0000 7fc85dd49700 1 --2- 192.168.123.103:0/3609819221 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 0x7fc85810d040 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.960+0000 7fc85dd49700 1 --2- 192.168.123.103:0/3609819221 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc8580ffdb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.960+0000 7fc85dd49700 1 -- 192.168.123.103:0/3609819221 >> 192.168.123.103:0/3609819221 conn(0x7fc8580fb610 msgr2=0x7fc8580fda30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.960+0000 7fc85dd49700 1 -- 192.168.123.103:0/3609819221 shutdown_connections 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.960+0000 7fc85dd49700 1 -- 192.168.123.103:0/3609819221 wait complete. 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.961+0000 7fc85dd49700 1 Processor -- start 2026-03-09T00:05:24.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.961+0000 7fc85dd49700 1 -- start start 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.961+0000 7fc85dd49700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc85819e910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.961+0000 7fc85dd49700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 0x7fc85819ee50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85dd49700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc85819f4e0 con 0x7fc858100380 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85dd49700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc858198990 con 0x7fc8580ff9d0 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85cd47700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc85819e910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85cd47700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc85819e910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50528/0 (socket says 192.168.123.103:50528) 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85cd47700 1 -- 192.168.123.103:0/2706455606 learned_addr learned my addr 192.168.123.103:0/2706455606 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85cd47700 1 -- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 msgr2=0x7fc85819ee50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:24.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85cd47700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 0x7fc85819ee50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:24.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.962+0000 7fc85cd47700 1 -- 192.168.123.103:0/2706455606 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc8440097e0 con 0x7fc8580ff9d0 2026-03-09T00:05:24.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.963+0000 7fc85cd47700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc85819e910 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fc84c00eb10 tx=0x7fc84c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:24.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.964+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc84c00cca0 con 0x7fc8580ff9d0 2026-03-09T00:05:24.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.964+0000 7fc85dd49700 1 -- 192.168.123.103:0/2706455606 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc858198c70 con 0x7fc8580ff9d0 2026-03-09T00:05:24.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.964+0000 7fc85dd49700 1 -- 192.168.123.103:0/2706455606 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc8581991c0 con 0x7fc8580ff9d0 2026-03-09T00:05:24.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.964+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc84c00ce00 con 0x7fc8580ff9d0 2026-03-09T00:05:24.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.964+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc84c018910 con 0x7fc8580ff9d0 2026-03-09T00:05:24.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.965+0000 7fc85dd49700 1 -- 192.168.123.103:0/2706455606 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc83c005320 con 0x7fc8580ff9d0 2026-03-09T00:05:24.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.967+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc84c018a70 con 0x7fc8580ff9d0 2026-03-09T00:05:24.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.967+0000 7fc855ffb700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc848077620 0x7fc848079ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:24.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.967+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fc84c014070 con 0x7fc8580ff9d0 2026-03-09T00:05:24.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.967+0000 7fc857fff700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc848077620 0x7fc848079ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:24.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.968+0000 7fc857fff700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc848077620 0x7fc848079ae0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fc84400b5c0 tx=0x7fc844005fd0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:24.968 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:24.969+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fc84c063130 con 0x7fc8580ff9d0 2026-03-09T00:05:25.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:25.112+0000 7fc85dd49700 1 -- 192.168.123.103:0/2706455606 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7fc83c000c90 con 0x7fc848077620 2026-03-09T00:05:26.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:25 vm03.local ceph-mon[52346]: from='client.24479 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:26.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:25 vm06.local ceph-mon[58395]: from='client.24479 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:26.971 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:26 vm03.local ceph-mon[52346]: pgmap v46: 65 pgs: 65 active+clean; 333 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.7 MiB/s wr, 717 op/s 2026-03-09T00:05:27.012 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.010+0000 7fc855ffb700 1 -- 192.168.123.103:0/2706455606 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+5308 (secure 0 0 0) 0x7fc83c000c90 con 0x7fc848077620 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 -- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc848077620 msgr2=0x7fc848079ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc848077620 0x7fc848079ae0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fc84400b5c0 tx=0x7fc844005fd0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 -- 192.168.123.103:0/2706455606 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 msgr2=0x7fc85819e910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc85819e910 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fc84c00eb10 tx=0x7fc84c00eed0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 -- 192.168.123.103:0/2706455606 shutdown_connections 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fc848077620 0x7fc848079ae0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc8580ff9d0 0x7fc85819e910 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 --2- 192.168.123.103:0/2706455606 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc858100380 0x7fc85819ee50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 -- 192.168.123.103:0/2706455606 >> 192.168.123.103:0/2706455606 conn(0x7fc8580fb610 msgr2=0x7fc8580fcba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:27.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 -- 192.168.123.103:0/2706455606 shutdown_connections 2026-03-09T00:05:27.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.014+0000 7fc8437fe700 1 -- 192.168.123.103:0/2706455606 wait complete. 2026-03-09T00:05:27.024 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:05:27.071 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:26 vm06.local ceph-mon[58395]: pgmap v46: 65 pgs: 65 active+clean; 333 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.7 MiB/s wr, 717 op/s 2026-03-09T00:05:27.071 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T00:05:27.312 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:27.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 -- 192.168.123.103:0/1468357003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc107ff0 msgr2=0x7f07cc1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:27.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 --2- 192.168.123.103:0/1468357003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc107ff0 0x7f07cc1083d0 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7f07bc00bc70 tx=0x7f07bc00bf80 comp rx=0 tx=0).stop 2026-03-09T00:05:27.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 -- 192.168.123.103:0/1468357003 shutdown_connections 2026-03-09T00:05:27.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 --2- 192.168.123.103:0/1468357003 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc10be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 --2- 192.168.123.103:0/1468357003 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc107ff0 0x7f07cc1083d0 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 -- 192.168.123.103:0/1468357003 >> 192.168.123.103:0/1468357003 conn(0x7f07cc06ce20 msgr2=0x7f07cc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:27.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 -- 192.168.123.103:0/1468357003 shutdown_connections 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.616+0000 7f07d229d700 1 -- 192.168.123.103:0/1468357003 wait complete. 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07d229d700 1 Processor -- start 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07d229d700 1 -- start start 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07d229d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc07cee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07d229d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc07d420 0x7f07cc07d8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07d229d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07cc081a60 con 0x7f07cc07d420 2026-03-09T00:05:27.617 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07d229d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07cc081bd0 con 0x7f07cc1089a0 2026-03-09T00:05:27.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07cbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc07cee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:27.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07cbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc07cee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50536/0 (socket says 192.168.123.103:50536) 2026-03-09T00:05:27.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.617+0000 7f07cbfff700 1 -- 192.168.123.103:0/1655009515 learned_addr learned my addr 192.168.123.103:0/1655009515 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:27.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.618+0000 7f07cbfff700 1 -- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc07d420 msgr2=0x7f07cc07d8a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:27.618 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.618+0000 7f07cbfff700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc07d420 0x7f07cc07d8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.618+0000 7f07cbfff700 1 -- 192.168.123.103:0/1655009515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f07bc00b920 con 0x7f07cc1089a0 2026-03-09T00:05:27.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.619+0000 7f07cbfff700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc07cee0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f07bc004510 tx=0x7f07bc0045f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:27.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.620+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07bc010030 con 0x7f07cc1089a0 2026-03-09T00:05:27.619 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.620+0000 7f07d229d700 1 -- 192.168.123.103:0/1655009515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f07cc081e50 con 0x7f07cc1089a0 2026-03-09T00:05:27.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.620+0000 7f07d229d700 1 -- 192.168.123.103:0/1655009515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f07cc082340 con 0x7f07cc1089a0 2026-03-09T00:05:27.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.620+0000 7f07d229d700 1 -- 192.168.123.103:0/1655009515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f07cc04f2e0 con 0x7f07cc1089a0 2026-03-09T00:05:27.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.623+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f07bc01d940 con 0x7f07cc1089a0 2026-03-09T00:05:27.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.625+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07bc014af0 con 0x7f07cc1089a0 2026-03-09T00:05:27.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.625+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f07bc014d10 con 0x7f07cc1089a0 2026-03-09T00:05:27.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.626+0000 7f07c97fa700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f07b4077860 0x7f07b4079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:27.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.626+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f07bc09beb0 con 0x7f07cc1089a0 2026-03-09T00:05:27.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.628+0000 7f07cb7fe700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f07b4077860 0x7f07b4079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:27.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.629+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f07bc0649c0 con 0x7f07cc1089a0 2026-03-09T00:05:27.628 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.629+0000 7f07cb7fe700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f07b4077860 0x7f07b4079d20 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f07c40076d0 tx=0x7f07c4007820 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:27.802 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.802+0000 7f07d229d700 1 -- 192.168.123.103:0/1655009515 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f07cc078ad0 con 0x7f07b4077860 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (44s) 26s ago 5m 16.7M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (5m) 26s ago 5m 8434k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (5m) 60s ago 5m 8522k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 26s ago 5m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (5m) 60s ago 5m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (29s) 26s ago 5m 39.0M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (3m) 26s ago 3m 16.7M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (3m) 26s ago 3m 253M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (3m) 60s ago 3m 18.5M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (3m) 60s ago 3m 14.7M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (97s) 26s ago 6m 627M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (75s) 60s ago 5m 487M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (6m) 26s ago 6m 55.2M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (5m) 60s ago 5m 45.6M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (65s) 26s ago 5m 9261k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (61s) 60s ago 5m 5511k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 26s ago 4m 343M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 26s ago 4m 360M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (4m) 26s ago 4m 314M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (4m) 60s ago 4m 421M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (4m) 60s ago 4m 384M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (4m) 60s ago 4m 342M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (49s) 26s ago 5m 47.0M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.808+0000 7f07c97fa700 1 -- 192.168.123.103:0/1655009515 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f07cc078ad0 con 0x7f07b4077860 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 -- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f07b4077860 msgr2=0x7f07b4079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:27.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f07b4077860 0x7f07b4079d20 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f07c40076d0 tx=0x7f07c4007820 comp rx=0 tx=0).stop 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 -- 192.168.123.103:0/1655009515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 msgr2=0x7f07cc07cee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc07cee0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f07bc004510 tx=0x7f07bc0045f0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 -- 192.168.123.103:0/1655009515 shutdown_connections 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f07b4077860 0x7f07b4079d20 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f07cc1089a0 0x7f07cc07cee0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 --2- 192.168.123.103:0/1655009515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f07cc07d420 0x7f07cc07d8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 -- 192.168.123.103:0/1655009515 >> 192.168.123.103:0/1655009515 conn(0x7f07cc06ce20 msgr2=0x7f07cc071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.811+0000 7f07b2ffd700 1 -- 192.168.123.103:0/1655009515 shutdown_connections 2026-03-09T00:05:27.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:27.812+0000 7f07b2ffd700 1 -- 192.168.123.103:0/1655009515 wait complete. 2026-03-09T00:05:27.903 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T00:05:27.904 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:05:27.904 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs true' 2026-03-09T00:05:28.134 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1414887685 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 msgr2=0x7fa1fc10f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/1414887685 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 0x7fa1fc10f720 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fa1ec009a60 tx=0x7fa1ec009d70 comp rx=0 tx=0).stop 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1414887685 shutdown_connections 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/1414887685 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc10d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/1414887685 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 0x7fa1fc10f720 unknown :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1414887685 >> 192.168.123.103:0/1414887685 conn(0x7fa1fc06ce20 msgr2=0x7fa1fc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:28.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.426+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1414887685 shutdown_connections 2026-03-09T00:05:28.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.427+0000 7fa1fbfff700 1 -- 192.168.123.103:0/1414887685 wait complete. 2026-03-09T00:05:28.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.428+0000 7fa1fbfff700 1 Processor -- start 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.428+0000 7fa1fbfff700 1 -- start start 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.428+0000 7fa1fbfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc11bf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.428+0000 7fa1fbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 0x7fa1fc116f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.428+0000 7fa1fbfff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1fc117500 con 0x7fa1fc10f340 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.428+0000 7fa1fbfff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1fc117670 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc11bf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc11bf30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50566/0 (socket says 192.168.123.103:50566) 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 -- 192.168.123.103:0/2406072954 learned_addr learned my addr 192.168.123.103:0/2406072954 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 -- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 msgr2=0x7fa1fc116f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 0x7fa1fc116f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 -- 192.168.123.103:0/2406072954 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa1ec009710 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.428 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.429+0000 7fa1faffd700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc11bf30 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fa1ec00b5c0 tx=0x7fa1ec004180 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:28.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.430+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa1ec01d070 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.430+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa1fc1178f0 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.430+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa1fc1b8400 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.430+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa1ec022950 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.430+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa1fc04f2e0 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.431+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa1ec00f970 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.430 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.431+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fa1ec00fc10 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.431+0000 7fa1e3fff700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fa1e4077790 0x7fa1e4079c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:28.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.431+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fa1ec09c0c0 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.432+0000 7fa1fa7fc700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fa1e4077790 0x7fa1e4079c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:28.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.432+0000 7fa1fa7fc700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fa1e4077790 0x7fa1e4079c50 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa1f0005950 tx=0x7fa1f00058e0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:28.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.433+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fa1ec064c80 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.594 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.590+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7fa1fc04ea90 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.689+0000 7fa1e3fff700 1 -- 192.168.123.103:0/2406072954 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v37)=0 v37) v1 ==== 125+0+0 (secure 0 0 0) 0x7fa1ec064430 con 0x7fa1fc10d0f0 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.692+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fa1e4077790 msgr2=0x7fa1e4079c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.692+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fa1e4077790 0x7fa1e4079c50 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa1f0005950 tx=0x7fa1f00058e0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.692+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 msgr2=0x7fa1fc11bf30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.692+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc11bf30 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7fa1ec00b5c0 tx=0x7fa1ec004180 comp rx=0 tx=0).stop 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.693+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 shutdown_connections 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.693+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fa1e4077790 0x7fa1e4079c50 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.693+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1fc10d0f0 0x7fa1fc11bf30 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.693+0000 7fa1fbfff700 1 --2- 192.168.123.103:0/2406072954 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1fc10f340 0x7fa1fc116f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:28.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.693+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 >> 192.168.123.103:0/2406072954 conn(0x7fa1fc06ce20 msgr2=0x7fa1fc0702e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:28.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.694+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 shutdown_connections 2026-03-09T00:05:28.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:28.694+0000 7fa1fbfff700 1 -- 192.168.123.103:0/2406072954 wait complete. 2026-03-09T00:05:28.817 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T00:05:28.817 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:05:28.817 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-09T00:05:29.054 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:28 vm03.local ceph-mon[52346]: pgmap v47: 65 pgs: 65 active+clean; 316 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 44 KiB/s rd, 2.1 MiB/s wr, 819 op/s 2026-03-09T00:05:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:28 vm03.local ceph-mon[52346]: from='client.24483 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:28 vm03.local ceph-mon[52346]: from='client.? ' entity='client.admin' 2026-03-09T00:05:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:28 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:29.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:28 vm06.local ceph-mon[58395]: pgmap v47: 65 pgs: 65 active+clean; 316 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 44 KiB/s rd, 2.1 MiB/s wr, 819 op/s 2026-03-09T00:05:29.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:28 vm06.local ceph-mon[58395]: from='client.24483 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:29.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:28 vm06.local ceph-mon[58395]: from='client.? ' entity='client.admin' 2026-03-09T00:05:29.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:28 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:29.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.473+0000 7f0eef2d8700 1 -- 192.168.123.103:0/3613507990 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 msgr2=0x7f0ee81127a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:29.473 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.473+0000 7f0eef2d8700 1 --2- 192.168.123.103:0/3613507990 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81127a0 secure :-1 s=READY pgs=325 cs=0 l=1 rev1=1 crypto rx=0x7f0ed8009a60 tx=0x7f0ed8009d70 comp rx=0 tx=0).stop 2026-03-09T00:05:29.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.478+0000 7f0eef2d8700 1 -- 192.168.123.103:0/3613507990 shutdown_connections 2026-03-09T00:05:29.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.478+0000 7f0eef2d8700 1 --2- 192.168.123.103:0/3613507990 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81127a0 unknown :-1 s=CLOSED pgs=325 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.478+0000 7f0eef2d8700 1 --2- 192.168.123.103:0/3613507990 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0ee8109730 0x7f0ee8109b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.478 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.478+0000 7f0eef2d8700 1 -- 192.168.123.103:0/3613507990 >> 192.168.123.103:0/3613507990 conn(0x7f0ee806d1b0 msgr2=0x7f0ee806d5c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:29.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.480+0000 7f0eef2d8700 1 -- 192.168.123.103:0/3613507990 shutdown_connections 2026-03-09T00:05:29.479 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.480+0000 7f0eef2d8700 1 -- 192.168.123.103:0/3613507990 wait complete. 2026-03-09T00:05:29.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eef2d8700 1 Processor -- start 2026-03-09T00:05:29.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eef2d8700 1 -- start start 2026-03-09T00:05:29.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eef2d8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0ee8109730 0x7f0ee8113080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:29.480 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eef2d8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81135c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eef2d8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ee8114e30 con 0x7f0ee810a050 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eef2d8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ee8114fa0 con 0x7f0ee8109730 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eec873700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81135c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eec873700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81135c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46512/0 (socket says 192.168.123.103:46512) 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.481+0000 7f0eec873700 1 -- 192.168.123.103:0/1512251706 learned_addr learned my addr 192.168.123.103:0/1512251706 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0eec873700 1 -- 192.168.123.103:0/1512251706 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0ee8109730 msgr2=0x7f0ee8113080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0eec873700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0ee8109730 0x7f0ee8113080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0eec873700 1 -- 192.168.123.103:0/1512251706 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0ed8009710 con 0x7f0ee810a050 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0eec873700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81135c0 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7f0ed800b5c0 tx=0x7f0ed800f740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:29.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ed801d070 con 0x7f0ee810a050 2026-03-09T00:05:29.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0eef2d8700 1 -- 192.168.123.103:0/1512251706 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ee8113bc0 con 0x7f0ee810a050 2026-03-09T00:05:29.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.482+0000 7f0eef2d8700 1 -- 192.168.123.103:0/1512251706 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ee81afbf0 con 0x7f0ee810a050 2026-03-09T00:05:29.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.483+0000 7f0eef2d8700 1 -- 192.168.123.103:0/1512251706 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ee810fea0 con 0x7f0ee810a050 2026-03-09T00:05:29.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.483+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0ed8005c80 con 0x7f0ee810a050 2026-03-09T00:05:29.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.483+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0ed800e5f0 con 0x7f0ee810a050 2026-03-09T00:05:29.484 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.485+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f0ed800fd20 con 0x7f0ee810a050 2026-03-09T00:05:29.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.485+0000 7f0ede7fc700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f0ed4077680 0x7f0ed4079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:29.485 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.486+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f0ed809ac80 con 0x7f0ee810a050 2026-03-09T00:05:29.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.487+0000 7f0eed074700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f0ed4077680 0x7f0ed4079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:29.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.488+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f0ed8063840 con 0x7f0ee810a050 2026-03-09T00:05:29.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.503+0000 7f0eed074700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f0ed4077680 0x7f0ed4079b40 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f0ee4005950 tx=0x7f0ee40058e0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:29.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.656+0000 7f0eef2d8700 1 -- 192.168.123.103:0/1512251706 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f0ee804f2e0 con 0x7f0ee810a050 2026-03-09T00:05:29.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.659+0000 7f0ede7fc700 1 -- 192.168.123.103:0/1512251706 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v37)=0 v37) v1 ==== 155+0+0 (secure 0 0 0) 0x7f0ed8062f90 con 0x7f0ee810a050 2026-03-09T00:05:29.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 -- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f0ed4077680 msgr2=0x7f0ed4079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:29.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f0ed4077680 0x7f0ed4079b40 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f0ee4005950 tx=0x7f0ee40058e0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 -- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 msgr2=0x7f0ee81135c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:29.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81135c0 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7f0ed800b5c0 tx=0x7f0ed800f740 comp rx=0 tx=0).stop 2026-03-09T00:05:29.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 -- 192.168.123.103:0/1512251706 shutdown_connections 2026-03-09T00:05:29.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f0ed4077680 0x7f0ed4079b40 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0ee8109730 0x7f0ee8113080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 --2- 192.168.123.103:0/1512251706 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0ee810a050 0x7f0ee81135c0 unknown :-1 s=CLOSED pgs=326 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:29.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.665+0000 7f0ed3fff700 1 -- 192.168.123.103:0/1512251706 >> 192.168.123.103:0/1512251706 conn(0x7f0ee806d1b0 msgr2=0x7f0ee810cfe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:29.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.666+0000 7f0ed3fff700 1 -- 192.168.123.103:0/1512251706 shutdown_connections 2026-03-09T00:05:29.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:29.666+0000 7f0ed3fff700 1 -- 192.168.123.103:0/1512251706 wait complete. 2026-03-09T00:05:29.749 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-09T00:05:29.987 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:30.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:30.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:30.045 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:29 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:29 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:30.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.382+0000 7f567b59e700 1 -- 192.168.123.103:0/3813509098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c109730 msgr2=0x7f567c109b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:30.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.382+0000 7f567b59e700 1 --2- 192.168.123.103:0/3813509098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c109730 0x7f567c109b10 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7f566c009b00 tx=0x7f566c009e10 comp rx=0 tx=0).stop 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.383+0000 7f567b59e700 1 -- 192.168.123.103:0/3813509098 shutdown_connections 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.383+0000 7f567b59e700 1 --2- 192.168.123.103:0/3813509098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c10a050 0x7f567c1127a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.383+0000 7f567b59e700 1 --2- 192.168.123.103:0/3813509098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c109730 0x7f567c109b10 unknown :-1 s=CLOSED pgs=327 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.383+0000 7f567b59e700 1 -- 192.168.123.103:0/3813509098 >> 192.168.123.103:0/3813509098 conn(0x7f567c06ce20 msgr2=0x7f567c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 -- 192.168.123.103:0/3813509098 shutdown_connections 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 -- 192.168.123.103:0/3813509098 wait complete. 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 Processor -- start 2026-03-09T00:05:30.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 -- start start 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 0x7f567c1a1360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c10a050 0x7f567c1a18a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f567c1a1f80 con 0x7f567c10a050 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.384+0000 7f567b59e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f567c1a5d10 con 0x7f567c109730 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 0x7f567c1a1360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 0x7f567c1a1360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50598/0 (socket says 192.168.123.103:50598) 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 -- 192.168.123.103:0/856294694 learned_addr learned my addr 192.168.123.103:0/856294694 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f5679d9b700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c10a050 0x7f567c1a18a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 -- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c10a050 msgr2=0x7f567c1a18a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c10a050 0x7f567c1a18a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 -- 192.168.123.103:0/856294694 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f566c0097e0 con 0x7f567c109730 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f567a59c700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 0x7f567c1a1360 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f566c005b40 tx=0x7f566c00bfd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:30.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f566c01d070 con 0x7f567c109730 2026-03-09T00:05:30.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.385+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f566c00f460 con 0x7f567c109730 2026-03-09T00:05:30.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.386+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f566c021620 con 0x7f567c109730 2026-03-09T00:05:30.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.387+0000 7f567b59e700 1 -- 192.168.123.103:0/856294694 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f567c1a5f90 con 0x7f567c109730 2026-03-09T00:05:30.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.387+0000 7f567b59e700 1 -- 192.168.123.103:0/856294694 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f567c1a6400 con 0x7f567c109730 2026-03-09T00:05:30.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.388+0000 7f567b59e700 1 -- 192.168.123.103:0/856294694 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f567c1122b0 con 0x7f567c109730 2026-03-09T00:05:30.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.389+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f566c003a40 con 0x7f567c109730 2026-03-09T00:05:30.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.389+0000 7f566b7fe700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5664077450 0x7f5664079910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:30.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.389+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f566c09a6f0 con 0x7f567c109730 2026-03-09T00:05:30.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.390+0000 7f5679d9b700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5664077450 0x7f5664079910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:30.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.390+0000 7f5679d9b700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5664077450 0x7f5664079910 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f567c1a2980 tx=0x7f5670009450 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:30.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.391+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f566c063200 con 0x7f567c109730 2026-03-09T00:05:30.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.574+0000 7f567b59e700 1 -- 192.168.123.103:0/856294694 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f567c1a26c0 con 0x7f567c109730 2026-03-09T00:05:30.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.577+0000 7f566b7fe700 1 -- 192.168.123.103:0/856294694 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v37)=0 v37) v1 ==== 163+0+0 (secure 0 0 0) 0x7f566c062950 con 0x7f567c109730 2026-03-09T00:05:30.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 -- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5664077450 msgr2=0x7f5664079910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:30.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5664077450 0x7f5664079910 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f567c1a2980 tx=0x7f5670009450 comp rx=0 tx=0).stop 2026-03-09T00:05:30.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 -- 192.168.123.103:0/856294694 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 msgr2=0x7f567c1a1360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:30.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 0x7f567c1a1360 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f566c005b40 tx=0x7f566c00bfd0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 -- 192.168.123.103:0/856294694 shutdown_connections 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f5664077450 0x7f5664079910 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f567c109730 0x7f567c1a1360 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 --2- 192.168.123.103:0/856294694 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f567c10a050 0x7f567c1a18a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 -- 192.168.123.103:0/856294694 >> 192.168.123.103:0/856294694 conn(0x7f567c06ce20 msgr2=0x7f567c1136b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 -- 192.168.123.103:0/856294694 shutdown_connections 2026-03-09T00:05:30.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:30.582+0000 7f56697fa700 1 -- 192.168.123.103:0/856294694 wait complete. 2026-03-09T00:05:30.651 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-09T00:05:30.892 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:31.130 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:30 vm03.local ceph-mon[52346]: pgmap v48: 65 pgs: 65 active+clean; 316 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.3 MiB/s wr, 617 op/s 2026-03-09T00:05:31.130 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:30 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:30 vm06.local ceph-mon[58395]: pgmap v48: 65 pgs: 65 active+clean; 316 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.3 MiB/s wr, 617 op/s 2026-03-09T00:05:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:30 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:31.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.210+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1677536220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 msgr2=0x7fb4e810f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:31.210 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.210+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1677536220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 0x7fb4e810f720 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fb4d8009b00 tx=0x7fb4d8009e10 comp rx=0 tx=0).stop 2026-03-09T00:05:31.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.211+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1677536220 shutdown_connections 2026-03-09T00:05:31.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.211+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1677536220 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e810d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.211+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1677536220 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 0x7fb4e810f720 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.211 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.211+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1677536220 >> 192.168.123.103:0/1677536220 conn(0x7fb4e806ce20 msgr2=0x7fb4e806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.212+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1677536220 shutdown_connections 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.212+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1677536220 wait complete. 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.212+0000 7fb4e7fff700 1 Processor -- start 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.212+0000 7fb4e7fff700 1 -- start start 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e811bfb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e7fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 0x7fb4e8116fb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e7fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4e8117600 con 0x7fb4e810d0f0 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e7fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb4e8117770 con 0x7fb4e810f340 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4dffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 0x7fb4e8116fb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e811bfb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e811bfb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46556/0 (socket says 192.168.123.103:46556) 2026-03-09T00:05:31.212 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.213+0000 7fb4e6ffd700 1 -- 192.168.123.103:0/1493101574 learned_addr learned my addr 192.168.123.103:0/1493101574 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:31.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.214+0000 7fb4e6ffd700 1 -- 192.168.123.103:0/1493101574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 msgr2=0x7fb4e8116fb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:31.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.214+0000 7fb4e6ffd700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 0x7fb4e8116fb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.213 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.214+0000 7fb4e6ffd700 1 -- 192.168.123.103:0/1493101574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb4d80097e0 con 0x7fb4e810d0f0 2026-03-09T00:05:31.216 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.217+0000 7fb4e6ffd700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e811bfb0 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7fb4d8006010 tx=0x7fb4d80048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:31.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.218+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4d801d070 con 0x7fb4e810d0f0 2026-03-09T00:05:31.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.218+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb4d8022470 con 0x7fb4e810d0f0 2026-03-09T00:05:31.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.218+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb4d800f670 con 0x7fb4e810d0f0 2026-03-09T00:05:31.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.218+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb4e81179f0 con 0x7fb4e810d0f0 2026-03-09T00:05:31.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.218+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb4e81b8400 con 0x7fb4e810d0f0 2026-03-09T00:05:31.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.219+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb4e81114f0 con 0x7fb4e810d0f0 2026-03-09T00:05:31.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.220+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 31) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fb4d80225e0 con 0x7fb4e810d0f0 2026-03-09T00:05:31.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.220+0000 7fb4e4ff9700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb4d4077520 0x7fb4d40799e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:31.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.220+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fb4d809ad70 con 0x7fb4e810d0f0 2026-03-09T00:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.222+0000 7fb4dffff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb4d4077520 0x7fb4d40799e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:31.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.222+0000 7fb4dffff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb4d4077520 0x7fb4d40799e0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb4e81187a0 tx=0x7fb4d000b410 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:31.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.223+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fb4d809f070 con 0x7fb4e810d0f0 2026-03-09T00:05:31.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.344+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7fb4e804f2e0 con 0x7fb4e810d0f0 2026-03-09T00:05:31.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.345+0000 7fb4e4ff9700 1 -- 192.168.123.103:0/1493101574 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v37)=0 v37) v1 ==== 135+0+0 (secure 0 0 0) 0x7fb4d80270c0 con 0x7fb4e810d0f0 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.348+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb4d4077520 msgr2=0x7fb4d40799e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.348+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb4d4077520 0x7fb4d40799e0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fb4e81187a0 tx=0x7fb4d000b410 comp rx=0 tx=0).stop 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 msgr2=0x7fb4e811bfb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e811bfb0 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7fb4d8006010 tx=0x7fb4d80048c0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 shutdown_connections 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fb4d4077520 0x7fb4d40799e0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb4e810d0f0 0x7fb4e811bfb0 unknown :-1 s=CLOSED pgs=328 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 --2- 192.168.123.103:0/1493101574 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb4e810f340 0x7fb4e8116fb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 >> 192.168.123.103:0/1493101574 conn(0x7fb4e806ce20 msgr2=0x7fb4e80704a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:31.349 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 shutdown_connections 2026-03-09T00:05:31.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.349+0000 7fb4e7fff700 1 -- 192.168.123.103:0/1493101574 wait complete. 2026-03-09T00:05:31.396 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-09T00:05:31.624 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:31.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.985+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/3630514233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a44107ff0 msgr2=0x7f6a441083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.985+0000 7f6a4ad4e700 1 --2- 192.168.123.103:0/3630514233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a44107ff0 0x7f6a441083d0 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f6a34007780 tx=0x7f6a3400c050 comp rx=0 tx=0).stop 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.986+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/3630514233 shutdown_connections 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.986+0000 7f6a4ad4e700 1 --2- 192.168.123.103:0/3630514233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441089a0 0x7f6a4410be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.986+0000 7f6a4ad4e700 1 --2- 192.168.123.103:0/3630514233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a44107ff0 0x7f6a441083d0 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.986+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/3630514233 >> 192.168.123.103:0/3630514233 conn(0x7f6a4406ce20 msgr2=0x7f6a4406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.986+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/3630514233 shutdown_connections 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.986+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/3630514233 wait complete. 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a4ad4e700 1 Processor -- start 2026-03-09T00:05:31.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a4ad4e700 1 -- start start 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a4ad4e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a441089a0 0x7f6a44133260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a4ad4e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441337a0 0x7f6a44133c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a4ad4e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a4407ef30 con 0x7f6a441089a0 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a4ad4e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6a4407f0a0 con 0x7f6a441337a0 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a48aea700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a441089a0 0x7f6a44133260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a43fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441337a0 0x7f6a44133c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a43fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441337a0 0x7f6a44133c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50620/0 (socket says 192.168.123.103:50620) 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a43fff700 1 -- 192.168.123.103:0/2536488339 learned_addr learned my addr 192.168.123.103:0/2536488339 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a48aea700 1 -- 192.168.123.103:0/2536488339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441337a0 msgr2=0x7f6a44133c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a48aea700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441337a0 0x7f6a44133c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.987+0000 7f6a48aea700 1 -- 192.168.123.103:0/2536488339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6a34007430 con 0x7f6a441089a0 2026-03-09T00:05:31.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.988+0000 7f6a48aea700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a441089a0 0x7f6a44133260 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f6a3400afd0 tx=0x7f6a3400ca60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:31.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.988+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a3400f050 con 0x7f6a441089a0 2026-03-09T00:05:31.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.989+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/2536488339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6a4407f2d0 con 0x7f6a441089a0 2026-03-09T00:05:31.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.989+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/2536488339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6a4407f820 con 0x7f6a441089a0 2026-03-09T00:05:31.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.990+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6a3400ced0 con 0x7f6a441089a0 2026-03-09T00:05:31.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.990+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6a34008710 con 0x7f6a441089a0 2026-03-09T00:05:31.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.991+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/2536488339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6a30005320 con 0x7f6a441089a0 2026-03-09T00:05:31.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.991+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f6a3401a040 con 0x7f6a441089a0 2026-03-09T00:05:31.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.993+0000 7f6a41ffb700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6a2c077660 0x7f6a2c079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:31.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.993+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f6a34099a80 con 0x7f6a441089a0 2026-03-09T00:05:31.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.993+0000 7f6a43fff700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6a2c077660 0x7f6a2c079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:31.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.996+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f6a340627b0 con 0x7f6a441089a0 2026-03-09T00:05:31.998 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:31.999+0000 7f6a43fff700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6a2c077660 0x7f6a2c079b20 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f6a3c009f20 tx=0x7f6a3c009580 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:32.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.176+0000 7f6a4ad4e700 1 -- 192.168.123.103:0/2536488339 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f6a30000c90 con 0x7f6a2c077660 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.184+0000 7f6a41ffb700 1 -- 192.168.123.103:0/2536488339 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f6a30000c90 con 0x7f6a2c077660 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 -- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6a2c077660 msgr2=0x7f6a2c079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6a2c077660 0x7f6a2c079b20 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f6a3c009f20 tx=0x7f6a3c009580 comp rx=0 tx=0).stop 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 -- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a441089a0 msgr2=0x7f6a44133260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a441089a0 0x7f6a44133260 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7f6a3400afd0 tx=0x7f6a3400ca60 comp rx=0 tx=0).stop 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 -- 192.168.123.103:0/2536488339 shutdown_connections 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f6a2c077660 0x7f6a2c079b20 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6a441089a0 0x7f6a44133260 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 --2- 192.168.123.103:0/2536488339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6a441337a0 0x7f6a44133c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:32.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 -- 192.168.123.103:0/2536488339 >> 192.168.123.103:0/2536488339 conn(0x7f6a4406ce20 msgr2=0x7f6a44070590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:32.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.188+0000 7f6a2b7fe700 1 -- 192.168.123.103:0/2536488339 shutdown_connections 2026-03-09T00:05:32.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.189+0000 7f6a2b7fe700 1 -- 192.168.123.103:0/2536488339 wait complete. 2026-03-09T00:05:32.303 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-09T00:05:32.303 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:05:32.303 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-09T00:05:32.499 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:05:32.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.955+0000 7f1b552eb700 1 -- 192.168.123.103:0/944173443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b50102a00 msgr2=0x7f1b5010aef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.955+0000 7f1b552eb700 1 --2- 192.168.123.103:0/944173443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b50102a00 0x7f1b5010aef0 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7f1b44009b50 tx=0x7f1b44009e60 comp rx=0 tx=0).stop 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 -- 192.168.123.103:0/944173443 shutdown_connections 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 --2- 192.168.123.103:0/944173443 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b50102a00 0x7f1b5010aef0 unknown :-1 s=CLOSED pgs=331 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 --2- 192.168.123.103:0/944173443 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b501020e0 0x7f1b501024c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 -- 192.168.123.103:0/944173443 >> 192.168.123.103:0/944173443 conn(0x7f1b500fb830 msgr2=0x7f1b500fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 -- 192.168.123.103:0/944173443 shutdown_connections 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 -- 192.168.123.103:0/944173443 wait complete. 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 Processor -- start 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 -- start start 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.956+0000 7f1b552eb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 0x7f1b50199840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 0x7f1b50199840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 0x7f1b50199840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46606/0 (socket says 192.168.123.103:46606) 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b552eb700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b50102a00 0x7f1b50194840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b552eb700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b50194e10 con 0x7f1b501020e0 2026-03-09T00:05:32.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b552eb700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b50194f80 con 0x7f1b50102a00 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 -- 192.168.123.103:0/3877412608 learned_addr learned my addr 192.168.123.103:0/3877412608 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 -- 192.168.123.103:0/3877412608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b50102a00 msgr2=0x7f1b50194840 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b50102a00 0x7f1b50194840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 -- 192.168.123.103:0/3877412608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1b440097e0 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b4effd700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 0x7f1b50199840 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7f1b4000ed70 tx=0x7f1b4000c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b4000cd70 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1b40010910 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b40018980 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b552eb700 1 -- 192.168.123.103:0/3877412608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1b50195260 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.957+0000 7f1b552eb700 1 -- 192.168.123.103:0/3877412608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1b501afed0 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.958+0000 7f1b552eb700 1 -- 192.168.123.103:0/3877412608 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1b501085f0 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.963+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f1b40010a80 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.963+0000 7f1b37fff700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b38077550 0x7f1b38079a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.963+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f1b40014070 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.963+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f1b4009a480 con 0x7f1b501020e0 2026-03-09T00:05:32.974 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.972+0000 7f1b4e7fc700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b38077550 0x7f1b38079a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:32.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:32.977+0000 7f1b4e7fc700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b38077550 0x7f1b38079a10 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f1b4400b5c0 tx=0x7f1b44005fb0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: pgmap v49: 65 pgs: 65 active+clean; 304 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.8 MiB/s wr, 908 op/s 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: mgrmap e32: vm03.yvcons(active, since 92s), standbys: vm06.rzcvhn 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:33.081 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:33 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:33.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.097+0000 7f1b552eb700 1 -- 192.168.123.103:0/3877412608 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1b50195e80 con 0x7f1b38077550 2026-03-09T00:05:33.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.098+0000 7f1b37fff700 1 -- 192.168.123.103:0/3877412608 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f1b50195e80 con 0x7f1b38077550 2026-03-09T00:05:33.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.100+0000 7f1b35ffb700 1 -- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b38077550 msgr2=0x7f1b38079a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.101+0000 7f1b35ffb700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b38077550 0x7f1b38079a10 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f1b4400b5c0 tx=0x7f1b44005fb0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.101+0000 7f1b35ffb700 1 -- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 msgr2=0x7f1b50199840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.101+0000 7f1b35ffb700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 0x7f1b50199840 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7f1b4000ed70 tx=0x7f1b4000c5b0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 -- 192.168.123.103:0/3877412608 shutdown_connections 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f1b38077550 0x7f1b38079a10 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1b501020e0 0x7f1b50199840 unknown :-1 s=CLOSED pgs=332 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 --2- 192.168.123.103:0/3877412608 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1b50102a00 0x7f1b50194840 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 -- 192.168.123.103:0/3877412608 >> 192.168.123.103:0/3877412608 conn(0x7f1b500fb830 msgr2=0x7f1b50105730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 -- 192.168.123.103:0/3877412608 shutdown_connections 2026-03-09T00:05:33.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.103+0000 7f1b35ffb700 1 -- 192.168.123.103:0/3877412608 wait complete. 2026-03-09T00:05:33.112 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:05:33.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.187+0000 7f9dc6510700 1 -- 192.168.123.103:0/950885994 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0103b80 msgr2=0x7f9dc0104000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.188 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.187+0000 7f9dc6510700 1 --2- 192.168.123.103:0/950885994 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0103b80 0x7f9dc0104000 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f9db4009b00 tx=0x7f9db4009e10 comp rx=0 tx=0).stop 2026-03-09T00:05:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.192+0000 7f9dc6510700 1 -- 192.168.123.103:0/950885994 shutdown_connections 2026-03-09T00:05:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.192+0000 7f9dc6510700 1 --2- 192.168.123.103:0/950885994 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0103b80 0x7f9dc0104000 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.192+0000 7f9dc6510700 1 --2- 192.168.123.103:0/950885994 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0105bb0 0x7f9dc0103640 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.192 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.192+0000 7f9dc6510700 1 -- 192.168.123.103:0/950885994 >> 192.168.123.103:0/950885994 conn(0x7f9dc00756c0 msgr2=0x7f9dc0075ad0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.197+0000 7f9dc6510700 1 -- 192.168.123.103:0/950885994 shutdown_connections 2026-03-09T00:05:33.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.197+0000 7f9dc6510700 1 -- 192.168.123.103:0/950885994 wait complete. 2026-03-09T00:05:33.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.197+0000 7f9dc6510700 1 Processor -- start 2026-03-09T00:05:33.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.197+0000 7f9dc6510700 1 -- start start 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dc6510700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0103b80 0x7f9dc0112320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dc6510700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0105bb0 0x7f9dc010d320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dc6510700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc010d860 con 0x7f9dc0103b80 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dc6510700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc010d9d0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbf7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0105bb0 0x7f9dc010d320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0103b80 0x7f9dc0112320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0103b80 0x7f9dc0112320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46614/0 (socket says 192.168.123.103:46614) 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbffff700 1 -- 192.168.123.103:0/3344428596 learned_addr learned my addr 192.168.123.103:0/3344428596 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbf7fe700 1 -- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0103b80 msgr2=0x7f9dc0112320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbf7fe700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0103b80 0x7f9dc0112320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.198+0000 7f9dbf7fe700 1 -- 192.168.123.103:0/3344428596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9db40097e0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.199+0000 7f9dbf7fe700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0105bb0 0x7f9dc010d320 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9db4009b00 tx=0x7f9db40048c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.200+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db401d070 con 0x7f9dc0105bb0 2026-03-09T00:05:33.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.200+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9db4004ba0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.200+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db400f670 con 0x7f9dc0105bb0 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.201+0000 7f9dc6510700 1 -- 192.168.123.103:0/3344428596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9dc010dc50 con 0x7f9dc0105bb0 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.201+0000 7f9dc6510700 1 -- 192.168.123.103:0/3344428596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9dc010e0c0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.202+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f9db400bc50 con 0x7f9dc0105bb0 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.203+0000 7f9dbd7fa700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9da8077400 0x7f9da80798c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.203+0000 7f9dbffff700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9da8077400 0x7f9da80798c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.203+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f9db409a9a0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.202+0000 7f9dc6510700 1 -- 192.168.123.103:0/3344428596 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9dc004f2e0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.205+0000 7f9dbffff700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9da8077400 0x7f9da80798c0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f9db0005950 tx=0x7f9db000b500 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.207 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.207+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f9db40636d0 con 0x7f9dc0105bb0 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: pgmap v49: 65 pgs: 65 active+clean; 304 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.8 MiB/s wr, 908 op/s 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: mgrmap e32: vm03.yvcons(active, since 92s), standbys: vm06.rzcvhn 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:33.332 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:33 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:33.347 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.347+0000 7f9dc6510700 1 -- 192.168.123.103:0/3344428596 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9dc010e850 con 0x7f9da8077400 2026-03-09T00:05:33.348 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.349+0000 7f9dbd7fa700 1 -- 192.168.123.103:0/3344428596 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f9dc010e850 con 0x7f9da8077400 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 -- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9da8077400 msgr2=0x7f9da80798c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9da8077400 0x7f9da80798c0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f9db0005950 tx=0x7f9db000b500 comp rx=0 tx=0).stop 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 -- 192.168.123.103:0/3344428596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0105bb0 msgr2=0x7f9dc010d320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0105bb0 0x7f9dc010d320 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f9db4009b00 tx=0x7f9db40048c0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 -- 192.168.123.103:0/3344428596 shutdown_connections 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f9da8077400 0x7f9da80798c0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9dc0103b80 0x7f9dc0112320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 --2- 192.168.123.103:0/3344428596 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9dc0105bb0 0x7f9dc010d320 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.352+0000 7f9da6ffd700 1 -- 192.168.123.103:0/3344428596 >> 192.168.123.103:0/3344428596 conn(0x7f9dc00756c0 msgr2=0x7f9dc00fe790 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.353+0000 7f9da6ffd700 1 -- 192.168.123.103:0/3344428596 shutdown_connections 2026-03-09T00:05:33.352 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.353+0000 7f9da6ffd700 1 -- 192.168.123.103:0/3344428596 wait complete. 2026-03-09T00:05:33.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.429+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2801635058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810d0f0 msgr2=0x7f10f810d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.429+0000 7f10fdb2b700 1 --2- 192.168.123.103:0/2801635058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810d0f0 0x7f10f810d570 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f10f000b210 tx=0x7f10f000b520 comp rx=0 tx=0).stop 2026-03-09T00:05:33.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.429+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2801635058 shutdown_connections 2026-03-09T00:05:33.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.429+0000 7f10fdb2b700 1 --2- 192.168.123.103:0/2801635058 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810d0f0 0x7f10f810d570 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.429+0000 7f10fdb2b700 1 --2- 192.168.123.103:0/2801635058 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f810f340 0x7f10f810f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.429+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2801635058 >> 192.168.123.103:0/2801635058 conn(0x7f10f806ce20 msgr2=0x7f10f806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.437+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2801635058 shutdown_connections 2026-03-09T00:05:33.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.437+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2801635058 wait complete. 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.437+0000 7f10fdb2b700 1 Processor -- start 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fdb2b700 1 -- start start 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fdb2b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810f340 0x7f10f811bf60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fdb2b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f8116f60 0x7f10f81173e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fdb2b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10f81179b0 con 0x7f10f8116f60 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fdb2b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10f8117b20 con 0x7f10f810f340 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fcb29700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810f340 0x7f10f811bf60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fcb29700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810f340 0x7f10f811bf60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50688/0 (socket says 192.168.123.103:50688) 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10fcb29700 1 -- 192.168.123.103:0/2917081287 learned_addr learned my addr 192.168.123.103:0/2917081287 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.438+0000 7f10f7fff700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f8116f60 0x7f10f81173e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.439+0000 7f10f7fff700 1 -- 192.168.123.103:0/2917081287 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810f340 msgr2=0x7f10f811bf60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.439+0000 7f10f7fff700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810f340 0x7f10f811bf60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.439+0000 7f10f7fff700 1 -- 192.168.123.103:0/2917081287 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10ec0097e0 con 0x7f10f8116f60 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.439+0000 7f10f7fff700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f8116f60 0x7f10f81173e0 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7f10f0000f80 tx=0x7f10f0003c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.439+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10f000e050 con 0x7f10f8116f60 2026-03-09T00:05:33.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.439+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2917081287 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10f0009e30 con 0x7f10f8116f60 2026-03-09T00:05:33.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.440+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2917081287 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10f81b85d0 con 0x7f10f8116f60 2026-03-09T00:05:33.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.440+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2917081287 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f10f804f2e0 con 0x7f10f8116f60 2026-03-09T00:05:33.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.440+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f10f00045b0 con 0x7f10f8116f60 2026-03-09T00:05:33.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.441+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10f0012430 con 0x7f10f8116f60 2026-03-09T00:05:33.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.442+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f10f0019040 con 0x7f10f8116f60 2026-03-09T00:05:33.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.443+0000 7f10f5ffb700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f10e0077340 0x7f10e0079800 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.443+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f10f009a670 con 0x7f10f8116f60 2026-03-09T00:05:33.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.443+0000 7f10fcb29700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f10e0077340 0x7f10e0079800 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.445+0000 7f10fcb29700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f10e0077340 0x7f10e0079800 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f10ec005d10 tx=0x7f10ec009500 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.445+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f10f0063320 con 0x7f10f8116f60 2026-03-09T00:05:33.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.580+0000 7f10fdb2b700 1 -- 192.168.123.103:0/2917081287 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f10f810b560 con 0x7f10e0077340 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (49s) 32s ago 5m 16.7M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (6m) 32s ago 6m 8434k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (5m) 66s ago 5m 8522k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (6m) 32s ago 6m 7402k - 18.2.1 5be31c24972a 320f8ef2d2cb 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (5m) 66s ago 5m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (34s) 32s ago 5m 39.0M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (3m) 32s ago 3m 16.7M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (3m) 32s ago 3m 253M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (3m) 66s ago 3m 18.5M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (3m) 66s ago 3m 14.7M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (103s) 32s ago 6m 627M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (81s) 66s ago 5m 487M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (6m) 32s ago 6m 55.2M 2048M 18.2.1 5be31c24972a f9863944dcfb 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (5m) 66s ago 5m 45.6M 2048M 18.2.1 5be31c24972a 1e39c7ad3e9f 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (71s) 32s ago 5m 9261k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (67s) 66s ago 5m 5511k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 32s ago 5m 343M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (4m) 32s ago 4m 360M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (4m) 32s ago 4m 314M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (4m) 66s ago 4m 421M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (4m) 66s ago 4m 384M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (4m) 66s ago 4m 342M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (55s) 32s ago 5m 47.0M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:05:33.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.587+0000 7f10f5ffb700 1 -- 192.168.123.103:0/2917081287 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f10f810b560 con 0x7f10e0077340 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.591+0000 7f10df7fe700 1 -- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f10e0077340 msgr2=0x7f10e0079800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.591+0000 7f10df7fe700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f10e0077340 0x7f10e0079800 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f10ec005d10 tx=0x7f10ec009500 comp rx=0 tx=0).stop 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.591+0000 7f10df7fe700 1 -- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f8116f60 msgr2=0x7f10f81173e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.591+0000 7f10df7fe700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f8116f60 0x7f10f81173e0 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7f10f0000f80 tx=0x7f10f0003c30 comp rx=0 tx=0).stop 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 -- 192.168.123.103:0/2917081287 shutdown_connections 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f10e0077340 0x7f10e0079800 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f10f810f340 0x7f10f811bf60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 --2- 192.168.123.103:0/2917081287 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f10f8116f60 0x7f10f81173e0 unknown :-1 s=CLOSED pgs=333 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.591 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 -- 192.168.123.103:0/2917081287 >> 192.168.123.103:0/2917081287 conn(0x7f10f806ce20 msgr2=0x7f10f810ae50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 -- 192.168.123.103:0/2917081287 shutdown_connections 2026-03-09T00:05:33.592 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.592+0000 7f10df7fe700 1 -- 192.168.123.103:0/2917081287 wait complete. 2026-03-09T00:05:33.682 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 -- 192.168.123.103:0/1036490832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb41089a0 msgr2=0x7f4bb410be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 --2- 192.168.123.103:0/1036490832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb41089a0 0x7f4bb410be70 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7f4bac00d3f0 tx=0x7f4bac00d700 comp rx=0 tx=0).stop 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 -- 192.168.123.103:0/1036490832 shutdown_connections 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 --2- 192.168.123.103:0/1036490832 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb41089a0 0x7f4bb410be70 unknown :-1 s=CLOSED pgs=334 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 --2- 192.168.123.103:0/1036490832 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 0x7f4bb41083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 -- 192.168.123.103:0/1036490832 >> 192.168.123.103:0/1036490832 conn(0x7f4bb406ce20 msgr2=0x7f4bb406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 -- 192.168.123.103:0/1036490832 shutdown_connections 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 -- 192.168.123.103:0/1036490832 wait complete. 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 Processor -- start 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.683+0000 7f4bbaca4700 1 -- start start 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bbaca4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 0x7f4bb4138380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bbaca4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb4133330 0x7f4bb41337b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bbaca4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4bb4133e40 con 0x7f4bb4133330 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bbaca4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4bb4133f80 con 0x7f4bb4107ff0 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb4133330 0x7f4bb41337b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb4133330 0x7f4bb41337b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:46658/0 (socket says 192.168.123.103:46658) 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb3fff700 1 -- 192.168.123.103:0/2764952725 learned_addr learned my addr 192.168.123.103:0/2764952725 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb8a40700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 0x7f4bb4138380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb8a40700 1 -- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb4133330 msgr2=0x7f4bb41337b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb8a40700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb4133330 0x7f4bb41337b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb8a40700 1 -- 192.168.123.103:0/2764952725 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4bac007ed0 con 0x7f4bb4107ff0 2026-03-09T00:05:33.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.684+0000 7f4bb8a40700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 0x7f4bb4138380 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f4ba400b700 tx=0x7f4ba400ba10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.685+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ba4011840 con 0x7f4bb4107ff0 2026-03-09T00:05:33.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.685+0000 7f4bbaca4700 1 -- 192.168.123.103:0/2764952725 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4bb407ee80 con 0x7f4bb4107ff0 2026-03-09T00:05:33.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.685+0000 7f4bbaca4700 1 -- 192.168.123.103:0/2764952725 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4bb407f3d0 con 0x7f4bb4107ff0 2026-03-09T00:05:33.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.686+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4ba4011e80 con 0x7f4bb4107ff0 2026-03-09T00:05:33.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.686+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4ba400f550 con 0x7f4bb4107ff0 2026-03-09T00:05:33.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.686+0000 7f4bbaca4700 1 -- 192.168.123.103:0/2764952725 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4ba0005320 con 0x7f4bb4107ff0 2026-03-09T00:05:33.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.687+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f4ba40119a0 con 0x7f4bb4107ff0 2026-03-09T00:05:33.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.688+0000 7f4bb1ffb700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f4b9c077660 0x7f4b9c079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:33.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.688+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f4ba4099560 con 0x7f4bb4107ff0 2026-03-09T00:05:33.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.688+0000 7f4bb3fff700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f4b9c077660 0x7f4b9c079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:33.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.689+0000 7f4bb3fff700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f4b9c077660 0x7f4b9c079b20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f4bac00e010 tx=0x7f4bac007e10 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:33.694 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.692+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f4ba4062290 con 0x7f4bb4107ff0 2026-03-09T00:05:33.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.913+0000 7f4bbaca4700 1 -- 192.168.123.103:0/2764952725 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f4ba0005cc0 con 0x7f4bb4107ff0 2026-03-09T00:05:33.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.913+0000 7f4bb1ffb700 1 -- 192.168.123.103:0/2764952725 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f4ba40619e0 con 0x7f4bb4107ff0 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:05:33.913 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:05:33.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.916+0000 7f4b9b7fe700 1 -- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f4b9c077660 msgr2=0x7f4b9c079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.916+0000 7f4b9b7fe700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f4b9c077660 0x7f4b9c079b20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f4bac00e010 tx=0x7f4bac007e10 comp rx=0 tx=0).stop 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 -- 192.168.123.103:0/2764952725 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 msgr2=0x7f4bb4138380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 0x7f4bb4138380 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f4ba400b700 tx=0x7f4ba400ba10 comp rx=0 tx=0).stop 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 -- 192.168.123.103:0/2764952725 shutdown_connections 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f4b9c077660 0x7f4b9c079b20 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4bb4107ff0 0x7f4bb4138380 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 --2- 192.168.123.103:0/2764952725 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4bb4133330 0x7f4bb41337b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 -- 192.168.123.103:0/2764952725 >> 192.168.123.103:0/2764952725 conn(0x7f4bb406ce20 msgr2=0x7f4bb40706a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 -- 192.168.123.103:0/2764952725 shutdown_connections 2026-03-09T00:05:33.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:33.918+0000 7f4b9b7fe700 1 -- 192.168.123.103:0/2764952725 wait complete. 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1757138925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e81089a0 msgr2=0x7fe9e810be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 --2- 192.168.123.103:0/1757138925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e81089a0 0x7fe9e810be70 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0009230 tx=0x7fe9e0009260 comp rx=0 tx=0).stop 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1757138925 shutdown_connections 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 --2- 192.168.123.103:0/1757138925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e81089a0 0x7fe9e810be70 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 --2- 192.168.123.103:0/1757138925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9e8107ff0 0x7fe9e81083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1757138925 >> 192.168.123.103:0/1757138925 conn(0x7fe9e806ce20 msgr2=0x7fe9e806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1757138925 shutdown_connections 2026-03-09T00:05:34.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.015+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1757138925 wait complete. 2026-03-09T00:05:34.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.016+0000 7fe9ed1a5700 1 Processor -- start 2026-03-09T00:05:34.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.016+0000 7fe9ed1a5700 1 -- start start 2026-03-09T00:05:34.016 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.016+0000 7fe9ed1a5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9e8107ff0 0x7fe9e807cf80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.016+0000 7fe9ed1a5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 0x7fe9e807d940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.016+0000 7fe9ed1a5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9e8081a60 con 0x7fe9e8107ff0 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.016+0000 7fe9ed1a5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe9e8081bd0 con 0x7fe9e807d4c0 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 0x7fe9e807d940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 0x7fe9e807d940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50728/0 (socket says 192.168.123.103:50728) 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e659c700 1 -- 192.168.123.103:0/1692287381 learned_addr learned my addr 192.168.123.103:0/1692287381 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e6d9d700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9e8107ff0 0x7fe9e807cf80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e659c700 1 -- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9e8107ff0 msgr2=0x7fe9e807cf80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e659c700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9e8107ff0 0x7fe9e807cf80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.017+0000 7fe9e659c700 1 -- 192.168.123.103:0/1692287381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe9e0008ee0 con 0x7fe9e807d4c0 2026-03-09T00:05:34.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.018+0000 7fe9e659c700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 0x7fe9e807d940 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0011fd0 tx=0x7fe9e000bc70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:34.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.019+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9e00078f0 con 0x7fe9e807d4c0 2026-03-09T00:05:34.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.019+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe9e0008020 con 0x7fe9e807d4c0 2026-03-09T00:05:34.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.019+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe9e0023a20 con 0x7fe9e807d4c0 2026-03-09T00:05:34.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.019+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1692287381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe9e8081e50 con 0x7fe9e807d4c0 2026-03-09T00:05:34.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.019+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1692287381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe9e80823a0 con 0x7fe9e807d4c0 2026-03-09T00:05:34.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.020+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1692287381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe9e804f2e0 con 0x7fe9e807d4c0 2026-03-09T00:05:34.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.021+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fe9e0008640 con 0x7fe9e807d4c0 2026-03-09T00:05:34.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.021+0000 7fe9cffff700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe9d0077660 0x7fe9d0079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.022+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fe9e009ff60 con 0x7fe9e807d4c0 2026-03-09T00:05:34.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.022+0000 7fe9e6d9d700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe9d0077660 0x7fe9d0079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.021 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.022+0000 7fe9e6d9d700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe9d0077660 0x7fe9d0079b20 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe9d800be60 tx=0x7fe9d800d040 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:34.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.025+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fe9e00cfc50 con 0x7fe9e807d4c0 2026-03-09T00:05:34.194 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:34 vm03.local ceph-mon[52346]: from='client.14694 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:34.194 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:34 vm03.local ceph-mon[52346]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:05:34.194 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:34 vm03.local ceph-mon[52346]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:05:34.194 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:34 vm03.local ceph-mon[52346]: from='client.14698 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:34.194 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:34 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/2764952725' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:34.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.259+0000 7fe9ed1a5700 1 -- 192.168.123.103:0/1692287381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fe9e804ea90 con 0x7fe9e807d4c0 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:05:34.266 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.263+0000 7fe9cffff700 1 -- 192.168.123.103:0/1692287381 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1828 (secure 0 0 0) 0x7fe9e00685c0 con 0x7fe9e807d4c0 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 -- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe9d0077660 msgr2=0x7fe9d0079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe9d0077660 0x7fe9d0079b20 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fe9d800be60 tx=0x7fe9d800d040 comp rx=0 tx=0).stop 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 -- 192.168.123.103:0/1692287381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 msgr2=0x7fe9e807d940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 0x7fe9e807d940 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fe9e0011fd0 tx=0x7fe9e000bc70 comp rx=0 tx=0).stop 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 -- 192.168.123.103:0/1692287381 shutdown_connections 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe9d0077660 0x7fe9d0079b20 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe9e8107ff0 0x7fe9e807cf80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 --2- 192.168.123.103:0/1692287381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe9e807d4c0 0x7fe9e807d940 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.267+0000 7fe9cdffb700 1 -- 192.168.123.103:0/1692287381 >> 192.168.123.103:0/1692287381 conn(0x7fe9e806ce20 msgr2=0x7fe9e80705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.272+0000 7fe9cdffb700 1 -- 192.168.123.103:0/1692287381 shutdown_connections 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.272+0000 7fe9cdffb700 1 -- 192.168.123.103:0/1692287381 wait complete. 2026-03-09T00:05:34.281 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:05:34.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.380+0000 7f3e9b414700 1 -- 192.168.123.103:0/1952129620 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e941089a0 msgr2=0x7f3e9410be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.380+0000 7f3e9b414700 1 --2- 192.168.123.103:0/1952129620 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e941089a0 0x7f3e9410be70 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f3e8c00cd40 tx=0x7f3e8c00a320 comp rx=0 tx=0).stop 2026-03-09T00:05:34.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.380+0000 7f3e9b414700 1 -- 192.168.123.103:0/1952129620 shutdown_connections 2026-03-09T00:05:34.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.380+0000 7f3e9b414700 1 --2- 192.168.123.103:0/1952129620 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e941089a0 0x7f3e9410be70 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.380+0000 7f3e9b414700 1 --2- 192.168.123.103:0/1952129620 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3e94107ff0 0x7f3e941083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.380+0000 7f3e9b414700 1 -- 192.168.123.103:0/1952129620 >> 192.168.123.103:0/1952129620 conn(0x7f3e9406ce20 msgr2=0x7f3e9406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:34.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.381+0000 7f3e9b414700 1 -- 192.168.123.103:0/1952129620 shutdown_connections 2026-03-09T00:05:34.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.381+0000 7f3e9b414700 1 -- 192.168.123.103:0/1952129620 wait complete. 2026-03-09T00:05:34.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.381+0000 7f3e9b414700 1 Processor -- start 2026-03-09T00:05:34.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.381+0000 7f3e9b414700 1 -- start start 2026-03-09T00:05:34.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e9b414700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3e94107ff0 0x7f3e9407cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e9b414700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 0x7f3e9407d910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e9b414700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e94081ad0 con 0x7f3e94107ff0 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e9b414700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e94081c40 con 0x7f3e9407d490 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e989af700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 0x7f3e9407d910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e989af700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 0x7f3e9407d910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:50740/0 (socket says 192.168.123.103:50740) 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e989af700 1 -- 192.168.123.103:0/3669064224 learned_addr learned my addr 192.168.123.103:0/3669064224 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e991b0700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3e94107ff0 0x7f3e9407cf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e989af700 1 -- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3e94107ff0 msgr2=0x7f3e9407cf50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e989af700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3e94107ff0 0x7f3e9407cf50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.382+0000 7f3e989af700 1 -- 192.168.123.103:0/3669064224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e8c00c9f0 con 0x7f3e9407d490 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.383+0000 7f3e989af700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 0x7f3e9407d910 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f3e8c007800 tx=0x7f3e8c0078e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:34.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.383+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e8c0096a0 con 0x7f3e9407d490 2026-03-09T00:05:34.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.383+0000 7f3e9b414700 1 -- 192.168.123.103:0/3669064224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3e94081ec0 con 0x7f3e9407d490 2026-03-09T00:05:34.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.383+0000 7f3e9b414700 1 -- 192.168.123.103:0/3669064224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3e94082410 con 0x7f3e9407d490 2026-03-09T00:05:34.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.384+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3e8c00ce90 con 0x7f3e9407d490 2026-03-09T00:05:34.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.384+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e8c004260 con 0x7f3e9407d490 2026-03-09T00:05:34.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.385+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f3e8c0043c0 con 0x7f3e9407d490 2026-03-09T00:05:34.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.386+0000 7f3e8a7fc700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f3e80077590 0x7f3e80079a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.386+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f3e8c09add0 con 0x7f3e9407d490 2026-03-09T00:05:34.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.386+0000 7f3e991b0700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f3e80077590 0x7f3e80079a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.387+0000 7f3e9b414700 1 -- 192.168.123.103:0/3669064224 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e78005320 con 0x7f3e9407d490 2026-03-09T00:05:34.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.387+0000 7f3e991b0700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f3e80077590 0x7f3e80079a50 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f3e90005950 tx=0x7f3e900058e0 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:34.393 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.390+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f3e8c063a80 con 0x7f3e9407d490 2026-03-09T00:05:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:34 vm06.local ceph-mon[58395]: from='client.14694 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:34 vm06.local ceph-mon[58395]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:05:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:34 vm06.local ceph-mon[58395]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-09T00:05:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:34 vm06.local ceph-mon[58395]: from='client.14698 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:34 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/2764952725' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:34.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.683+0000 7f3e9b414700 1 -- 192.168.123.103:0/3669064224 --> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3e78000bf0 con 0x7f3e80077590 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.684+0000 7f3e8a7fc700 1 -- 192.168.123.103:0/3669064224 <== mgr.24393 v2:192.168.123.103:6800/3123605642 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7f3e78000bf0 con 0x7f3e80077590 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "", 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:05:34.684 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 -- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f3e80077590 msgr2=0x7f3e80079a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f3e80077590 0x7f3e80079a50 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f3e90005950 tx=0x7f3e900058e0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 -- 192.168.123.103:0/3669064224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 msgr2=0x7f3e9407d910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 0x7f3e9407d910 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f3e8c007800 tx=0x7f3e8c0078e0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 -- 192.168.123.103:0/3669064224 shutdown_connections 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7f3e80077590 0x7f3e80079a50 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3e94107ff0 0x7f3e9407cf50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 --2- 192.168.123.103:0/3669064224 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3e9407d490 0x7f3e9407d910 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.689+0000 7f3e7ffff700 1 -- 192.168.123.103:0/3669064224 >> 192.168.123.103:0/3669064224 conn(0x7f3e9406ce20 msgr2=0x7f3e940705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.690+0000 7f3e7ffff700 1 -- 192.168.123.103:0/3669064224 shutdown_connections 2026-03-09T00:05:34.689 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.690+0000 7f3e7ffff700 1 -- 192.168.123.103:0/3669064224 wait complete. 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 -- 192.168.123.103:0/2164246283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe43810d310 msgr2=0x7fe43810d6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 --2- 192.168.123.103:0/2164246283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe43810d310 0x7fe43810d6f0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fe428007780 tx=0x7fe42800c050 comp rx=0 tx=0).stop 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 -- 192.168.123.103:0/2164246283 shutdown_connections 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 --2- 192.168.123.103:0/2164246283 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe438107d90 0x7fe4381081f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 --2- 192.168.123.103:0/2164246283 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe43810d310 0x7fe43810d6f0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 -- 192.168.123.103:0/2164246283 >> 192.168.123.103:0/2164246283 conn(0x7fe43806ce20 msgr2=0x7fe43806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 -- 192.168.123.103:0/2164246283 shutdown_connections 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.823+0000 7fe437fff700 1 -- 192.168.123.103:0/2164246283 wait complete. 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe437fff700 1 Processor -- start 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe437fff700 1 -- start start 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe437fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 0x7fe438133130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe437fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe438133670 0x7fe438133af0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe437fff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe43807ef70 con 0x7fe438133670 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe437fff700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe43807f0e0 con 0x7fe438107d90 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe436ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 0x7fe438133130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe436ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 0x7fe438133130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:55540/0 (socket says 192.168.123.103:55540) 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.824+0000 7fe436ffd700 1 -- 192.168.123.103:0/1522140925 learned_addr learned my addr 192.168.123.103:0/1522140925 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe436ffd700 1 -- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe438133670 msgr2=0x7fe438133af0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe436ffd700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe438133670 0x7fe438133af0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe436ffd700 1 -- 192.168.123.103:0/1522140925 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe428007430 con 0x7fe438107d90 2026-03-09T00:05:34.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe436ffd700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 0x7fe438133130 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe42800c9c0 tx=0x7fe42800caa0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:34.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe428021070 con 0x7fe438107d90 2026-03-09T00:05:34.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe437fff700 1 -- 192.168.123.103:0/1522140925 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe43807f310 con 0x7fe438107d90 2026-03-09T00:05:34.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.825+0000 7fe437fff700 1 -- 192.168.123.103:0/1522140925 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe43807f800 con 0x7fe438107d90 2026-03-09T00:05:34.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.826+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe428004500 con 0x7fe438107d90 2026-03-09T00:05:34.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.826+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe42800f050 con 0x7fe438107d90 2026-03-09T00:05:34.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.827+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 32) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fe428004020 con 0x7fe438107d90 2026-03-09T00:05:34.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.830+0000 7fe41ffff700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe420077590 0x7fe420079a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:05:34.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.831+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fe428099990 con 0x7fe438107d90 2026-03-09T00:05:34.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.831+0000 7fe437fff700 1 -- 192.168.123.103:0/1522140925 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe424005320 con 0x7fe438107d90 2026-03-09T00:05:34.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.832+0000 7fe4367fc700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe420077590 0x7fe420079a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:05:34.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.833+0000 7fe4367fc700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe420077590 0x7fe420079a50 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fe43000aab0 tx=0x7fe430009250 comp rx=0 tx=0).ready entity=mgr.24393 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:05:34.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:34.835+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fe428062640 con 0x7fe438107d90 2026-03-09T00:05:35.117 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.118+0000 7fe437fff700 1 -- 192.168.123.103:0/1522140925 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fe424005190 con 0x7fe438107d90 2026-03-09T00:05:35.118 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.118+0000 7fe41ffff700 1 -- 192.168.123.103:0/1522140925 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7fe428018020 con 0x7fe438107d90 2026-03-09T00:05:35.118 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:35.118 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:35.118 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:05:35.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.121+0000 7fe41dffb700 1 -- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe420077590 msgr2=0x7fe420079a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.121+0000 7fe41dffb700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe420077590 0x7fe420079a50 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fe43000aab0 tx=0x7fe430009250 comp rx=0 tx=0).stop 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.121+0000 7fe41dffb700 1 -- 192.168.123.103:0/1522140925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 msgr2=0x7fe438133130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.121+0000 7fe41dffb700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 0x7fe438133130 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe42800c9c0 tx=0x7fe42800caa0 comp rx=0 tx=0).stop 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.122+0000 7fe41dffb700 1 -- 192.168.123.103:0/1522140925 shutdown_connections 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.122+0000 7fe41dffb700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642] conn(0x7fe420077590 0x7fe420079a50 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.122+0000 7fe41dffb700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe438107d90 0x7fe438133130 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.122+0000 7fe41dffb700 1 --2- 192.168.123.103:0/1522140925 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe438133670 0x7fe438133af0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:05:35.121 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.122+0000 7fe41dffb700 1 -- 192.168.123.103:0/1522140925 >> 192.168.123.103:0/1522140925 conn(0x7fe43806ce20 msgr2=0x7fe4380710f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:05:35.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.123+0000 7fe41dffb700 1 -- 192.168.123.103:0/1522140925 shutdown_connections 2026-03-09T00:05:35.122 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:05:35.123+0000 7fe41dffb700 1 -- 192.168.123.103:0/1522140925 wait complete. 2026-03-09T00:05:35.389 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='client.24505 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: pgmap v50: 65 pgs: 65 active+clean; 304 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 977 KiB/s wr, 393 op/s 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='client.14706 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='client.? 192.168.123.103:0/1692287381' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:35.390 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:35 vm03.local ceph-mon[52346]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='client.24505 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: pgmap v50: 65 pgs: 65 active+clean; 304 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 977 KiB/s wr, 393 op/s 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='client.14706 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='client.? 192.168.123.103:0/1692287381' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' 2026-03-09T00:05:35.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:35 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T00:05:36.294 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local systemd[1]: Stopping Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:05:36.294 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[52342]: 2026-03-09T00:05:36.119+0000 7f1c356aa700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:05:36.294 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[52342]: 2026-03-09T00:05:36.119+0000 7f1c356aa700 -1 mon.vm03@0(leader) e2 *** Got Signal Terminated *** 2026-03-09T00:05:36.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local podman[129538]: 2026-03-09 00:05:36.293870683 +0000 UTC m=+0.198031417 container died f9863944dcfb0d091ecc36cb189641e022fde809f5586637fc314c55837f5195 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.vendor=CentOS, GIT_CLEAN=True, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20240222) 2026-03-09T00:05:36.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local podman[129538]: 2026-03-09 00:05:36.317214057 +0000 UTC m=+0.221374791 container remove f9863944dcfb0d091ecc36cb189641e022fde809f5586637fc314c55837f5195 (image=quay.io/ceph/ceph:v18.2.1, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_CLEAN=True, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, RELEASE=HEAD, io.buildah.version=1.29.1) 2026-03-09T00:05:36.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local bash[129538]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03 2026-03-09T00:05:36.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service: Deactivated successfully. 2026-03-09T00:05:36.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local systemd[1]: Stopped Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:05:36.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service: Consumed 7.041s CPU time. 2026-03-09T00:05:36.918 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local systemd[1]: Starting Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:05:36.918 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local podman[129655]: 2026-03-09 00:05:36.849470892 +0000 UTC m=+0.059486815 container create cafe87ec117d4a06b04edbb0a533db1e7f053ff2bb1fe4fd3a0662b859594874 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS) 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local podman[129655]: 2026-03-09 00:05:36.825662166 +0000 UTC m=+0.035678110 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local podman[129655]: 2026-03-09 00:05:36.935554911 +0000 UTC m=+0.145570853 container init cafe87ec117d4a06b04edbb0a533db1e7f053ff2bb1fe4fd3a0662b859594874 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local podman[129655]: 2026-03-09 00:05:36.976872061 +0000 UTC m=+0.186887994 container start cafe87ec117d4a06b04edbb0a533db1e7f053ff2bb1fe4fd3a0662b859594874 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local bash[129655]: cafe87ec117d4a06b04edbb0a533db1e7f053ff2bb1fe4fd3a0662b859594874 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:36 vm03.local systemd[1]: Started Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: pidfile_write: ignore empty --pid-file 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: load: jerasure load: lrc 2026-03-09T00:05:37.341 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: RocksDB version: 7.9.2 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Git sha 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: DB SUMMARY 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: DB Session ID: F71B36YTSR869K8BD3GA 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: CURRENT file: CURRENT 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: MANIFEST file: MANIFEST-000015 size: 1020 Bytes 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm03/store.db dir, Total Num: 1, files: 000026.sst 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm03/store.db: 000024.log size: 13348 ; 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.error_if_exists: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.create_if_missing: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.paranoid_checks: 1 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.env: 0x557ba838fdc0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.info_log: 0x557ba8ded900 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.statistics: (nil) 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.use_fsync: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_log_file_size: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.allow_fallocate: 1 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.use_direct_reads: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.db_log_dir: 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.wal_dir: 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T00:05:37.342 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.write_buffer_manager: 0x557ba8df1900 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.unordered_write: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.row_cache: None 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.wal_filter: None 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.two_write_queues: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.wal_compression: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.atomic_flush: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.log_readahead_size: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_background_jobs: 2 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_background_compactions: -1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_subcompactions: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_open_files: -1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_background_flushes: -1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Compression algorithms supported: 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kZSTD supported: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kXpressCompression supported: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kBZip2Compression supported: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kLZ4Compression supported: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kZlibCompression supported: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: kSnappyCompression supported: 1 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T00:05:37.343 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000015 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.merge_operator: 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_filter: None 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557ba8ded580) 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: pin_top_level_index_and_filter: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_type: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_index_type: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_shortening: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: checksum: 4 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: no_block_cache: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache: 0x557ba8e109b0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_name: BinnedLRUCache 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_options: 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: capacity : 536870912 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_shard_bits : 4 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: strict_capacity_limit : 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: high_pri_pool_ratio: 0.000 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_cache_compressed: (nil) 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: persistent_cache: (nil) 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size: 4096 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_size_deviation: 10 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_restart_interval: 16 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: index_block_restart_interval: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_block_size: 4096 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: partition_filters: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: use_delta_encoding: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: filter_policy: bloomfilter 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: whole_key_filtering: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: verify_compression: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: read_amp_bytes_per_bit: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: format_version: 5 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_index_compression: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: block_align: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_auto_readahead_size: 262144 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: prepopulate_block_cache: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: initial_auto_readahead_size: 8192 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression: NoCompression 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.num_levels: 7 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T00:05:37.344 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.inplace_update_support: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.bloom_locality: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.max_successive_merges: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.ttl: 2592000 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.enable_blob_files: false 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.min_blob_size: 0 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T00:05:37.345 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm03/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 28, last_sequence is 9961, log_number is 24,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 24 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 24 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fd8aa2b9-ca97-4f1b-9ec0-1d302426f007 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014737041719, "job": 1, "event": "recovery_started", "wal_files": [24]} 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #24 mode 2 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014737043028, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 29, "file_size": 14075, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9962, "largest_seqno": 10006, "table_properties": {"data_size": 12831, "index_size": 82, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 133, "raw_key_size": 976, "raw_average_key_size": 28, "raw_value_size": 12042, "raw_average_value_size": 354, "num_data_blocks": 3, "num_entries": 34, "num_filter_entries": 34, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773014737, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fd8aa2b9-ca97-4f1b-9ec0-1d302426f007", "db_session_id": "F71B36YTSR869K8BD3GA", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014737043091, "job": 1, "event": "recovery_finished"} 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/version_set.cc:5047] Creating manifest 31 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm03/store.db/000024.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557ba8e12e00 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: DB pointer 0x557ba8e22000 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: starting mon.vm03 rank 0 at public addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] at bind addrs [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon_data /var/lib/ceph/mon/ceph-vm03 fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** DB Stats ** 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** Compaction Stats [default] ** 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: L0 1/0 13.75 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 13.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: L6 1/0 8.25 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Sum 2/0 8.26 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 13.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 13.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** Compaction Stats [default] ** 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 13.4 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Cumulative compaction: 0.00 GB write, 1.22 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Interval compaction: 0.00 GB write, 1.22 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Block cache BinnedLRUCache@0x557ba8e109b0#2 capacity: 512.00 MB usage: 40.61 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 6e-06 secs_since: 0 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Block cache entry stats(count,size,portion): DataBlock(2,12.16 KB,0.00231862%) FilterBlock(2,8.20 KB,0.00156462%) IndexBlock(2,20.25 KB,0.00386238%) Misc(1,0.00 KB,0%) 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???) e2 preinit fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).mds e11 new map 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).mds e11 print_map 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: e11 2026-03-09T00:05:37.346 INFO:journalctl@ceph.mon.vm03.vm03.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: legacy client fscid: 1 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Filesystem 'cephfs' (1) 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: fs_name cephfs 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: epoch 11 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: tableserver 0 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: root 0 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: session_timeout 60 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: session_autoclose 300 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_file_size 1099511627776 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_xattr_size 65536 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: required_client_features {} 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: last_failure 0 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: last_failure_osd_epoch 39 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: max_mds 1 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: in 0 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: up {0=14480} 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: failed 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: damaged 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: stopped 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: data_pools [3] 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: metadata_pool 2 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: inline_data enabled 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: balancer 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: bal_rank_mask -1 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: standby_count_wanted 1 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: qdb_cluster leader: 0 members: 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: Standby daemons: 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout: [mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).osd e43 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).mgr e0 loading version 32 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).mgr e32 active server: [v2:192.168.123.103:6800/3123605642,v1:192.168.123.103:6801/3123605642](24393) 2026-03-09T00:05:37.347 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03@-1(???).mgr e32 mkfs or daemon transitioned to available, loading commands 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: Upgrade: Updating mon.vm03 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: Deploying daemon mon.vm03 on vm03 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: mon.vm03 calling monitor election 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: mon.vm03 is new leader, mons vm03,vm06 in quorum (ranks 0,1) 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: monmap epoch 2 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: last_changed 2026-03-09T00:00:11.764667+0000 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: created 2026-03-08T23:58:55.232252+0000 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: min_mon_release 18 (reef) 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: election_strategy: 1 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: mgrmap e32: vm03.yvcons(active, since 98s), standbys: vm06.rzcvhn 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: mgrmap e33: vm03.yvcons(active, since 98s), standbys: vm06.rzcvhn 2026-03-09T00:05:38.015 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:37 vm06.local ceph-mon[58395]: from='mgr.24393 ' entity='' 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: Upgrade: Updating mon.vm03 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: Deploying daemon mon.vm03 on vm03 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: from='mgr.24393 192.168.123.103:0/3379783929' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03 calling monitor election 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mon.vm03 is new leader, mons vm03,vm06 in quorum (ranks 0,1) 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: monmap epoch 2 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: last_changed 2026-03-09T00:00:11.764667+0000 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: created 2026-03-08T23:58:55.232252+0000 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: min_mon_release 18 (reef) 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: election_strategy: 1 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: osdmap e43: 6 total, 6 up, 6 in 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mgrmap e32: vm03.yvcons(active, since 98s), standbys: vm06.rzcvhn 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: mgrmap e33: vm03.yvcons(active, since 98s), standbys: vm06.rzcvhn 2026-03-09T00:05:38.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:37 vm03.local ceph-mon[129670]: from='mgr.24393 ' entity='' 2026-03-09T00:05:43.772 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:43 vm06.local ceph-mon[58395]: Standby manager daemon vm06.rzcvhn restarted 2026-03-09T00:05:43.772 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:43 vm06.local ceph-mon[58395]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:05:43.772 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:43 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:05:43.772 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:43 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:05:43.772 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:43 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:05:43.772 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:43 vm06.local ceph-mon[58395]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:05:43.836 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:43 vm03.local ceph-mon[129670]: Standby manager daemon vm06.rzcvhn restarted 2026-03-09T00:05:43.836 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:43 vm03.local ceph-mon[129670]: Standby manager daemon vm06.rzcvhn started 2026-03-09T00:05:43.836 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:43 vm03.local ceph-mon[129670]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/crt"}]: dispatch 2026-03-09T00:05:43.836 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:43 vm03.local ceph-mon[129670]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-09T00:05:43.836 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:43 vm03.local ceph-mon[129670]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm06.rzcvhn/key"}]: dispatch 2026-03-09T00:05:43.836 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:43 vm03.local ceph-mon[129670]: from='mgr.? 192.168.123.106:0/1396575939' entity='mgr.vm06.rzcvhn' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: Active manager daemon vm03.yvcons restarted 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: Activating manager daemon vm03.yvcons 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: mgrmap e34: vm03.yvcons(active, since 104s), standbys: vm06.rzcvhn 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: mgrmap e35: vm03.yvcons(active, starting, since 0.0249892s), standbys: vm06.rzcvhn 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:05:44.841 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: Manager daemon vm03.yvcons is now available 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:44.842 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: Active manager daemon vm03.yvcons restarted 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: Activating manager daemon vm03.yvcons 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: mgrmap e34: vm03.yvcons(active, since 104s), standbys: vm06.rzcvhn 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: mgrmap e35: vm03.yvcons(active, starting, since 0.0249892s), standbys: vm06.rzcvhn 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm03.yvcons", "id": "vm03.yvcons"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr metadata", "who": "vm06.rzcvhn", "id": "vm06.rzcvhn"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: Manager daemon vm03.yvcons is now available 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/mirror_snapshot_schedule"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:44.861 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:44 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm03.yvcons/trash_purge_schedule"}]: dispatch 2026-03-09T00:05:45.741 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:45 vm03.local ceph-mon[129670]: mgrmap e36: vm03.yvcons(active, since 1.01675s), standbys: vm06.rzcvhn 2026-03-09T00:05:45.742 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:45 vm03.local ceph-mon[129670]: pgmap v3: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:45.923 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:45 vm06.local ceph-mon[58395]: mgrmap e36: vm03.yvcons(active, since 1.01675s), standbys: vm06.rzcvhn 2026-03-09T00:05:45.923 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:45 vm06.local ceph-mon[58395]: pgmap v3: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:46.827 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:46 vm06.local ceph-mon[58395]: pgmap v4: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:46.827 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:46 vm06.local ceph-mon[58395]: mgrmap e37: vm03.yvcons(active, since 2s), standbys: vm06.rzcvhn 2026-03-09T00:05:46.827 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:46 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.827 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:46 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.827 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:46 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.827 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:46 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:46 vm03.local ceph-mon[129670]: pgmap v4: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:46 vm03.local ceph-mon[129670]: mgrmap e37: vm03.yvcons(active, since 2s), standbys: vm06.rzcvhn 2026-03-09T00:05:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:46 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:46 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:46 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:46.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:46 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: [09/Mar/2026:00:05:46] ENGINE Bus STARTING 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: [09/Mar/2026:00:05:46] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: [09/Mar/2026:00:05:46] ENGINE Client ('192.168.123.103', 33296) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: [09/Mar/2026:00:05:46] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: [09/Mar/2026:00:05:46] ENGINE Bus STARTED 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.828 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:47 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: [09/Mar/2026:00:05:46] ENGINE Bus STARTING 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: [09/Mar/2026:00:05:46] ENGINE Serving on https://192.168.123.103:7150 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: [09/Mar/2026:00:05:46] ENGINE Client ('192.168.123.103', 33296) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: [09/Mar/2026:00:05:46] ENGINE Serving on http://192.168.123.103:8765 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: [09/Mar/2026:00:05:46] ENGINE Bus STARTED 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm06", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:47 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:48.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:48 vm06.local ceph-mon[58395]: Detected new or changed devices on vm06 2026-03-09T00:05:48.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:48 vm06.local ceph-mon[58395]: pgmap v5: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:48.644 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:48 vm06.local ceph-mon[58395]: mgrmap e38: vm03.yvcons(active, since 4s), standbys: vm06.rzcvhn 2026-03-09T00:05:48.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:48 vm03.local ceph-mon[129670]: Detected new or changed devices on vm06 2026-03-09T00:05:48.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:48 vm03.local ceph-mon[129670]: pgmap v5: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:48.890 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:48 vm03.local ceph-mon[129670]: mgrmap e38: vm03.yvcons(active, since 4s), standbys: vm06.rzcvhn 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: Detected new or changed devices on vm03 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:50 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: Detected new or changed devices on vm03 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "who": "osd/host:vm03", "name": "osd_memory_target"}]: dispatch 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: Updating vm03:/etc/ceph/ceph.conf 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: Updating vm06:/etc/ceph/ceph.conf 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:50.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:50 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: pgmap v6: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Upgrade: Updating mon.vm06 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:05:51.299 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:05:51.300 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:51.300 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[58395]: Deploying daemon mon.vm06 on vm06 2026-03-09T00:05:51.300 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local systemd[1]: Stopping Ceph mon.vm06 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.conf 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Updating vm03:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Updating vm06:/etc/ceph/ceph.client.admin.keyring 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Updating vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Updating vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/config/ceph.client.admin.keyring 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: pgmap v6: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Upgrade: Updating mon.vm06 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:51.490 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:51 vm03.local ceph-mon[129670]: Deploying daemon mon.vm06 on vm06 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06[58391]: 2026-03-09T00:05:51.373+0000 7f7209e0f700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm06 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06[58391]: 2026-03-09T00:05:51.373+0000 7f7209e0f700 -1 mon.vm06@1(peon) e2 *** Got Signal Terminated *** 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local podman[106097]: 2026-03-09 00:05:51.399464417 +0000 UTC m=+0.041524037 container died 1e39c7ad3e9f412291db107cf7ec6a88747158c750861d4a3f0783943c1340fa (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, CEPH_POINT_RELEASE=-18.2.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, RELEASE=HEAD, io.buildah.version=1.29.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0) 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local podman[106097]: 2026-03-09 00:05:51.42672038 +0000 UTC m=+0.068780000 container remove 1e39c7ad3e9f412291db107cf7ec6a88747158c750861d4a3f0783943c1340fa (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_CLEAN=True, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD) 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local bash[106097]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm06.service: Deactivated successfully. 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local systemd[1]: Stopped Ceph mon.vm06 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:05:51.582 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm06.service: Consumed 3.831s CPU time. 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local systemd[1]: Starting Ceph mon.vm06 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local podman[106204]: 2026-03-09 00:05:51.803134499 +0000 UTC m=+0.022159004 container create 33df752aa193f37075d1e20764e59635f39a4d5e5274aa1c1bde4c7a8d1d9e3d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local podman[106204]: 2026-03-09 00:05:51.853233013 +0000 UTC m=+0.072257528 container init 33df752aa193f37075d1e20764e59635f39a4d5e5274aa1c1bde4c7a8d1d9e3d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local podman[106204]: 2026-03-09 00:05:51.856717846 +0000 UTC m=+0.075742361 container start 33df752aa193f37075d1e20764e59635f39a4d5e5274aa1c1bde4c7a8d1d9e3d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local bash[106204]: 33df752aa193f37075d1e20764e59635f39a4d5e5274aa1c1bde4c7a8d1d9e3d 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local podman[106204]: 2026-03-09 00:05:51.796000536 +0000 UTC m=+0.015025042 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local systemd[1]: Started Ceph mon.vm06 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: set uid:gid to 167:167 (ceph:ceph) 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: pidfile_write: ignore empty --pid-file 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: load: jerasure load: lrc 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: RocksDB version: 7.9.2 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Git sha 0 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: DB SUMMARY 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: DB Session ID: 2DRREOBHKAXDV6Q9G523 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: CURRENT file: CURRENT 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: IDENTITY file: IDENTITY 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: MANIFEST file: MANIFEST-000010 size: 913 Bytes 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm06/store.db dir, Total Num: 1, files: 000021.sst 2026-03-09T00:05:52.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm06/store.db: 000019.log size: 3277203 ; 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.error_if_exists: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.create_if_missing: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.paranoid_checks: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.env: 0x55a1ff1b8dc0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.fs: PosixFileSystem 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.info_log: 0x55a2005ad900 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_file_opening_threads: 16 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.statistics: (nil) 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.use_fsync: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_log_file_size: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.keep_log_file_num: 1000 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.recycle_log_file_num: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.allow_fallocate: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.allow_mmap_reads: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.allow_mmap_writes: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.use_direct_reads: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.create_missing_column_families: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.db_log_dir: 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.wal_dir: 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.advise_random_on_open: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.db_write_buffer_size: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.write_buffer_manager: 0x55a2005b1900 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.rate_limiter: (nil) 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.wal_recovery_mode: 2 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.enable_thread_tracking: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.enable_pipelined_write: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.unordered_write: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.row_cache: None 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.wal_filter: None 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.allow_ingest_behind: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.two_write_queues: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.manual_wal_flush: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.wal_compression: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.atomic_flush: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.log_readahead_size: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.best_efforts_recovery: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.allow_data_in_errors: 0 2026-03-09T00:05:52.173 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.db_host_id: __hostname__ 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_background_jobs: 2 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_background_compactions: -1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_subcompactions: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_total_wal_size: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_open_files: -1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bytes_per_sync: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_readahead_size: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_background_flushes: -1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Compression algorithms supported: 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kZSTD supported: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kXpressCompression supported: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kBZip2Compression supported: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kLZ4Compression supported: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kZlibCompression supported: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kLZ4HCCompression supported: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: kSnappyCompression supported: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000010 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.merge_operator: 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_filter: None 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_filter_factory: None 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.sst_partitioner_factory: None 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a2005ad580) 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: pin_top_level_index_and_filter: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_type: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_index_type: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_shortening: 1 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: checksum: 4 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: no_block_cache: 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache: 0x55a2005d09b0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_name: BinnedLRUCache 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_options: 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: capacity : 536870912 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_shard_bits : 4 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: strict_capacity_limit : 0 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: high_pri_pool_ratio: 0.000 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_cache_compressed: (nil) 2026-03-09T00:05:52.174 INFO:journalctl@ceph.mon.vm06.vm06.stdout: persistent_cache: (nil) 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size: 4096 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_size_deviation: 10 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_restart_interval: 16 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: index_block_restart_interval: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_block_size: 4096 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: partition_filters: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: use_delta_encoding: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: filter_policy: bloomfilter 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: whole_key_filtering: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: verify_compression: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: read_amp_bytes_per_bit: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: format_version: 5 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_index_compression: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: block_align: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_auto_readahead_size: 262144 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: prepopulate_block_cache: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: initial_auto_readahead_size: 8192 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout: num_file_reads_for_auto_readahead: 2 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.write_buffer_size: 33554432 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_write_buffer_number: 2 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression: NoCompression 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression: Disabled 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.prefix_extractor: nullptr 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.num_levels: 7 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.level: 32767 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.strategy: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.enabled: false 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.target_file_size_base: 67108864 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.arena_block_size: 1048576 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-09T00:05:52.175 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.disable_auto_compactions: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.inplace_update_support: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.bloom_locality: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.max_successive_merges: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.paranoid_file_checks: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.force_consistency_checks: 1 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.report_bg_io_stats: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.ttl: 2592000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.enable_blob_files: false 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.min_blob_size: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.blob_file_size: 268435456 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.blob_file_starting_level: 0 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm06/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 23, last_sequence is 10092, log_number is 19,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 19 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 19 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 34134aeb-a2ae-46ac-b465-2759b37b5985 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014751890312, "job": 1, "event": "recovery_started", "wal_files": [19]} 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #19 mode 2 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014751899843, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 24, "file_size": 2071745, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10097, "largest_seqno": 10564, "table_properties": {"data_size": 2068353, "index_size": 1711, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 6179, "raw_average_key_size": 25, "raw_value_size": 2062706, "raw_average_value_size": 8384, "num_data_blocks": 78, "num_entries": 246, "num_filter_entries": 246, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773014751, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34134aeb-a2ae-46ac-b465-2759b37b5985", "db_session_id": "2DRREOBHKAXDV6Q9G523", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773014751899982, "job": 1, "event": "recovery_finished"} 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/version_set.cc:5047] Creating manifest 26 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm06/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a2005d2e00 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: DB pointer 0x55a2005e2000 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** DB Stats ** 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-09T00:05:52.176 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L0 1/0 1.98 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 367.9 0.01 0.00 1 0.005 0 0 0.0 0.0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: L6 1/0 8.25 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Sum 2/0 10.22 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 367.9 0.01 0.00 1 0.005 0 0 0.0 0.0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 367.9 0.01 0.00 1 0.005 0 0 0.0 0.0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** Compaction Stats [default] ** 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 367.9 0.01 0.00 1 0.005 0 0 0.0 0.0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Cumulative compaction: 0.00 GB write, 127.74 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Interval compaction: 0.00 GB write, 127.74 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache BinnedLRUCache@0x55a2005d09b0#2 capacity: 512.00 MB usage: 2.55 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 9e-06 secs_since: 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,0.72 KB,0.000137091%) IndexBlock(1,1.83 KB,0.000348687%) Misc(1,0.00 KB,0%) 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: starting mon.vm06 rank 1 at public addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] at bind addrs [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon_data /var/lib/ceph/mon/ceph-vm06 fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???) e2 preinit fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).mds e11 new map 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).mds e11 print_map 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: e11 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: legacy client fscid: 1 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Filesystem 'cephfs' (1) 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: fs_name cephfs 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: epoch 11 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: tableserver 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: root 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: session_timeout 60 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: session_autoclose 300 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_file_size 1099511627776 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_xattr_size 65536 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: required_client_features {} 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: last_failure 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: last_failure_osd_epoch 39 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: max_mds 1 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: in 0 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: up {0=14480} 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: failed 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: damaged 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: stopped 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: data_pools [3] 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: metadata_pool 2 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: inline_data enabled 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: balancer 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: bal_rank_mask -1 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: standby_count_wanted 1 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: qdb_cluster leader: 0 members: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.177 INFO:journalctl@ceph.mon.vm06.vm06.stdout: Standby daemons: 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout: 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout: [mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).osd e44 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).osd e44 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).osd e44 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).osd e44 crush map has features 288514051259236352, adjusting msgr requires 2026-03-09T00:05:52.178 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:51 vm06.local ceph-mon[106218]: mon.vm06@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-09T00:05:53.310 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm03"}]: dispatch 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: mon.vm06 calling monitor election 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: mon.vm03 calling monitor election 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: mon.vm03 is new leader, mons vm03,vm06 in quorum (ranks 0,1) 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: monmap epoch 3 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: last_changed 2026-03-09T00:05:52.289734+0000 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: created 2026-03-08T23:58:55.232252+0000 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: min_mon_release 19 (squid) 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: election_strategy: 1 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: mgrmap e38: vm03.yvcons(active, since 8s), standbys: vm06.rzcvhn 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:53.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mon metadata", "id": "vm06"}]: dispatch 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: mon.vm06 calling monitor election 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: mon.vm03 calling monitor election 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: mon.vm03 is new leader, mons vm03,vm06 in quorum (ranks 0,1) 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: monmap epoch 3 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: last_changed 2026-03-09T00:05:52.289734+0000 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: created 2026-03-08T23:58:55.232252+0000 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: min_mon_release 19 (squid) 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: election_strategy: 1 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: 0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.vm03 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: 1: [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] mon.vm06 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: osdmap e44: 6 total, 6 up, 6 in 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: mgrmap e38: vm03.yvcons(active, since 8s), standbys: vm06.rzcvhn 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:53.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:05:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:54 vm06.local ceph-mon[106218]: pgmap v8: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 565 KiB/s wr, 216 op/s 2026-03-09T00:05:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:54 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:54 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:54 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:54.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:54 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:54 vm03.local ceph-mon[129670]: pgmap v8: 65 pgs: 65 active+clean; 282 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 565 KiB/s wr, 216 op/s 2026-03-09T00:05:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:54 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:54 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:54 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:54 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:56.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:56 vm03.local ceph-mon[129670]: pgmap v9: 65 pgs: 65 active+clean; 273 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.0 MiB/s wr, 582 op/s 2026-03-09T00:05:56.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:56.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:56.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:56.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:56.568 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:56 vm06.local ceph-mon[106218]: pgmap v9: 65 pgs: 65 active+clean; 273 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 37 KiB/s rd, 1.0 MiB/s wr, 582 op/s 2026-03-09T00:05:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:05:56.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: Reconfiguring mon.vm03 (monmap changed)... 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: Reconfiguring mgr.vm03.yvcons (monmap changed)... 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.yvcons", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: Reconfiguring daemon mgr.vm03.yvcons on vm03 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T00:05:57.524 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: Reconfiguring mon.vm03 (monmap changed)... 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: Reconfiguring daemon mon.vm03 on vm03 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: Reconfiguring mgr.vm03.yvcons (monmap changed)... 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm03.yvcons", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: Reconfiguring daemon mgr.vm03.yvcons on vm03 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T00:05:57.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Unable to update caps for client.ceph-exporter.vm03 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: pgmap v10: 65 pgs: 65 active+clean; 273 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 935 KiB/s wr, 529 op/s 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: Reconfiguring daemon osd.0 on vm03 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T00:05:58.569 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Reconfiguring ceph-exporter.vm03 (monmap changed)... 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Unable to update caps for client.ceph-exporter.vm03 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Reconfiguring daemon ceph-exporter.vm03 on vm03 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Reconfiguring crash.vm03 (monmap changed)... 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Reconfiguring daemon crash.vm03 on vm03 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: pgmap v10: 65 pgs: 65 active+clean; 273 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 935 KiB/s wr, 529 op/s 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Reconfiguring osd.0 (monmap changed)... 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: Reconfiguring daemon osd.0 on vm03 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T00:05:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: Reconfiguring daemon osd.1 on vm03 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: Reconfiguring daemon osd.2 on vm03 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: Reconfiguring mds.cephfs.vm03.sejksk (monmap changed)... 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: Reconfiguring daemon mds.cephfs.vm03.sejksk on vm03 2026-03-09T00:05:59.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:05:59.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:05:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:59.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: Reconfiguring osd.1 (monmap changed)... 2026-03-09T00:05:59.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: Reconfiguring daemon osd.1 on vm03 2026-03-09T00:05:59.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: Reconfiguring osd.2 (monmap changed)... 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: Reconfiguring daemon osd.2 on vm03 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: Reconfiguring mds.cephfs.vm03.sejksk (monmap changed)... 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: Reconfiguring daemon mds.cephfs.vm03.sejksk on vm03 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:05:59.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:05:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Reconfiguring mds.cephfs.vm03.ralade (monmap changed)... 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Reconfiguring daemon mds.cephfs.vm03.ralade on vm03 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Unable to update caps for client.ceph-exporter.vm06 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: pgmap v11: 65 pgs: 65 active+clean; 273 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 935 KiB/s wr, 529 op/s 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:06:00.729 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Reconfiguring mds.cephfs.vm03.ralade (monmap changed)... 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Reconfiguring daemon mds.cephfs.vm03.ralade on vm03 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Reconfiguring ceph-exporter.vm06 (monmap changed)... 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Unable to update caps for client.ceph-exporter.vm06 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Reconfiguring daemon ceph-exporter.vm06 on vm06 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: pgmap v11: 65 pgs: 65 active+clean; 273 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 935 KiB/s wr, 529 op/s 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Reconfiguring crash.vm06 (monmap changed)... 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: Reconfiguring daemon crash.vm06 on vm06 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm06.rzcvhn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-09T00:06:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: Reconfiguring mgr.vm06.rzcvhn (monmap changed)... 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: Reconfiguring daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: Reconfiguring daemon osd.3 on vm06 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T00:06:01.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:01 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: Reconfiguring mgr.vm06.rzcvhn (monmap changed)... 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: Reconfiguring daemon mgr.vm06.rzcvhn on vm06 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: Reconfiguring mon.vm06 (monmap changed)... 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: Reconfiguring daemon mon.vm06 on vm06 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: Reconfiguring osd.3 (monmap changed)... 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: Reconfiguring daemon osd.3 on vm06 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T00:06:01.870 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:01 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: Reconfiguring daemon osd.4 on vm06 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: pgmap v12: 65 pgs: 65 active+clean; 268 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.3 MiB/s wr, 670 op/s 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: Reconfiguring daemon osd.5 on vm06 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.633 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: Reconfiguring osd.4 (monmap changed)... 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: Reconfiguring daemon osd.4 on vm06 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: pgmap v12: 65 pgs: 65 active+clean; 268 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 43 KiB/s rd, 1.3 MiB/s wr, 670 op/s 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: Reconfiguring osd.5 (monmap changed)... 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: Reconfiguring daemon osd.5 on vm06 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:06:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:02 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:06:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:02 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: Reconfiguring mds.cephfs.vm06.vlrwtl (monmap changed)... 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: Reconfiguring daemon mds.cephfs.vm06.vlrwtl on vm06 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: Reconfiguring mds.cephfs.vm06.ixduim (monmap changed)... 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: Reconfiguring daemon mds.cephfs.vm06.ixduim on vm06 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]: dispatch 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]': finished 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]: dispatch 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]': finished 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.154 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:06:04.155 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:03 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: Reconfiguring mds.cephfs.vm06.vlrwtl (monmap changed)... 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: Reconfiguring daemon mds.cephfs.vm06.vlrwtl on vm06 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: Reconfiguring mds.cephfs.vm06.ixduim (monmap changed)... 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: Reconfiguring daemon mds.cephfs.vm06.ixduim on vm06 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm03"}]': finished 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm06"}]': finished 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm03", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:06:04.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:03 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:04 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all mon 2026-03-09T00:06:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:04 vm03.local ceph-mon[129670]: Upgrade: Updating crash.vm03 (1/2) 2026-03-09T00:06:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:04 vm03.local ceph-mon[129670]: Deploying daemon crash.vm03 on vm03 2026-03-09T00:06:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:04 vm03.local ceph-mon[129670]: pgmap v13: 65 pgs: 65 active+clean; 268 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 939 KiB/s wr, 509 op/s 2026-03-09T00:06:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:04 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all mon 2026-03-09T00:06:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:04 vm06.local ceph-mon[106218]: Upgrade: Updating crash.vm03 (1/2) 2026-03-09T00:06:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:04 vm06.local ceph-mon[106218]: Deploying daemon crash.vm03 on vm03 2026-03-09T00:06:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:04 vm06.local ceph-mon[106218]: pgmap v13: 65 pgs: 65 active+clean; 268 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 939 KiB/s wr, 509 op/s 2026-03-09T00:06:05.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 -- 192.168.123.103:0/76086590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c107d90 msgr2=0x7f6c0c108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/76086590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c107d90 0x7f6c0c108210 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f6c0400cd40 tx=0x7f6c0400a320 comp rx=0 tx=0).stop 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 -- 192.168.123.103:0/76086590 shutdown_connections 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/76086590 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c107d90 0x7f6c0c108210 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/76086590 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c10d680 0x7f6c0c10da60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 -- 192.168.123.103:0/76086590 >> 192.168.123.103:0/76086590 conn(0x7f6c0c06d1b0 msgr2=0x7f6c0c06d5c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 -- 192.168.123.103:0/76086590 shutdown_connections 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 -- 192.168.123.103:0/76086590 wait complete. 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.403+0000 7f6c12e6a700 1 Processor -- start 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.404+0000 7f6c12e6a700 1 -- start start 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.404+0000 7f6c12e6a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c10d680 0x7f6c0c121050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.404+0000 7f6c12e6a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 0x7f6c0c118470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.404+0000 7f6c12e6a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c0c121730 con 0x7f6c0c10d680 2026-03-09T00:06:05.404 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.404+0000 7f6c12e6a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c0c1189b0 con 0x7f6c0c117ff0 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c0bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 0x7f6c0c118470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c0bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 0x7f6c0c118470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48210/0 (socket says 192.168.123.103:48210) 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c0bfff700 1 -- 192.168.123.103:0/3427525798 learned_addr learned my addr 192.168.123.103:0/3427525798 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c10c06700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c10d680 0x7f6c0c121050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c0bfff700 1 -- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c10d680 msgr2=0x7f6c0c121050 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c0bfff700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c10d680 0x7f6c0c121050 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.406+0000 7f6c0bfff700 1 -- 192.168.123.103:0/3427525798 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c0400c9f0 con 0x7f6c0c117ff0 2026-03-09T00:06:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.407+0000 7f6c0bfff700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 0x7f6c0c118470 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f6c040062a0 tx=0x7f6c0400baf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.407+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c0400dea0 con 0x7f6c0c117ff0 2026-03-09T00:06:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.407+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c0c118c30 con 0x7f6c0c117ff0 2026-03-09T00:06:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.407+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c0c11cce0 con 0x7f6c0c117ff0 2026-03-09T00:06:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.408+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6c04009d70 con 0x7f6c0c117ff0 2026-03-09T00:06:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.408+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c0401fa10 con 0x7f6c0c117ff0 2026-03-09T00:06:05.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.411+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c0c04f2e0 con 0x7f6c0c117ff0 2026-03-09T00:06:05.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.411+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f6c04004820 con 0x7f6c0c117ff0 2026-03-09T00:06:05.410 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.411+0000 7f6c09ffb700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6bf4077a50 0x7f6bf4079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.411+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6c0409b210 con 0x7f6c0c117ff0 2026-03-09T00:06:05.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.413+0000 7f6c10c06700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6bf4077a50 0x7f6bf4079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.413+0000 7f6c10c06700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6bf4077a50 0x7f6bf4079f10 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6bfc005950 tx=0x7f6bfc00b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:05.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.414+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6c04063820 con 0x7f6c0c117ff0 2026-03-09T00:06:05.562 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.562+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6c0c119540 con 0x7f6bf4077a50 2026-03-09T00:06:05.563 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.564+0000 7f6c09ffb700 1 -- 192.168.123.103:0/3427525798 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7f6c0c119540 con 0x7f6bf4077a50 2026-03-09T00:06:05.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6bf4077a50 msgr2=0x7f6bf4079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6bf4077a50 0x7f6bf4079f10 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f6bfc005950 tx=0x7f6bfc00b410 comp rx=0 tx=0).stop 2026-03-09T00:06:05.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 msgr2=0x7f6c0c118470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 0x7f6c0c118470 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7f6c040062a0 tx=0x7f6c0400baf0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 shutdown_connections 2026-03-09T00:06:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6bf4077a50 0x7f6bf4079f10 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c0c10d680 0x7f6c0c121050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 --2- 192.168.123.103:0/3427525798 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c0c117ff0 0x7f6c0c118470 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 >> 192.168.123.103:0/3427525798 conn(0x7f6c0c06d1b0 msgr2=0x7f6c0c0710c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.571+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 shutdown_connections 2026-03-09T00:06:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.572+0000 7f6c12e6a700 1 -- 192.168.123.103:0/3427525798 wait complete. 2026-03-09T00:06:05.582 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.664+0000 7fc68203f700 1 -- 192.168.123.103:0/4059060507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c1082d0 msgr2=0x7fc67c108750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.664+0000 7fc68203f700 1 --2- 192.168.123.103:0/4059060507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c1082d0 0x7fc67c108750 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fc670009b00 tx=0x7fc670009e10 comp rx=0 tx=0).stop 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.665+0000 7fc68203f700 1 -- 192.168.123.103:0/4059060507 shutdown_connections 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.665+0000 7fc68203f700 1 --2- 192.168.123.103:0/4059060507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c1082d0 0x7fc67c108750 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.665+0000 7fc68203f700 1 --2- 192.168.123.103:0/4059060507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c107d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.665+0000 7fc68203f700 1 -- 192.168.123.103:0/4059060507 >> 192.168.123.103:0/4059060507 conn(0x7fc67c06d0f0 msgr2=0x7fc67c06d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:05.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.665+0000 7fc68203f700 1 -- 192.168.123.103:0/4059060507 shutdown_connections 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.666+0000 7fc68203f700 1 -- 192.168.123.103:0/4059060507 wait complete. 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.666+0000 7fc68203f700 1 Processor -- start 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.666+0000 7fc68203f700 1 -- start start 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc68203f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c119eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc68203f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c114f00 0x7fc67c115380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc68203f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc67c1158c0 con 0x7fc67c114f00 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc68203f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc67c115a00 con 0x7fc67c10f660 2026-03-09T00:06:05.666 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc67b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c119eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc67b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c119eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48226/0 (socket says 192.168.123.103:48226) 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc67b7fe700 1 -- 192.168.123.103:0/286916830 learned_addr learned my addr 192.168.123.103:0/286916830 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc67b7fe700 1 -- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c114f00 msgr2=0x7fc67c115380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc67b7fe700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c114f00 0x7fc67c115380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.667+0000 7fc67b7fe700 1 -- 192.168.123.103:0/286916830 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc6700097e0 con 0x7fc67c10f660 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.668+0000 7fc67b7fe700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c119eb0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc66c00d8d0 tx=0x7fc66c00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.668+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc66c009940 con 0x7fc67c10f660 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.668+0000 7fc68203f700 1 -- 192.168.123.103:0/286916830 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc67c115ce0 con 0x7fc67c10f660 2026-03-09T00:06:05.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.668+0000 7fc68203f700 1 -- 192.168.123.103:0/286916830 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc67c077260 con 0x7fc67c10f660 2026-03-09T00:06:05.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.668+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc66c010460 con 0x7fc67c10f660 2026-03-09T00:06:05.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.668+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc66c00f5d0 con 0x7fc67c10f660 2026-03-09T00:06:05.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.670+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc66c009aa0 con 0x7fc67c10f660 2026-03-09T00:06:05.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.671+0000 7fc678ff9700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc664077a50 0x7fc664079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.672+0000 7fc68203f700 1 -- 192.168.123.103:0/286916830 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc668005320 con 0x7fc67c10f660 2026-03-09T00:06:05.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.672+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fc66c099bd0 con 0x7fc67c10f660 2026-03-09T00:06:05.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.673+0000 7fc67affd700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc664077a50 0x7fc664079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.673+0000 7fc67affd700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc664077a50 0x7fc664079f10 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc670009b00 tx=0x7fc67000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:05.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.676+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc66c061b30 con 0x7fc67c10f660 2026-03-09T00:06:05.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.823+0000 7fc68203f700 1 -- 192.168.123.103:0/286916830 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc668000bf0 con 0x7fc664077a50 2026-03-09T00:06:05.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.827+0000 7fc678ff9700 1 -- 192.168.123.103:0/286916830 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7fc668000bf0 con 0x7fc664077a50 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 -- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc664077a50 msgr2=0x7fc664079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc664077a50 0x7fc664079f10 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fc670009b00 tx=0x7fc67000b540 comp rx=0 tx=0).stop 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 -- 192.168.123.103:0/286916830 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 msgr2=0x7fc67c119eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c119eb0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fc66c00d8d0 tx=0x7fc66c00dc90 comp rx=0 tx=0).stop 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 -- 192.168.123.103:0/286916830 shutdown_connections 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc664077a50 0x7fc664079f10 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc67c10f660 0x7fc67c119eb0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 --2- 192.168.123.103:0/286916830 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc67c114f00 0x7fc67c115380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 -- 192.168.123.103:0/286916830 >> 192.168.123.103:0/286916830 conn(0x7fc67c06d0f0 msgr2=0x7fc67c06f860 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:05.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.830+0000 7fc6627fc700 1 -- 192.168.123.103:0/286916830 shutdown_connections 2026-03-09T00:06:05.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.832+0000 7fc6627fc700 1 -- 192.168.123.103:0/286916830 wait complete. 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.917+0000 7f6caa45c700 1 -- 192.168.123.103:0/957331025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca410f420 msgr2=0x7f6ca410f800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.917+0000 7f6caa45c700 1 --2- 192.168.123.103:0/957331025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca410f420 0x7f6ca410f800 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f6c9c00b3a0 tx=0x7f6c9c00b6b0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 -- 192.168.123.103:0/957331025 shutdown_connections 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 --2- 192.168.123.103:0/957331025 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca4107d90 0x7f6ca4108210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 --2- 192.168.123.103:0/957331025 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca410f420 0x7f6ca410f800 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 -- 192.168.123.103:0/957331025 >> 192.168.123.103:0/957331025 conn(0x7f6ca406ce20 msgr2=0x7f6ca406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:05.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 -- 192.168.123.103:0/957331025 shutdown_connections 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 -- 192.168.123.103:0/957331025 wait complete. 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 Processor -- start 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.918+0000 7f6caa45c700 1 -- start start 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6caa45c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 0x7f6ca4117ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6caa45c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca410f420 0x7f6ca4112ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6caa45c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ca41134a0 con 0x7f6ca410f420 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6caa45c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ca4113610 con 0x7f6ca4107d90 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6ca3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 0x7f6ca4117ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6ca3fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 0x7f6ca4117ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48242/0 (socket says 192.168.123.103:48242) 2026-03-09T00:06:05.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6ca3fff700 1 -- 192.168.123.103:0/1748855117 learned_addr learned my addr 192.168.123.103:0/1748855117 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6ca3fff700 1 -- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca410f420 msgr2=0x7f6ca4112ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.919+0000 7f6ca37fe700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca410f420 0x7f6ca4112ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6ca3fff700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca410f420 0x7f6ca4112ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6ca3fff700 1 -- 192.168.123.103:0/1748855117 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c9c00b050 con 0x7f6ca4107d90 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6ca37fe700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca410f420 0x7f6ca4112ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6ca3fff700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 0x7f6ca4117ed0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f6c9c015040 tx=0x7f6c9c012710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:05.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c9c00e040 con 0x7f6ca4107d90 2026-03-09T00:06:05.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ca4113890 con 0x7f6ca4107d90 2026-03-09T00:06:05.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.920+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ca41a6430 con 0x7f6ca4107d90 2026-03-09T00:06:05.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.921+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6c9c012e90 con 0x7f6ca4107d90 2026-03-09T00:06:05.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.922+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c9c004690 con 0x7f6ca4107d90 2026-03-09T00:06:05.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.922+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f6c9c0048b0 con 0x7f6ca4107d90 2026-03-09T00:06:05.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.923+0000 7f6ca17fa700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c8c077b20 0x7f6c8c079fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:05.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.923+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6c9c09c320 con 0x7f6ca4107d90 2026-03-09T00:06:05.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.923+0000 7f6ca37fe700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c8c077b20 0x7f6c8c079fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:05.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.923+0000 7f6ca37fe700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c8c077b20 0x7f6c8c079fe0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6c98005fd0 tx=0x7f6c98005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:05.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.923+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c90005320 con 0x7f6ca4107d90 2026-03-09T00:06:05.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:05.928+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6c9c0649b0 con 0x7f6ca4107d90 2026-03-09T00:06:06.058 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.058+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6c90000bf0 con 0x7f6c8c077b20 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.067+0000 7f6ca17fa700 1 -- 192.168.123.103:0/1748855117 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f6c90000bf0 con 0x7f6c8c077b20 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (82s) 19s ago 6m 16.8M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (6m) 19s ago 6m 8493k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (6m) 12s ago 6m 8644k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 starting - - - - 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (5m) 12s ago 5m 7411k - 18.2.1 5be31c24972a d9eb9a54d81d 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (67s) 19s ago 6m 75.9M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (4m) 19s ago 4m 17.5M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (4m) 19s ago 4m 237M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (4m) 12s ago 4m 19.6M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (4m) 12s ago 4m 15.8M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (2m) 19s ago 7m 592M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (113s) 12s ago 5m 488M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (29s) 19s ago 7m 45.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (14s) 12s ago 5m 34.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (103s) 19s ago 6m 9407k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (99s) 12s ago 5m 9407k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 19s ago 5m 395M 4096M 18.2.1 5be31c24972a 7582c56d43e3 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5m) 19s ago 5m 404M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 19s ago 5m 342M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (5m) 12s ago 5m 495M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (4m) 12s ago 4m 462M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (4m) 12s ago 4m 382M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:06:06.066 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (87s) 19s ago 6m 53.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c8c077b20 msgr2=0x7f6c8c079fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c8c077b20 0x7f6c8c079fe0 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f6c98005fd0 tx=0x7f6c98005ee0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 msgr2=0x7f6ca4117ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 0x7f6ca4117ed0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f6c9c015040 tx=0x7f6c9c012710 comp rx=0 tx=0).stop 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 shutdown_connections 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c8c077b20 0x7f6c8c079fe0 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6ca4107d90 0x7f6ca4117ed0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 --2- 192.168.123.103:0/1748855117 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6ca410f420 0x7f6ca4112ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.069+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 >> 192.168.123.103:0/1748855117 conn(0x7f6ca406ce20 msgr2=0x7f6ca410d1f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.070+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 shutdown_connections 2026-03-09T00:06:06.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.070+0000 7f6caa45c700 1 -- 192.168.123.103:0/1748855117 wait complete. 2026-03-09T00:06:06.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.173+0000 7ffb8f38d700 1 -- 192.168.123.103:0/616519478 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb881089a0 msgr2=0x7ffb8810be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.173+0000 7ffb8f38d700 1 --2- 192.168.123.103:0/616519478 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb881089a0 0x7ffb8810be70 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7ffb8000b600 tx=0x7ffb8000b910 comp rx=0 tx=0).stop 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- 192.168.123.103:0/616519478 shutdown_connections 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 --2- 192.168.123.103:0/616519478 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb881089a0 0x7ffb8810be70 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 --2- 192.168.123.103:0/616519478 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb88107ff0 0x7ffb881083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- 192.168.123.103:0/616519478 >> 192.168.123.103:0/616519478 conn(0x7ffb8806ce20 msgr2=0x7ffb8806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- 192.168.123.103:0/616519478 shutdown_connections 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- 192.168.123.103:0/616519478 wait complete. 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 Processor -- start 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- start start 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb88107ff0 0x7ffb8807cfc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 0x7ffb8807d980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.186 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb88081bd0 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8f38d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb88081d10 con 0x7ffb88107ff0 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8c928700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 0x7ffb8807d980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8c928700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 0x7ffb8807d980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50348/0 (socket says 192.168.123.103:50348) 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8c928700 1 -- 192.168.123.103:0/2272248968 learned_addr learned my addr 192.168.123.103:0/2272248968 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8c928700 1 -- 192.168.123.103:0/2272248968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb88107ff0 msgr2=0x7ffb8807cfc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8c928700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb88107ff0 0x7ffb8807cfc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.174+0000 7ffb8c928700 1 -- 192.168.123.103:0/2272248968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffb8000b050 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.175+0000 7ffb8c928700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 0x7ffb8807d980 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7ffb80009fd0 tx=0x7ffb80003ce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.175+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb8000e030 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.175+0000 7ffb8f38d700 1 -- 192.168.123.103:0/2272248968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffb88081f90 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.175+0000 7ffb8f38d700 1 -- 192.168.123.103:0/2272248968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffb880824e0 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.177+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ffb800048e0 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.177+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb8001cd50 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.177+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ffb80012430 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.177+0000 7ffb7e7fc700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ffb74079de0 0x7ffb7407c2a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.178+0000 7ffb8d129700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ffb74079de0 0x7ffb7407c2a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.178+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ffb8009c5c0 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.178+0000 7ffb8f38d700 1 -- 192.168.123.103:0/2272248968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffb6c005320 con 0x7ffb8807d500 2026-03-09T00:06:06.187 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.179+0000 7ffb8d129700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ffb74079de0 0x7ffb7407c2a0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7ffb84009710 tx=0x7ffb84006c60 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:06.375 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.376+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ffb80064c50 con 0x7ffb8807d500 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: pgmap v14: 65 pgs: 65 active+clean; 258 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.4 MiB/s wr, 680 op/s 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: Upgrade: Updating crash.vm06 (2/2) 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:06:06.510 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:06.511 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: Deploying daemon crash.vm06 on vm06 2026-03-09T00:06:06.511 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:06 vm06.local ceph-mon[106218]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.551+0000 7ffb8f38d700 1 -- 192.168.123.103:0/2272248968 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7ffb6c005cc0 con 0x7ffb8807d500 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: pgmap v14: 65 pgs: 65 active+clean; 258 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 29 KiB/s rd, 1.4 MiB/s wr, 680 op/s 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: Upgrade: Updating crash.vm06 (2/2) 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm06", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: Deploying daemon crash.vm06 on vm06 2026-03-09T00:06:06.551 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:06 vm03.local ceph-mon[129670]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.552+0000 7ffb7e7fc700 1 -- 192.168.123.103:0/2272248968 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7ffb800202d0 con 0x7ffb8807d500 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 10, 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:06:06.551 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 -- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ffb74079de0 msgr2=0x7ffb7407c2a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ffb74079de0 0x7ffb7407c2a0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7ffb84009710 tx=0x7ffb84006c60 comp rx=0 tx=0).stop 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 -- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 msgr2=0x7ffb8807d980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 0x7ffb8807d980 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7ffb80009fd0 tx=0x7ffb80003ce0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 -- 192.168.123.103:0/2272248968 shutdown_connections 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ffb74079de0 0x7ffb7407c2a0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ffb88107ff0 0x7ffb8807cfc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 --2- 192.168.123.103:0/2272248968 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ffb8807d500 0x7ffb8807d980 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 -- 192.168.123.103:0/2272248968 >> 192.168.123.103:0/2272248968 conn(0x7ffb8806ce20 msgr2=0x7ffb880706e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:06.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 -- 192.168.123.103:0/2272248968 shutdown_connections 2026-03-09T00:06:06.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.555+0000 7ffb6bfff700 1 -- 192.168.123.103:0/2272248968 wait complete. 2026-03-09T00:06:06.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 -- 192.168.123.103:0/3798500831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3658107ff0 msgr2=0x7f36581083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 --2- 192.168.123.103:0/3798500831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3658107ff0 0x7f36581083d0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f3654007780 tx=0x7f365400c050 comp rx=0 tx=0).stop 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 -- 192.168.123.103:0/3798500831 shutdown_connections 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 --2- 192.168.123.103:0/3798500831 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581089a0 0x7f365810be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 --2- 192.168.123.103:0/3798500831 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3658107ff0 0x7f36581083d0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 -- 192.168.123.103:0/3798500831 >> 192.168.123.103:0/3798500831 conn(0x7f365806ce20 msgr2=0x7f365806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 -- 192.168.123.103:0/3798500831 shutdown_connections 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.625+0000 7f365ff16700 1 -- 192.168.123.103:0/3798500831 wait complete. 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365ff16700 1 Processor -- start 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365ff16700 1 -- start start 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365ff16700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36581089a0 0x7f3658133260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365ff16700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581337a0 0x7f3658133c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365ff16700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f365807ef30 con 0x7f36581089a0 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365ff16700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f365807f0a0 con 0x7f36581337a0 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365d4b1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581337a0 0x7f3658133c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365dcb2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36581089a0 0x7f3658133260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365d4b1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581337a0 0x7f3658133c20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48272/0 (socket says 192.168.123.103:48272) 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.626+0000 7f365d4b1700 1 -- 192.168.123.103:0/2250972000 learned_addr learned my addr 192.168.123.103:0/2250972000 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f365dcb2700 1 -- 192.168.123.103:0/2250972000 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581337a0 msgr2=0x7f3658133c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f365dcb2700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581337a0 0x7f3658133c20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f365dcb2700 1 -- 192.168.123.103:0/2250972000 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3654007430 con 0x7f36581089a0 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f365dcb2700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36581089a0 0x7f3658133260 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f365400afd0 tx=0x7f365400ca60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:06.626 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f365400f050 con 0x7f36581089a0 2026-03-09T00:06:06.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f365ff16700 1 -- 192.168.123.103:0/2250972000 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f365807f2d0 con 0x7f36581089a0 2026-03-09T00:06:06.627 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.627+0000 7f365ff16700 1 -- 192.168.123.103:0/2250972000 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f365807f820 con 0x7f36581089a0 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.628+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f365400ced0 con 0x7f36581089a0 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.629+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3654008710 con 0x7f36581089a0 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.629+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f365401a040 con 0x7f36581089a0 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.630+0000 7f364effd700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3644077be0 0x7f364407a0a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.630+0000 7f365d4b1700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3644077be0 0x7f364407a0a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.630+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f365409b480 con 0x7f36581089a0 2026-03-09T00:06:06.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.630+0000 7f365d4b1700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3644077be0 0x7f364407a0a0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3650009f20 tx=0x7f3650009580 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:06.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.630+0000 7f365ff16700 1 -- 192.168.123.103:0/2250972000 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f363c005320 con 0x7f36581089a0 2026-03-09T00:06:06.633 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.633+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3654063b10 con 0x7f36581089a0 2026-03-09T00:06:06.777 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.778+0000 7f365ff16700 1 -- 192.168.123.103:0/2250972000 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f363c005cc0 con 0x7f36581089a0 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.778+0000 7f364effd700 1 -- 192.168.123.103:0/2250972000 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f3654018070 con 0x7f36581089a0 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:06:06.778 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:06.779 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:06.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.782+0000 7f364cff9700 1 -- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3644077be0 msgr2=0x7f364407a0a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.782+0000 7f364cff9700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3644077be0 0x7f364407a0a0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3650009f20 tx=0x7f3650009580 comp rx=0 tx=0).stop 2026-03-09T00:06:06.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.782+0000 7f364cff9700 1 -- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36581089a0 msgr2=0x7f3658133260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.781 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.782+0000 7f364cff9700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36581089a0 0x7f3658133260 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f365400afd0 tx=0x7f365400ca60 comp rx=0 tx=0).stop 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 -- 192.168.123.103:0/2250972000 shutdown_connections 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3644077be0 0x7f364407a0a0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f36581089a0 0x7f3658133260 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 --2- 192.168.123.103:0/2250972000 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f36581337a0 0x7f3658133c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 -- 192.168.123.103:0/2250972000 >> 192.168.123.103:0/2250972000 conn(0x7f365806ce20 msgr2=0x7f3658070590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 -- 192.168.123.103:0/2250972000 shutdown_connections 2026-03-09T00:06:06.782 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.783+0000 7f364cff9700 1 -- 192.168.123.103:0/2250972000 wait complete. 2026-03-09T00:06:06.783 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.883+0000 7f8437402700 1 -- 192.168.123.103:0/2673965806 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 msgr2=0x7f8430108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.883+0000 7f8437402700 1 --2- 192.168.123.103:0/2673965806 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f8430108210 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f842800b3a0 tx=0x7f842800b6b0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.883+0000 7f8437402700 1 -- 192.168.123.103:0/2673965806 shutdown_connections 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.883+0000 7f8437402700 1 --2- 192.168.123.103:0/2673965806 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f8430108210 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.883+0000 7f8437402700 1 --2- 192.168.123.103:0/2673965806 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f843010f420 0x7f843010f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.883+0000 7f8437402700 1 -- 192.168.123.103:0/2673965806 >> 192.168.123.103:0/2673965806 conn(0x7f843006ce20 msgr2=0x7f843006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 -- 192.168.123.103:0/2673965806 shutdown_connections 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 -- 192.168.123.103:0/2673965806 wait complete. 2026-03-09T00:06:06.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 Processor -- start 2026-03-09T00:06:06.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 -- start start 2026-03-09T00:06:06.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f84301162f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f843010f420 0x7f8430116830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8430116d70 con 0x7f843010f420 2026-03-09T00:06:06.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.884+0000 7f8437402700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8430116ee0 con 0x7f8430107d90 2026-03-09T00:06:06.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.886+0000 7f843519e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f84301162f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.886+0000 7f843519e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f84301162f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48288/0 (socket says 192.168.123.103:48288) 2026-03-09T00:06:06.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.886+0000 7f843519e700 1 -- 192.168.123.103:0/1988034285 learned_addr learned my addr 192.168.123.103:0/1988034285 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:06.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.886+0000 7f843499d700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f843010f420 0x7f8430116830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.887+0000 7f843519e700 1 -- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f843010f420 msgr2=0x7f8430116830 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:06.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.887+0000 7f843519e700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f843010f420 0x7f8430116830 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:06.886 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.887+0000 7f843519e700 1 -- 192.168.123.103:0/1988034285 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f842800b050 con 0x7f8430107d90 2026-03-09T00:06:06.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.887+0000 7f843519e700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f84301162f0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f842c00eb10 tx=0x7f842c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:06.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.888+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f842c00cca0 con 0x7f8430107d90 2026-03-09T00:06:06.887 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.888+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84301b8100 con 0x7f8430107d90 2026-03-09T00:06:06.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.888+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84301b8460 con 0x7f8430107d90 2026-03-09T00:06:06.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.888+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f842c00ce00 con 0x7f8430107d90 2026-03-09T00:06:06.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.888+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f842c018910 con 0x7f8430107d90 2026-03-09T00:06:06.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.889+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f842c018b50 con 0x7f8430107d90 2026-03-09T00:06:06.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.889+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8414005320 con 0x7f8430107d90 2026-03-09T00:06:06.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.890+0000 7f84267fc700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f841c0777f0 0x7f841c079cb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:06.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.891+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f842c014070 con 0x7f8430107d90 2026-03-09T00:06:06.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.893+0000 7f843499d700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f841c0777f0 0x7f841c079cb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:06.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.896+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f842c062c40 con 0x7f8430107d90 2026-03-09T00:06:06.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:06.897+0000 7f843499d700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f841c0777f0 0x7f841c079cb0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8430117ae0 tx=0x7f842800ba00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:07.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.142+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8414000bf0 con 0x7f841c0777f0 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.146+0000 7f84267fc700 1 -- 192.168.123.103:0/1988034285 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7f8414000bf0 con 0x7f841c0777f0 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "4/23 daemons upgraded", 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading crash daemons", 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:06:07.146 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:06:07.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.154+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f841c0777f0 msgr2=0x7f841c079cb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:07.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.154+0000 7f8437402700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f841c0777f0 0x7f841c079cb0 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f8430117ae0 tx=0x7f842800ba00 comp rx=0 tx=0).stop 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.154+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 msgr2=0x7f84301162f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.154+0000 7f8437402700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f84301162f0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f842c00eb10 tx=0x7f842c00eed0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 shutdown_connections 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f841c0777f0 0x7f841c079cb0 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8430107d90 0x7f84301162f0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 --2- 192.168.123.103:0/1988034285 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f843010f420 0x7f8430116830 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 >> 192.168.123.103:0/1988034285 conn(0x7f843006ce20 msgr2=0x7f843006f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 shutdown_connections 2026-03-09T00:06:07.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.155+0000 7f8437402700 1 -- 192.168.123.103:0/1988034285 wait complete. 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 -- 192.168.123.103:0/1422980227 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb041089a0 msgr2=0x7fdb0410be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/1422980227 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb041089a0 0x7fdb0410be70 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7fdafc00d3f0 tx=0x7fdafc00d700 comp rx=0 tx=0).stop 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 -- 192.168.123.103:0/1422980227 shutdown_connections 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/1422980227 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb041089a0 0x7fdb0410be70 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/1422980227 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb041083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 -- 192.168.123.103:0/1422980227 >> 192.168.123.103:0/1422980227 conn(0x7fdb0406ce20 msgr2=0x7fdb0406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 -- 192.168.123.103:0/1422980227 shutdown_connections 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 -- 192.168.123.103:0/1422980227 wait complete. 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 Processor -- start 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.317+0000 7fdb0aa46700 1 -- start start 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb0aa46700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb04133250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb0aa46700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb04133790 0x7fdb04133c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb0aa46700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb0407ef10 con 0x7fdb04107ff0 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb0aa46700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdb0407f080 con 0x7fdb04133790 2026-03-09T00:06:07.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb04133250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb04133250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:50396/0 (socket says 192.168.123.103:50396) 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 -- 192.168.123.103:0/689389516 learned_addr learned my addr 192.168.123.103:0/689389516 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb037fe700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb04133790 0x7fdb04133c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 -- 192.168.123.103:0/689389516 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb04133790 msgr2=0x7fdb04133c10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb04133790 0x7fdb04133c10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 -- 192.168.123.103:0/689389516 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdafc007ed0 con 0x7fdb04107ff0 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.318+0000 7fdb03fff700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb04133250 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fdaf400d8d0 tx=0x7fdaf400dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:07.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.319+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdaf4009880 con 0x7fdb04107ff0 2026-03-09T00:06:07.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.319+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdb0407f310 con 0x7fdb04107ff0 2026-03-09T00:06:07.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.319+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdb0407f860 con 0x7fdb04107ff0 2026-03-09T00:06:07.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.319+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdaf4010460 con 0x7fdb04107ff0 2026-03-09T00:06:07.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.320+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdaf400f5d0 con 0x7fdb04107ff0 2026-03-09T00:06:07.319 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.320+0000 7fdaeaffd700 1 -- 192.168.123.103:0/689389516 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdaf0005320 con 0x7fdb04107ff0 2026-03-09T00:06:07.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.326+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fdaf40099e0 con 0x7fdb04107ff0 2026-03-09T00:06:07.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.326+0000 7fdb017fa700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fdaec077a40 0x7fdaec079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:07.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.327+0000 7fdb037fe700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fdaec077a40 0x7fdaec079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:07.326 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.327+0000 7fdb037fe700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fdaec077a40 0x7fdaec079f00 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fdafc00db80 tx=0x7fdafc006040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:07.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.329+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(44..44 src has 1..44) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fdaf4066a30 con 0x7fdb04107ff0 2026-03-09T00:06:07.336 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.337+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdaf4062450 con 0x7fdb04107ff0 2026-03-09T00:06:07.431 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:07 vm03.local ceph-mon[129670]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:07.431 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:07 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2272248968' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:07.431 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:07 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2250972000' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:06:07.431 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:07 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:07.431 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:07 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:07.431 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:07 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:07.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:07 vm06.local ceph-mon[106218]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:07.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:07 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2272248968' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:07.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:07 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2250972000' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:06:07.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:07 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:07.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:07 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:07.534 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:07 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:07.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.620+0000 7fdaeaffd700 1 -- 192.168.123.103:0/689389516 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fdaf0005190 con 0x7fdb04107ff0 2026-03-09T00:06:07.620 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.620+0000 7fdb017fa700 1 -- 192.168.123.103:0/689389516 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7fdaf40160e0 con 0x7fdb04107ff0 2026-03-09T00:06:07.620 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:06:07.620 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:06:07.620 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fdaec077a40 msgr2=0x7fdaec079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fdaec077a40 0x7fdaec079f00 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7fdafc00db80 tx=0x7fdafc006040 comp rx=0 tx=0).stop 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 msgr2=0x7fdb04133250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb04133250 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fdaf400d8d0 tx=0x7fdaf400dbe0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 shutdown_connections 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fdaec077a40 0x7fdaec079f00 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fdb04107ff0 0x7fdb04133250 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 --2- 192.168.123.103:0/689389516 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fdb04133790 0x7fdb04133c10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 >> 192.168.123.103:0/689389516 conn(0x7fdb0406ce20 msgr2=0x7fdb040705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 shutdown_connections 2026-03-09T00:06:07.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:07.623+0000 7fdb0aa46700 1 -- 192.168.123.103:0/689389516 wait complete. 2026-03-09T00:06:08.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:08 vm03.local ceph-mon[129670]: from='client.44115 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:08.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:08 vm03.local ceph-mon[129670]: pgmap v15: 65 pgs: 65 active+clean; 258 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 918 KiB/s wr, 311 op/s 2026-03-09T00:06:08.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:08 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/689389516' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:06:08.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:08 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:08.546 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:08 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:08 vm06.local ceph-mon[106218]: from='client.44115 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:08 vm06.local ceph-mon[106218]: pgmap v15: 65 pgs: 65 active+clean; 258 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 918 KiB/s wr, 311 op/s 2026-03-09T00:06:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:08 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/689389516' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:06:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:08 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:08 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.593 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:09 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.593 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:09 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.593 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:09 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.593 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:09 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.630 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.630 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.630 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:09.630 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:10 vm03.local ceph-mon[129670]: pgmap v16: 65 pgs: 65 active+clean; 258 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 917 KiB/s wr, 311 op/s 2026-03-09T00:06:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:10 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:10 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:10 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:10 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:10 vm06.local ceph-mon[106218]: pgmap v16: 65 pgs: 65 active+clean; 258 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 917 KiB/s wr, 311 op/s 2026-03-09T00:06:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:10 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:10 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:10 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:10 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:12 vm03.local ceph-mon[129670]: pgmap v17: 65 pgs: 65 active+clean; 261 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.2 MiB/s wr, 455 op/s 2026-03-09T00:06:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:06:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:12 vm06.local ceph-mon[106218]: pgmap v17: 65 pgs: 65 active+clean; 261 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.2 MiB/s wr, 455 op/s 2026-03-09T00:06:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:06:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all crash 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]': finished 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]': finished 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: Upgrade: osd.0 is safe to restart 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: Upgrade: Updating osd.0 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:13.107 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-mon[129670]: Deploying daemon osd.0 on vm03 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all crash 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm03"}]': finished 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm06"}]': finished 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: Upgrade: osd.0 is safe to restart 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: Upgrade: Updating osd.0 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:13.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:13 vm06.local ceph-mon[106218]: Deploying daemon osd.0 on vm03 2026-03-09T00:06:13.838 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:13 vm03.local systemd[1]: Stopping Ceph osd.0 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:06:13.838 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[69755]: 2026-03-09T00:06:13.669+0000 7f2560098700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:06:13.838 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[69755]: 2026-03-09T00:06:13.669+0000 7f2560098700 -1 osd.0 44 *** Got signal Terminated *** 2026-03-09T00:06:13.838 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:13 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[69755]: 2026-03-09T00:06:13.669+0000 7f2560098700 -1 osd.0 44 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:06:14.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138723]: 2026-03-09 00:06:14.179675345 +0000 UTC m=+0.532612334 container died 7582c56d43e394a1b5fb6cf569d46e4071e61295562ad6be1d542a9242f9a437 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, org.label-schema.vendor=CentOS, GIT_CLEAN=True, org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-09T00:06:14.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138723]: 2026-03-09 00:06:14.220687988 +0000 UTC m=+0.573624977 container remove 7582c56d43e394a1b5fb6cf569d46e4071e61295562ad6be1d542a9242f9a437 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, maintainer=Guillaume Abrioux , GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, GIT_CLEAN=True, ceph=True) 2026-03-09T00:06:14.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local bash[138723]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0 2026-03-09T00:06:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:14 vm03.local ceph-mon[129670]: pgmap v18: 65 pgs: 65 active+clean; 261 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 836 KiB/s wr, 314 op/s 2026-03-09T00:06:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:14 vm03.local ceph-mon[129670]: osd.0 marked itself down and dead 2026-03-09T00:06:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:14.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:14 vm06.local ceph-mon[106218]: pgmap v18: 65 pgs: 65 active+clean; 261 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 836 KiB/s wr, 314 op/s 2026-03-09T00:06:14.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:14 vm06.local ceph-mon[106218]: osd.0 marked itself down and dead 2026-03-09T00:06:14.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:14.678 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.406341902 +0000 UTC m=+0.019940338 container create 4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.466023515 +0000 UTC m=+0.079621940 container init 4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.build-date=20260223) 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.47734984 +0000 UTC m=+0.090948276 container start 4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.485438358 +0000 UTC m=+0.099036794 container attach 4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.397537465 +0000 UTC m=+0.011135910 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local conmon[138799]: conmon 4d7e77a8f565a59f82f5 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61.scope/container/memory.events 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.620219947 +0000 UTC m=+0.233818383 container died 4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138787]: 2026-03-09 00:06:14.641171546 +0000 UTC m=+0.254769982 container remove 4d7e77a8f565a59f82f589005f9a5feabf816b0d320df83529b65f002a98fe61 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0.service: Deactivated successfully. 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0.service: Unit process 138799 (conmon) remains running after unit stopped. 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0.service: Unit process 138808 (podman) remains running after unit stopped. 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local systemd[1]: Stopped Ceph osd.0 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:06:14.679 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0.service: Consumed 39.812s CPU time, 625.5M memory peak. 2026-03-09T00:06:14.963 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local systemd[1]: Starting Ceph osd.0 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:06:15.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-mon[129670]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:06:15.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-mon[129670]: osdmap e45: 6 total, 5 up, 6 in 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:14 vm03.local podman[138887]: 2026-03-09 00:06:14.963378273 +0000 UTC m=+0.018122996 container create 6572aafc0c7de61c26fd75abbc9324cdd323ca859c75a529a3faad64ee8d8ac5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True) 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local podman[138887]: 2026-03-09 00:06:15.044335843 +0000 UTC m=+0.099080575 container init 6572aafc0c7de61c26fd75abbc9324cdd323ca859c75a529a3faad64ee8d8ac5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local podman[138887]: 2026-03-09 00:06:15.047929818 +0000 UTC m=+0.102674541 container start 6572aafc0c7de61c26fd75abbc9324cdd323ca859c75a529a3faad64ee8d8ac5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local podman[138887]: 2026-03-09 00:06:15.048830053 +0000 UTC m=+0.103574776 container attach 6572aafc0c7de61c26fd75abbc9324cdd323ca859c75a529a3faad64ee8d8ac5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local podman[138887]: 2026-03-09 00:06:14.95586323 +0000 UTC m=+0.010607953 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:15.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:15.339 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:15.339 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:15.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:15 vm06.local ceph-mon[106218]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:06:15.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:15 vm06.local ceph-mon[106218]: osdmap e45: 6 total, 5 up, 6 in 2026-03-09T00:06:16.258 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:06:16.258 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:06:16.258 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6f891384-88a6-43d3-8de2-3ac3b784b5c8/osd-block-1eefdd28-e5a7-4e98-a454-60c0bb654070 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T00:06:16.259 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:15 vm03.local bash[138887]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-6f891384-88a6-43d3-8de2-3ac3b784b5c8/osd-block-1eefdd28-e5a7-4e98-a454-60c0bb654070 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/ln -snf /dev/ceph-6f891384-88a6-43d3-8de2-3ac3b784b5c8/osd-block-1eefdd28-e5a7-4e98-a454-60c0bb654070 /var/lib/ceph/osd/ceph-0/block 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local bash[138887]: Running command: /usr/bin/ln -snf /dev/ceph-6f891384-88a6-43d3-8de2-3ac3b784b5c8/osd-block-1eefdd28-e5a7-4e98-a454-60c0bb654070 /var/lib/ceph/osd/ceph-0/block 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local bash[138887]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local bash[138887]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local bash[138887]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate[138900]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local bash[138887]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local podman[139123]: 2026-03-09 00:06:16.324457047 +0000 UTC m=+0.027616984 container died 6572aafc0c7de61c26fd75abbc9324cdd323ca859c75a529a3faad64ee8d8ac5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:06:16.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local podman[139123]: 2026-03-09 00:06:16.358713568 +0000 UTC m=+0.061873494 container remove 6572aafc0c7de61c26fd75abbc9324cdd323ca859c75a529a3faad64ee8d8ac5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS) 2026-03-09T00:06:16.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-mon[129670]: osdmap e46: 6 total, 5 up, 6 in 2026-03-09T00:06:16.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-mon[129670]: pgmap v21: 65 pgs: 34 peering, 31 active+clean; 250 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 1.1 MiB/s wr, 468 op/s 2026-03-09T00:06:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:16 vm06.local ceph-mon[106218]: osdmap e46: 6 total, 5 up, 6 in 2026-03-09T00:06:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:16 vm06.local ceph-mon[106218]: pgmap v21: 65 pgs: 34 peering, 31 active+clean; 250 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 1.1 MiB/s wr, 468 op/s 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local podman[139163]: 2026-03-09 00:06:16.620347545 +0000 UTC m=+0.043889829 container create 7112eceae9ce23ba4a76bf3a63ec90108cba7b6f6f6affc826b6fe007a09b262 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2) 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local podman[139163]: 2026-03-09 00:06:16.656514159 +0000 UTC m=+0.080056453 container init 7112eceae9ce23ba4a76bf3a63ec90108cba7b6f6f6affc826b6fe007a09b262 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local podman[139163]: 2026-03-09 00:06:16.660788198 +0000 UTC m=+0.084330482 container start 7112eceae9ce23ba4a76bf3a63ec90108cba7b6f6f6affc826b6fe007a09b262 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local bash[139163]: 7112eceae9ce23ba4a76bf3a63ec90108cba7b6f6f6affc826b6fe007a09b262 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local podman[139163]: 2026-03-09 00:06:16.612059925 +0000 UTC m=+0.035602219 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local systemd[1]: Started Ceph osd.0 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:06:16.951 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:16 vm03.local ceph-osd[139177]: -- 192.168.123.103:0/753460053 <== mon.1 v2:192.168.123.106:3300/0 4 ==== auth_reply(proto 2 0 (0) Success) ==== 194+0+0 (secure 0 0 0) 0x5638f4aba960 con 0x5638f4a99c00 2026-03-09T00:06:17.284 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:17 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:06:17.282+0000 7fc7cfd04740 -1 Falling back to public interface 2026-03-09T00:06:17.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:17.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:17.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:17.971 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:17.971 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:17.971 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:18 vm03.local ceph-mon[129670]: pgmap v22: 65 pgs: 34 peering, 31 active+clean; 250 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 1.1 MiB/s wr, 468 op/s 2026-03-09T00:06:19.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:18 vm06.local ceph-mon[106218]: pgmap v22: 65 pgs: 34 peering, 31 active+clean; 250 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 17 KiB/s rd, 1.1 MiB/s wr, 468 op/s 2026-03-09T00:06:20.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:19 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:19 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:19 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:19 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:19 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:19 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:19 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:20.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:19 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:20 vm06.local ceph-mon[106218]: pgmap v23: 65 pgs: 34 peering, 31 active+clean; 250 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 7.7 KiB/s rd, 603 KiB/s wr, 253 op/s 2026-03-09T00:06:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:06:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:20 vm03.local ceph-mon[129670]: pgmap v23: 65 pgs: 34 peering, 31 active+clean; 250 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 7.7 KiB/s rd, 603 KiB/s wr, 253 op/s 2026-03-09T00:06:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:06:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:06:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:21.962 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-09T00:06:21.963 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-09T00:06:22.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-09T00:06:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:21 vm06.local ceph-mon[106218]: pgmap v24: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 237 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 1.2 MiB/s wr, 590 op/s; 1155/7365 objects degraded (15.682%) 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-09T00:06:22.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:21 vm03.local ceph-mon[129670]: pgmap v24: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 237 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 1.2 MiB/s wr, 590 op/s; 1155/7365 objects degraded (15.682%) 2026-03-09T00:06:22.360 DEBUG:teuthology.parallel:result is None 2026-03-09T00:06:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:23 vm03.local ceph-mon[129670]: Health check failed: Degraded data redundancy: 1155/7365 objects degraded (15.682%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:23.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:23 vm06.local ceph-mon[106218]: Health check failed: Degraded data redundancy: 1155/7365 objects degraded (15.682%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:23.838 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:23 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:06:23.340+0000 7fc7cfd04740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-09T00:06:24.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:24 vm03.local ceph-mon[129670]: pgmap v25: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 237 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.0 MiB/s wr, 497 op/s; 1155/7365 objects degraded (15.682%) 2026-03-09T00:06:24.338 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:24 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:06:24.129+0000 7fc7cfd04740 -1 osd.0 44 log_to_monitors true 2026-03-09T00:06:24.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:24 vm06.local ceph-mon[106218]: pgmap v25: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 237 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.0 MiB/s wr, 497 op/s; 1155/7365 objects degraded (15.682%) 2026-03-09T00:06:26.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:25 vm06.local ceph-mon[106218]: from='osd.0 [v2:192.168.123.103:6802/3333106348,v1:192.168.123.103:6803/3333106348]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T00:06:26.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:25 vm06.local ceph-mon[106218]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T00:06:26.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:25 vm03.local ceph-mon[129670]: from='osd.0 [v2:192.168.123.103:6802/3333106348,v1:192.168.123.103:6803/3333106348]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T00:06:26.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:25 vm03.local ceph-mon[129670]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-09T00:06:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:26 vm06.local ceph-mon[106218]: pgmap v26: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 230 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 577 op/s; 591/3729 objects degraded (15.849%) 2026-03-09T00:06:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:26 vm06.local ceph-mon[106218]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T00:06:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:26 vm06.local ceph-mon[106218]: from='osd.0 [v2:192.168.123.103:6802/3333106348,v1:192.168.123.103:6803/3333106348]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:06:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:26 vm06.local ceph-mon[106218]: osdmap e47: 6 total, 5 up, 6 in 2026-03-09T00:06:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:26 vm06.local ceph-mon[106218]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:06:27.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:26 vm03.local ceph-mon[129670]: pgmap v26: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 230 MiB data, 3.4 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.1 MiB/s wr, 577 op/s; 591/3729 objects degraded (15.849%) 2026-03-09T00:06:27.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:26 vm03.local ceph-mon[129670]: from='osd.0 ' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-09T00:06:27.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:26 vm03.local ceph-mon[129670]: from='osd.0 [v2:192.168.123.103:6802/3333106348,v1:192.168.123.103:6803/3333106348]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:06:27.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:26 vm03.local ceph-mon[129670]: osdmap e47: 6 total, 5 up, 6 in 2026-03-09T00:06:27.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:26 vm03.local ceph-mon[129670]: from='osd.0 ' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:06:28.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:27 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 591/3729 objects degraded (15.849%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:28.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:27 vm03.local ceph-mon[129670]: from='osd.0 ' entity='osd.0' 2026-03-09T00:06:28.088 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:06:27 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:06:27.809+0000 7fc7c729d640 -1 osd.0 44 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:06:28.093 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-09T00:06:28.093 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-09T00:06:28.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:27 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 591/3729 objects degraded (15.849%), 34 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:28.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:27 vm06.local ceph-mon[106218]: from='osd.0 ' entity='osd.0' 2026-03-09T00:06:28.442 DEBUG:teuthology.parallel:result is None 2026-03-09T00:06:28.442 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T00:06:28.469 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-09T00:06:28.469 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T00:06:28.494 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-09T00:06:28.494 DEBUG:teuthology.parallel:result is None 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: pgmap v28: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 230 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.1 MiB/s wr, 605 op/s; 591/3729 objects degraded (15.849%) 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: osd.0 [v2:192.168.123.103:6802/3333106348,v1:192.168.123.103:6803/3333106348] boot 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: osdmap e48: 6 total, 6 up, 6 in 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:29.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:28 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:29.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: pgmap v28: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 230 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.1 MiB/s wr, 605 op/s; 591/3729 objects degraded (15.849%) 2026-03-09T00:06:29.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:06:29.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: osd.0 [v2:192.168.123.103:6802/3333106348,v1:192.168.123.103:6803/3333106348] boot 2026-03-09T00:06:29.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: osdmap e48: 6 total, 6 up, 6 in 2026-03-09T00:06:29.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-09T00:06:29.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:06:29.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:28 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:30.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:29 vm03.local ceph-mon[129670]: osdmap e49: 6 total, 6 up, 6 in 2026-03-09T00:06:30.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:29 vm03.local ceph-mon[129670]: pgmap v31: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 230 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.0 MiB/s wr, 560 op/s; 591/3729 objects degraded (15.849%) 2026-03-09T00:06:30.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:29 vm03.local ceph-mon[129670]: osdmap e50: 6 total, 6 up, 6 in 2026-03-09T00:06:30.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:29 vm06.local ceph-mon[106218]: osdmap e49: 6 total, 6 up, 6 in 2026-03-09T00:06:30.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:29 vm06.local ceph-mon[106218]: pgmap v31: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 230 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.0 MiB/s wr, 560 op/s; 591/3729 objects degraded (15.849%) 2026-03-09T00:06:30.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:29 vm06.local ceph-mon[106218]: osdmap e50: 6 total, 6 up, 6 in 2026-03-09T00:06:32.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:32 vm06.local ceph-mon[106218]: pgmap v33: 65 pgs: 1 active+recovering+degraded, 16 remapped+peering, 14 active+recovery_wait+degraded, 34 active+clean; 216 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 740 KiB/s wr, 185 op/s; 196/1464 objects degraded (13.388%); 57 KiB/s, 1 keys/s, 5 objects/s recovering 2026-03-09T00:06:32.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:32 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 196/1464 objects degraded (13.388%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:33.041 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:32 vm03.local ceph-mon[129670]: pgmap v33: 65 pgs: 1 active+recovering+degraded, 16 remapped+peering, 14 active+recovery_wait+degraded, 34 active+clean; 216 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 740 KiB/s wr, 185 op/s; 196/1464 objects degraded (13.388%); 57 KiB/s, 1 keys/s, 5 objects/s recovering 2026-03-09T00:06:33.041 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:32 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 196/1464 objects degraded (13.388%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:34.978 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:34 vm06.local ceph-mon[106218]: pgmap v34: 65 pgs: 1 active+recovering+degraded, 16 remapped+peering, 14 active+recovery_wait+degraded, 34 active+clean; 216 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 705 KiB/s wr, 176 op/s; 196/1464 objects degraded (13.388%); 54 KiB/s, 1 keys/s, 5 objects/s recovering 2026-03-09T00:06:35.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:34 vm03.local ceph-mon[129670]: pgmap v34: 65 pgs: 1 active+recovering+degraded, 16 remapped+peering, 14 active+recovery_wait+degraded, 34 active+clean; 216 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 705 KiB/s wr, 176 op/s; 196/1464 objects degraded (13.388%); 54 KiB/s, 1 keys/s, 5 objects/s recovering 2026-03-09T00:06:36.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:36 vm03.local ceph-mon[129670]: pgmap v35: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 11 active+recovery_wait+degraded, 1 active+recovering, 37 active+clean; 213 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 563 KiB/s wr, 194 op/s; 1519/237 objects degraded (640.928%); 43 KiB/s, 31 keys/s, 11 objects/s recovering 2026-03-09T00:06:36.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:36.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:36 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:36.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:36 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-09T00:06:37.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:36 vm06.local ceph-mon[106218]: pgmap v35: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 11 active+recovery_wait+degraded, 1 active+recovering, 37 active+clean; 213 MiB data, 3.3 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 563 KiB/s wr, 194 op/s; 1519/237 objects degraded (640.928%); 43 KiB/s, 31 keys/s, 11 objects/s recovering 2026-03-09T00:06:37.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:37.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:36 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:37.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:36 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-09T00:06:37.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.701+0000 7fa5da38a700 1 -- 192.168.123.103:0/4130618988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 msgr2=0x7fa5d41139a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:37.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.701+0000 7fa5da38a700 1 --2- 192.168.123.103:0/4130618988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d41139a0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fa5c4009a60 tx=0x7fa5c4009d70 comp rx=0 tx=0).stop 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.702+0000 7fa5da38a700 1 -- 192.168.123.103:0/4130618988 shutdown_connections 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.702+0000 7fa5da38a700 1 --2- 192.168.123.103:0/4130618988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d41139a0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.702+0000 7fa5da38a700 1 --2- 192.168.123.103:0/4130618988 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 0x7fa5d4108170 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.702+0000 7fa5da38a700 1 -- 192.168.123.103:0/4130618988 >> 192.168.123.103:0/4130618988 conn(0x7fa5d406ce20 msgr2=0x7fa5d406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.702+0000 7fa5da38a700 1 -- 192.168.123.103:0/4130618988 shutdown_connections 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.702+0000 7fa5da38a700 1 -- 192.168.123.103:0/4130618988 wait complete. 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.703+0000 7fa5da38a700 1 Processor -- start 2026-03-09T00:06:37.702 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.703+0000 7fa5da38a700 1 -- start start 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.703+0000 7fa5da38a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 0x7fa5d419cef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.703+0000 7fa5da38a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d419d430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.703+0000 7fa5da38a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5d419db10 con 0x7fa5d4107d90 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.703+0000 7fa5da38a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa5d41a18a0 con 0x7fa5d41086b0 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d9388700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 0x7fa5d419cef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d419d430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d419d430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:46248/0 (socket says 192.168.123.103:46248) 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 -- 192.168.123.103:0/2154838421 learned_addr learned my addr 192.168.123.103:0/2154838421 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:37.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d9388700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 0x7fa5d419cef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40762/0 (socket says 192.168.123.103:40762) 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 -- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 msgr2=0x7fa5d419cef0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 0x7fa5d419cef0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 -- 192.168.123.103:0/2154838421 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5c4009710 con 0x7fa5d41086b0 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.704+0000 7fa5d8b87700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d419d430 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fa5c400f690 tx=0x7fa5c400f770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.705+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5c401d070 con 0x7fa5d41086b0 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.705+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa5d41a1b20 con 0x7fa5d41086b0 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.705+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5d41a2070 con 0x7fa5d41086b0 2026-03-09T00:06:37.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.705+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa5c400fd00 con 0x7fa5d41086b0 2026-03-09T00:06:37.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.705+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa5d41110a0 con 0x7fa5d41086b0 2026-03-09T00:06:37.705 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.706+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa5c4017600 con 0x7fa5d41086b0 2026-03-09T00:06:37.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.706+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fa5c4021410 con 0x7fa5d41086b0 2026-03-09T00:06:37.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.707+0000 7fa5ca7fc700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa5c007be60 0x7fa5c007e320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:37.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.707+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7fa5c409b6b0 con 0x7fa5d41086b0 2026-03-09T00:06:37.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.707+0000 7fa5d9388700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa5c007be60 0x7fa5c007e320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:37.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.708+0000 7fa5d9388700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa5c007be60 0x7fa5c007e320 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fa5d0005fd0 tx=0x7fa5d0009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:37.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.709+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa5c4063d40 con 0x7fa5d41086b0 2026-03-09T00:06:37.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.832+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa5d419e360 con 0x7fa5c007be60 2026-03-09T00:06:37.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.833+0000 7fa5ca7fc700 1 -- 192.168.123.103:0/2154838421 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fa5d419e360 con 0x7fa5c007be60 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.835+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa5c007be60 msgr2=0x7fa5c007e320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.835+0000 7fa5da38a700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa5c007be60 0x7fa5c007e320 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7fa5d0005fd0 tx=0x7fa5d0009500 comp rx=0 tx=0).stop 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 msgr2=0x7fa5d419d430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d419d430 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fa5c400f690 tx=0x7fa5c400f770 comp rx=0 tx=0).stop 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 shutdown_connections 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa5c007be60 0x7fa5c007e320 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa5d4107d90 0x7fa5d419cef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 --2- 192.168.123.103:0/2154838421 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa5d41086b0 0x7fa5d419d430 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 >> 192.168.123.103:0/2154838421 conn(0x7fa5d406ce20 msgr2=0x7fa5d410e1e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:37.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 shutdown_connections 2026-03-09T00:06:37.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.836+0000 7fa5da38a700 1 -- 192.168.123.103:0/2154838421 wait complete. 2026-03-09T00:06:37.845 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 -- 192.168.123.103:0/1705525161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce40731c0 msgr2=0x7f5ce40735a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 --2- 192.168.123.103:0/1705525161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce40731c0 0x7f5ce40735a0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f5cd4009b00 tx=0x7f5cd4009e10 comp rx=0 tx=0).stop 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 -- 192.168.123.103:0/1705525161 shutdown_connections 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 --2- 192.168.123.103:0/1705525161 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4073ae0 0x7f5ce410d170 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 --2- 192.168.123.103:0/1705525161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce40731c0 0x7f5ce40735a0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 -- 192.168.123.103:0/1705525161 >> 192.168.123.103:0/1705525161 conn(0x7f5ce40fc920 msgr2=0x7f5ce40fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:37.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 -- 192.168.123.103:0/1705525161 shutdown_connections 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.903+0000 7f5cec72c700 1 -- 192.168.123.103:0/1705525161 wait complete. 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5cec72c700 1 Processor -- start 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5cec72c700 1 -- start start 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5cec72c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce4073ae0 0x7f5ce4198ff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5cec72c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 0x7f5ce419d9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5cec72c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ce4199b50 con 0x7f5ce4073ae0 2026-03-09T00:06:37.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5cec72c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ce4199cc0 con 0x7f5ce4199530 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5ce9cc7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 0x7f5ce419d9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5ce9cc7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 0x7f5ce419d9a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:46256/0 (socket says 192.168.123.103:46256) 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5ce9cc7700 1 -- 192.168.123.103:0/2075052741 learned_addr learned my addr 192.168.123.103:0/2075052741 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.904+0000 7f5ce9cc7700 1 -- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce4073ae0 msgr2=0x7f5ce4198ff0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cea4c8700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce4073ae0 0x7f5ce4198ff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5ce9cc7700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce4073ae0 0x7f5ce4198ff0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5ce9cc7700 1 -- 192.168.123.103:0/2075052741 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5cd40097e0 con 0x7f5ce4199530 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cea4c8700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce4073ae0 0x7f5ce4198ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5ce9cc7700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 0x7f5ce419d9a0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f5ce000eb10 tx=0x7f5ce000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ce000cca0 con 0x7f5ce4199530 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5ce000ce00 con 0x7f5ce4199530 2026-03-09T00:06:37.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ce419dfa0 con 0x7f5ce4199530 2026-03-09T00:06:37.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ce00189c0 con 0x7f5ce4199530 2026-03-09T00:06:37.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.905+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ce419e4f0 con 0x7f5ce4199530 2026-03-09T00:06:37.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.906+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ce410a870 con 0x7f5ce4199530 2026-03-09T00:06:37.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.907+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f5ce0018b20 con 0x7f5ce4199530 2026-03-09T00:06:37.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.907+0000 7f5cdb7fe700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5cd00779f0 0x7f5cd0079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:37.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.908+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7f5ce0014070 con 0x7f5ce4199530 2026-03-09T00:06:37.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.908+0000 7f5cea4c8700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5cd00779f0 0x7f5cd0079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:37.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.908+0000 7f5cea4c8700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5cd00779f0 0x7f5cd0079eb0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f5cd400b5c0 tx=0x7f5cd4005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:37.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:37.909+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5ce0063370 con 0x7f5ce4199530 2026-03-09T00:06:37.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:37 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1519/237 objects degraded (640.928%), 27 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:38.033 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.034+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f5ce4066ef0 con 0x7f5cd00779f0 2026-03-09T00:06:38.036 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:37 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1519/237 objects degraded (640.928%), 27 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:38.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.039+0000 7f5cdb7fe700 1 -- 192.168.123.103:0/2075052741 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f5ce4066ef0 con 0x7f5cd00779f0 2026-03-09T00:06:38.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5cd00779f0 msgr2=0x7f5cd0079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5cd00779f0 0x7f5cd0079eb0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f5cd400b5c0 tx=0x7f5cd4005fd0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 msgr2=0x7f5ce419d9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 0x7f5ce419d9a0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f5ce000eb10 tx=0x7f5ce000eed0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 shutdown_connections 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5cd00779f0 0x7f5cd0079eb0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ce4073ae0 0x7f5ce4198ff0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 --2- 192.168.123.103:0/2075052741 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ce4199530 0x7f5ce419d9a0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.042+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 >> 192.168.123.103:0/2075052741 conn(0x7f5ce40fc920 msgr2=0x7f5ce41079b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.043+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 shutdown_connections 2026-03-09T00:06:38.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.043+0000 7f5cec72c700 1 -- 192.168.123.103:0/2075052741 wait complete. 2026-03-09T00:06:38.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 -- 192.168.123.103:0/1284270857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 msgr2=0x7f6748068b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 --2- 192.168.123.103:0/1284270857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748068b10 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f6730009b00 tx=0x7f6730009e10 comp rx=0 tx=0).stop 2026-03-09T00:06:38.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 -- 192.168.123.103:0/1284270857 shutdown_connections 2026-03-09T00:06:38.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 --2- 192.168.123.103:0/1284270857 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748105b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 --2- 192.168.123.103:0/1284270857 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748068b10 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 -- 192.168.123.103:0/1284270857 >> 192.168.123.103:0/1284270857 conn(0x7f6748075960 msgr2=0x7f6748075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 -- 192.168.123.103:0/1284270857 shutdown_connections 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.107+0000 7f674cea3700 1 -- 192.168.123.103:0/1284270857 wait complete. 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f674cea3700 1 Processor -- start 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f674cea3700 1 -- start start 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f674cea3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748100010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f674cea3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748100550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f674cea3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6748100a90 con 0x7f6748068730 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f674cea3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6748100bd0 con 0x7f67480690e0 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f6745d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748100550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f6745d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748100550 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:46284/0 (socket says 192.168.123.103:46284) 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.108+0000 7f6745d9b700 1 -- 192.168.123.103:0/1049958367 learned_addr learned my addr 192.168.123.103:0/1049958367 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f674659c700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748100010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f6745d9b700 1 -- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 msgr2=0x7f6748100010 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f6745d9b700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748100010 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f6745d9b700 1 -- 192.168.123.103:0/1049958367 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f67300097e0 con 0x7f67480690e0 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f674659c700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748100010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:06:38.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f6745d9b700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748100550 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f673800d8d0 tx=0x7f673800dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67380098e0 con 0x7f67480690e0 2026-03-09T00:06:38.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.109+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f67481a2df0 con 0x7f67480690e0 2026-03-09T00:06:38.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.110+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6738010460 con 0x7f67480690e0 2026-03-09T00:06:38.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.110+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f673800f5d0 con 0x7f67480690e0 2026-03-09T00:06:38.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.110+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f67481a3200 con 0x7f67480690e0 2026-03-09T00:06:38.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.111+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f6738010ab0 con 0x7f67480690e0 2026-03-09T00:06:38.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.111+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67481094e0 con 0x7f67480690e0 2026-03-09T00:06:38.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.112+0000 7f673f7fe700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67340779f0 0x7f6734079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.112+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7f6738099aa0 con 0x7f67480690e0 2026-03-09T00:06:38.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.112+0000 7f674659c700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67340779f0 0x7f6734079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.112+0000 7f674659c700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67340779f0 0x7f6734079eb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f6730006010 tx=0x7f673000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.114+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6738061e60 con 0x7f67480690e0 2026-03-09T00:06:38.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.234+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6748101540 con 0x7f67340779f0 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.240+0000 7f673f7fe700 1 -- 192.168.123.103:0/1049958367 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f6748101540 con 0x7f67340779f0 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (114s) 19s ago 7m 24.0M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (7m) 19s ago 7m 8786k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (6m) 30s ago 6m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (33s) 19s ago 7m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (31s) 30s ago 6m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (99s) 19s ago 6m 73.3M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (4m) 19s ago 4m 18.1M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (4m) 19s ago 4m 218M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (4m) 30s ago 4m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (4m) 30s ago 4m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (2m) 19s ago 7m 619M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (2m) 30s ago 6m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (61s) 19s ago 7m 57.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (46s) 30s ago 6m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 19s ago 7m 9445k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 30s ago 6m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (21s) 19s ago 6m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5m) 19s ago 5m 451M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (5m) 19s ago 5m 370M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (5m) 30s ago 5m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (5m) 30s ago 5m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:06:38.240 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (5m) 30s ago 5m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:06:38.241 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 19s ago 6m 56.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67340779f0 msgr2=0x7f6734079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67340779f0 0x7f6734079eb0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f6730006010 tx=0x7f673000b540 comp rx=0 tx=0).stop 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 msgr2=0x7f6748100550 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748100550 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f673800d8d0 tx=0x7f673800dc90 comp rx=0 tx=0).stop 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 shutdown_connections 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67340779f0 0x7f6734079eb0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6748068730 0x7f6748100010 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 --2- 192.168.123.103:0/1049958367 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67480690e0 0x7f6748100550 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.243+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 >> 192.168.123.103:0/1049958367 conn(0x7f6748075960 msgr2=0x7f67480ff7b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.244+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 shutdown_connections 2026-03-09T00:06:38.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.244+0000 7f674cea3700 1 -- 192.168.123.103:0/1049958367 wait complete. 2026-03-09T00:06:38.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.306+0000 7f834aa39700 1 -- 192.168.123.103:0/1983944404 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 msgr2=0x7f8344111730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.306+0000 7f834aa39700 1 --2- 192.168.123.103:0/1983944404 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f8344111730 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f8334009a60 tx=0x7f8334009d70 comp rx=0 tx=0).stop 2026-03-09T00:06:38.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.308+0000 7f834aa39700 1 -- 192.168.123.103:0/1983944404 shutdown_connections 2026-03-09T00:06:38.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.308+0000 7f834aa39700 1 --2- 192.168.123.103:0/1983944404 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f8344111730 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.308+0000 7f834aa39700 1 --2- 192.168.123.103:0/1983944404 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f8344073510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.308+0000 7f834aa39700 1 -- 192.168.123.103:0/1983944404 >> 192.168.123.103:0/1983944404 conn(0x7f83440fc790 msgr2=0x7f83440febb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.310+0000 7f834aa39700 1 -- 192.168.123.103:0/1983944404 shutdown_connections 2026-03-09T00:06:38.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.310+0000 7f834aa39700 1 -- 192.168.123.103:0/1983944404 wait complete. 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f834aa39700 1 Processor -- start 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f834aa39700 1 -- start start 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f834aa39700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f83440728e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f834aa39700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f834406d8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f834aa39700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f834406de20 con 0x7f8344073130 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f834aa39700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f834406df90 con 0x7f8344073a50 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f8343fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f83440728e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f8343fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f83440728e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40792/0 (socket says 192.168.123.103:40792) 2026-03-09T00:06:38.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f8343fff700 1 -- 192.168.123.103:0/574179874 learned_addr learned my addr 192.168.123.103:0/574179874 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f8343fff700 1 -- 192.168.123.103:0/574179874 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 msgr2=0x7f834406d8e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.311+0000 7f83437fe700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f834406d8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f8343fff700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f834406d8e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f8343fff700 1 -- 192.168.123.103:0/574179874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f832c0097e0 con 0x7f8344073130 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f8343fff700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f83440728e0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f832c00c2d0 tx=0x7f832c00c690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f832c004020 con 0x7f8344073130 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f832c003680 con 0x7f8344073130 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f83437fe700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f834406d8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:06:38.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.312+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f832c0107a0 con 0x7f8344073130 2026-03-09T00:06:38.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.313+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8334009710 con 0x7f8344073130 2026-03-09T00:06:38.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.313+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f834406e600 con 0x7f8344073130 2026-03-09T00:06:38.312 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.313+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f834410eeb0 con 0x7f8344073130 2026-03-09T00:06:38.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.316+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f832c0037f0 con 0x7f8344073130 2026-03-09T00:06:38.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.317+0000 7f83417fa700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8330077ab0 0x7f8330079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.317+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7f832c014070 con 0x7f8344073130 2026-03-09T00:06:38.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.317+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f832c09aa00 con 0x7f8344073130 2026-03-09T00:06:38.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.317+0000 7f83437fe700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8330077ab0 0x7f8330079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.317+0000 7f83437fe700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8330077ab0 0x7f8330079f70 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f834406f0d0 tx=0x7f833400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.474+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f834404f2e0 con 0x7f8344073130 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.475+0000 7f83417fa700 1 -- 192.168.123.103:0/574179874 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f832c062c00 con 0x7f8344073130 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:06:38.474 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:06:38.475 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:06:38.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8330077ab0 msgr2=0x7f8330079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.476 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8330077ab0 0x7f8330079f70 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f834406f0d0 tx=0x7f833400b540 comp rx=0 tx=0).stop 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 msgr2=0x7f83440728e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f83440728e0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f832c00c2d0 tx=0x7f832c00c690 comp rx=0 tx=0).stop 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 shutdown_connections 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8330077ab0 0x7f8330079f70 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8344073130 0x7f83440728e0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 --2- 192.168.123.103:0/574179874 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8344073a50 0x7f834406d8e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.477+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 >> 192.168.123.103:0/574179874 conn(0x7f83440fc790 msgr2=0x7f8344103260 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.478+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 shutdown_connections 2026-03-09T00:06:38.477 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.478+0000 7f834aa39700 1 -- 192.168.123.103:0/574179874 wait complete. 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.541+0000 7f4795e97700 1 -- 192.168.123.103:0/3322650233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 msgr2=0x7f4790105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.541+0000 7f4795e97700 1 --2- 192.168.123.103:0/3322650233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f4790105b50 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f4780009b00 tx=0x7f4780009e10 comp rx=0 tx=0).stop 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.541+0000 7f4795e97700 1 -- 192.168.123.103:0/3322650233 shutdown_connections 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.541+0000 7f4795e97700 1 --2- 192.168.123.103:0/3322650233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f4790105b50 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.541+0000 7f4795e97700 1 --2- 192.168.123.103:0/3322650233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 0x7f4790068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.541+0000 7f4795e97700 1 -- 192.168.123.103:0/3322650233 >> 192.168.123.103:0/3322650233 conn(0x7f4790075960 msgr2=0x7f4790075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 -- 192.168.123.103:0/3322650233 shutdown_connections 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 -- 192.168.123.103:0/3322650233 wait complete. 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 Processor -- start 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 -- start start 2026-03-09T00:06:38.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 0x7f4790198e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f47901993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f47901999f0 con 0x7f47900690e0 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.542+0000 7f4795e97700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4790199b30 con 0x7f4790068730 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 0x7f4790198e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478effd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f47901993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 0x7f4790198e60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:46324/0 (socket says 192.168.123.103:46324) 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478f7fe700 1 -- 192.168.123.103:0/3346058630 learned_addr learned my addr 192.168.123.103:0/3346058630 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478effd700 1 -- 192.168.123.103:0/3346058630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 msgr2=0x7f4790198e60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478effd700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 0x7f4790198e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478effd700 1 -- 192.168.123.103:0/3346058630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f47800097e0 con 0x7f47900690e0 2026-03-09T00:06:38.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.543+0000 7f478effd700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f47901993a0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f4780000c00 tx=0x7f4780004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.544+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f478001d070 con 0x7f47900690e0 2026-03-09T00:06:38.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.544+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f479019d8b0 con 0x7f47900690e0 2026-03-09T00:06:38.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.544+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f479019dd70 con 0x7f47900690e0 2026-03-09T00:06:38.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.544+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4780004b90 con 0x7f47900690e0 2026-03-09T00:06:38.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.544+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f478000f670 con 0x7f47900690e0 2026-03-09T00:06:38.544 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.545+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4770005320 con 0x7f47900690e0 2026-03-09T00:06:38.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.548+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f478000bc50 con 0x7f47900690e0 2026-03-09T00:06:38.547 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.548+0000 7f478cff9700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f477c077a40 0x7f477c079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.548+0000 7f478f7fe700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f477c077a40 0x7f477c079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.549+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7f478009bd90 con 0x7f47900690e0 2026-03-09T00:06:38.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.549+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f47800cc9f0 con 0x7f47900690e0 2026-03-09T00:06:38.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.549+0000 7f478f7fe700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f477c077a40 0x7f477c079f00 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f477800ad20 tx=0x7f4778005f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.684+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f4770005cc0 con 0x7f47900690e0 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.685+0000 7f478cff9700 1 -- 192.168.123.103:0/3346058630 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f4780027790 con 0x7f47900690e0 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:06:38.684 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:38.685 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f477c077a40 msgr2=0x7f477c079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f477c077a40 0x7f477c079f00 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f477800ad20 tx=0x7f4778005f90 comp rx=0 tx=0).stop 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 msgr2=0x7f47901993a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f47901993a0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f4780000c00 tx=0x7f4780004930 comp rx=0 tx=0).stop 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 shutdown_connections 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f477c077a40 0x7f477c079f00 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4790068730 0x7f4790198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 --2- 192.168.123.103:0/3346058630 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f47900690e0 0x7f47901993a0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 >> 192.168.123.103:0/3346058630 conn(0x7f4790075960 msgr2=0x7f4790102b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 shutdown_connections 2026-03-09T00:06:38.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.688+0000 7f4795e97700 1 -- 192.168.123.103:0/3346058630 wait complete. 2026-03-09T00:06:38.695 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.759+0000 7fcccc982700 1 -- 192.168.123.103:0/640282868 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4102db0 msgr2=0x7fccc4103190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.759+0000 7fcccc982700 1 --2- 192.168.123.103:0/640282868 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4102db0 0x7fccc4103190 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fccbc009b00 tx=0x7fccbc009e10 comp rx=0 tx=0).stop 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 -- 192.168.123.103:0/640282868 shutdown_connections 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 --2- 192.168.123.103:0/640282868 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccc4069180 0x7fccc4069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 --2- 192.168.123.103:0/640282868 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4102db0 0x7fccc4103190 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 -- 192.168.123.103:0/640282868 >> 192.168.123.103:0/640282868 conn(0x7fccc4076b70 msgr2=0x7fccc4076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 -- 192.168.123.103:0/640282868 shutdown_connections 2026-03-09T00:06:38.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 -- 192.168.123.103:0/640282868 wait complete. 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.760+0000 7fcccc982700 1 Processor -- start 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fcccc982700 1 -- start start 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fcccc982700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 0x7fccc4198e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fcccc982700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccc4102db0 0x7fccc4199370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fcccc982700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fccc4199a50 con 0x7fccc4069180 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fcccc982700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fccc419d7e0 con 0x7fccc4102db0 2026-03-09T00:06:38.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 0x7fccc4198e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 0x7fccc4198e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:40828/0 (socket says 192.168.123.103:40828) 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 -- 192.168.123.103:0/509958647 learned_addr learned my addr 192.168.123.103:0/509958647 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 -- 192.168.123.103:0/509958647 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccc4102db0 msgr2=0x7fccc4199370 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccc4102db0 0x7fccc4199370 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 -- 192.168.123.103:0/509958647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fccbc0097e0 con 0x7fccc4069180 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.761+0000 7fccca71e700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 0x7fccc4198e30 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fccbc00b5c0 tx=0x7fccbc0049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.762+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fccbc01d070 con 0x7fccc4069180 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.762+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fccbc00bc50 con 0x7fccc4069180 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.762+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fccc419da60 con 0x7fccc4069180 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.762+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fccbc00f790 con 0x7fccc4069180 2026-03-09T00:06:38.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.762+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fccc419df20 con 0x7fccc4069180 2026-03-09T00:06:38.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.763+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fccbc00f8f0 con 0x7fccc4069180 2026-03-09T00:06:38.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.763+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fccc4106330 con 0x7fccc4069180 2026-03-09T00:06:38.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.764+0000 7fccb77fe700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fccb00777d0 0x7fccb0079c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.764+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7fccbc09b4f0 con 0x7fccc4069180 2026-03-09T00:06:38.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.764+0000 7fccc9f1d700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fccb00777d0 0x7fccb0079c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.765+0000 7fccc9f1d700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fccb00777d0 0x7fccb0079c90 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fccc419a450 tx=0x7fccb800b3f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.766+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fccbc0639b0 con 0x7fccc4069180 2026-03-09T00:06:38.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.888+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fccc4066e80 con 0x7fccb00777d0 2026-03-09T00:06:38.888 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.889+0000 7fccb77fe700 1 -- 192.168.123.103:0/509958647 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fccc4066e80 con 0x7fccb00777d0 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T00:06:38.889 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:06:38.890 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:06:38.890 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:06:38.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.892+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fccb00777d0 msgr2=0x7fccb0079c90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.892+0000 7fcccc982700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fccb00777d0 0x7fccb0079c90 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fccc419a450 tx=0x7fccb800b3f0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.892+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 msgr2=0x7fccc4198e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.892+0000 7fcccc982700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 0x7fccc4198e30 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fccbc00b5c0 tx=0x7fccbc0049e0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 shutdown_connections 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fccb00777d0 0x7fccb0079c90 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fccc4069180 0x7fccc4198e30 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 --2- 192.168.123.103:0/509958647 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fccc4102db0 0x7fccc4199370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 >> 192.168.123.103:0/509958647 conn(0x7fccc4076b70 msgr2=0x7fccc40fe0d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 shutdown_connections 2026-03-09T00:06:38.892 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.893+0000 7fcccc982700 1 -- 192.168.123.103:0/509958647 wait complete. 2026-03-09T00:06:38.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:38 vm03.local ceph-mon[129670]: pgmap v36: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 11 active+recovery_wait+degraded, 1 active+recovering, 37 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 498 KiB/s wr, 172 op/s; 1519/231 objects degraded (657.576%); 38 KiB/s, 27 keys/s, 10 objects/s recovering 2026-03-09T00:06:38.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:38 vm03.local ceph-mon[129670]: from='client.44127 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:38.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:38 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/574179874' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:38.957 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:38 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3346058630' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:06:38.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.955+0000 7f6272e42700 1 -- 192.168.123.103:0/753624919 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 msgr2=0x7f626c10d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.955+0000 7f6272e42700 1 --2- 192.168.123.103:0/753624919 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 0x7f626c10d5b0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f6260009b30 tx=0x7f6260009e40 comp rx=0 tx=0).stop 2026-03-09T00:06:38.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.957+0000 7f6272e42700 1 -- 192.168.123.103:0/753624919 shutdown_connections 2026-03-09T00:06:38.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.957+0000 7f6272e42700 1 --2- 192.168.123.103:0/753624919 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 0x7f626c10d5b0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.957+0000 7f6272e42700 1 --2- 192.168.123.103:0/753624919 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c0688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.957+0000 7f6272e42700 1 -- 192.168.123.103:0/753624919 >> 192.168.123.103:0/753624919 conn(0x7f626c075960 msgr2=0x7f626c075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:38.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 -- 192.168.123.103:0/753624919 shutdown_connections 2026-03-09T00:06:38.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 -- 192.168.123.103:0/753624919 wait complete. 2026-03-09T00:06:38.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 Processor -- start 2026-03-09T00:06:38.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 -- start start 2026-03-09T00:06:38.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c10d090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 0x7f626c103fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f626c10d790 con 0x7f626c068df0 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.959+0000 7f6272e42700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f626c10d900 con 0x7f626c0684d0 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.960+0000 7f6270bde700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c10d090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.960+0000 7f6270bde700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c10d090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:46346/0 (socket says 192.168.123.103:46346) 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.960+0000 7f6270bde700 1 -- 192.168.123.103:0/2749123083 learned_addr learned my addr 192.168.123.103:0/2749123083 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.960+0000 7f6270bde700 1 -- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 msgr2=0x7f626c103fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:38.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.960+0000 7f6270bde700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 0x7f626c103fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:38.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.960+0000 7f6270bde700 1 -- 192.168.123.103:0/2749123083 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f62600097e0 con 0x7f626c0684d0 2026-03-09T00:06:38.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.961+0000 7f6270bde700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c10d090 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f625c00b700 tx=0x7f625c00bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.962+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f625c010840 con 0x7f626c0684d0 2026-03-09T00:06:38.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.962+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f625c010e80 con 0x7f626c0684d0 2026-03-09T00:06:38.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.962+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f625c00d590 con 0x7f626c0684d0 2026-03-09T00:06:38.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.962+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f626c104640 con 0x7f626c0684d0 2026-03-09T00:06:38.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.963+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f626c104b60 con 0x7f626c0684d0 2026-03-09T00:06:38.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.963+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f626c104e40 con 0x7f626c0684d0 2026-03-09T00:06:38.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.964+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f625c00f3e0 con 0x7f626c0684d0 2026-03-09T00:06:38.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.965+0000 7f6269ffb700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6254077a40 0x7f6254079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:06:38.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.965+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(50..50 src has 1..50) v4 ==== 6600+0+0 (secure 0 0 0) 0x7f625c099f80 con 0x7f626c0684d0 2026-03-09T00:06:38.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.965+0000 7f626bfff700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6254077a40 0x7f6254079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:06:38.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.965+0000 7f626bfff700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6254077a40 0x7f6254079f00 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f6260006010 tx=0x7f6260000bc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:06:38.965 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:38.966+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f625c062440 con 0x7f626c0684d0 2026-03-09T00:06:39.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.121+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f626c04ea90 con 0x7f626c0684d0 2026-03-09T00:06:39.120 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.121+0000 7f6269ffb700 1 -- 192.168.123.103:0/2749123083 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+2302 (secure 0 0 0) 0x7f625c00f690 con 0x7f626c0684d0 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 1519/231 objects degraded (657.576%), 27 pgs degraded 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1519/231 objects degraded (657.576%), 27 pgs degraded 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.0 is active+recovery_wait+degraded, acting [3,1,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1 is active+recovery_wait+degraded, acting [2,1,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.2 is active+recovery_wait+degraded, acting [5,1,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.4 is active+recovery_wait+degraded, acting [1,0,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.5 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.8 is active+recovery_wait+degraded, acting [3,5,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.9 is active+recovery_wait+degraded, acting [1,4,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.e is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.10 is active+recovery_wait+degraded, acting [2,1,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.12 is active+recovery_wait+degraded, acting [3,1,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.13 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.15 is active+recovery_wait+degraded, acting [1,3,0] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1e is active+recovery_wait+degraded, acting [2,0,5] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.12 is active+recovery_wait+undersized+degraded+remapped, acting [1,3] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T00:06:39.121 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6254077a40 msgr2=0x7f6254079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6254077a40 0x7f6254079f00 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f6260006010 tx=0x7f6260000bc0 comp rx=0 tx=0).stop 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 msgr2=0x7f626c10d090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c10d090 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f625c00b700 tx=0x7f625c00bac0 comp rx=0 tx=0).stop 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 shutdown_connections 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6254077a40 0x7f6254079f00 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f626c0684d0 0x7f626c10d090 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 --2- 192.168.123.103:0/2749123083 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f626c068df0 0x7f626c103fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 >> 192.168.123.103:0/2749123083 conn(0x7f626c075960 msgr2=0x7f626c0fe970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 shutdown_connections 2026-03-09T00:06:39.123 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:06:39.124+0000 7f6272e42700 1 -- 192.168.123.103:0/2749123083 wait complete. 2026-03-09T00:06:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:38 vm06.local ceph-mon[106218]: pgmap v36: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 11 active+recovery_wait+degraded, 1 active+recovering, 37 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 498 KiB/s wr, 172 op/s; 1519/231 objects degraded (657.576%); 38 KiB/s, 27 keys/s, 10 objects/s recovering 2026-03-09T00:06:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:38 vm06.local ceph-mon[106218]: from='client.44127 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:38 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/574179874' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:06:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:38 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3346058630' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:06:40.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:39 vm03.local ceph-mon[129670]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:40.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:39 vm03.local ceph-mon[129670]: from='client.44135 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:40.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:39 vm03.local ceph-mon[129670]: from='client.34172 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:40.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:39 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2749123083' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:06:40.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:39 vm06.local ceph-mon[106218]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:40.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:39 vm06.local ceph-mon[106218]: from='client.44135 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:40.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:39 vm06.local ceph-mon[106218]: from='client.34172 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:06:40.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:39 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2749123083' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:06:41.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:40 vm03.local ceph-mon[129670]: pgmap v37: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 11 active+recovery_wait+degraded, 1 active+recovering, 37 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 431 KiB/s wr, 149 op/s; 1519/231 objects degraded (657.576%); 33 KiB/s, 24 keys/s, 8 objects/s recovering 2026-03-09T00:06:41.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:40 vm06.local ceph-mon[106218]: pgmap v37: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 11 active+recovery_wait+degraded, 1 active+recovering, 37 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 12 KiB/s rd, 431 KiB/s wr, 149 op/s; 1519/231 objects degraded (657.576%); 33 KiB/s, 24 keys/s, 8 objects/s recovering 2026-03-09T00:06:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:42 vm06.local ceph-mon[106218]: pgmap v38: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 7 active+recovery_wait+degraded, 1 active+recovering, 41 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.0 KiB/s rd, 7.1 KiB/s wr, 37 op/s; 1480/231 objects degraded (640.693%); 7 B/s, 20 keys/s, 10 objects/s recovering 2026-03-09T00:06:42.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:42 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1480/231 objects degraded (640.693%), 23 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:42 vm03.local ceph-mon[129670]: pgmap v38: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 7 active+recovery_wait+degraded, 1 active+recovering, 41 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1.0 KiB/s rd, 7.1 KiB/s wr, 37 op/s; 1480/231 objects degraded (640.693%); 7 B/s, 20 keys/s, 10 objects/s recovering 2026-03-09T00:06:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:42 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1480/231 objects degraded (640.693%), 23 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:44 vm03.local ceph-mon[129670]: pgmap v39: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 7 active+recovery_wait+degraded, 1 active+recovering, 41 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 6.9 KiB/s wr, 36 op/s; 1480/231 objects degraded (640.693%); 7 B/s, 19 keys/s, 8 objects/s recovering 2026-03-09T00:06:45.117 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:44 vm06.local ceph-mon[106218]: pgmap v39: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 7 active+recovery_wait+degraded, 1 active+recovering, 41 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 6.9 KiB/s wr, 36 op/s; 1480/231 objects degraded (640.693%); 7 B/s, 19 keys/s, 8 objects/s recovering 2026-03-09T00:06:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:46 vm03.local ceph-mon[129670]: pgmap v40: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 45 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 6.9 KiB/s wr, 36 op/s; 1429/231 objects degraded (618.615%); 7 B/s, 19 keys/s, 12 objects/s recovering 2026-03-09T00:06:47.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:46 vm06.local ceph-mon[106218]: pgmap v40: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 45 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1023 B/s rd, 6.9 KiB/s wr, 36 op/s; 1429/231 objects degraded (618.615%); 7 B/s, 19 keys/s, 12 objects/s recovering 2026-03-09T00:06:47.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:47 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1429/231 objects degraded (618.615%), 19 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:48.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:47 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1429/231 objects degraded (618.615%), 19 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:49.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:48 vm03.local ceph-mon[129670]: pgmap v41: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 45 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 85 B/s wr, 0 op/s; 1429/231 objects degraded (618.615%); 0 B/s, 7 objects/s recovering 2026-03-09T00:06:49.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:48 vm06.local ceph-mon[106218]: pgmap v41: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 45 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 85 B/s wr, 0 op/s; 1429/231 objects degraded (618.615%); 0 B/s, 7 objects/s recovering 2026-03-09T00:06:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:50 vm03.local ceph-mon[129670]: pgmap v42: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 45 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 85 B/s wr, 0 op/s; 1429/231 objects degraded (618.615%); 0 B/s, 7 objects/s recovering 2026-03-09T00:06:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:50 vm06.local ceph-mon[106218]: pgmap v42: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 45 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 85 B/s wr, 0 op/s; 1429/231 objects degraded (618.615%); 0 B/s, 7 objects/s recovering 2026-03-09T00:06:52.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:51 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:52.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:51 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:52.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:51 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (7 PGs are or would become offline) 2026-03-09T00:06:52.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:52.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:51 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:06:52.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:51 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (7 PGs are or would become offline) 2026-03-09T00:06:53.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:52 vm06.local ceph-mon[106218]: pgmap v43: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 48 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 85 B/s wr, 0 op/s; 1384/231 objects degraded (599.134%); 0 B/s, 11 objects/s recovering 2026-03-09T00:06:53.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:52 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1384/231 objects degraded (599.134%), 16 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:52 vm03.local ceph-mon[129670]: pgmap v43: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 48 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 85 B/s wr, 0 op/s; 1384/231 objects degraded (599.134%); 0 B/s, 11 objects/s recovering 2026-03-09T00:06:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:52 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1384/231 objects degraded (599.134%), 16 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:54.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:53 vm06.local ceph-mon[106218]: osdmap e51: 6 total, 6 up, 6 in 2026-03-09T00:06:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:53 vm03.local ceph-mon[129670]: osdmap e51: 6 total, 6 up, 6 in 2026-03-09T00:06:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:54 vm03.local ceph-mon[129670]: pgmap v45: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 48 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1384/231 objects degraded (599.134%); 0 B/s, 9 objects/s recovering 2026-03-09T00:06:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:54 vm03.local ceph-mon[129670]: osdmap e52: 6 total, 6 up, 6 in 2026-03-09T00:06:55.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:54 vm06.local ceph-mon[106218]: pgmap v45: 65 pgs: 16 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 48 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1384/231 objects degraded (599.134%); 0 B/s, 9 objects/s recovering 2026-03-09T00:06:55.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:54 vm06.local ceph-mon[106218]: osdmap e52: 6 total, 6 up, 6 in 2026-03-09T00:06:57.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:57 vm06.local ceph-mon[106218]: pgmap v47: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovering+undersized+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 50 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 1024 KiB/s, 13 objects/s recovering 2026-03-09T00:06:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:57 vm06.local ceph-mon[106218]: osdmap e53: 6 total, 6 up, 6 in 2026-03-09T00:06:57.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:57 vm03.local ceph-mon[129670]: pgmap v47: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 1 active+recovering+undersized+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 50 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 1024 KiB/s, 13 objects/s recovering 2026-03-09T00:06:57.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:57 vm03.local ceph-mon[129670]: osdmap e53: 6 total, 6 up, 6 in 2026-03-09T00:06:58.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:58 vm03.local ceph-mon[129670]: osdmap e54: 6 total, 6 up, 6 in 2026-03-09T00:06:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:58 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1266/231 objects degraded (548.052%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:58.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:58 vm03.local ceph-mon[129670]: pgmap v50: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 50 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 869 KiB/s, 9 objects/s recovering 2026-03-09T00:06:58.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:58 vm06.local ceph-mon[106218]: osdmap e54: 6 total, 6 up, 6 in 2026-03-09T00:06:58.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:58 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1266/231 objects degraded (548.052%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T00:06:58.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:58 vm06.local ceph-mon[106218]: pgmap v50: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 50 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 869 KiB/s, 9 objects/s recovering 2026-03-09T00:06:59.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:06:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:06:59.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:06:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:00.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:00 vm03.local ceph-mon[129670]: pgmap v51: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 50 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 683 KiB/s, 7 objects/s recovering 2026-03-09T00:07:00.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:00 vm06.local ceph-mon[106218]: pgmap v51: 65 pgs: 1 peering, 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 50 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 683 KiB/s, 7 objects/s recovering 2026-03-09T00:07:02.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:02 vm06.local ceph-mon[106218]: pgmap v52: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 1.0 MiB/s, 14 objects/s recovering 2026-03-09T00:07:03.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:02 vm03.local ceph-mon[129670]: pgmap v52: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 1.0 MiB/s, 14 objects/s recovering 2026-03-09T00:07:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:04 vm06.local ceph-mon[106218]: pgmap v53: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 0 B/s, 6 objects/s recovering 2026-03-09T00:07:04.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:04 vm06.local ceph-mon[106218]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T00:07:05.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:04 vm03.local ceph-mon[129670]: pgmap v53: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 13 active+recovery_wait+undersized+degraded+remapped, 51 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1266/231 objects degraded (548.052%); 0 B/s, 6 objects/s recovering 2026-03-09T00:07:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:04 vm03.local ceph-mon[129670]: osdmap e55: 6 total, 6 up, 6 in 2026-03-09T00:07:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:05 vm06.local ceph-mon[106218]: osdmap e56: 6 total, 6 up, 6 in 2026-03-09T00:07:06.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:05 vm03.local ceph-mon[129670]: osdmap e56: 6 total, 6 up, 6 in 2026-03-09T00:07:06.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:06 vm06.local ceph-mon[106218]: pgmap v56: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 512 KiB/s, 13 objects/s recovering 2026-03-09T00:07:06.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:06 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1167/231 objects degraded (505.195%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T00:07:06.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:06.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:06 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:06 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-09T00:07:06.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:06 vm03.local ceph-mon[129670]: pgmap v56: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 512 KiB/s, 13 objects/s recovering 2026-03-09T00:07:06.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:06 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1167/231 objects degraded (505.195%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T00:07:06.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:06.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:06 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:06.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:06 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-09T00:07:08.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:08 vm06.local ceph-mon[106218]: pgmap v57: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 512 KiB/s, 13 objects/s recovering 2026-03-09T00:07:09.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:08 vm03.local ceph-mon[129670]: pgmap v57: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 512 KiB/s, 13 objects/s recovering 2026-03-09T00:07:09.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.194+0000 7ff1e4832700 1 -- 192.168.123.103:0/709869398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 msgr2=0x7ff1dc069600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.194+0000 7ff1e4832700 1 --2- 192.168.123.103:0/709869398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc069600 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7ff1d0009b00 tx=0x7ff1d0009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:09.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.195+0000 7ff1e4832700 1 -- 192.168.123.103:0/709869398 shutdown_connections 2026-03-09T00:07:09.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.195+0000 7ff1e4832700 1 --2- 192.168.123.103:0/709869398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc069600 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.195+0000 7ff1e4832700 1 --2- 192.168.123.103:0/709869398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc103210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.194 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.195+0000 7ff1e4832700 1 -- 192.168.123.103:0/709869398 >> 192.168.123.103:0/709869398 conn(0x7ff1dc076b70 msgr2=0x7ff1dc076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.195+0000 7ff1e4832700 1 -- 192.168.123.103:0/709869398 shutdown_connections 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.195+0000 7ff1e4832700 1 -- 192.168.123.103:0/709869398 wait complete. 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.196+0000 7ff1e4832700 1 Processor -- start 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.196+0000 7ff1e4832700 1 -- start start 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.196+0000 7ff1e4832700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc198f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.196+0000 7ff1e4832700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc1994c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.195 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.196+0000 7ff1e4832700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1dc199ba0 con 0x7ff1dc069180 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.196+0000 7ff1e4832700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1dc19d930 con 0x7ff1dc102e30 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e1dcd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc1994c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e1dcd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc1994c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37772/0 (socket says 192.168.123.103:37772) 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e1dcd700 1 -- 192.168.123.103:0/1890914615 learned_addr learned my addr 192.168.123.103:0/1890914615 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e25ce700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc198f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e1dcd700 1 -- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 msgr2=0x7ff1dc198f80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e1dcd700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc198f80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.196 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e1dcd700 1 -- 192.168.123.103:0/1890914615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff1d00097e0 con 0x7ff1dc102e30 2026-03-09T00:07:09.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.197+0000 7ff1e25ce700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:07:09.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.198+0000 7ff1e1dcd700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc1994c0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff1d0004900 tx=0x7ff1d0004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.198+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff1d001d070 con 0x7ff1dc102e30 2026-03-09T00:07:09.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.198+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff1dc19dbb0 con 0x7ff1dc102e30 2026-03-09T00:07:09.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.198+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff1dc19e0a0 con 0x7ff1dc102e30 2026-03-09T00:07:09.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.199+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff1d000bc50 con 0x7ff1dc102e30 2026-03-09T00:07:09.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.199+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff1d000f670 con 0x7ff1dc102e30 2026-03-09T00:07:09.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.200+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7ff1d000f7d0 con 0x7ff1dc102e30 2026-03-09T00:07:09.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.200+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff1dc10b5c0 con 0x7ff1dc102e30 2026-03-09T00:07:09.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.200+0000 7ff1cf7fe700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1c8077990 0x7ff1c8079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.200+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7ff1d0068080 con 0x7ff1dc102e30 2026-03-09T00:07:09.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.201+0000 7ff1e25ce700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1c8077990 0x7ff1c8079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.201+0000 7ff1e25ce700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1c8077990 0x7ff1c8079e50 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7ff1d4006fd0 tx=0x7ff1d4009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.204+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff1d00602e0 con 0x7ff1dc102e30 2026-03-09T00:07:09.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.340+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff1dc19a2e0 con 0x7ff1c8077990 2026-03-09T00:07:09.340 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.341+0000 7ff1cf7fe700 1 -- 192.168.123.103:0/1890914615 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff1dc19a2e0 con 0x7ff1c8077990 2026-03-09T00:07:09.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.344+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1c8077990 msgr2=0x7ff1c8079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.344+0000 7ff1e4832700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1c8077990 0x7ff1c8079e50 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7ff1d4006fd0 tx=0x7ff1d4009380 comp rx=0 tx=0).stop 2026-03-09T00:07:09.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.344+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 msgr2=0x7ff1dc1994c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.344+0000 7ff1e4832700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc1994c0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff1d0004900 tx=0x7ff1d0004930 comp rx=0 tx=0).stop 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 shutdown_connections 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1c8077990 0x7ff1c8079e50 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff1dc069180 0x7ff1dc198f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 --2- 192.168.123.103:0/1890914615 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff1dc102e30 0x7ff1dc1994c0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 >> 192.168.123.103:0/1890914615 conn(0x7ff1dc076b70 msgr2=0x7ff1dc1052f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 shutdown_connections 2026-03-09T00:07:09.344 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.345+0000 7ff1e4832700 1 -- 192.168.123.103:0/1890914615 wait complete. 2026-03-09T00:07:09.353 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.411+0000 7f3f65307700 1 -- 192.168.123.103:0/2587571322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103340 msgr2=0x7f3f60103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.411+0000 7f3f65307700 1 --2- 192.168.123.103:0/2587571322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103340 0x7f3f60103720 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f3f50009b00 tx=0x7f3f50009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 -- 192.168.123.103:0/2587571322 shutdown_connections 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 --2- 192.168.123.103:0/2587571322 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f60103cf0 0x7f3f60107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 --2- 192.168.123.103:0/2587571322 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103340 0x7f3f60103720 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 -- 192.168.123.103:0/2587571322 >> 192.168.123.103:0/2587571322 conn(0x7f3f600feb90 msgr2=0x7f3f60100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 -- 192.168.123.103:0/2587571322 shutdown_connections 2026-03-09T00:07:09.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 -- 192.168.123.103:0/2587571322 wait complete. 2026-03-09T00:07:09.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.412+0000 7f3f65307700 1 Processor -- start 2026-03-09T00:07:09.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f65307700 1 -- start start 2026-03-09T00:07:09.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f65307700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103cf0 0x7f3f60199070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f65307700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 0x7f3f6019da20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f65307700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f60199bd0 con 0x7f3f60103cf0 2026-03-09T00:07:09.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f65307700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3f60199d40 con 0x7f3f601995b0 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f57fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 0x7f3f6019da20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f57fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 0x7f3f6019da20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37794/0 (socket says 192.168.123.103:37794) 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f57fff700 1 -- 192.168.123.103:0/1898297292 learned_addr learned my addr 192.168.123.103:0/1898297292 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f57fff700 1 -- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103cf0 msgr2=0x7f3f60199070 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.413+0000 7f3f5effd700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103cf0 0x7f3f60199070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f57fff700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103cf0 0x7f3f60199070 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f57fff700 1 -- 192.168.123.103:0/1898297292 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3f48009710 con 0x7f3f601995b0 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f5effd700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103cf0 0x7f3f60199070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f57fff700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 0x7f3f6019da20 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3f4800eba0 tx=0x7f3f4800ef60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f4800cd20 con 0x7f3f601995b0 2026-03-09T00:07:09.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3f4800ce80 con 0x7f3f601995b0 2026-03-09T00:07:09.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3f500097e0 con 0x7f3f601995b0 2026-03-09T00:07:09.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3f480053f0 con 0x7f3f601995b0 2026-03-09T00:07:09.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.414+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3f6019e3e0 con 0x7f3f601995b0 2026-03-09T00:07:09.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.415+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3f6010b6e0 con 0x7f3f601995b0 2026-03-09T00:07:09.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.416+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f3f4801e030 con 0x7f3f601995b0 2026-03-09T00:07:09.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.417+0000 7f3f5cff9700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3f40077a00 0x7f3f40079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.417+0000 7f3f5effd700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3f40077a00 0x7f3f40079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.417+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7f3f48014070 con 0x7f3f601995b0 2026-03-09T00:07:09.417 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.417+0000 7f3f5effd700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3f40077a00 0x7f3f40079ec0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f3f50000c00 tx=0x7f3f50005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.419+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3f48062890 con 0x7f3f601995b0 2026-03-09T00:07:09.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.544+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3f6019a470 con 0x7f3f40077a00 2026-03-09T00:07:09.548 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.548+0000 7f3f5cff9700 1 -- 192.168.123.103:0/1898297292 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f3f6019a470 con 0x7f3f40077a00 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3f40077a00 msgr2=0x7f3f40079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3f40077a00 0x7f3f40079ec0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f3f50000c00 tx=0x7f3f50005fb0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 msgr2=0x7f3f6019da20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 0x7f3f6019da20 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3f4800eba0 tx=0x7f3f4800ef60 comp rx=0 tx=0).stop 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 shutdown_connections 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3f40077a00 0x7f3f40079ec0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.550 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3f60103cf0 0x7f3f60199070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 --2- 192.168.123.103:0/1898297292 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3f601995b0 0x7f3f6019da20 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 >> 192.168.123.103:0/1898297292 conn(0x7f3f600feb90 msgr2=0x7f3f601002c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 shutdown_connections 2026-03-09T00:07:09.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.551+0000 7f3f65307700 1 -- 192.168.123.103:0/1898297292 wait complete. 2026-03-09T00:07:09.621 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/3538808815 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac103c90 msgr2=0x7fd1ac107ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/3538808815 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac103c90 0x7fd1ac107ce0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fd1a8009a60 tx=0x7fd1a8009d70 comp rx=0 tx=0).stop 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/3538808815 shutdown_connections 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/3538808815 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac103c90 0x7fd1ac107ce0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/3538808815 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac1036c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/3538808815 >> 192.168.123.103:0/3538808815 conn(0x7fd1ac0feb50 msgr2=0x7fd1ac100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.622+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/3538808815 shutdown_connections 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.623+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/3538808815 wait complete. 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.623+0000 7fd1b3e7e700 1 Processor -- start 2026-03-09T00:07:09.622 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.623+0000 7fd1b3e7e700 1 -- start start 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b3e7e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac198e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b3e7e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1ac103c90 0x7fd1ac199360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b3e7e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1ac199a40 con 0x7fd1ac1032e0 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b3e7e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd1ac19d7d0 con 0x7fd1ac103c90 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b1c1a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac198e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b1c1a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac198e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38082/0 (socket says 192.168.123.103:38082) 2026-03-09T00:07:09.623 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b1c1a700 1 -- 192.168.123.103:0/2719195822 learned_addr learned my addr 192.168.123.103:0/2719195822 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:09.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b1c1a700 1 -- 192.168.123.103:0/2719195822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1ac103c90 msgr2=0x7fd1ac199360 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:07:09.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b1c1a700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1ac103c90 0x7fd1ac199360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.624+0000 7fd1b1c1a700 1 -- 192.168.123.103:0/2719195822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd19c0097e0 con 0x7fd1ac1032e0 2026-03-09T00:07:09.624 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.625+0000 7fd1b1c1a700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac198e20 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fd19c00efd0 tx=0x7fd19c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.625+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd19c004020 con 0x7fd1ac1032e0 2026-03-09T00:07:09.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.625+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd19c003680 con 0x7fd1ac1032e0 2026-03-09T00:07:09.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.625+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd19c010740 con 0x7fd1ac1032e0 2026-03-09T00:07:09.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.625+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd1a8009710 con 0x7fd1ac1032e0 2026-03-09T00:07:09.625 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.625+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd1ac19dd90 con 0x7fd1ac1032e0 2026-03-09T00:07:09.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.627+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fd19c0109b0 con 0x7fd1ac1032e0 2026-03-09T00:07:09.629 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.627+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd1ac04ea90 con 0x7fd1ac1032e0 2026-03-09T00:07:09.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.630+0000 7fd1a2ffd700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd194077990 0x7fd194079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.630+0000 7fd1b1419700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd194077990 0x7fd194079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.631+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7fd19c014070 con 0x7fd1ac1032e0 2026-03-09T00:07:09.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.631+0000 7fd1b1419700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd194077990 0x7fd194079e50 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd1ac19a440 tx=0x7fd1a800b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.630 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.631+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd19c05e2b0 con 0x7fd1ac1032e0 2026-03-09T00:07:09.752 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.753+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd1ac19e070 con 0x7fd194077990 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.758+0000 7fd1a2ffd700 1 -- 192.168.123.103:0/2719195822 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fd1ac19e070 con 0x7fd194077990 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 50s ago 7m 24.0M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (7m) 50s ago 7m 8786k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (7m) 61s ago 7m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (64s) 50s ago 7m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (63s) 61s ago 7m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 50s ago 7m 73.3M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:07:09.757 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (5m) 50s ago 5m 18.1M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (5m) 50s ago 5m 218M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (5m) 61s ago 5m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (5m) 61s ago 5m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (3m) 50s ago 8m 619M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (2m) 61s ago 6m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (92s) 50s ago 8m 57.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (77s) 61s ago 6m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (2m) 50s ago 7m 9445k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (2m) 61s ago 7m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (53s) 50s ago 6m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (6m) 50s ago 6m 451M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (6m) 50s ago 6m 370M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (6m) 61s ago 6m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (5m) 61s ago 5m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (5m) 61s ago 5m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:07:09.758 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (2m) 50s ago 7m 56.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.760+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd194077990 msgr2=0x7fd194079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.760+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd194077990 0x7fd194079e50 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fd1ac19a440 tx=0x7fd1a800b540 comp rx=0 tx=0).stop 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 msgr2=0x7fd1ac198e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac198e20 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fd19c00efd0 tx=0x7fd19c00c5b0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 shutdown_connections 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd194077990 0x7fd194079e50 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd1ac1032e0 0x7fd1ac198e20 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 --2- 192.168.123.103:0/2719195822 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd1ac103c90 0x7fd1ac199360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.761+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 >> 192.168.123.103:0/2719195822 conn(0x7fd1ac0feb50 msgr2=0x7fd1ac100140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.762+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 shutdown_connections 2026-03-09T00:07:09.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.762+0000 7fd1b3e7e700 1 -- 192.168.123.103:0/2719195822 wait complete. 2026-03-09T00:07:09.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.829+0000 7f67d14ef700 1 -- 192.168.123.103:0/818487042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc1033c0 msgr2=0x7f67cc1037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:09.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.829+0000 7f67d14ef700 1 --2- 192.168.123.103:0/818487042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc1033c0 0x7f67cc1037a0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f67c0009b00 tx=0x7f67c0009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:09.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.830+0000 7f67d14ef700 1 -- 192.168.123.103:0/818487042 shutdown_connections 2026-03-09T00:07:09.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.830+0000 7f67d14ef700 1 --2- 192.168.123.103:0/818487042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67cc103d70 0x7f67cc107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.830+0000 7f67d14ef700 1 --2- 192.168.123.103:0/818487042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc1033c0 0x7f67cc1037a0 secure :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f67c0009b00 tx=0x7f67c0009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:09.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.830+0000 7f67d14ef700 1 -- 192.168.123.103:0/818487042 >> 192.168.123.103:0/818487042 conn(0x7f67cc0fec30 msgr2=0x7f67cc101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:09.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 -- 192.168.123.103:0/818487042 shutdown_connections 2026-03-09T00:07:09.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 -- 192.168.123.103:0/818487042 wait complete. 2026-03-09T00:07:09.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 Processor -- start 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 -- start start 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67cc103d70 0x7f67cc0752a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 0x7f67cc075c60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67cc079320 con 0x7f67cc0757e0 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.832+0000 7f67d14ef700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67cc079490 con 0x7f67cc103d70 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.833+0000 7f67ca7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 0x7f67cc075c60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.833+0000 7f67ca7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 0x7f67cc075c60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38110/0 (socket says 192.168.123.103:38110) 2026-03-09T00:07:09.832 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.833+0000 7f67ca7fc700 1 -- 192.168.123.103:0/1131329763 learned_addr learned my addr 192.168.123.103:0/1131329763 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:09.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.833+0000 7f67ca7fc700 1 -- 192.168.123.103:0/1131329763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67cc103d70 msgr2=0x7f67cc0752a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:07:09.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.833+0000 7f67ca7fc700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67cc103d70 0x7f67cc0752a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:09.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.834+0000 7f67ca7fc700 1 -- 192.168.123.103:0/1131329763 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f67c00097e0 con 0x7f67cc0757e0 2026-03-09T00:07:09.833 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.834+0000 7f67ca7fc700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 0x7f67cc075c60 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f67bc009fd0 tx=0x7f67bc00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:09.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.836+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67bc00cd70 con 0x7f67cc0757e0 2026-03-09T00:07:09.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.836+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f67bc004d10 con 0x7f67cc0757e0 2026-03-09T00:07:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.837+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f67cc071b10 con 0x7f67cc0757e0 2026-03-09T00:07:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.837+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f67cc072060 con 0x7f67cc0757e0 2026-03-09T00:07:09.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.836+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67bc010640 con 0x7f67cc0757e0 2026-03-09T00:07:09.838 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.839+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67cc04ea90 con 0x7f67cc0757e0 2026-03-09T00:07:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.841+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f67bc004750 con 0x7f67cc0757e0 2026-03-09T00:07:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.841+0000 7f67b3fff700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67b4077ab0 0x7f67b4079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.841+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7f67bc014070 con 0x7f67cc0757e0 2026-03-09T00:07:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.843+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f67bc0d09f0 con 0x7f67cc0757e0 2026-03-09T00:07:09.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.843+0000 7f67caffd700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67b4077ab0 0x7f67b4079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:09.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:09.844+0000 7f67caffd700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67b4077ab0 0x7f67b4079f70 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f67c000b5c0 tx=0x7f67c0005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.014 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.014+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f67cc066e80 con 0x7f67cc0757e0 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.015+0000 7f67b3fff700 1 -- 192.168.123.103:0/1131329763 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f67bc062700 con 0x7f67cc0757e0 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:07:10.015 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:07:10.016 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.018+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67b4077ab0 msgr2=0x7f67b4079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.018+0000 7f67d14ef700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67b4077ab0 0x7f67b4079f70 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f67c000b5c0 tx=0x7f67c0005fb0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.018+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 msgr2=0x7f67cc075c60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.018+0000 7f67d14ef700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 0x7f67cc075c60 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f67bc009fd0 tx=0x7f67bc00c5b0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 shutdown_connections 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f67b4077ab0 0x7f67b4079f70 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f67cc103d70 0x7f67cc0752a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 --2- 192.168.123.103:0/1131329763 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f67cc0757e0 0x7f67cc075c60 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 >> 192.168.123.103:0/1131329763 conn(0x7f67cc0fec30 msgr2=0x7f67cc1002e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 shutdown_connections 2026-03-09T00:07:10.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.019+0000 7f67d14ef700 1 -- 192.168.123.103:0/1131329763 wait complete. 2026-03-09T00:07:10.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.084+0000 7f8187884700 1 -- 192.168.123.103:0/1536121740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8180102db0 msgr2=0x7f8180103190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.085 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.084+0000 7f8187884700 1 --2- 192.168.123.103:0/1536121740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8180102db0 0x7f8180103190 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f8170009b00 tx=0x7f8170009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- 192.168.123.103:0/1536121740 shutdown_connections 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 --2- 192.168.123.103:0/1536121740 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8180069180 0x7f8180069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 --2- 192.168.123.103:0/1536121740 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8180102db0 0x7f8180103190 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- 192.168.123.103:0/1536121740 >> 192.168.123.103:0/1536121740 conn(0x7f8180076b70 msgr2=0x7f8180076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- 192.168.123.103:0/1536121740 shutdown_connections 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- 192.168.123.103:0/1536121740 wait complete. 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 Processor -- start 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- start start 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8180069180 0x7f818019dc00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 0x7f81801100c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8180110600 con 0x7f818010fc40 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8187884700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8180110770 con 0x7f8180069180 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.085+0000 7f8184e1f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 0x7f81801100c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.086+0000 7f8184e1f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 0x7f81801100c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38130/0 (socket says 192.168.123.103:38130) 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.086+0000 7f8184e1f700 1 -- 192.168.123.103:0/1085485231 learned_addr learned my addr 192.168.123.103:0/1085485231 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.086+0000 7f8184e1f700 1 -- 192.168.123.103:0/1085485231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8180069180 msgr2=0x7f818019dc00 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.086+0000 7f8184e1f700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8180069180 0x7f818019dc00 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.086+0000 7f8184e1f700 1 -- 192.168.123.103:0/1085485231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f81700097e0 con 0x7f818010fc40 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.086+0000 7f8184e1f700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 0x7f81801100c0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f817c00cc90 tx=0x7f817c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.086 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.087+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f817c007ab0 con 0x7f818010fc40 2026-03-09T00:07:10.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.087+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f817c004d10 con 0x7f818010fc40 2026-03-09T00:07:10.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.087+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f817c0056a0 con 0x7f818010fc40 2026-03-09T00:07:10.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.087+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f81801108b0 con 0x7f818010fc40 2026-03-09T00:07:10.087 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.087+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8180199880 con 0x7f818010fc40 2026-03-09T00:07:10.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.088+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f817c004750 con 0x7f818010fc40 2026-03-09T00:07:10.088 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.089+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f818004ea90 con 0x7f818010fc40 2026-03-09T00:07:10.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.091+0000 7f81767fc700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f816c0779f0 0x7f816c079eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.091+0000 7f8185620700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f816c0779f0 0x7f816c079eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.091+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7f817c0999a0 con 0x7f818010fc40 2026-03-09T00:07:10.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.091+0000 7f8185620700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f816c0779f0 0x7f816c079eb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f817000b5c0 tx=0x7f8170005dc0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.091 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.092+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f817c061eb0 con 0x7f818010fc40 2026-03-09T00:07:10.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.227+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8180196070 con 0x7f818010fc40 2026-03-09T00:07:10.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.228+0000 7f81767fc700 1 -- 192.168.123.103:0/1085485231 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f817c01d020 con 0x7f818010fc40 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:10.228 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.230+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f816c0779f0 msgr2=0x7f816c079eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.230+0000 7f8187884700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f816c0779f0 0x7f816c079eb0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f817000b5c0 tx=0x7f8170005dc0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.230+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 msgr2=0x7f81801100c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.230+0000 7f8187884700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 0x7f81801100c0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f817c00cc90 tx=0x7f817c0074a0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 shutdown_connections 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f816c0779f0 0x7f816c079eb0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8180069180 0x7f818019dc00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 --2- 192.168.123.103:0/1085485231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f818010fc40 0x7f81801100c0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 >> 192.168.123.103:0/1085485231 conn(0x7f8180076b70 msgr2=0x7f81800fe110 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 shutdown_connections 2026-03-09T00:07:10.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.231+0000 7f8187884700 1 -- 192.168.123.103:0/1085485231 wait complete. 2026-03-09T00:07:10.231 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.294+0000 7f1335704700 1 -- 192.168.123.103:0/1870486481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301020c0 msgr2=0x7f13301024a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.294+0000 7f1335704700 1 --2- 192.168.123.103:0/1870486481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301020c0 0x7f13301024a0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f1318009b30 tx=0x7f1318009e40 comp rx=0 tx=0).stop 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 -- 192.168.123.103:0/1870486481 shutdown_connections 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 --2- 192.168.123.103:0/1870486481 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13301029e0 0x7f133010aed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 --2- 192.168.123.103:0/1870486481 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301020c0 0x7f13301024a0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 -- 192.168.123.103:0/1870486481 >> 192.168.123.103:0/1870486481 conn(0x7f13300fb830 msgr2=0x7f13300fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 -- 192.168.123.103:0/1870486481 shutdown_connections 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 -- 192.168.123.103:0/1870486481 wait complete. 2026-03-09T00:07:10.294 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 Processor -- start 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.295+0000 7f1335704700 1 -- start start 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f1335704700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13301020c0 0x7f1330198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f1335704700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 0x7f1330199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f1335704700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f13301999f0 con 0x7f13301029e0 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f1335704700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f133019d710 con 0x7f13301020c0 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 0x7f1330199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 0x7f1330199310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38148/0 (socket says 192.168.123.103:38148) 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 -- 192.168.123.103:0/1656799908 learned_addr learned my addr 192.168.123.103:0/1656799908 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:10.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132effd700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13301020c0 0x7f1330198dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 -- 192.168.123.103:0/1656799908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13301020c0 msgr2=0x7f1330198dd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13301020c0 0x7f1330198dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 -- 192.168.123.103:0/1656799908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f13180097e0 con 0x7f13301029e0 2026-03-09T00:07:10.296 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.296+0000 7f132e7fc700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 0x7f1330199310 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f132000cc60 tx=0x7f13200074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.297+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1320007af0 con 0x7f13301029e0 2026-03-09T00:07:10.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.297+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1320007c50 con 0x7f13301029e0 2026-03-09T00:07:10.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.297+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1320018700 con 0x7f13301029e0 2026-03-09T00:07:10.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.297+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f133019d9f0 con 0x7f13301029e0 2026-03-09T00:07:10.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.297+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f133019df40 con 0x7f13301029e0 2026-03-09T00:07:10.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.298+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f133010aa30 con 0x7f13301029e0 2026-03-09T00:07:10.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.298+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f132001f030 con 0x7f13301029e0 2026-03-09T00:07:10.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.299+0000 7f1327fff700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f131c077ab0 0x7f131c079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.299+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7f1320099e50 con 0x7f13301029e0 2026-03-09T00:07:10.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.300+0000 7f132effd700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f131c077ab0 0x7f131c079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.300+0000 7f132effd700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f131c077ab0 0x7f131c079f70 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f131800b580 tx=0x7f1318005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.301+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1320062360 con 0x7f13301029e0 2026-03-09T00:07:10.421 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.421+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f133019e220 con 0x7f131c077ab0 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.422+0000 7f1327fff700 1 -- 192.168.123.103:0/1656799908 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f133019e220 con 0x7f131c077ab0 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:07:10.422 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f131c077ab0 msgr2=0x7f131c079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f131c077ab0 0x7f131c079f70 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f131800b580 tx=0x7f1318005fb0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 msgr2=0x7f1330199310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 0x7f1330199310 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f132000cc60 tx=0x7f13200074a0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 shutdown_connections 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f131c077ab0 0x7f131c079f70 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f13301020c0 0x7f1330198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 --2- 192.168.123.103:0/1656799908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f13301029e0 0x7f1330199310 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 >> 192.168.123.103:0/1656799908 conn(0x7f13300fb830 msgr2=0x7f13301004e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 shutdown_connections 2026-03-09T00:07:10.424 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.425+0000 7f1335704700 1 -- 192.168.123.103:0/1656799908 wait complete. 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 -- 192.168.123.103:0/2880907479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 msgr2=0x7fae6c10d170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 --2- 192.168.123.103:0/2880907479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c10d170 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7fae60009b50 tx=0x7fae60009e60 comp rx=0 tx=0).stop 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 -- 192.168.123.103:0/2880907479 shutdown_connections 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 --2- 192.168.123.103:0/2880907479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c10d170 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 --2- 192.168.123.103:0/2880907479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 0x7fae6c0735a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 -- 192.168.123.103:0/2880907479 >> 192.168.123.103:0/2880907479 conn(0x7fae6c0fc920 msgr2=0x7fae6c0fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 -- 192.168.123.103:0/2880907479 shutdown_connections 2026-03-09T00:07:10.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.488+0000 7fae749de700 1 -- 192.168.123.103:0/2880907479 wait complete. 2026-03-09T00:07:10.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.489+0000 7fae749de700 1 Processor -- start 2026-03-09T00:07:10.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.489+0000 7fae749de700 1 -- start start 2026-03-09T00:07:10.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.489+0000 7fae749de700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 0x7fae6c198d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.489+0000 7fae749de700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c1992c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.489+0000 7fae749de700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae6c1999a0 con 0x7fae6c073ae0 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.489+0000 7fae749de700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fae6c19d730 con 0x7fae6c0731c0 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c1992c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c1992c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:38168/0 (socket says 192.168.123.103:38168) 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 -- 192.168.123.103:0/2899626898 learned_addr learned my addr 192.168.123.103:0/2899626898 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae7277a700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 0x7fae6c198d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 -- 192.168.123.103:0/2899626898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 msgr2=0x7fae6c198d80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 0x7fae6c198d80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 -- 192.168.123.103:0/2899626898 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fae600097e0 con 0x7fae6c073ae0 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae7277a700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 0x7fae6c198d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:07:10.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae71f79700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c1992c0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fae60004c30 tx=0x7fae60004e80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae6001d070 con 0x7fae6c073ae0 2026-03-09T00:07:10.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fae60022470 con 0x7fae6c073ae0 2026-03-09T00:07:10.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fae6000f670 con 0x7fae6c073ae0 2026-03-09T00:07:10.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fae6c19d9b0 con 0x7fae6c073ae0 2026-03-09T00:07:10.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.490+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fae6c19dea0 con 0x7fae6c073ae0 2026-03-09T00:07:10.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.491+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fae6c10a870 con 0x7fae6c073ae0 2026-03-09T00:07:10.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.492+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 38) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fae60022ae0 con 0x7fae6c073ae0 2026-03-09T00:07:10.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.493+0000 7fae677fe700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fae5c077b00 0x7fae5c079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:10.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.493+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(56..56 src has 1..56) v4 ==== 6513+0+0 (secure 0 0 0) 0x7fae6009b580 con 0x7fae6c073ae0 2026-03-09T00:07:10.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.495+0000 7fae7277a700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fae5c077b00 0x7fae5c079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:10.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.495+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fae60063a90 con 0x7fae6c073ae0 2026-03-09T00:07:10.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.495+0000 7fae7277a700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fae5c077b00 0x7fae5c079fc0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fae58009dd0 tx=0x7fae58009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:10.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.650+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fae6c04ea90 con 0x7fae6c073ae0 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.650+0000 7fae677fe700 1 -- 192.168.123.103:0/2899626898 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1400 (secure 0 0 0) 0x7fae60027590 con 0x7fae6c073ae0 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 1167/231 objects degraded (505.195%), 13 pgs degraded 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 1167/231 objects degraded (505.195%), 13 pgs degraded 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is active+recovery_wait+undersized+degraded+remapped, acting [2,4] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.b is active+recovery_wait+undersized+degraded+remapped, acting [1,4] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.f is active+recovering+undersized+degraded+remapped, acting [5,3] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is active+recovery_wait+undersized+degraded+remapped, acting [5,1] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.11 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.12 is active+recovery_wait+undersized+degraded+remapped, acting [1,3] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.17 is active+recovery_wait+undersized+degraded+remapped, acting [2,5] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is active+recovery_wait+undersized+degraded+remapped, acting [2,1] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1b is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-09T00:07:10.650 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is active+recovery_wait+undersized+degraded+remapped, acting [2,3] 2026-03-09T00:07:10.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fae5c077b00 msgr2=0x7fae5c079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fae5c077b00 0x7fae5c079fc0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fae58009dd0 tx=0x7fae58009450 comp rx=0 tx=0).stop 2026-03-09T00:07:10.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 msgr2=0x7fae6c1992c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:10.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c1992c0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fae60004c30 tx=0x7fae60004e80 comp rx=0 tx=0).stop 2026-03-09T00:07:10.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 shutdown_connections 2026-03-09T00:07:10.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fae5c077b00 0x7fae5c079fc0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fae6c0731c0 0x7fae6c198d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 --2- 192.168.123.103:0/2899626898 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fae6c073ae0 0x7fae6c1992c0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:10.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 >> 192.168.123.103:0/2899626898 conn(0x7fae6c0fc920 msgr2=0x7fae6c1079b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:10.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 shutdown_connections 2026-03-09T00:07:10.653 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:10.653+0000 7fae749de700 1 -- 192.168.123.103:0/2899626898 wait complete. 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: from='client.44145 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: pgmap v58: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: from='client.34184 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1131329763' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1085485231' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:07:10.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:10 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2899626898' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:07:10.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: from='client.44145 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:10.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:10.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: pgmap v58: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:10.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: from='client.34184 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:10.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1131329763' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:10.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1085485231' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:07:10.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:10 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2899626898' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:07:11.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:11 vm06.local ceph-mon[106218]: from='client.34196 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:12.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:11 vm03.local ceph-mon[129670]: from='client.34196 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:12.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:12 vm06.local ceph-mon[106218]: pgmap v59: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:12 vm03.local ceph-mon[129670]: pgmap v59: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:14 vm03.local ceph-mon[129670]: pgmap v60: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:14 vm03.local ceph-mon[129670]: osdmap e57: 6 total, 6 up, 6 in 2026-03-09T00:07:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:15.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:14 vm06.local ceph-mon[106218]: pgmap v60: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 12 active+recovery_wait+undersized+degraded+remapped, 52 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1167/231 objects degraded (505.195%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:15.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:14 vm06.local ceph-mon[106218]: osdmap e57: 6 total, 6 up, 6 in 2026-03-09T00:07:15.115 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:15 vm03.local ceph-mon[129670]: osdmap e58: 6 total, 6 up, 6 in 2026-03-09T00:07:16.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:15 vm06.local ceph-mon[106218]: osdmap e58: 6 total, 6 up, 6 in 2026-03-09T00:07:16.948 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:16 vm03.local ceph-mon[129670]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:16.948 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:16 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 1085/231 objects degraded (469.697%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T00:07:16.948 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:16 vm03.local ceph-mon[129670]: mgrmap e39: vm03.yvcons(active, since 92s), standbys: vm06.rzcvhn 2026-03-09T00:07:17.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:16 vm06.local ceph-mon[106218]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:17.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:16 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 1085/231 objects degraded (469.697%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T00:07:17.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:16 vm06.local ceph-mon[106218]: mgrmap e39: vm03.yvcons(active, since 92s), standbys: vm06.rzcvhn 2026-03-09T00:07:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:18 vm03.local ceph-mon[129670]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:19.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:18 vm06.local ceph-mon[106218]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:20 vm03.local ceph-mon[129670]: pgmap v65: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:20 vm06.local ceph-mon[106218]: pgmap v65: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:22.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:07:22.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:22 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:22 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-09T00:07:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:22 vm03.local ceph-mon[129670]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:22 vm03.local ceph-mon[129670]: osdmap e59: 6 total, 6 up, 6 in 2026-03-09T00:07:23.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:22 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:23.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:22 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-09T00:07:23.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:22 vm06.local ceph-mon[106218]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 383 B/s wr, 0 op/s; 1085/231 objects degraded (469.697%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:23.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:22 vm06.local ceph-mon[106218]: osdmap e59: 6 total, 6 up, 6 in 2026-03-09T00:07:24.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:24 vm03.local ceph-mon[129670]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T00:07:24.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:24 vm03.local ceph-mon[129670]: pgmap v69: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1085/231 objects degraded (469.697%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:24.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:24 vm06.local ceph-mon[106218]: osdmap e60: 6 total, 6 up, 6 in 2026-03-09T00:07:24.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:24 vm06.local ceph-mon[106218]: pgmap v69: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 11 active+recovery_wait+undersized+degraded+remapped, 53 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 1085/231 objects degraded (469.697%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:26.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:25 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 997/231 objects degraded (431.602%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T00:07:26.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:25 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 997/231 objects degraded (431.602%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T00:07:26.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:26 vm03.local ceph-mon[129670]: pgmap v70: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:27.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:26 vm06.local ceph-mon[106218]: pgmap v70: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:28 vm03.local ceph-mon[129670]: pgmap v71: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:29.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:28 vm06.local ceph-mon[106218]: pgmap v71: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:29.952 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:30.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:31.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:30 vm03.local ceph-mon[129670]: pgmap v72: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:31.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:30 vm06.local ceph-mon[106218]: pgmap v72: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:32.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:31 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 997/231 objects degraded (431.602%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:32.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:31 vm03.local ceph-mon[129670]: osdmap e61: 6 total, 6 up, 6 in 2026-03-09T00:07:32.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:31 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 997/231 objects degraded (431.602%), 11 pgs degraded, 11 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:32.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:31 vm06.local ceph-mon[106218]: osdmap e61: 6 total, 6 up, 6 in 2026-03-09T00:07:33.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:32 vm03.local ceph-mon[129670]: pgmap v73: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:32 vm03.local ceph-mon[129670]: osdmap e62: 6 total, 6 up, 6 in 2026-03-09T00:07:33.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:32 vm06.local ceph-mon[106218]: pgmap v73: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:33.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:32 vm06.local ceph-mon[106218]: osdmap e62: 6 total, 6 up, 6 in 2026-03-09T00:07:35.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:34 vm03.local ceph-mon[129670]: pgmap v76: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 6 objects/s recovering 2026-03-09T00:07:35.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:34 vm06.local ceph-mon[106218]: pgmap v76: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 10 active+recovery_wait+undersized+degraded+remapped, 54 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 997/231 objects degraded (431.602%); 0 B/s, 6 objects/s recovering 2026-03-09T00:07:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:36 vm03.local ceph-mon[129670]: pgmap v77: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:37.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:36 vm06.local ceph-mon[106218]: pgmap v77: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:37.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:37 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:37 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T00:07:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:37 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 907/231 objects degraded (392.641%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:38.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:37 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:38.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:37 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T00:07:38.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:37 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 907/231 objects degraded (392.641%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:39.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:38 vm03.local ceph-mon[129670]: pgmap v78: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:39.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:38 vm06.local ceph-mon[106218]: pgmap v78: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:40.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.718+0000 7f7826af8700 1 -- 192.168.123.103:0/2289037222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7820101120 msgr2=0x7f7820101500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:40.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.718+0000 7f7826af8700 1 --2- 192.168.123.103:0/2289037222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7820101120 0x7f7820101500 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f7810009b50 tx=0x7f7810009e60 comp rx=0 tx=0).stop 2026-03-09T00:07:40.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.719+0000 7f7826af8700 1 -- 192.168.123.103:0/2289037222 shutdown_connections 2026-03-09T00:07:40.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.719+0000 7f7826af8700 1 --2- 192.168.123.103:0/2289037222 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7820101ad0 0x7f7820105b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.719+0000 7f7826af8700 1 --2- 192.168.123.103:0/2289037222 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7820101120 0x7f7820101500 secure :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f7810009b50 tx=0x7f7810009e60 comp rx=0 tx=0).stop 2026-03-09T00:07:40.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.719+0000 7f7826af8700 1 -- 192.168.123.103:0/2289037222 >> 192.168.123.103:0/2289037222 conn(0x7f78200fc9b0 msgr2=0x7f78200fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:40.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.719+0000 7f7826af8700 1 -- 192.168.123.103:0/2289037222 shutdown_connections 2026-03-09T00:07:40.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.719+0000 7f7826af8700 1 -- 192.168.123.103:0/2289037222 wait complete. 2026-03-09T00:07:40.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f7826af8700 1 Processor -- start 2026-03-09T00:07:40.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f7826af8700 1 -- start start 2026-03-09T00:07:40.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f7826af8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7820101ad0 0x7f782019aa60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f7826af8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 0x7f7820194ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f7826af8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f782019b570 con 0x7f782019afa0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f7826af8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7820195020 con 0x7f7820101ad0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f781ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 0x7f7820194ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f781ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 0x7f7820194ae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57378/0 (socket says 192.168.123.103:57378) 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.720+0000 7f781ffff700 1 -- 192.168.123.103:0/2436369936 learned_addr learned my addr 192.168.123.103:0/2436369936 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781ffff700 1 -- 192.168.123.103:0/2436369936 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7820101ad0 msgr2=0x7f782019aa60 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781ffff700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7820101ad0 0x7f782019aa60 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781ffff700 1 -- 192.168.123.103:0/2436369936 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78100097e0 con 0x7f782019afa0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781ffff700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 0x7f7820194ae0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f781400ead0 tx=0x7f781400ee90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f781400cca0 con 0x7f782019afa0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f781400ce00 con 0x7f782019afa0 2026-03-09T00:07:40.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7814010640 con 0x7f782019afa0 2026-03-09T00:07:40.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.721+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7820195300 con 0x7f782019afa0 2026-03-09T00:07:40.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.722+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7820195820 con 0x7f782019afa0 2026-03-09T00:07:40.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.722+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f78140107a0 con 0x7f782019afa0 2026-03-09T00:07:40.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.723+0000 7f781dffb700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f780807bcd0 0x7f780807e190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:40.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.723+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f7814014070 con 0x7f782019afa0 2026-03-09T00:07:40.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.723+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f78201094c0 con 0x7f782019afa0 2026-03-09T00:07:40.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.724+0000 7f7824894700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f780807bcd0 0x7f780807e190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:40.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.724+0000 7f7824894700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f780807bcd0 0x7f780807e190 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7810000c00 tx=0x7f7810005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:40.727 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.727+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f781409e050 con 0x7f782019afa0 2026-03-09T00:07:40.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.852+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7820066e80 con 0x7f780807bcd0 2026-03-09T00:07:40.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.853+0000 7f781dffb700 1 -- 192.168.123.103:0/2436369936 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f7820066e80 con 0x7f780807bcd0 2026-03-09T00:07:40.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.856+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f780807bcd0 msgr2=0x7f780807e190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:40.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.856+0000 7f7826af8700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f780807bcd0 0x7f780807e190 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7810000c00 tx=0x7f7810005fb0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.856+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 msgr2=0x7f7820194ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:40.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.856+0000 7f7826af8700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 0x7f7820194ae0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f781400ead0 tx=0x7f781400ee90 comp rx=0 tx=0).stop 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.856+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 shutdown_connections 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.856+0000 7f7826af8700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f780807bcd0 0x7f780807e190 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.857+0000 7f7826af8700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7820101ad0 0x7f782019aa60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.857+0000 7f7826af8700 1 --2- 192.168.123.103:0/2436369936 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f782019afa0 0x7f7820194ae0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.857+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 >> 192.168.123.103:0/2436369936 conn(0x7f78200fc9b0 msgr2=0x7f78200fe120 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.857+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 shutdown_connections 2026-03-09T00:07:40.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.857+0000 7f7826af8700 1 -- 192.168.123.103:0/2436369936 wait complete. 2026-03-09T00:07:40.866 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.925+0000 7f8c895be700 1 -- 192.168.123.103:0/3658749389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84073ae0 msgr2=0x7f8c8410d170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.925+0000 7f8c895be700 1 --2- 192.168.123.103:0/3658749389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84073ae0 0x7f8c8410d170 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f8c74009b50 tx=0x7f8c74009e60 comp rx=0 tx=0).stop 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.925+0000 7f8c895be700 1 -- 192.168.123.103:0/3658749389 shutdown_connections 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.925+0000 7f8c895be700 1 --2- 192.168.123.103:0/3658749389 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84073ae0 0x7f8c8410d170 secure :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f8c74009b50 tx=0x7f8c74009e60 comp rx=0 tx=0).stop 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.925+0000 7f8c895be700 1 --2- 192.168.123.103:0/3658749389 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c840735a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.925+0000 7f8c895be700 1 -- 192.168.123.103:0/3658749389 >> 192.168.123.103:0/3658749389 conn(0x7f8c840fc920 msgr2=0x7f8c840fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.926+0000 7f8c895be700 1 -- 192.168.123.103:0/3658749389 shutdown_connections 2026-03-09T00:07:40.926 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.926+0000 7f8c895be700 1 -- 192.168.123.103:0/3658749389 wait complete. 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.926+0000 7f8c895be700 1 Processor -- start 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.926+0000 7f8c895be700 1 -- start start 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c895be700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c84198fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c895be700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84199520 0x7f8c8419d990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c895be700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c84199b40 con 0x7f8c84199520 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c895be700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8c84199cb0 con 0x7f8c840731c0 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c82ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c84198fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c82ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c84198fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:53410/0 (socket says 192.168.123.103:53410) 2026-03-09T00:07:40.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c82ffd700 1 -- 192.168.123.103:0/3376158098 learned_addr learned my addr 192.168.123.103:0/3376158098 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:40.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c82ffd700 1 -- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84199520 msgr2=0x7f8c8419d990 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:07:40.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.927+0000 7f8c827fc700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84199520 0x7f8c8419d990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:40.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c82ffd700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84199520 0x7f8c8419d990 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:40.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c82ffd700 1 -- 192.168.123.103:0/3376158098 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c740097e0 con 0x7f8c840731c0 2026-03-09T00:07:40.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c827fc700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84199520 0x7f8c8419d990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c82ffd700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c84198fe0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f8c6c00eb10 tx=0x7f8c6c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c6c00cca0 con 0x7f8c840731c0 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8c8419df90 con 0x7f8c840731c0 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8c8419e4e0 con 0x7f8c840731c0 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8c6c00ce00 con 0x7f8c840731c0 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.928+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8c6c0189e0 con 0x7f8c840731c0 2026-03-09T00:07:40.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.929+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8c6c018b40 con 0x7f8c840731c0 2026-03-09T00:07:40.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.930+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8c8410a870 con 0x7f8c840731c0 2026-03-09T00:07:40.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.930+0000 7f8c7bfff700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8c70080130 0x7f8c700825f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:40.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.930+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f8c6c014070 con 0x7f8c840731c0 2026-03-09T00:07:40.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.930+0000 7f8c827fc700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8c70080130 0x7f8c700825f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:40.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.930+0000 7f8c827fc700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8c70080130 0x7f8c700825f0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8c7400b5c0 tx=0x7f8c74005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:40.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:40.933+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8c6c063230 con 0x7f8c840731c0 2026-03-09T00:07:41.071 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:40 vm03.local ceph-mon[129670]: pgmap v79: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:41.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.070+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8c84066e80 con 0x7f8c70080130 2026-03-09T00:07:41.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.075+0000 7f8c7bfff700 1 -- 192.168.123.103:0/3376158098 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8c84066e80 con 0x7f8c70080130 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8c70080130 msgr2=0x7f8c700825f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8c70080130 0x7f8c700825f0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f8c7400b5c0 tx=0x7f8c74005fb0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 msgr2=0x7f8c84198fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c84198fe0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f8c6c00eb10 tx=0x7f8c6c00eed0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 shutdown_connections 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8c70080130 0x7f8c700825f0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.078 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8c840731c0 0x7f8c84198fe0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 --2- 192.168.123.103:0/3376158098 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8c84199520 0x7f8c8419d990 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 >> 192.168.123.103:0/3376158098 conn(0x7f8c840fc920 msgr2=0x7f8c841079b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 shutdown_connections 2026-03-09T00:07:41.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.078+0000 7f8c895be700 1 -- 192.168.123.103:0/3376158098 wait complete. 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.145+0000 7fe83fbbe700 1 -- 192.168.123.103:0/1653046594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 msgr2=0x7fe838103190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.145+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/1653046594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838103190 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fe828009a60 tx=0x7fe828009d70 comp rx=0 tx=0).stop 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.145+0000 7fe83fbbe700 1 -- 192.168.123.103:0/1653046594 shutdown_connections 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.145+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/1653046594 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.145+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/1653046594 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838103190 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.145+0000 7fe83fbbe700 1 -- 192.168.123.103:0/1653046594 >> 192.168.123.103:0/1653046594 conn(0x7fe838076b70 msgr2=0x7fe838076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 -- 192.168.123.103:0/1653046594 shutdown_connections 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 -- 192.168.123.103:0/1653046594 wait complete. 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 Processor -- start 2026-03-09T00:07:41.146 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 -- start start 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838100010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838100550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe838100a90 con 0x7fe838069180 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.146+0000 7fe83fbbe700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe838100bd0 con 0x7fe838102db0 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838100010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838100010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57416/0 (socket says 192.168.123.103:57416) 2026-03-09T00:07:41.147 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 -- 192.168.123.103:0/3236616398 learned_addr learned my addr 192.168.123.103:0/3236616398 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:41.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 -- 192.168.123.103:0/3236616398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 msgr2=0x7fe838100550 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:07:41.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d159700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838100550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.148 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838100550 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 -- 192.168.123.103:0/3236616398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8340097e0 con 0x7fe838069180 2026-03-09T00:07:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d95a700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838100010 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fe828005c50 tx=0x7fe82800f740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.147+0000 7fe83d159700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838100550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:07:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.148+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe82801d070 con 0x7fe838069180 2026-03-09T00:07:41.152 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.148+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe82800fd90 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.148+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe828009710 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.148+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe8381a2f80 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.149+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe828017750 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.149+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe8280179d0 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.149+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe81c005320 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.152+0000 7fe82effd700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe8240779e0 0x7fe824079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.152+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7fe82809b5a0 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.152+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe8280649d0 con 0x7fe838069180 2026-03-09T00:07:41.153 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.153+0000 7fe83d159700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe8240779e0 0x7fe824079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.156 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.156+0000 7fe83d159700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe8240779e0 0x7fe824079ea0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fe838101800 tx=0x7fe834009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:40 vm06.local ceph-mon[106218]: pgmap v79: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:41.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.272+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fe81c000bf0 con 0x7fe8240779e0 2026-03-09T00:07:41.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.278+0000 7fe82effd700 1 -- 192.168.123.103:0/3236616398 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fe81c000bf0 con 0x7fe8240779e0 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (2m) 82s ago 8m 24.0M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (8m) 82s ago 8m 8786k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (7m) 93s ago 7m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (96s) 82s ago 8m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (94s) 93s ago 7m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (2m) 82s ago 7m 73.3M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (5m) 82s ago 5m 18.1M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (5m) 82s ago 5m 218M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (5m) 93s ago 5m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (5m) 93s ago 5m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (3m) 82s ago 8m 619M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (3m) 93s ago 7m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 82s ago 8m 57.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (109s) 93s ago 7m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 82s ago 8m 9445k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 93s ago 7m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (84s) 82s ago 7m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (7m) 82s ago 7m 451M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (6m) 82s ago 6m 370M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (6m) 93s ago 6m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (6m) 93s ago 6m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (6m) 93s ago 6m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:07:41.280 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 82s ago 7m 56.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:07:41.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe8240779e0 msgr2=0x7fe824079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe8240779e0 0x7fe824079ea0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fe838101800 tx=0x7fe834009500 comp rx=0 tx=0).stop 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 msgr2=0x7fe838100010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838100010 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fe828005c50 tx=0x7fe82800f740 comp rx=0 tx=0).stop 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 shutdown_connections 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe8240779e0 0x7fe824079ea0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe838069180 0x7fe838100010 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 --2- 192.168.123.103:0/3236616398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe838102db0 0x7fe838100550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.282+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 >> 192.168.123.103:0/3236616398 conn(0x7fe838076b70 msgr2=0x7fe8380fde50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.283+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 shutdown_connections 2026-03-09T00:07:41.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.283+0000 7fe83fbbe700 1 -- 192.168.123.103:0/3236616398 wait complete. 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 -- 192.168.123.103:0/1283163792 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c0ffc50 msgr2=0x7fd94c100030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 --2- 192.168.123.103:0/1283163792 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c0ffc50 0x7fd94c100030 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fd93c009b30 tx=0x7fd93c009e40 comp rx=0 tx=0).stop 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 -- 192.168.123.103:0/1283163792 shutdown_connections 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 --2- 192.168.123.103:0/1283163792 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c100600 0x7fd94c10d2c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 --2- 192.168.123.103:0/1283163792 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c0ffc50 0x7fd94c100030 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 -- 192.168.123.103:0/1283163792 >> 192.168.123.103:0/1283163792 conn(0x7fd94c0fb830 msgr2=0x7fd94c0fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 -- 192.168.123.103:0/1283163792 shutdown_connections 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.353+0000 7fd95312d700 1 -- 192.168.123.103:0/1283163792 wait complete. 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd95312d700 1 Processor -- start 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd95312d700 1 -- start start 2026-03-09T00:07:41.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd95312d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 0x7fd94c198d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd95312d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c100600 0x7fd94c199270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd95312d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd94c199950 con 0x7fd94c0ffc50 2026-03-09T00:07:41.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd95312d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd94c19d6e0 con 0x7fd94c100600 2026-03-09T00:07:41.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd950ec9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 0x7fd94c198d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd950ec9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 0x7fd94c198d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57434/0 (socket says 192.168.123.103:57434) 2026-03-09T00:07:41.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd950ec9700 1 -- 192.168.123.103:0/608792192 learned_addr learned my addr 192.168.123.103:0/608792192 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:41.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd950ec9700 1 -- 192.168.123.103:0/608792192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c100600 msgr2=0x7fd94c199270 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:07:41.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.354+0000 7fd94bfff700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c100600 0x7fd94c199270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd950ec9700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c100600 0x7fd94c199270 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd950ec9700 1 -- 192.168.123.103:0/608792192 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd940009710 con 0x7fd94c0ffc50 2026-03-09T00:07:41.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd950ec9700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 0x7fd94c198d30 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fd93c009b00 tx=0x7fd93c00f740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd93c01d070 con 0x7fd94c0ffc50 2026-03-09T00:07:41.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd93c00fda0 con 0x7fd94c0ffc50 2026-03-09T00:07:41.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd93c017890 con 0x7fd94c0ffc50 2026-03-09T00:07:41.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd93c0097e0 con 0x7fd94c0ffc50 2026-03-09T00:07:41.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.355+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd94c19dc40 con 0x7fd94c0ffc50 2026-03-09T00:07:41.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.357+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd93c0179f0 con 0x7fd94c0ffc50 2026-03-09T00:07:41.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.357+0000 7fd949ffb700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd9340779e0 0x7fd934079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.357+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7fd93c09c0b0 con 0x7fd94c0ffc50 2026-03-09T00:07:41.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.358+0000 7fd94bfff700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd9340779e0 0x7fd934079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.360 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.358+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd94c04ea90 con 0x7fd94c0ffc50 2026-03-09T00:07:41.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.361+0000 7fd94bfff700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd9340779e0 0x7fd934079ea0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fd94c19a350 tx=0x7fd940009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.361+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd93c0648f0 con 0x7fd94c0ffc50 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.526+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fd94c19a150 con 0x7fd94c0ffc50 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.527+0000 7fd949ffb700 1 -- 192.168.123.103:0/608792192 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fd93c027070 con 0x7fd94c0ffc50 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:07:41.527 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd9340779e0 msgr2=0x7fd934079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd9340779e0 0x7fd934079ea0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fd94c19a350 tx=0x7fd940009450 comp rx=0 tx=0).stop 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 msgr2=0x7fd94c198d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 0x7fd94c198d30 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fd93c009b00 tx=0x7fd93c00f740 comp rx=0 tx=0).stop 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 shutdown_connections 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd9340779e0 0x7fd934079ea0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd94c0ffc50 0x7fd94c198d30 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.530 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 --2- 192.168.123.103:0/608792192 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd94c100600 0x7fd94c199270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.530+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 >> 192.168.123.103:0/608792192 conn(0x7fd94c0fb830 msgr2=0x7fd94c0fcdd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.531+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 shutdown_connections 2026-03-09T00:07:41.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.531+0000 7fd95312d700 1 -- 192.168.123.103:0/608792192 wait complete. 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 -- 192.168.123.103:0/166612654 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 msgr2=0x7f21b0103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 --2- 192.168.123.103:0/166612654 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0103720 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f21a0009b00 tx=0x7f21a0009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 -- 192.168.123.103:0/166612654 shutdown_connections 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 --2- 192.168.123.103:0/166612654 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 0x7f21b0107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 --2- 192.168.123.103:0/166612654 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0103720 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 -- 192.168.123.103:0/166612654 >> 192.168.123.103:0/166612654 conn(0x7f21b00feb90 msgr2=0x7f21b0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 -- 192.168.123.103:0/166612654 shutdown_connections 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.601+0000 7f21b63ee700 1 -- 192.168.123.103:0/166612654 wait complete. 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21b63ee700 1 Processor -- start 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21b63ee700 1 -- start start 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21b63ee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0198d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21b63ee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 0x7f21b01992b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21b63ee700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21b0199990 con 0x7f21b0103cf0 2026-03-09T00:07:41.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21b63ee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f21b019d720 con 0x7f21b0103340 2026-03-09T00:07:41.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21a7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 0x7f21b01992b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21affff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0198d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.602+0000 7f21affff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0198d70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:53456/0 (socket says 192.168.123.103:53456) 2026-03-09T00:07:41.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.603+0000 7f21affff700 1 -- 192.168.123.103:0/1135943867 learned_addr learned my addr 192.168.123.103:0/1135943867 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.605+0000 7f21affff700 1 -- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 msgr2=0x7f21b01992b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.605+0000 7f21affff700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 0x7f21b01992b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.605+0000 7f21affff700 1 -- 192.168.123.103:0/1135943867 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f21a00097e0 con 0x7f21b0103340 2026-03-09T00:07:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.605+0000 7f21a7fff700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 0x7f21b01992b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:07:41.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.605+0000 7f21affff700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0198d70 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f21a000c010 tx=0x7f21a000ba00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.605+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f21a001d070 con 0x7f21b0103340 2026-03-09T00:07:41.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.606+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f21a000f460 con 0x7f21b0103340 2026-03-09T00:07:41.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.606+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f21a001d070 con 0x7f21b0103340 2026-03-09T00:07:41.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.607+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f21b019d9a0 con 0x7f21b0103340 2026-03-09T00:07:41.607 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.607+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f21b019de10 con 0x7f21b0103340 2026-03-09T00:07:41.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.607+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f21b010b670 con 0x7f21b0103340 2026-03-09T00:07:41.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.609+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f21a0003a40 con 0x7f21b0103340 2026-03-09T00:07:41.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.609+0000 7f21adffb700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f219c077990 0x7f219c079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.610+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f21a006c6b0 con 0x7f21b0103340 2026-03-09T00:07:41.610 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.610+0000 7f21a7fff700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f219c077990 0x7f219c079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.611 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.610+0000 7f21a7fff700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f219c077990 0x7f219c079e50 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f21b019a390 tx=0x7f2198009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.611 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.611+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f21a005f9f0 con 0x7f21b0103340 2026-03-09T00:07:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.758+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f21b019a1d0 con 0x7f21b0103340 2026-03-09T00:07:41.759 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.758+0000 7f21adffb700 1 -- 192.168.123.103:0/1135943867 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f21a005f9f0 con 0x7f21b0103340 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:07:41.760 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:41.761 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.762+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f219c077990 msgr2=0x7f219c079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.762+0000 7f21b63ee700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f219c077990 0x7f219c079e50 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f21b019a390 tx=0x7f2198009450 comp rx=0 tx=0).stop 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.762+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 msgr2=0x7f21b0198d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.762+0000 7f21b63ee700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0198d70 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f21a000c010 tx=0x7f21a000ba00 comp rx=0 tx=0).stop 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 shutdown_connections 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f219c077990 0x7f219c079e50 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f21b0103340 0x7f21b0198d70 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 --2- 192.168.123.103:0/1135943867 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f21b0103cf0 0x7f21b01992b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 >> 192.168.123.103:0/1135943867 conn(0x7f21b00feb90 msgr2=0x7f21b0100f30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 shutdown_connections 2026-03-09T00:07:41.763 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.763+0000 7f21b63ee700 1 -- 192.168.123.103:0/1135943867 wait complete. 2026-03-09T00:07:41.764 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 -- 192.168.123.103:0/3751853649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf4101a80 msgr2=0x7efdf4105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/3751853649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf4101a80 0x7efdf4105ad0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7efde4009b00 tx=0x7efde4009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 -- 192.168.123.103:0/3751853649 shutdown_connections 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/3751853649 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf4101a80 0x7efdf4105ad0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/3751853649 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf41010d0 0x7efdf41014b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 -- 192.168.123.103:0/3751853649 >> 192.168.123.103:0/3751853649 conn(0x7efdf40fc920 msgr2=0x7efdf40fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 -- 192.168.123.103:0/3751853649 shutdown_connections 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.835+0000 7efdf9e4a700 1 -- 192.168.123.103:0/3751853649 wait complete. 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.836+0000 7efdf9e4a700 1 Processor -- start 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.836+0000 7efdf9e4a700 1 -- start start 2026-03-09T00:07:41.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.836+0000 7efdf9e4a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 0x7efdf4198d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.836+0000 7efdf9e4a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf4101a80 0x7efdf41992d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.836+0000 7efdf9e4a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efdf41999b0 con 0x7efdf41010d0 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.836+0000 7efdf9e4a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efdf419d740 con 0x7efdf4101a80 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 0x7efdf4198d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 0x7efdf4198d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57480/0 (socket says 192.168.123.103:57480) 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 -- 192.168.123.103:0/2034045565 learned_addr learned my addr 192.168.123.103:0/2034045565 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf2ffd700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf4101a80 0x7efdf41992d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 -- 192.168.123.103:0/2034045565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf4101a80 msgr2=0x7efdf41992d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf4101a80 0x7efdf41992d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 -- 192.168.123.103:0/2034045565 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efde40097e0 con 0x7efdf41010d0 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf2ffd700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf4101a80 0x7efdf41992d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:07:41.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf37fe700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 0x7efdf4198d90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7efddc00dc40 tx=0x7efddc00be10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efddc0099a0 con 0x7efdf41010d0 2026-03-09T00:07:41.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7efddc010460 con 0x7efdf41010d0 2026-03-09T00:07:41.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efddc00f6f0 con 0x7efdf41010d0 2026-03-09T00:07:41.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.837+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efdf419da20 con 0x7efdf41010d0 2026-03-09T00:07:41.839 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.838+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efdf419df70 con 0x7efdf41010d0 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.839+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efddc00f850 con 0x7efdf41010d0 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.839+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efdf404ea90 con 0x7efdf41010d0 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.840+0000 7efdf0ff9700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efde0077870 0x7efde0079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.840+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7efddc0999a0 con 0x7efdf41010d0 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.840+0000 7efdf2ffd700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efde0077870 0x7efde0079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.841+0000 7efdf2ffd700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efde0077870 0x7efde0079d30 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7efde4009fd0 tx=0x7efde4005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:41.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.842+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efddc0620b0 con 0x7efdf41010d0 2026-03-09T00:07:41.972 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:41 vm03.local ceph-mon[129670]: from='client.34204 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:41.972 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:41 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/608792192' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:41.972 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:41 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1135943867' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:07:41.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.971+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7efdf419e250 con 0x7efde0077870 2026-03-09T00:07:41.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.976+0000 7efdf0ff9700 1 -- 192.168.123.103:0/2034045565 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7efdf419e250 con 0x7efde0077870 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:07:41.977 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efde0077870 msgr2=0x7efde0079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efde0077870 0x7efde0079d30 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7efde4009fd0 tx=0x7efde4005fd0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 msgr2=0x7efdf4198d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 0x7efdf4198d90 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7efddc00dc40 tx=0x7efddc00be10 comp rx=0 tx=0).stop 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 shutdown_connections 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efde0077870 0x7efde0079d30 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efdf41010d0 0x7efdf4198d90 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 --2- 192.168.123.103:0/2034045565 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efdf4101a80 0x7efdf41992d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.979+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 >> 192.168.123.103:0/2034045565 conn(0x7efdf40fc920 msgr2=0x7efdf4104ab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.980+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 shutdown_connections 2026-03-09T00:07:41.980 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:41.980+0000 7efdf9e4a700 1 -- 192.168.123.103:0/2034045565 wait complete. 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.046+0000 7f127b51e700 1 -- 192.168.123.103:0/3738022108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 msgr2=0x7f1274073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.046+0000 7f127b51e700 1 --2- 192.168.123.103:0/3738022108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f1274073510 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f1264009b00 tx=0x7f1264009e10 comp rx=0 tx=0).stop 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 -- 192.168.123.103:0/3738022108 shutdown_connections 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 --2- 192.168.123.103:0/3738022108 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 0x7f1274111960 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 --2- 192.168.123.103:0/3738022108 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f1274073510 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 -- 192.168.123.103:0/3738022108 >> 192.168.123.103:0/3738022108 conn(0x7f12740fc920 msgr2=0x7f12740fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 -- 192.168.123.103:0/3738022108 shutdown_connections 2026-03-09T00:07:42.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 -- 192.168.123.103:0/3738022108 wait complete. 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.047+0000 7f127b51e700 1 Processor -- start 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f127b51e700 1 -- start start 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f127b51e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f127419d210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f127b51e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 0x7f127419d750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f127b51e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f127419de30 con 0x7f1274073130 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f127b51e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f12741a1bc0 con 0x7f1274073a50 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f12792ba700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f127419d210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f12792ba700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f127419d210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57510/0 (socket says 192.168.123.103:57510) 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f12792ba700 1 -- 192.168.123.103:0/4291230757 learned_addr learned my addr 192.168.123.103:0/4291230757 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:07:42.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f1278ab9700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 0x7f127419d750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f12792ba700 1 -- 192.168.123.103:0/4291230757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 msgr2=0x7f127419d750 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f12792ba700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 0x7f127419d750 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f12792ba700 1 -- 192.168.123.103:0/4291230757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f12640097e0 con 0x7f1274073130 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.048+0000 7f1278ab9700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 0x7f127419d750 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.049+0000 7f12792ba700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f127419d210 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f12640048c0 tx=0x7f12640049a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.049+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f126401d070 con 0x7f1274073130 2026-03-09T00:07:42.049 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.049+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f126400bc50 con 0x7f1274073130 2026-03-09T00:07:42.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.049+0000 7f127b51e700 1 -- 192.168.123.103:0/4291230757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f12741a1ea0 con 0x7f1274073130 2026-03-09T00:07:42.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.049+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f126400f630 con 0x7f1274073130 2026-03-09T00:07:42.050 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.049+0000 7f127b51e700 1 -- 192.168.123.103:0/4291230757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f12741a23f0 con 0x7f1274073130 2026-03-09T00:07:42.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.050+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f126400f7f0 con 0x7f1274073130 2026-03-09T00:07:42.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.051+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f127404ea90 con 0x7f1274073130 2026-03-09T00:07:42.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.052+0000 7f126a7fc700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f12600778c0 0x7f1260079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:07:42.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.052+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6426+0+0 (secure 0 0 0) 0x7f126409b0d0 con 0x7f1274073130 2026-03-09T00:07:42.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.053+0000 7f1278ab9700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f12600778c0 0x7f1260079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:07:42.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.054+0000 7f1278ab9700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f12600778c0 0x7f1260079d80 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f127419e830 tx=0x7f1270008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:07:42.054 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.054+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f12640a0050 con 0x7f1274073130 2026-03-09T00:07:42.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:41 vm06.local ceph-mon[106218]: from='client.34204 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:42.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:41 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/608792192' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:07:42.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:41 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1135943867' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:07:42.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.219+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f1274066e80 con 0x7f1274073130 2026-03-09T00:07:42.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.220+0000 7f126a7fc700 1 -- 192.168.123.103:0/4291230757 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1651 (secure 0 0 0) 0x7f1264027690 con 0x7f1274073130 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 907/231 objects degraded (392.641%), 10 pgs degraded, 10 pgs undersized 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 907/231 objects degraded (392.641%), 10 pgs degraded, 10 pgs undersized 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.c is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,3] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.10 is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,1] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.12 is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,3] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.17 is stuck undersized for 71s, current state active+recovering+undersized+degraded+remapped, last acting [2,5] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1b is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-09T00:07:42.221 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is stuck undersized for 71s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.222+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f12600778c0 msgr2=0x7f1260079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.223+0000 7f125ffff700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f12600778c0 0x7f1260079d80 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f127419e830 tx=0x7f1270008040 comp rx=0 tx=0).stop 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.223+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 msgr2=0x7f127419d210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.223+0000 7f125ffff700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f127419d210 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f12640048c0 tx=0x7f12640049a0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.223+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 shutdown_connections 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.224+0000 7f125ffff700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f12600778c0 0x7f1260079d80 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.224+0000 7f125ffff700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1274073130 0x7f127419d210 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.224+0000 7f125ffff700 1 --2- 192.168.123.103:0/4291230757 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1274073a50 0x7f127419d750 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:07:42.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.224+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 >> 192.168.123.103:0/4291230757 conn(0x7f12740fc920 msgr2=0x7f1274103470 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:07:42.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.224+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 shutdown_connections 2026-03-09T00:07:42.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:07:42.225+0000 7f125ffff700 1 -- 192.168.123.103:0/4291230757 wait complete. 2026-03-09T00:07:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:42 vm03.local ceph-mon[129670]: from='client.44161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:42 vm03.local ceph-mon[129670]: from='client.34210 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:42 vm03.local ceph-mon[129670]: pgmap v80: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:42 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4291230757' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:07:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:42 vm03.local ceph-mon[129670]: osdmap e63: 6 total, 6 up, 6 in 2026-03-09T00:07:43.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:42 vm06.local ceph-mon[106218]: from='client.44161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:42 vm06.local ceph-mon[106218]: from='client.34210 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:42 vm06.local ceph-mon[106218]: pgmap v80: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:42 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4291230757' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:07:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:42 vm06.local ceph-mon[106218]: osdmap e63: 6 total, 6 up, 6 in 2026-03-09T00:07:44.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:43 vm03.local ceph-mon[129670]: from='client.34222 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:44.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:43 vm03.local ceph-mon[129670]: osdmap e64: 6 total, 6 up, 6 in 2026-03-09T00:07:44.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:44.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:43 vm06.local ceph-mon[106218]: from='client.34222 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:07:44.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:43 vm06.local ceph-mon[106218]: osdmap e64: 6 total, 6 up, 6 in 2026-03-09T00:07:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:45.040 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:44 vm03.local ceph-mon[129670]: pgmap v83: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:45.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:44 vm06.local ceph-mon[106218]: pgmap v83: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 9 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 907/231 objects degraded (392.641%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:45 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 822/231 objects degraded (355.844%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:46.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:45 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 822/231 objects degraded (355.844%), 10 pgs degraded, 10 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:47.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:46 vm06.local ceph-mon[106218]: pgmap v84: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 822/231 objects degraded (355.844%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:46 vm03.local ceph-mon[129670]: pgmap v84: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 55 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 822/231 objects degraded (355.844%); 0 B/s, 9 objects/s recovering 2026-03-09T00:07:49.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:48 vm06.local ceph-mon[106218]: pgmap v85: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:48 vm03.local ceph-mon[129670]: pgmap v85: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:50 vm06.local ceph-mon[106218]: pgmap v86: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:51.174 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:50 vm03.local ceph-mon[129670]: pgmap v86: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 5 objects/s recovering 2026-03-09T00:07:52.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:51 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:52.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:53.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:52 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:53.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:52 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T00:07:53.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:52 vm06.local ceph-mon[106218]: pgmap v87: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 10 objects/s recovering 2026-03-09T00:07:53.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:52 vm06.local ceph-mon[106218]: osdmap e65: 6 total, 6 up, 6 in 2026-03-09T00:07:53.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:52 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 812/231 objects degraded (351.515%), 9 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:52 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:07:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:52 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (4 PGs are or would become offline) 2026-03-09T00:07:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:52 vm03.local ceph-mon[129670]: pgmap v87: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 10 objects/s recovering 2026-03-09T00:07:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:52 vm03.local ceph-mon[129670]: osdmap e65: 6 total, 6 up, 6 in 2026-03-09T00:07:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:52 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 812/231 objects degraded (351.515%), 9 pgs degraded, 9 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:54.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:53 vm06.local ceph-mon[106218]: osdmap e66: 6 total, 6 up, 6 in 2026-03-09T00:07:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:53 vm03.local ceph-mon[129670]: osdmap e66: 6 total, 6 up, 6 in 2026-03-09T00:07:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:54 vm03.local ceph-mon[129670]: pgmap v90: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 8 objects/s recovering 2026-03-09T00:07:55.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:54 vm06.local ceph-mon[106218]: pgmap v90: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 8 active+recovery_wait+undersized+degraded+remapped, 56 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 812/231 objects degraded (351.515%); 0 B/s, 8 objects/s recovering 2026-03-09T00:07:56.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:55 vm03.local ceph-mon[129670]: pgmap v91: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:56.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:55 vm06.local ceph-mon[106218]: pgmap v91: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:57.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:57 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 711/231 objects degraded (307.792%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:57.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:57 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 711/231 objects degraded (307.792%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-09T00:07:58.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:58 vm06.local ceph-mon[106218]: pgmap v92: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:58.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:58 vm03.local ceph-mon[129670]: pgmap v92: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 11 objects/s recovering 2026-03-09T00:07:59.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:07:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:07:59.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:07:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:00 vm06.local ceph-mon[106218]: pgmap v93: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:00 vm03.local ceph-mon[129670]: pgmap v93: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:02.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:02 vm06.local ceph-mon[106218]: pgmap v94: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:03.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:02 vm03.local ceph-mon[129670]: pgmap v94: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:04 vm06.local ceph-mon[106218]: pgmap v95: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 8 objects/s recovering 2026-03-09T00:08:04.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:04 vm06.local ceph-mon[106218]: osdmap e67: 6 total, 6 up, 6 in 2026-03-09T00:08:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:04 vm03.local ceph-mon[129670]: pgmap v95: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 7 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 711/231 objects degraded (307.792%); 0 B/s, 8 objects/s recovering 2026-03-09T00:08:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:04 vm03.local ceph-mon[129670]: osdmap e67: 6 total, 6 up, 6 in 2026-03-09T00:08:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:05 vm06.local ceph-mon[106218]: osdmap e68: 6 total, 6 up, 6 in 2026-03-09T00:08:06.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:05 vm03.local ceph-mon[129670]: osdmap e68: 6 total, 6 up, 6 in 2026-03-09T00:08:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:06 vm06.local ceph-mon[106218]: pgmap v98: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 634/231 objects degraded (274.459%); 0 B/s, 8 objects/s recovering 2026-03-09T00:08:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:06 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 634/231 objects degraded (274.459%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:06.948 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:06 vm03.local ceph-mon[129670]: pgmap v98: 65 pgs: 2 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 57 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 634/231 objects degraded (274.459%); 0 B/s, 8 objects/s recovering 2026-03-09T00:08:06.948 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:06 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 634/231 objects degraded (274.459%), 8 pgs degraded, 8 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:06.948 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:07 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:07.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:07 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-09T00:08:08.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:07 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:08.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:07 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (3 PGs are or would become offline) 2026-03-09T00:08:09.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:08 vm03.local ceph-mon[129670]: pgmap v99: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:09.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:08 vm06.local ceph-mon[106218]: pgmap v99: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:10 vm03.local ceph-mon[129670]: pgmap v100: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:10 vm06.local ceph-mon[106218]: pgmap v100: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.308+0000 7f8d3d04e700 1 -- 192.168.123.103:0/729173988 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069180 msgr2=0x7f8d38069560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.308+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/729173988 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069180 0x7f8d38069560 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8d20009b50 tx=0x7f8d20009e60 comp rx=0 tx=0).stop 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.308+0000 7f8d3d04e700 1 -- 192.168.123.103:0/729173988 shutdown_connections 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.308+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/729173988 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38069aa0 0x7f8d3810d5b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.308+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/729173988 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069180 0x7f8d38069560 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.308+0000 7f8d3d04e700 1 -- 192.168.123.103:0/729173988 >> 192.168.123.103:0/729173988 conn(0x7f8d38076b70 msgr2=0x7f8d38076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.309+0000 7f8d3d04e700 1 -- 192.168.123.103:0/729173988 shutdown_connections 2026-03-09T00:08:12.309 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.309+0000 7f8d3d04e700 1 -- 192.168.123.103:0/729173988 wait complete. 2026-03-09T00:08:12.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.309+0000 7f8d3d04e700 1 Processor -- start 2026-03-09T00:08:12.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3d04e700 1 -- start start 2026-03-09T00:08:12.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3d04e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069aa0 0x7f8d38199120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3d04e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 0x7f8d3819dad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3d04e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d38199c80 con 0x7f8d38069aa0 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3d04e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8d38199df0 con 0x7f8d38199660 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 0x7f8d3819dad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3659c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 0x7f8d3819dad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:47398/0 (socket says 192.168.123.103:47398) 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.310+0000 7f8d3659c700 1 -- 192.168.123.103:0/3399303566 learned_addr learned my addr 192.168.123.103:0/3399303566 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.311+0000 7f8d3659c700 1 -- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069aa0 msgr2=0x7f8d38199120 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.311+0000 7f8d36d9d700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069aa0 0x7f8d38199120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.311+0000 7f8d3659c700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069aa0 0x7f8d38199120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.311+0000 7f8d3659c700 1 -- 192.168.123.103:0/3399303566 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8d200097e0 con 0x7f8d38199660 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.311+0000 7f8d36d9d700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069aa0 0x7f8d38199120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:08:12.311 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.311+0000 7f8d3659c700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 0x7f8d3819dad0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f8d2800eb10 tx=0x7f8d2800eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:12.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.367+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8d2800cca0 con 0x7f8d38199660 2026-03-09T00:08:12.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.367+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8d2800ce00 con 0x7f8d38199660 2026-03-09T00:08:12.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.367+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8d280189c0 con 0x7f8d38199660 2026-03-09T00:08:12.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.367+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8d3819e0d0 con 0x7f8d38199660 2026-03-09T00:08:12.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.367+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8d3819e5d0 con 0x7f8d38199660 2026-03-09T00:08:12.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.368+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8d3810ad20 con 0x7f8d38199660 2026-03-09T00:08:12.370 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.369+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8d28018b20 con 0x7f8d38199660 2026-03-09T00:08:12.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.370+0000 7f8d2ffff700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8d24077990 0x7f8d24079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.371+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(68..68 src has 1..68) v4 ==== 6339+0+0 (secure 0 0 0) 0x7f8d28014070 con 0x7f8d38199660 2026-03-09T00:08:12.371 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.371+0000 7f8d36d9d700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8d24077990 0x7f8d24079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.371+0000 7f8d36d9d700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8d24077990 0x7f8d24079e50 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f8d20006010 tx=0x7f8d200058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:12.372 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.372+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8d28063410 con 0x7f8d38199660 2026-03-09T00:08:12.517 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.516+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8d3819e8b0 con 0x7f8d24077990 2026-03-09T00:08:12.518 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.518+0000 7f8d2ffff700 1 -- 192.168.123.103:0/3399303566 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f8d3819e8b0 con 0x7f8d24077990 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8d24077990 msgr2=0x7f8d24079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8d24077990 0x7f8d24079e50 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f8d20006010 tx=0x7f8d200058e0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 msgr2=0x7f8d3819dad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 0x7f8d3819dad0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f8d2800eb10 tx=0x7f8d2800eed0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 shutdown_connections 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f8d24077990 0x7f8d24079e50 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.521 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f8d38069aa0 0x7f8d38199120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 --2- 192.168.123.103:0/3399303566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f8d38199660 0x7f8d3819dad0 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 >> 192.168.123.103:0/3399303566 conn(0x7f8d38076b70 msgr2=0x7f8d380febc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:12.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.521+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 shutdown_connections 2026-03-09T00:08:12.522 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.522+0000 7f8d3d04e700 1 -- 192.168.123.103:0/3399303566 wait complete. 2026-03-09T00:08:12.532 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.600+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/1563685330 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103340 msgr2=0x7f6c58103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.600+0000 7f6c577fe700 1 -- 192.168.123.103:0/1563685330 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c4000ba40 con 0x7f6c58103340 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.600+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/1563685330 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103340 0x7f6c58103720 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f6c40009b00 tx=0x7f6c40009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.600+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/1563685330 shutdown_connections 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.600+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/1563685330 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c58103cf0 0x7f6c58107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.600+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/1563685330 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103340 0x7f6c58103720 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.601+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/1563685330 >> 192.168.123.103:0/1563685330 conn(0x7f6c580feb90 msgr2=0x7f6c58100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.601+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/1563685330 shutdown_connections 2026-03-09T00:08:12.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.601+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/1563685330 wait complete. 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.601+0000 7f6c5e4f1700 1 Processor -- start 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.601+0000 7f6c5e4f1700 1 -- start start 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c5e4f1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c58103340 0x7f6c58198ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c5e4f1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 0x7f6c581993e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c5e4f1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c58199ac0 con 0x7f6c58103cf0 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c5e4f1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6c5819d850 con 0x7f6c58103340 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c4ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 0x7f6c581993e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c4ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 0x7f6c581993e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45794/0 (socket says 192.168.123.103:45794) 2026-03-09T00:08:12.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c4ffff700 1 -- 192.168.123.103:0/637930638 learned_addr learned my addr 192.168.123.103:0/637930638 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:12.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c57fff700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c58103340 0x7f6c58198ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c4ffff700 1 -- 192.168.123.103:0/637930638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c58103340 msgr2=0x7f6c58198ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c4ffff700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c58103340 0x7f6c58198ea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.602+0000 7f6c4ffff700 1 -- 192.168.123.103:0/637930638 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6c400097e0 con 0x7f6c58103cf0 2026-03-09T00:08:12.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.603+0000 7f6c4ffff700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 0x7f6c581993e0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f6c4800b700 tx=0x7f6c4800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:12.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.603+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c48010820 con 0x7f6c58103cf0 2026-03-09T00:08:12.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.603+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6c48010e60 con 0x7f6c58103cf0 2026-03-09T00:08:12.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.603+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6c48017570 con 0x7f6c58103cf0 2026-03-09T00:08:12.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.603+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6c5819db30 con 0x7f6c58103cf0 2026-03-09T00:08:12.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.603+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6c5819e080 con 0x7f6c58103cf0 2026-03-09T00:08:12.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.605+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6c480176d0 con 0x7f6c58103cf0 2026-03-09T00:08:12.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.605+0000 7f6c55ffb700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c380779e0 0x7f6c38079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.605+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6310+0+0 (secure 0 0 0) 0x7f6c4809a520 con 0x7f6c58103cf0 2026-03-09T00:08:12.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.605+0000 7f6c57fff700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c380779e0 0x7f6c38079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.606 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.606+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6c5819dcc0 con 0x7f6c58103cf0 2026-03-09T00:08:12.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.607+0000 7f6c57fff700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c380779e0 0x7f6c38079ea0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f6c40009fd0 tx=0x7f6c4000b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:12.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.609+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6c5819dcc0 con 0x7f6c58103cf0 2026-03-09T00:08:12.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.768+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6c5819dcc0 con 0x7f6c380779e0 2026-03-09T00:08:12.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.770+0000 7f6c55ffb700 1 -- 192.168.123.103:0/637930638 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f6c5819dcc0 con 0x7f6c380779e0 2026-03-09T00:08:12.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.775+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c380779e0 msgr2=0x7f6c38079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.775+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c380779e0 0x7f6c38079ea0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f6c40009fd0 tx=0x7f6c4000b560 comp rx=0 tx=0).stop 2026-03-09T00:08:12.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.775+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 msgr2=0x7f6c581993e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.775+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 0x7f6c581993e0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f6c4800b700 tx=0x7f6c4800bac0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.777+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 shutdown_connections 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.777+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6c380779e0 0x7f6c38079ea0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.777+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6c58103340 0x7f6c58198ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.777+0000 7f6c5e4f1700 1 --2- 192.168.123.103:0/637930638 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6c58103cf0 0x7f6c581993e0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.777+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 >> 192.168.123.103:0/637930638 conn(0x7f6c580feb90 msgr2=0x7f6c58100f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.778+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 shutdown_connections 2026-03-09T00:08:12.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.778+0000 7f6c5e4f1700 1 -- 192.168.123.103:0/637930638 wait complete. 2026-03-09T00:08:12.862 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:12 vm03.local ceph-mon[129670]: pgmap v101: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:12.862 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:12 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 622/231 objects degraded (269.264%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:12.862 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:12 vm03.local ceph-mon[129670]: osdmap e69: 6 total, 6 up, 6 in 2026-03-09T00:08:12.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.861+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3466467884 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 msgr2=0x7f6e74107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.861+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3466467884 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e74107dc0 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f6e5c009b50 tx=0x7f6e5c009e60 comp rx=0 tx=0).stop 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.862+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3466467884 shutdown_connections 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.862+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3466467884 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e74107dc0 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.862+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3466467884 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e741037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.862+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3466467884 >> 192.168.123.103:0/3466467884 conn(0x7f6e740fec30 msgr2=0x7f6e74101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.862+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3466467884 shutdown_connections 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.862+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3466467884 wait complete. 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.866+0000 7f6e78f7c700 1 Processor -- start 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.866+0000 7f6e78f7c700 1 -- start start 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.866+0000 7f6e78f7c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e74198e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.866 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.866+0000 7f6e78f7c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e741993c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.866+0000 7f6e78f7c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e74199aa0 con 0x7f6e74103d70 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.866+0000 7f6e78f7c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e7419d830 con 0x7f6e741033c0 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e7259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e74198e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e7259c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e74198e80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:47424/0 (socket says 192.168.123.103:47424) 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e7259c700 1 -- 192.168.123.103:0/3844168731 learned_addr learned my addr 192.168.123.103:0/3844168731 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e6bfff700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e741993c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e7259c700 1 -- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 msgr2=0x7f6e741993c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:12.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e7259c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e741993c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:12.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e7259c700 1 -- 192.168.123.103:0/3844168731 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e5c0097e0 con 0x7f6e741033c0 2026-03-09T00:08:12.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.867+0000 7f6e6bfff700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e741993c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:08:12.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.868+0000 7f6e7259c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e74198e80 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f6e6400ebf0 tx=0x7f6e6400ef00 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:12.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.868+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e6400cc30 con 0x7f6e741033c0 2026-03-09T00:08:12.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.868+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6e6400cd90 con 0x7f6e741033c0 2026-03-09T00:08:12.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.868+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e7419db10 con 0x7f6e741033c0 2026-03-09T00:08:12.869 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.868+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e64010640 con 0x7f6e741033c0 2026-03-09T00:08:12.869 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.868+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e7419e060 con 0x7f6e741033c0 2026-03-09T00:08:12.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.870+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6e64004750 con 0x7f6e741033c0 2026-03-09T00:08:12.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.870+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e7410b6d0 con 0x7f6e741033c0 2026-03-09T00:08:12.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.870+0000 7f6e6b7fe700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6e60077870 0x7f6e60079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:12.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.871+0000 7f6e6bfff700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6e60077870 0x7f6e60079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:12.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.871+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6310+0+0 (secure 0 0 0) 0x7f6e64014070 con 0x7f6e741033c0 2026-03-09T00:08:12.871 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.871+0000 7f6e6bfff700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6e60077870 0x7f6e60079d30 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6e7419a4a0 tx=0x7f6e5c00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:12.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:12.873+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6e640625f0 con 0x7f6e741033c0 2026-03-09T00:08:13.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.019+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6e7419a1e0 con 0x7f6e60077870 2026-03-09T00:08:13.025 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.025+0000 7f6e6b7fe700 1 -- 192.168.123.103:0/3844168731 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f6e7419a1e0 con 0x7f6e60077870 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (3m) 114s ago 8m 24.0M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (8m) 114s ago 8m 8786k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (8m) 2m ago 8m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 114s ago 8m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (2m) 2m ago 8m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 114s ago 8m 73.3M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (6m) 114s ago 6m 18.1M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (6m) 114s ago 6m 218M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (6m) 2m ago 6m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (6m) 2m ago 6m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (4m) 114s ago 9m 619M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (4m) 2m ago 8m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (2m) 114s ago 9m 57.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (2m) 2m ago 8m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (3m) 114s ago 8m 9445k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (3m) 2m ago 8m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (116s) 114s ago 7m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (7m) 114s ago 7m 451M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (7m) 114s ago 7m 370M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (7m) 2m ago 7m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (7m) 2m ago 7m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (6m) 2m ago 6m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:08:13.026 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (3m) 114s ago 8m 56.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:08:13.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.028+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6e60077870 msgr2=0x7f6e60079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.028+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6e60077870 0x7f6e60079d30 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f6e7419a4a0 tx=0x7f6e5c00b540 comp rx=0 tx=0).stop 2026-03-09T00:08:13.028 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.028+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 msgr2=0x7f6e74198e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.028+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e74198e80 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f6e6400ebf0 tx=0x7f6e6400ef00 comp rx=0 tx=0).stop 2026-03-09T00:08:13.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.029+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 shutdown_connections 2026-03-09T00:08:13.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.029+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6e60077870 0x7f6e60079d30 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.029+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6e741033c0 0x7f6e74198e80 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.029+0000 7f6e78f7c700 1 --2- 192.168.123.103:0/3844168731 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6e74103d70 0x7f6e741993c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.029+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 >> 192.168.123.103:0/3844168731 conn(0x7f6e740fec30 msgr2=0x7f6e74107630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.029+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 shutdown_connections 2026-03-09T00:08:13.030 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.030+0000 7f6e78f7c700 1 -- 192.168.123.103:0/3844168731 wait complete. 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.102+0000 7fa823a2a700 1 -- 192.168.123.103:0/2656873748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c103c90 msgr2=0x7fa81c107ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.102+0000 7fa823a2a700 1 --2- 192.168.123.103:0/2656873748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c103c90 0x7fa81c107ce0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fa818009b00 tx=0x7fa818009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 -- 192.168.123.103:0/2656873748 shutdown_connections 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 --2- 192.168.123.103:0/2656873748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c103c90 0x7fa81c107ce0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 --2- 192.168.123.103:0/2656873748 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa81c1032e0 0x7fa81c1036c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 -- 192.168.123.103:0/2656873748 >> 192.168.123.103:0/2656873748 conn(0x7fa81c0feb50 msgr2=0x7fa81c100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 -- 192.168.123.103:0/2656873748 shutdown_connections 2026-03-09T00:08:13.103 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 -- 192.168.123.103:0/2656873748 wait complete. 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.103+0000 7fa823a2a700 1 Processor -- start 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa823a2a700 1 -- start start 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa823a2a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 0x7fa81c198ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa823a2a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa81c103c90 0x7fa81c199420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa823a2a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa81c199b00 con 0x7fa81c1032e0 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa823a2a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa81c19d890 con 0x7fa81c103c90 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa8217c6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 0x7fa81c198ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa820fc5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa81c103c90 0x7fa81c199420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa8217c6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 0x7fa81c198ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45840/0 (socket says 192.168.123.103:45840) 2026-03-09T00:08:13.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.104+0000 7fa8217c6700 1 -- 192.168.123.103:0/1740730946 learned_addr learned my addr 192.168.123.103:0/1740730946 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:13.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8217c6700 1 -- 192.168.123.103:0/1740730946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa81c103c90 msgr2=0x7fa81c199420 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8217c6700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa81c103c90 0x7fa81c199420 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8217c6700 1 -- 192.168.123.103:0/1740730946 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa8180097e0 con 0x7fa81c1032e0 2026-03-09T00:08:13.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8217c6700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 0x7fa81c198ee0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa80c00ba70 tx=0x7fa80c00bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa80c00c700 con 0x7fa81c1032e0 2026-03-09T00:08:13.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa80c00cd40 con 0x7fa81c1032e0 2026-03-09T00:08:13.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa80c012340 con 0x7fa81c1032e0 2026-03-09T00:08:13.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa81c19db70 con 0x7fa81c1032e0 2026-03-09T00:08:13.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.105+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa81c19e090 con 0x7fa81c1032e0 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.107+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa80c014440 con 0x7fa81c1032e0 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.107+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa81c10b630 con 0x7fa81c1032e0 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.107+0000 7fa8127fc700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa808077870 0x7fa808079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.107+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(69..69 src has 1..69) v4 ==== 6310+0+0 (secure 0 0 0) 0x7fa80c098910 con 0x7fa81c1032e0 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.108+0000 7fa820fc5700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa808077870 0x7fa808079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.109+0000 7fa820fc5700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa808077870 0x7fa808079d30 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fa818009ad0 tx=0x7fa818005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.110+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa80c061250 con 0x7fa81c1032e0 2026-03-09T00:08:13.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:12 vm06.local ceph-mon[106218]: pgmap v101: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:13.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:12 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 622/231 objects degraded (269.264%), 7 pgs degraded, 7 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:13.122 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:12 vm06.local ceph-mon[106218]: osdmap e69: 6 total, 6 up, 6 in 2026-03-09T00:08:13.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.290+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa81c19a240 con 0x7fa81c1032e0 2026-03-09T00:08:13.295 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.295+0000 7fa8127fc700 1 -- 192.168.123.103:0/1740730946 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fa80c0609a0 con 0x7fa81c1032e0 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:08:13.296 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa808077870 msgr2=0x7fa808079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa808077870 0x7fa808079d30 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7fa818009ad0 tx=0x7fa818005fb0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 msgr2=0x7fa81c198ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 0x7fa81c198ee0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa80c00ba70 tx=0x7fa80c00bd80 comp rx=0 tx=0).stop 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 shutdown_connections 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa808077870 0x7fa808079d30 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa81c1032e0 0x7fa81c198ee0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 --2- 192.168.123.103:0/1740730946 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa81c103c90 0x7fa81c199420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 >> 192.168.123.103:0/1740730946 conn(0x7fa81c0feb50 msgr2=0x7fa81c1001d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 shutdown_connections 2026-03-09T00:08:13.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.299+0000 7fa823a2a700 1 -- 192.168.123.103:0/1740730946 wait complete. 2026-03-09T00:08:13.387 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.386+0000 7f61775c7700 1 -- 192.168.123.103:0/4252604121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 msgr2=0x7f6170073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.386+0000 7f61775c7700 1 --2- 192.168.123.103:0/4252604121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f6170073510 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f6164009b00 tx=0x7f6164009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.387+0000 7f61775c7700 1 -- 192.168.123.103:0/4252604121 shutdown_connections 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.387+0000 7f61775c7700 1 --2- 192.168.123.103:0/4252604121 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6170073a50 0x7f6170111940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.387+0000 7f61775c7700 1 --2- 192.168.123.103:0/4252604121 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f6170073510 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.387+0000 7f61775c7700 1 -- 192.168.123.103:0/4252604121 >> 192.168.123.103:0/4252604121 conn(0x7f61700fc920 msgr2=0x7f61700fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.387+0000 7f61775c7700 1 -- 192.168.123.103:0/4252604121 shutdown_connections 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.388+0000 7f61775c7700 1 -- 192.168.123.103:0/4252604121 wait complete. 2026-03-09T00:08:13.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.388+0000 7f61775c7700 1 Processor -- start 2026-03-09T00:08:13.389 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.389+0000 7f61775c7700 1 -- start start 2026-03-09T00:08:13.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.389+0000 7f61775c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f617019d170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.389+0000 7f61775c7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6170073a50 0x7f617019d6b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.390 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f61775c7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f617019dd90 con 0x7f6170073130 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f61775c7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61701a1b20 con 0x7f6170073a50 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f617019d170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f617019d170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45854/0 (socket says 192.168.123.103:45854) 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 -- 192.168.123.103:0/2133975515 learned_addr learned my addr 192.168.123.103:0/2133975515 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 -- 192.168.123.103:0/2133975515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6170073a50 msgr2=0x7f617019d6b0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6170073a50 0x7f617019d6b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 -- 192.168.123.103:0/2133975515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61640097e0 con 0x7f6170073130 2026-03-09T00:08:13.391 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.390+0000 7f6175363700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f617019d170 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6164009ad0 tx=0x7f61640052e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.391+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f616401d070 con 0x7f6170073130 2026-03-09T00:08:13.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.391+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f616400bc50 con 0x7f6170073130 2026-03-09T00:08:13.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.391+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f616400f7d0 con 0x7f6170073130 2026-03-09T00:08:13.392 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.392+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f61701a1e00 con 0x7f6170073130 2026-03-09T00:08:13.394 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.392+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f61701a2320 con 0x7f6170073130 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.395+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f617010f0c0 con 0x7f6170073130 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.395+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f616400f930 con 0x7f6170073130 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.395+0000 7f61627fc700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f615c0779e0 0x7f615c079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.396+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6310+0+0 (secure 0 0 0) 0x7f616409b1a0 con 0x7f6170073130 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.396+0000 7f6174b62700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f615c0779e0 0x7f615c079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.397+0000 7f6174b62700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f615c0779e0 0x7f615c079ea0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f617019e790 tx=0x7f616c006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.398 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.398+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f61640639a0 con 0x7f6170073130 2026-03-09T00:08:13.555 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.554+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f617019e4d0 con 0x7f6170073130 2026-03-09T00:08:13.559 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.558+0000 7f61627fc700 1 -- 192.168.123.103:0/2133975515 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f61640630f0 con 0x7f6170073130 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:08:13.560 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:13.561 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:13.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.563+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f615c0779e0 msgr2=0x7f615c079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.563+0000 7f61775c7700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f615c0779e0 0x7f615c079ea0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f617019e790 tx=0x7f616c006cb0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.564 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 msgr2=0x7f617019d170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f617019d170 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6164009ad0 tx=0x7f61640052e0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 shutdown_connections 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f615c0779e0 0x7f615c079ea0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f6170073130 0x7f617019d170 secure :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f6164009ad0 tx=0x7f61640052e0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 --2- 192.168.123.103:0/2133975515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f6170073a50 0x7f617019d6b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 >> 192.168.123.103:0/2133975515 conn(0x7f61700fc920 msgr2=0x7f6170103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 shutdown_connections 2026-03-09T00:08:13.565 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.564+0000 7f61775c7700 1 -- 192.168.123.103:0/2133975515 wait complete. 2026-03-09T00:08:13.566 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.645+0000 7fc922a04700 1 -- 192.168.123.103:0/3016988532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 msgr2=0x7fc91c10d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.645+0000 7fc922a04700 1 --2- 192.168.123.103:0/3016988532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c10d5b0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fc90c009b30 tx=0x7fc90c009e40 comp rx=0 tx=0).stop 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.645+0000 7fc922a04700 1 -- 192.168.123.103:0/3016988532 shutdown_connections 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.645+0000 7fc922a04700 1 --2- 192.168.123.103:0/3016988532 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c10d5b0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.645+0000 7fc922a04700 1 --2- 192.168.123.103:0/3016988532 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc91c0684d0 0x7fc91c0688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.645+0000 7fc922a04700 1 -- 192.168.123.103:0/3016988532 >> 192.168.123.103:0/3016988532 conn(0x7fc91c075960 msgr2=0x7fc91c075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.646+0000 7fc922a04700 1 -- 192.168.123.103:0/3016988532 shutdown_connections 2026-03-09T00:08:13.646 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.646+0000 7fc922a04700 1 -- 192.168.123.103:0/3016988532 wait complete. 2026-03-09T00:08:13.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc922a04700 1 Processor -- start 2026-03-09T00:08:13.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc922a04700 1 -- start start 2026-03-09T00:08:13.647 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc922a04700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc91c0684d0 0x7fc91c198d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc922a04700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c1992a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc922a04700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc91c199980 con 0x7fc91c068df0 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc922a04700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc91c19d710 con 0x7fc91c0684d0 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc91b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c1992a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc91b7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c1992a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45876/0 (socket says 192.168.123.103:45876) 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.647+0000 7fc91b7fe700 1 -- 192.168.123.103:0/578466969 learned_addr learned my addr 192.168.123.103:0/578466969 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc91b7fe700 1 -- 192.168.123.103:0/578466969 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc91c0684d0 msgr2=0x7fc91c198d60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc91b7fe700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc91c0684d0 0x7fc91c198d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.648 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc91b7fe700 1 -- 192.168.123.103:0/578466969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc90c0097e0 con 0x7fc91c068df0 2026-03-09T00:08:13.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc91b7fe700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c1992a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fc90c000c00 tx=0x7fc90c0049b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.649 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc90c01d070 con 0x7fc91c068df0 2026-03-09T00:08:13.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc90c004b90 con 0x7fc91c068df0 2026-03-09T00:08:13.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.648+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc90c00f650 con 0x7fc91c068df0 2026-03-09T00:08:13.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.649+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc91c19d9f0 con 0x7fc91c068df0 2026-03-09T00:08:13.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.649+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc91c19def0 con 0x7fc91c068df0 2026-03-09T00:08:13.657 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.651+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc91c10ad20 con 0x7fc91c068df0 2026-03-09T00:08:13.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.657+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc90c0229e0 con 0x7fc91c068df0 2026-03-09T00:08:13.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.657+0000 7fc9197fa700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc908077870 0x7fc908079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.657+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6310+0+0 (secure 0 0 0) 0x7fc90c09bba0 con 0x7fc91c068df0 2026-03-09T00:08:13.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.658+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc90c09c0f0 con 0x7fc91c068df0 2026-03-09T00:08:13.659 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.659+0000 7fc91bfff700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc908077870 0x7fc908079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.659+0000 7fc91bfff700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc908077870 0x7fc908079d30 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc904005950 tx=0x7fc9040058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:13 vm03.local ceph-mon[129670]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:13.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:13 vm03.local ceph-mon[129670]: from='client.34232 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:13.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:13 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1740730946' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:13.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:13 vm03.local ceph-mon[129670]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T00:08:13.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:13 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2133975515' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:08:13.812 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:13.813 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.811+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc91c19a1d0 con 0x7fc908077870 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.816+0000 7fc9197fa700 1 -- 192.168.123.103:0/578466969 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fc91c19a1d0 con 0x7fc908077870 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:08:13.817 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.820+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc908077870 msgr2=0x7fc908079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.820+0000 7fc922a04700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc908077870 0x7fc908079d30 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7fc904005950 tx=0x7fc9040058e0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.820+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 msgr2=0x7fc91c1992a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.820+0000 7fc922a04700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c1992a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fc90c000c00 tx=0x7fc90c0049b0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.821+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 shutdown_connections 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.821+0000 7fc922a04700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc908077870 0x7fc908079d30 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.821+0000 7fc922a04700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc91c0684d0 0x7fc91c198d60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.821+0000 7fc922a04700 1 --2- 192.168.123.103:0/578466969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc91c068df0 0x7fc91c1992a0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.821+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 >> 192.168.123.103:0/578466969 conn(0x7fc91c075960 msgr2=0x7fc91c0fe960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.821+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 shutdown_connections 2026-03-09T00:08:13.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.822+0000 7fc922a04700 1 -- 192.168.123.103:0/578466969 wait complete. 2026-03-09T00:08:13.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.910+0000 7f5f99be5700 1 -- 192.168.123.103:0/3380023519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94107500 msgr2=0x7f5f94107980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.910+0000 7f5f99be5700 1 --2- 192.168.123.103:0/3380023519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94107500 0x7f5f94107980 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f5f84009b50 tx=0x7f5f84009e60 comp rx=0 tx=0).stop 2026-03-09T00:08:13.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.911+0000 7f5f99be5700 1 -- 192.168.123.103:0/3380023519 shutdown_connections 2026-03-09T00:08:13.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.911+0000 7f5f99be5700 1 --2- 192.168.123.103:0/3380023519 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f94107500 0x7f5f94107980 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.911+0000 7f5f99be5700 1 --2- 192.168.123.103:0/3380023519 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5f9410d5b0 0x7f5f9410d990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.911+0000 7f5f99be5700 1 -- 192.168.123.103:0/3380023519 >> 192.168.123.103:0/3380023519 conn(0x7f5f94075840 msgr2=0x7f5f94105bb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:13.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.911+0000 7f5f99be5700 1 -- 192.168.123.103:0/3380023519 shutdown_connections 2026-03-09T00:08:13.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.911+0000 7f5f99be5700 1 -- 192.168.123.103:0/3380023519 wait complete. 2026-03-09T00:08:13.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f99be5700 1 Processor -- start 2026-03-09T00:08:13.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f99be5700 1 -- start start 2026-03-09T00:08:13.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f99be5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5f94107500 0x7f5f9419d1b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f99be5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 0x7f5f9419d6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f99be5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f9419ddd0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f92ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 0x7f5f9419d6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f92ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 0x7f5f9419d6f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45888/0 (socket says 192.168.123.103:45888) 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f92ffd700 1 -- 192.168.123.103:0/120135479 learned_addr learned my addr 192.168.123.103:0/120135479 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f937fe700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5f94107500 0x7f5f9419d1b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.912+0000 7f5f99be5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f941a1b60 con 0x7f5f94107500 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f92ffd700 1 -- 192.168.123.103:0/120135479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5f94107500 msgr2=0x7f5f9419d1b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f92ffd700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5f94107500 0x7f5f9419d1b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:13.913 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f92ffd700 1 -- 192.168.123.103:0/120135479 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5f840097e0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.913+0000 7f5f92ffd700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 0x7f5f9419d6f0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f5f84005950 tx=0x7f5f84004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:13.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.914+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f8401d070 con 0x7f5f9410d5b0 2026-03-09T00:08:13.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.914+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5f84022470 con 0x7f5f9410d5b0 2026-03-09T00:08:13.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.914+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5f941a1de0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.914+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5f941a22d0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.915+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f8400bbd0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.915+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5f9410f9e0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.919+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5f8400bd30 con 0x7f5f9410d5b0 2026-03-09T00:08:13.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.919+0000 7f5f90ff9700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5f80077960 0x7f5f80079e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:13.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.919+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6310+0+0 (secure 0 0 0) 0x7f5f8409bbe0 con 0x7f5f9410d5b0 2026-03-09T00:08:13.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.919+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5f840cba90 con 0x7f5f9410d5b0 2026-03-09T00:08:13.920 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.920+0000 7f5f937fe700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5f80077960 0x7f5f80079e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:13.921 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:13.920+0000 7f5f937fe700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5f80077960 0x7f5f80079e20 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f5f7c007c30 tx=0x7f5f7c0073d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:14.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.107+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f5f9404ea90 con 0x7f5f9410d5b0 2026-03-09T00:08:14.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.109+0000 7f5f90ff9700 1 -- 192.168.123.103:0/120135479 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1282 (secure 0 0 0) 0x7f5f84027090 con 0x7f5f9410d5b0 2026-03-09T00:08:14.109 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 622/231 objects degraded (269.264%), 7 pgs degraded, 7 pgs undersized 2026-03-09T00:08:14.109 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:08:14.109 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 622/231 objects degraded (269.264%), 7 pgs degraded, 7 pgs undersized 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 101s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is stuck undersized for 101s, current state active+recovery_wait+undersized+degraded+remapped, last acting [1,4] 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.c is stuck undersized for 101s, current state active+recovery_wait+undersized+degraded+remapped, last acting [5,3] 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.15 is stuck undersized for 101s, current state active+recovering+undersized+degraded+remapped, last acting [3,4] 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.18 is stuck undersized for 101s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,1] 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1b is stuck undersized for 101s, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-09T00:08:14.110 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is stuck undersized for 101s, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.112+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5f80077960 msgr2=0x7f5f80079e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5f80077960 0x7f5f80079e20 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f5f7c007c30 tx=0x7f5f7c0073d0 comp rx=0 tx=0).stop 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 msgr2=0x7f5f9419d6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 0x7f5f9419d6f0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f5f84005950 tx=0x7f5f84004c30 comp rx=0 tx=0).stop 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 shutdown_connections 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5f80077960 0x7f5f80079e20 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:14.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5f94107500 0x7f5f9419d1b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:14.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 --2- 192.168.123.103:0/120135479 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5f9410d5b0 0x7f5f9419d6f0 secure :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f5f84005950 tx=0x7f5f84004c30 comp rx=0 tx=0).stop 2026-03-09T00:08:14.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 >> 192.168.123.103:0/120135479 conn(0x7f5f94075840 msgr2=0x7f5f94076aa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:14.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.113+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 shutdown_connections 2026-03-09T00:08:14.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:14.114+0000 7f5f99be5700 1 -- 192.168.123.103:0/120135479 wait complete. 2026-03-09T00:08:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:13 vm06.local ceph-mon[106218]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:13 vm06.local ceph-mon[106218]: from='client.34232 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:13 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1740730946' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:13 vm06.local ceph-mon[106218]: osdmap e70: 6 total, 6 up, 6 in 2026-03-09T00:08:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:13 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2133975515' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:08:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:14 vm03.local ceph-mon[129670]: from='client.44181 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:14 vm03.local ceph-mon[129670]: pgmap v104: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:14 vm03.local ceph-mon[129670]: from='client.34248 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:14 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/120135479' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:08:15.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:14 vm06.local ceph-mon[106218]: from='client.44181 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:15.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:14 vm06.local ceph-mon[106218]: pgmap v104: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 6 active+recovery_wait+undersized+degraded+remapped, 58 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 622/231 objects degraded (269.264%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:15.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:14 vm06.local ceph-mon[106218]: from='client.34248 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:15.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:14 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/120135479' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:08:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:16 vm03.local ceph-mon[129670]: pgmap v105: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 10 objects/s recovering 2026-03-09T00:08:17.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:16 vm06.local ceph-mon[106218]: pgmap v105: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 10 objects/s recovering 2026-03-09T00:08:18.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:17 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 489/231 objects degraded (211.688%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:18.153 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:17 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 489/231 objects degraded (211.688%), 6 pgs degraded, 6 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:18 vm03.local ceph-mon[129670]: pgmap v106: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 10 objects/s recovering 2026-03-09T00:08:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:18 vm06.local ceph-mon[106218]: pgmap v106: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 10 objects/s recovering 2026-03-09T00:08:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:20 vm03.local ceph-mon[129670]: pgmap v107: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 4 objects/s recovering 2026-03-09T00:08:21.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:20 vm06.local ceph-mon[106218]: pgmap v107: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 4 objects/s recovering 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:22 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:22 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-09T00:08:23.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:22 vm03.local ceph-mon[129670]: pgmap v108: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:23.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:22 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:23.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:22 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-09T00:08:23.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:22 vm06.local ceph-mon[106218]: pgmap v108: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:25.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:24 vm03.local ceph-mon[129670]: pgmap v109: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 8 objects/s recovering 2026-03-09T00:08:25.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:24 vm06.local ceph-mon[106218]: pgmap v109: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 8 objects/s recovering 2026-03-09T00:08:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:26 vm06.local ceph-mon[106218]: pgmap v110: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:27.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:26 vm03.local ceph-mon[129670]: pgmap v110: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:28 vm06.local ceph-mon[106218]: osdmap e71: 6 total, 6 up, 6 in 2026-03-09T00:08:28.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:28 vm06.local ceph-mon[106218]: pgmap v112: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:28 vm03.local ceph-mon[129670]: osdmap e71: 6 total, 6 up, 6 in 2026-03-09T00:08:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:28 vm03.local ceph-mon[129670]: pgmap v112: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:29 vm06.local ceph-mon[106218]: osdmap e72: 6 total, 6 up, 6 in 2026-03-09T00:08:29.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:29.727 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:29 vm03.local ceph-mon[129670]: osdmap e72: 6 total, 6 up, 6 in 2026-03-09T00:08:29.728 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:30.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:30 vm03.local ceph-mon[129670]: pgmap v114: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:30.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:30 vm06.local ceph-mon[106218]: pgmap v114: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 5 active+recovery_wait+undersized+degraded+remapped, 59 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 489/231 objects degraded (211.688%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:32.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:31 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 382/231 objects degraded (165.368%), 5 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:32.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:31 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 382/231 objects degraded (165.368%), 5 pgs degraded, 5 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:32 vm03.local ceph-mon[129670]: pgmap v115: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:33.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:32 vm06.local ceph-mon[106218]: pgmap v115: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:35.064 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:34 vm03.local ceph-mon[129670]: pgmap v116: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:35.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:34 vm06.local ceph-mon[106218]: pgmap v116: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:36 vm03.local ceph-mon[129670]: pgmap v117: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:37.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:36 vm06.local ceph-mon[106218]: pgmap v117: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:37.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:37 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:37 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T00:08:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:37 vm03.local ceph-mon[129670]: osdmap e73: 6 total, 6 up, 6 in 2026-03-09T00:08:38.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:37 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:38.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:37 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T00:08:38.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:37 vm06.local ceph-mon[106218]: osdmap e73: 6 total, 6 up, 6 in 2026-03-09T00:08:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:38 vm06.local ceph-mon[106218]: pgmap v119: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 10 objects/s recovering 2026-03-09T00:08:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:38 vm06.local ceph-mon[106218]: osdmap e74: 6 total, 6 up, 6 in 2026-03-09T00:08:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:38 vm03.local ceph-mon[129670]: pgmap v119: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 10 objects/s recovering 2026-03-09T00:08:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:38 vm03.local ceph-mon[129670]: osdmap e74: 6 total, 6 up, 6 in 2026-03-09T00:08:41.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:40 vm06.local ceph-mon[106218]: pgmap v121: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:40 vm03.local ceph-mon[129670]: pgmap v121: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+undersized+degraded+remapped, 60 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 382/231 objects degraded (165.368%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:42.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:41 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 296/231 objects degraded (128.139%), 4 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:42.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:41 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 296/231 objects degraded (128.139%), 4 pgs degraded, 4 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:43.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:42 vm06.local ceph-mon[106218]: pgmap v122: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:42 vm03.local ceph-mon[129670]: pgmap v122: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:44.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:44.199 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:08:44.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.197+0000 7fb4930a6700 1 -- 192.168.123.103:0/2814155423 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 msgr2=0x7fb48c103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.197+0000 7fb4930a6700 1 --2- 192.168.123.103:0/2814155423 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c103720 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7fb47c009b00 tx=0x7fb47c009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:44.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.200+0000 7fb4930a6700 1 -- 192.168.123.103:0/2814155423 shutdown_connections 2026-03-09T00:08:44.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.200+0000 7fb4930a6700 1 --2- 192.168.123.103:0/2814155423 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb48c103cf0 0x7fb48c107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.200+0000 7fb4930a6700 1 --2- 192.168.123.103:0/2814155423 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c103720 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.200+0000 7fb4930a6700 1 -- 192.168.123.103:0/2814155423 >> 192.168.123.103:0/2814155423 conn(0x7fb48c0feb90 msgr2=0x7fb48c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.200+0000 7fb4930a6700 1 -- 192.168.123.103:0/2814155423 shutdown_connections 2026-03-09T00:08:44.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.200+0000 7fb4930a6700 1 -- 192.168.123.103:0/2814155423 wait complete. 2026-03-09T00:08:44.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.201+0000 7fb4930a6700 1 Processor -- start 2026-03-09T00:08:44.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.201+0000 7fb4930a6700 1 -- start start 2026-03-09T00:08:44.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.201+0000 7fb4930a6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c19d230 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.201+0000 7fb4930a6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb48c103cf0 0x7fb48c19d770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.201+0000 7fb4930a6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb48c19de50 con 0x7fb48c103340 2026-03-09T00:08:44.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.201+0000 7fb4930a6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb48c1a1be0 con 0x7fb48c103cf0 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c19d230 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c19d230 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41762/0 (socket says 192.168.123.103:41762) 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 -- 192.168.123.103:0/3783614482 learned_addr learned my addr 192.168.123.103:0/3783614482 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 -- 192.168.123.103:0/3783614482 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb48c103cf0 msgr2=0x7fb48c19d770 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb48c103cf0 0x7fb48c19d770 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 -- 192.168.123.103:0/3783614482 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb47c0097e0 con 0x7fb48c103340 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.202+0000 7fb490e42700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c19d230 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb47c005b40 tx=0x7fb47c004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.203+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb47c01d070 con 0x7fb48c103340 2026-03-09T00:08:44.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.203+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb47c00bc50 con 0x7fb48c103340 2026-03-09T00:08:44.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.203+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb48c1a1e60 con 0x7fb48c103340 2026-03-09T00:08:44.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.203+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb48c1a2350 con 0x7fb48c103340 2026-03-09T00:08:44.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.205+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb47c021760 con 0x7fb48c103340 2026-03-09T00:08:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.205+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb48c04ea90 con 0x7fb48c103340 2026-03-09T00:08:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.208+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb47c02b430 con 0x7fb48c103340 2026-03-09T00:08:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.208+0000 7fb489ffb700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb474077700 0x7fb474079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.208 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.208+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fb47c09c120 con 0x7fb48c103340 2026-03-09T00:08:44.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.209+0000 7fb48bfff700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb474077700 0x7fb474079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.209+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb47c0d29f0 con 0x7fb48c103340 2026-03-09T00:08:44.209 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.209+0000 7fb48bfff700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb474077700 0x7fb474079bc0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb48c19e850 tx=0x7fb480006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.349+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb48c1a2630 con 0x7fb474077700 2026-03-09T00:08:44.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.351+0000 7fb489ffb700 1 -- 192.168.123.103:0/3783614482 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fb48c1a2630 con 0x7fb474077700 2026-03-09T00:08:44.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.353+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb474077700 msgr2=0x7fb474079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb474077700 0x7fb474079bc0 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb48c19e850 tx=0x7fb480006cb0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 msgr2=0x7fb48c19d230 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c19d230 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fb47c005b40 tx=0x7fb47c004ab0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 shutdown_connections 2026-03-09T00:08:44.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb474077700 0x7fb474079bc0 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb48c103340 0x7fb48c19d230 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 --2- 192.168.123.103:0/3783614482 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb48c103cf0 0x7fb48c19d770 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.354+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 >> 192.168.123.103:0/3783614482 conn(0x7fb48c0feb90 msgr2=0x7fb48c1001b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.355+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 shutdown_connections 2026-03-09T00:08:44.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.355+0000 7fb4930a6700 1 -- 192.168.123.103:0/3783614482 wait complete. 2026-03-09T00:08:44.365 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:08:44.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 -- 192.168.123.103:0/1152887254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 msgr2=0x7fde841036c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 --2- 192.168.123.103:0/1152887254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde841036c0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fde70009b00 tx=0x7fde70009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 -- 192.168.123.103:0/1152887254 shutdown_connections 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 --2- 192.168.123.103:0/1152887254 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde84103c90 0x7fde84107ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 --2- 192.168.123.103:0/1152887254 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde841036c0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 -- 192.168.123.103:0/1152887254 >> 192.168.123.103:0/1152887254 conn(0x7fde840feb50 msgr2=0x7fde84100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 -- 192.168.123.103:0/1152887254 shutdown_connections 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.434+0000 7fde8b06c700 1 -- 192.168.123.103:0/1152887254 wait complete. 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.435+0000 7fde8b06c700 1 Processor -- start 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.435+0000 7fde8b06c700 1 -- start start 2026-03-09T00:08:44.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.435+0000 7fde8b06c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde84198db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.435+0000 7fde8b06c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde84103c90 0x7fde841992f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.435+0000 7fde8b06c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde841999d0 con 0x7fde841032e0 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.435+0000 7fde8b06c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fde8419d760 con 0x7fde84103c90 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde88e08700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde84198db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde88e08700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde84198db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41782/0 (socket says 192.168.123.103:41782) 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde88e08700 1 -- 192.168.123.103:0/2604402056 learned_addr learned my addr 192.168.123.103:0/2604402056 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde83fff700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde84103c90 0x7fde841992f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde88e08700 1 -- 192.168.123.103:0/2604402056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde84103c90 msgr2=0x7fde841992f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde88e08700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde84103c90 0x7fde841992f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.436+0000 7fde88e08700 1 -- 192.168.123.103:0/2604402056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fde700097e0 con 0x7fde841032e0 2026-03-09T00:08:44.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.437+0000 7fde88e08700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde84198db0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fde7000b5c0 tx=0x7fde700048c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.437+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fde7001d070 con 0x7fde841032e0 2026-03-09T00:08:44.437 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.437+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fde70022470 con 0x7fde841032e0 2026-03-09T00:08:44.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.438+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fde7000f700 con 0x7fde841032e0 2026-03-09T00:08:44.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.438+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fde8419d9e0 con 0x7fde841032e0 2026-03-09T00:08:44.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.438+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fde8419dea0 con 0x7fde841032e0 2026-03-09T00:08:44.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.439+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fde70022ac0 con 0x7fde841032e0 2026-03-09T00:08:44.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.440+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fde8410b680 con 0x7fde841032e0 2026-03-09T00:08:44.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.440+0000 7fde81ffb700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fde74077870 0x7fde74079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.441+0000 7fde83fff700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fde74077870 0x7fde74079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.441+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fde7006ce40 con 0x7fde841032e0 2026-03-09T00:08:44.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.441+0000 7fde83fff700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fde74077870 0x7fde74079d30 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fde8419a3d0 tx=0x7fde7800b3c0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.444+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fde70060180 con 0x7fde841032e0 2026-03-09T00:08:44.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.579+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fde8419a110 con 0x7fde74077870 2026-03-09T00:08:44.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.585+0000 7fde81ffb700 1 -- 192.168.123.103:0/2604402056 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fde8419a110 con 0x7fde74077870 2026-03-09T00:08:44.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.588+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fde74077870 msgr2=0x7fde74079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.589+0000 7fde8b06c700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fde74077870 0x7fde74079d30 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fde8419a3d0 tx=0x7fde7800b3c0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.589+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 msgr2=0x7fde84198db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.589+0000 7fde8b06c700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde84198db0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fde7000b5c0 tx=0x7fde700048c0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.589 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.589+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 shutdown_connections 2026-03-09T00:08:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.589+0000 7fde8b06c700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fde74077870 0x7fde74079d30 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.589+0000 7fde8b06c700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fde841032e0 0x7fde84198db0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.590+0000 7fde8b06c700 1 --2- 192.168.123.103:0/2604402056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fde84103c90 0x7fde841992f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.590+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 >> 192.168.123.103:0/2604402056 conn(0x7fde840feb50 msgr2=0x7fde84100f50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.590+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 shutdown_connections 2026-03-09T00:08:44.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.590+0000 7fde8b06c700 1 -- 192.168.123.103:0/2604402056 wait complete. 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.667+0000 7fa8c9739700 1 -- 192.168.123.103:0/3466329250 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101280 msgr2=0x7fa8c4101660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.667+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3466329250 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101280 0x7fa8c4101660 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fa8b4009b00 tx=0x7fa8b4009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.667+0000 7fa8c9739700 1 -- 192.168.123.103:0/3466329250 shutdown_connections 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.667+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3466329250 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101c30 0x7fa8c4105bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.668+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3466329250 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101280 0x7fa8c4101660 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.668+0000 7fa8c9739700 1 -- 192.168.123.103:0/3466329250 >> 192.168.123.103:0/3466329250 conn(0x7fa8c4078ed0 msgr2=0x7fa8c40792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.668+0000 7fa8c9739700 1 -- 192.168.123.103:0/3466329250 shutdown_connections 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.668+0000 7fa8c9739700 1 -- 192.168.123.103:0/3466329250 wait complete. 2026-03-09T00:08:44.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.668+0000 7fa8c9739700 1 Processor -- start 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.668+0000 7fa8c9739700 1 -- start start 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c9739700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101280 0x7fa8c4073090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c9739700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101c30 0x7fa8c40735d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c9739700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa8c4073b10 con 0x7fa8c4101c30 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c9739700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa8c4073c50 con 0x7fa8c4101280 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c27fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101c30 0x7fa8c40735d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101280 0x7fa8c4073090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101280 0x7fa8c4073090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:40322/0 (socket says 192.168.123.103:40322) 2026-03-09T00:08:44.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c2ffd700 1 -- 192.168.123.103:0/3202071265 learned_addr learned my addr 192.168.123.103:0/3202071265 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:44.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c27fc700 1 -- 192.168.123.103:0/3202071265 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101280 msgr2=0x7fa8c4073090 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c27fc700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101280 0x7fa8c4073090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.669+0000 7fa8c27fc700 1 -- 192.168.123.103:0/3202071265 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa8b40097e0 con 0x7fa8c4101c30 2026-03-09T00:08:44.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.670+0000 7fa8c27fc700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101c30 0x7fa8c40735d0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fa8c406a410 tx=0x7fa8b800dc20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.670+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa8b80098e0 con 0x7fa8c4101c30 2026-03-09T00:08:44.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.670+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa8c41a2e30 con 0x7fa8c4101c30 2026-03-09T00:08:44.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.670+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa8b800de60 con 0x7fa8c4101c30 2026-03-09T00:08:44.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.670+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa8b800f3c0 con 0x7fa8c4101c30 2026-03-09T00:08:44.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.671+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa8c41a3240 con 0x7fa8c4101c30 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.672+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa8c4109540 con 0x7fa8c4101c30 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.672+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa8b8009a40 con 0x7fa8c4101c30 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.673+0000 7fa8abfff700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa8ac0778c0 0x7fa8ac079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.673+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fa8b8098e80 con 0x7fa8c4101c30 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.673+0000 7fa8c2ffd700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa8ac0778c0 0x7fa8ac079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.673+0000 7fa8c2ffd700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa8ac0778c0 0x7fa8ac079d80 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fa8b4006010 tx=0x7fa8b400b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.676+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa8b80616c0 con 0x7fa8c4101c30 2026-03-09T00:08:44.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.807+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa8c40745c0 con 0x7fa8ac0778c0 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.813+0000 7fa8abfff700 1 -- 192.168.123.103:0/3202071265 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7fa8c40745c0 con 0x7fa8ac0778c0 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 2m ago 9m 24.0M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (9m) 2m ago 9m 8786k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (8m) 2m ago 8m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (2m) 2m ago 9m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (2m) 2m ago 8m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (3m) 2m ago 8m 73.3M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (6m) 2m ago 6m 18.1M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (7m) 2m ago 7m 218M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (6m) 2m ago 6m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (6m) 2m ago 6m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (4m) 2m ago 9m 619M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (4m) 2m ago 8m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 2m ago 9m 57.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (2m) 2m ago 8m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 2m ago 9m 9445k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 2m ago 8m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:08:44.814 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 2m ago 8m 30.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:08:44.815 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (8m) 2m ago 8m 451M 4096M 18.2.1 5be31c24972a 7bc729875521 2026-03-09T00:08:44.815 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (7m) 2m ago 7m 370M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:08:44.815 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (7m) 2m ago 7m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:08:44.815 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (7m) 2m ago 7m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:08:44.815 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (7m) 2m ago 7m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:08:44.815 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (4m) 2m ago 8m 56.6M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:08:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.817+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa8ac0778c0 msgr2=0x7fa8ac079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.817+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa8ac0778c0 0x7fa8ac079d80 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fa8b4006010 tx=0x7fa8b400b560 comp rx=0 tx=0).stop 2026-03-09T00:08:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.817+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101c30 msgr2=0x7fa8c40735d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.818 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.817+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101c30 0x7fa8c40735d0 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fa8c406a410 tx=0x7fa8b800dc20 comp rx=0 tx=0).stop 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.818+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 shutdown_connections 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.818+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa8ac0778c0 0x7fa8ac079d80 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.818+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa8c4101280 0x7fa8c4073090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.818+0000 7fa8c9739700 1 --2- 192.168.123.103:0/3202071265 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa8c4101c30 0x7fa8c40735d0 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.818+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 >> 192.168.123.103:0/3202071265 conn(0x7fa8c4078ed0 msgr2=0x7fa8c4100890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.819+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 shutdown_connections 2026-03-09T00:08:44.819 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.819+0000 7fa8c9739700 1 -- 192.168.123.103:0/3202071265 wait complete. 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.904+0000 7f68b36ee700 1 -- 192.168.123.103:0/2646176972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac1032e0 msgr2=0x7f68ac1036c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.904+0000 7f68b36ee700 1 --2- 192.168.123.103:0/2646176972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac1032e0 0x7f68ac1036c0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f689c009b00 tx=0x7f689c009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.905+0000 7f68b36ee700 1 -- 192.168.123.103:0/2646176972 shutdown_connections 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.905+0000 7f68b36ee700 1 --2- 192.168.123.103:0/2646176972 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68ac103c90 0x7f68ac107ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.905+0000 7f68b36ee700 1 --2- 192.168.123.103:0/2646176972 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac1032e0 0x7f68ac1036c0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.905+0000 7f68b36ee700 1 -- 192.168.123.103:0/2646176972 >> 192.168.123.103:0/2646176972 conn(0x7f68ac0feb50 msgr2=0x7f68ac100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.905+0000 7f68b36ee700 1 -- 192.168.123.103:0/2646176972 shutdown_connections 2026-03-09T00:08:44.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.905+0000 7f68b36ee700 1 -- 192.168.123.103:0/2646176972 wait complete. 2026-03-09T00:08:44.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b36ee700 1 Processor -- start 2026-03-09T00:08:44.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b36ee700 1 -- start start 2026-03-09T00:08:44.906 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b36ee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68ac1032e0 0x7f68ac198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b36ee700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 0x7f68ac199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b36ee700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68ac1999e0 con 0x7f68ac103c90 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b36ee700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f68ac19d770 con 0x7f68ac1032e0 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.906+0000 7f68b0c89700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 0x7f68ac199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b0c89700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 0x7f68ac199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41450/0 (socket says 192.168.123.103:41450) 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b0c89700 1 -- 192.168.123.103:0/463211056 learned_addr learned my addr 192.168.123.103:0/463211056 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b148a700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68ac1032e0 0x7f68ac198dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b0c89700 1 -- 192.168.123.103:0/463211056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68ac1032e0 msgr2=0x7f68ac198dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b0c89700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68ac1032e0 0x7f68ac198dc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:44.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b0c89700 1 -- 192.168.123.103:0/463211056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f689c0097e0 con 0x7f68ac103c90 2026-03-09T00:08:44.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68b0c89700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 0x7f68ac199300 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f68a800b700 tx=0x7f68a800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.907+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68a8010820 con 0x7f68ac103c90 2026-03-09T00:08:44.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.908+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f68ac19da50 con 0x7f68ac103c90 2026-03-09T00:08:44.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.908+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f68ac19dfa0 con 0x7f68ac103c90 2026-03-09T00:08:44.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.909+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f68a8010e60 con 0x7f68ac103c90 2026-03-09T00:08:44.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.909+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f68a8017570 con 0x7f68ac103c90 2026-03-09T00:08:44.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.911+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f68a80176d0 con 0x7f68ac103c90 2026-03-09T00:08:44.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.911+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6890005320 con 0x7f68ac103c90 2026-03-09T00:08:44.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.911+0000 7f68a27fc700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6898077870 0x7f6898079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:44.914 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.911+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7f68a8099360 con 0x7f68ac103c90 2026-03-09T00:08:44.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.915+0000 7f68b148a700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6898077870 0x7f6898079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:44.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.915+0000 7f68b148a700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6898077870 0x7f6898079d30 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f689c006010 tx=0x7f689c00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:44.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:44.915+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f68a8061470 con 0x7f68ac103c90 2026-03-09T00:08:45.089 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.089+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f6890006200 con 0x7f68ac103c90 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.090+0000 7f68a27fc700 1 -- 192.168.123.103:0/463211056 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f68a800fed0 con 0x7f68ac103c90 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:08:45.090 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-09T00:08:45.091 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T00:08:45.091 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:08:45.091 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6898077870 msgr2=0x7f6898079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6898077870 0x7f6898079d30 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f689c006010 tx=0x7f689c00b540 comp rx=0 tx=0).stop 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 msgr2=0x7f68ac199300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 0x7f68ac199300 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f68a800b700 tx=0x7f68a800bac0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 shutdown_connections 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f6898077870 0x7f6898079d30 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.093 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f68ac1032e0 0x7f68ac198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 --2- 192.168.123.103:0/463211056 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f68ac103c90 0x7f68ac199300 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 >> 192.168.123.103:0/463211056 conn(0x7f68ac0feb50 msgr2=0x7f68ac100f50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.093+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 shutdown_connections 2026-03-09T00:08:45.094 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.094+0000 7f68b36ee700 1 -- 192.168.123.103:0/463211056 wait complete. 2026-03-09T00:08:45.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:44 vm06.local ceph-mon[106218]: pgmap v123: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:45.172 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:44 vm03.local ceph-mon[129670]: pgmap v123: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:45.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.170+0000 7f14c04de700 1 -- 192.168.123.103:0/187922371 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103340 msgr2=0x7f14b8103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.172 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.170+0000 7f14c04de700 1 --2- 192.168.123.103:0/187922371 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103340 0x7f14b8103720 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f14ac009a60 tx=0x7f14ac009d70 comp rx=0 tx=0).stop 2026-03-09T00:08:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.173+0000 7f14c04de700 1 -- 192.168.123.103:0/187922371 shutdown_connections 2026-03-09T00:08:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.173+0000 7f14c04de700 1 --2- 192.168.123.103:0/187922371 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103cf0 0x7f14b8107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.173+0000 7f14c04de700 1 --2- 192.168.123.103:0/187922371 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103340 0x7f14b8103720 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.173+0000 7f14c04de700 1 -- 192.168.123.103:0/187922371 >> 192.168.123.103:0/187922371 conn(0x7f14b80feb90 msgr2=0x7f14b8100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.173+0000 7f14c04de700 1 -- 192.168.123.103:0/187922371 shutdown_connections 2026-03-09T00:08:45.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 -- 192.168.123.103:0/187922371 wait complete. 2026-03-09T00:08:45.174 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 Processor -- start 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 -- start start 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103340 0x7f14b819d930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103cf0 0x7f14b8078300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14b819dfc0 con 0x7f14b8103340 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.174+0000 7f14c04de700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f14b819e130 con 0x7f14b8103cf0 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14bda79700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103cf0 0x7f14b8078300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14bda79700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103cf0 0x7f14b8078300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:40356/0 (socket says 192.168.123.103:40356) 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14be27a700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103340 0x7f14b819d930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14bda79700 1 -- 192.168.123.103:0/233292016 learned_addr learned my addr 192.168.123.103:0/233292016 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:45.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14be27a700 1 -- 192.168.123.103:0/233292016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103cf0 msgr2=0x7f14b8078300 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14be27a700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103cf0 0x7f14b8078300 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.175+0000 7f14be27a700 1 -- 192.168.123.103:0/233292016 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f14b40097e0 con 0x7f14b8103340 2026-03-09T00:08:45.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.176+0000 7f14be27a700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103340 0x7f14b819d930 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f14ac009a60 tx=0x7f14ac00f690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:45.176 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.176+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f14ac01d070 con 0x7f14b8103340 2026-03-09T00:08:45.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.176+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f14ac0037c0 con 0x7f14b8103340 2026-03-09T00:08:45.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.177+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f14ac017600 con 0x7f14b8103340 2026-03-09T00:08:45.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.177+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f14ac009710 con 0x7f14b8103340 2026-03-09T00:08:45.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.177+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f14b8078b80 con 0x7f14b8103340 2026-03-09T00:08:45.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.179+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f14ac00fc70 con 0x7f14b8103340 2026-03-09T00:08:45.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.180+0000 7f14ab7fe700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f14a4077710 0x7f14a4079bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.180+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7f14ac09ab50 con 0x7f14b8103340 2026-03-09T00:08:45.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.180+0000 7f14bda79700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f14a4077710 0x7f14a4079bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.180+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f14b804ea90 con 0x7f14b8103340 2026-03-09T00:08:45.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.180+0000 7f14bda79700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f14a4077710 0x7f14a4079bd0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f14b80ff360 tx=0x7f14b4009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:45.184 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.183+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f14ac063390 con 0x7f14b8103340 2026-03-09T00:08:45.337 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.337+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f14b8195d40 con 0x7f14b8103340 2026-03-09T00:08:45.338 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.338+0000 7f14ab7fe700 1 -- 192.168.123.103:0/233292016 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f14ac026020 con 0x7f14b8103340 2026-03-09T00:08:45.338 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:45.339 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.341+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f14a4077710 msgr2=0x7f14a4079bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.341+0000 7f14c04de700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f14a4077710 0x7f14a4079bd0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f14b80ff360 tx=0x7f14b4009500 comp rx=0 tx=0).stop 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.341+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103340 msgr2=0x7f14b819d930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.341+0000 7f14c04de700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103340 0x7f14b819d930 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f14ac009a60 tx=0x7f14ac00f690 comp rx=0 tx=0).stop 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 shutdown_connections 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f14a4077710 0x7f14a4079bd0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f14b8103340 0x7f14b819d930 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 --2- 192.168.123.103:0/233292016 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f14b8103cf0 0x7f14b8078300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.342 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 >> 192.168.123.103:0/233292016 conn(0x7f14b80feb90 msgr2=0x7f14b8100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 shutdown_connections 2026-03-09T00:08:45.343 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.342+0000 7f14c04de700 1 -- 192.168.123.103:0/233292016 wait complete. 2026-03-09T00:08:45.344 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:08:45.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.422+0000 7f0026c96700 1 -- 192.168.123.103:0/1454918429 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 msgr2=0x7f00201037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.422+0000 7f0026c96700 1 --2- 192.168.123.103:0/1454918429 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00201037a0 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f001c009b00 tx=0x7f001c009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:45.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.425+0000 7f0026c96700 1 -- 192.168.123.103:0/1454918429 shutdown_connections 2026-03-09T00:08:45.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.425+0000 7f0026c96700 1 --2- 192.168.123.103:0/1454918429 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0020103d70 0x7f0020107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.425+0000 7f0026c96700 1 --2- 192.168.123.103:0/1454918429 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00201037a0 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.425+0000 7f0026c96700 1 -- 192.168.123.103:0/1454918429 >> 192.168.123.103:0/1454918429 conn(0x7f00200fec30 msgr2=0x7f0020101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.426+0000 7f0026c96700 1 -- 192.168.123.103:0/1454918429 shutdown_connections 2026-03-09T00:08:45.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.426+0000 7f0026c96700 1 -- 192.168.123.103:0/1454918429 wait complete. 2026-03-09T00:08:45.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.428+0000 7f0026c96700 1 Processor -- start 2026-03-09T00:08:45.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.428+0000 7f0026c96700 1 -- start start 2026-03-09T00:08:45.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.428+0000 7f0026c96700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00200752a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.428+0000 7f0026c96700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0020103d70 0x7f00200757e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.428+0000 7f0026c96700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0020079430 con 0x7f00201033c0 2026-03-09T00:08:45.429 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.428+0000 7f0026c96700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0020075d20 con 0x7f0020103d70 2026-03-09T00:08:45.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0017fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0020103d70 0x7f00200757e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.431 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0024a32700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00200752a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0024a32700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00200752a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41484/0 (socket says 192.168.123.103:41484) 2026-03-09T00:08:45.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0024a32700 1 -- 192.168.123.103:0/1671611240 learned_addr learned my addr 192.168.123.103:0/1671611240 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:45.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0024a32700 1 -- 192.168.123.103:0/1671611240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0020103d70 msgr2=0x7f00200757e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0024a32700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0020103d70 0x7f00200757e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.431+0000 7f0024a32700 1 -- 192.168.123.103:0/1671611240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f001c0097e0 con 0x7f00201033c0 2026-03-09T00:08:45.432 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.432+0000 7f0024a32700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00200752a0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f001c006010 tx=0x7f001c004a00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:45.433 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.432+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f001c01d070 con 0x7f00201033c0 2026-03-09T00:08:45.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.433+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0020075fa0 con 0x7f00201033c0 2026-03-09T00:08:45.434 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.433+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0020071a70 con 0x7f00201033c0 2026-03-09T00:08:45.435 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.435+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f002004ea90 con 0x7f00201033c0 2026-03-09T00:08:45.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.438+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f001c00bc50 con 0x7f00201033c0 2026-03-09T00:08:45.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.439+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f001c021860 con 0x7f00201033c0 2026-03-09T00:08:45.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.439+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f001c02b430 con 0x7f00201033c0 2026-03-09T00:08:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.439+0000 7f0015ffb700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0008077a60 0x7f0008079f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.440+0000 7f0017fff700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0008077a60 0x7f0008079f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.440+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7f001c09c120 con 0x7f00201033c0 2026-03-09T00:08:45.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.440+0000 7f0017fff700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0008077a60 0x7f0008079f20 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f0010006fd0 tx=0x7f0010008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:45.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.441+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f001c09e660 con 0x7f00201033c0 2026-03-09T00:08:45.585 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.585+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f0020100f90 con 0x7f0008077a60 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.586+0000 7f0015ffb700 1 -- 192.168.123.103:0/1671611240 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f0020100f90 con 0x7f0008077a60 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "7/23 daemons upgraded", 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:08:45.587 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0008077a60 msgr2=0x7f0008079f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0008077a60 0x7f0008079f20 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f0010006fd0 tx=0x7f0010008040 comp rx=0 tx=0).stop 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 msgr2=0x7f00200752a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00200752a0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f001c006010 tx=0x7f001c004a00 comp rx=0 tx=0).stop 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 shutdown_connections 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0008077a60 0x7f0008079f20 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00201033c0 0x7f00200752a0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 --2- 192.168.123.103:0/1671611240 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0020103d70 0x7f00200757e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 >> 192.168.123.103:0/1671611240 conn(0x7f00200fec30 msgr2=0x7f0020100270 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.589+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 shutdown_connections 2026-03-09T00:08:45.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.590+0000 7f0026c96700 1 -- 192.168.123.103:0/1671611240 wait complete. 2026-03-09T00:08:45.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.662+0000 7f1653e4d700 1 -- 192.168.123.103:0/3442822313 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 msgr2=0x7f164c1036c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.663 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.662+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3442822313 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c1036c0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f1640009b00 tx=0x7f1640009e10 comp rx=0 tx=0).stop 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.663+0000 7f1653e4d700 1 -- 192.168.123.103:0/3442822313 shutdown_connections 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.663+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3442822313 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f164c103c90 0x7f164c107ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.663+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3442822313 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c1036c0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.663+0000 7f1653e4d700 1 -- 192.168.123.103:0/3442822313 >> 192.168.123.103:0/3442822313 conn(0x7f164c0feb50 msgr2=0x7f164c100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.663+0000 7f1653e4d700 1 -- 192.168.123.103:0/3442822313 shutdown_connections 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.663+0000 7f1653e4d700 1 -- 192.168.123.103:0/3442822313 wait complete. 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1653e4d700 1 Processor -- start 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1653e4d700 1 -- start start 2026-03-09T00:08:45.664 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1653e4d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1653e4d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f164c103c90 0x7f164c199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1653e4d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f164c1999f0 con 0x7f164c1032e0 2026-03-09T00:08:45.665 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1653e4d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f164c19d780 con 0x7f164c103c90 2026-03-09T00:08:45.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1651be9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c198dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1651be9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c198dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:41504/0 (socket says 192.168.123.103:41504) 2026-03-09T00:08:45.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.664+0000 7f1651be9700 1 -- 192.168.123.103:0/3449880969 learned_addr learned my addr 192.168.123.103:0/3449880969 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:08:45.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f1651be9700 1 -- 192.168.123.103:0/3449880969 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f164c103c90 msgr2=0x7f164c199310 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:08:45.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f1651be9700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f164c103c90 0x7f164c199310 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f1651be9700 1 -- 192.168.123.103:0/3449880969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16400097e0 con 0x7f164c1032e0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f1651be9700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c198dd0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f1640005330 tx=0x7f16400049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f164000ba40 con 0x7f164c1032e0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1640027070 con 0x7f164c1032e0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f164c19da00 con 0x7f164c1032e0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.665+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f164c19def0 con 0x7f164c1032e0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.667+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f164000f9a0 con 0x7f164c1032e0 2026-03-09T00:08:45.668 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.667+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1640029030 con 0x7f164c1032e0 2026-03-09T00:08:45.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.668+0000 7f163effd700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1638077ab0 0x7f1638079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:08:45.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.668+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(74..74 src has 1..74) v4 ==== 6252+0+0 (secure 0 0 0) 0x7f164009c090 con 0x7f164c1032e0 2026-03-09T00:08:45.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.669+0000 7f16513e8700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1638077ab0 0x7f1638079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:08:45.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.669+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f164c10b630 con 0x7f164c1032e0 2026-03-09T00:08:45.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.670+0000 7f16513e8700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1638077ab0 0x7f1638079f70 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f164c19a3f0 tx=0x7f1648005c10 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:08:45.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.673+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f16400648d0 con 0x7f164c1032e0 2026-03-09T00:08:45.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.850+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f164c066eb0 con 0x7f164c1032e0 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.851+0000 7f163effd700 1 -- 192.168.123.103:0/3449880969 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+900 (secure 0 0 0) 0x7f1640064020 con 0x7f164c1032e0 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 296/231 objects degraded (128.139%), 4 pgs degraded, 4 pgs undersized 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 296/231 objects degraded (128.139%), 4 pgs degraded, 4 pgs undersized 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1 is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,4] 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.6 is stuck undersized for 2m, current state active+recovering+undersized+degraded+remapped, last acting [1,4] 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1b is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [3,4] 2026-03-09T00:08:45.852 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is stuck undersized for 2m, current state active+recovery_wait+undersized+degraded+remapped, last acting [2,3] 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.854+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1638077ab0 msgr2=0x7f1638079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1638077ab0 0x7f1638079f70 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f164c19a3f0 tx=0x7f1648005c10 comp rx=0 tx=0).stop 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 msgr2=0x7f164c198dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c198dd0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f1640005330 tx=0x7f16400049e0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 shutdown_connections 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1638077ab0 0x7f1638079f70 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f164c1032e0 0x7f164c198dd0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 --2- 192.168.123.103:0/3449880969 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f164c103c90 0x7f164c199310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:08:45.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 >> 192.168.123.103:0/3449880969 conn(0x7f164c0feb50 msgr2=0x7f164c100200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:08:45.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.855+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 shutdown_connections 2026-03-09T00:08:45.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:08:45.856+0000 7f1653e4d700 1 -- 192.168.123.103:0/3449880969 wait complete. 2026-03-09T00:08:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:45 vm03.local ceph-mon[129670]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:45 vm03.local ceph-mon[129670]: from='client.34260 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:45 vm03.local ceph-mon[129670]: from='client.34264 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:45 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/463211056' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:45 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/233292016' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:08:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:45 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3449880969' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:08:46.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:45 vm06.local ceph-mon[106218]: from='client.34256 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:46.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:45 vm06.local ceph-mon[106218]: from='client.34260 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:46.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:45 vm06.local ceph-mon[106218]: from='client.34264 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:46.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:45 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/463211056' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:08:46.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:45 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/233292016' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:08:46.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:45 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3449880969' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:08:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:46 vm03.local ceph-mon[129670]: from='client.34276 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:47.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:46 vm03.local ceph-mon[129670]: pgmap v124: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:47.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:46 vm06.local ceph-mon[106218]: from='client.34276 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:08:47.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:46 vm06.local ceph-mon[106218]: pgmap v124: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:48 vm03.local ceph-mon[129670]: pgmap v125: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:48 vm03.local ceph-mon[129670]: osdmap e75: 6 total, 6 up, 6 in 2026-03-09T00:08:49.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:48 vm06.local ceph-mon[106218]: pgmap v125: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:49.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:48 vm06.local ceph-mon[106218]: osdmap e75: 6 total, 6 up, 6 in 2026-03-09T00:08:50.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:49 vm03.local ceph-mon[129670]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T00:08:50.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:49 vm03.local ceph-mon[129670]: pgmap v128: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:49 vm06.local ceph-mon[106218]: osdmap e76: 6 total, 6 up, 6 in 2026-03-09T00:08:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:49 vm06.local ceph-mon[106218]: pgmap v128: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 3 active+recovery_wait+undersized+degraded+remapped, 61 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 296/231 objects degraded (128.139%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:51.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:51 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:52.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:51 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:52.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:52 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:52.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:52 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T00:08:52.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:52 vm06.local ceph-mon[106218]: pgmap v129: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:52.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:52 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 196/231 objects degraded (84.848%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:52 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:08:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:52 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (1 PGs are or would become offline) 2026-03-09T00:08:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:52 vm03.local ceph-mon[129670]: pgmap v129: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:52 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 196/231 objects degraded (84.848%), 3 pgs degraded, 3 pgs undersized (PG_DEGRADED) 2026-03-09T00:08:54.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:54 vm06.local ceph-mon[106218]: pgmap v130: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:54 vm03.local ceph-mon[129670]: pgmap v130: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 5 objects/s recovering 2026-03-09T00:08:56.947 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:56 vm03.local ceph-mon[129670]: pgmap v131: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:56 vm06.local ceph-mon[106218]: pgmap v131: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 11 objects/s recovering 2026-03-09T00:08:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:58 vm03.local ceph-mon[129670]: pgmap v132: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:58 vm03.local ceph-mon[129670]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T00:08:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:58 vm06.local ceph-mon[106218]: pgmap v132: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 9 objects/s recovering 2026-03-09T00:08:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:58 vm06.local ceph-mon[106218]: osdmap e77: 6 total, 6 up, 6 in 2026-03-09T00:09:00.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:59 vm03.local ceph-mon[129670]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T00:09:00.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:08:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:00.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:59 vm06.local ceph-mon[106218]: osdmap e78: 6 total, 6 up, 6 in 2026-03-09T00:09:00.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:08:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:00 vm03.local ceph-mon[129670]: pgmap v135: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 5 objects/s recovering 2026-03-09T00:09:01.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:00 vm06.local ceph-mon[106218]: pgmap v135: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 2 active+recovery_wait+undersized+degraded+remapped, 62 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 196/231 objects degraded (84.848%); 0 B/s, 5 objects/s recovering 2026-03-09T00:09:02.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:01 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 96/231 objects degraded (41.558%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T00:09:02.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:01 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 96/231 objects degraded (41.558%), 2 pgs degraded, 2 pgs undersized (PG_DEGRADED) 2026-03-09T00:09:03.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:02 vm03.local ceph-mon[129670]: pgmap v136: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 11 objects/s recovering 2026-03-09T00:09:03.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:02 vm06.local ceph-mon[106218]: pgmap v136: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 11 objects/s recovering 2026-03-09T00:09:05.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:04 vm03.local ceph-mon[129670]: pgmap v137: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 5 objects/s recovering 2026-03-09T00:09:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:04 vm06.local ceph-mon[106218]: pgmap v137: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 5 objects/s recovering 2026-03-09T00:09:07.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:06 vm03.local ceph-mon[129670]: pgmap v138: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 11 objects/s recovering 2026-03-09T00:09:07.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:06 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:09:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:06 vm06.local ceph-mon[106218]: pgmap v138: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 11 objects/s recovering 2026-03-09T00:09:07.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:06 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:09:08.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:09:08.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-mon[129670]: Upgrade: osd.1 is safe to restart 2026-03-09T00:09:08.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:08.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T00:09:08.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:08.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:07 vm03.local systemd[1]: Stopping Ceph osd.1 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:09:08.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[76366]: 2026-03-09T00:09:07.812+0000 7fb677e71700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:09:08.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[76366]: 2026-03-09T00:09:07.812+0000 7fb677e71700 -1 osd.1 78 *** Got signal Terminated *** 2026-03-09T00:09:08.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:07 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[76366]: 2026-03-09T00:09:07.812+0000 7fb677e71700 -1 osd.1 78 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:09:08.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:07 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-09T00:09:08.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:07 vm06.local ceph-mon[106218]: Upgrade: osd.1 is safe to restart 2026-03-09T00:09:08.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:07 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:08.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:07 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-09T00:09:08.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:07 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:09.078 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:08 vm03.local ceph-mon[129670]: Upgrade: Updating osd.1 2026-03-09T00:09:09.078 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:08 vm03.local ceph-mon[129670]: Deploying daemon osd.1 on vm03 2026-03-09T00:09:09.078 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:08 vm03.local ceph-mon[129670]: pgmap v139: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 9 objects/s recovering 2026-03-09T00:09:09.078 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:08 vm03.local ceph-mon[129670]: osd.1 marked itself down and dead 2026-03-09T00:09:09.078 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:08 vm03.local podman[144641]: 2026-03-09 00:09:08.870310677 +0000 UTC m=+1.074667624 container died 7bc72987552108f39e304de8a0e4f50d2d7dac0a3a994ee1bfa1fa2c683bc7c2 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, ceph=True, org.label-schema.vendor=CentOS, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-09T00:09:09.078 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:08 vm03.local podman[144641]: 2026-03-09 00:09:08.898118088 +0000 UTC m=+1.102475035 container remove 7bc72987552108f39e304de8a0e4f50d2d7dac0a3a994ee1bfa1fa2c683bc7c2 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, GIT_CLEAN=True, org.label-schema.vendor=CentOS, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0) 2026-03-09T00:09:09.078 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:08 vm03.local bash[144641]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1 2026-03-09T00:09:09.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:08 vm06.local ceph-mon[106218]: Upgrade: Updating osd.1 2026-03-09T00:09:09.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:08 vm06.local ceph-mon[106218]: Deploying daemon osd.1 on vm03 2026-03-09T00:09:09.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:08 vm06.local ceph-mon[106218]: pgmap v139: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 63 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 9 objects/s recovering 2026-03-09T00:09:09.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:08 vm06.local ceph-mon[106218]: osd.1 marked itself down and dead 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.078280285 +0000 UTC m=+0.022101643 container create 9fb517d220792ff430eb3f910a167a627413b9ea6d53c38e1ec0a0733025443d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.125082514 +0000 UTC m=+0.068903883 container init 9fb517d220792ff430eb3f910a167a627413b9ea6d53c38e1ec0a0733025443d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.138509823 +0000 UTC m=+0.082331181 container start 9fb517d220792ff430eb3f910a167a627413b9ea6d53c38e1ec0a0733025443d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS) 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.143992503 +0000 UTC m=+0.087813861 container attach 9fb517d220792ff430eb3f910a167a627413b9ea6d53c38e1ec0a0733025443d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default) 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.070525753 +0000 UTC m=+0.014347111 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.302494625 +0000 UTC m=+0.246315983 container died 9fb517d220792ff430eb3f910a167a627413b9ea6d53c38e1ec0a0733025443d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:09:09.332 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144708]: 2026-03-09 00:09:09.332217409 +0000 UTC m=+0.276038757 container remove 9fb517d220792ff430eb3f910a167a627413b9ea6d53c38e1ec0a0733025443d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-09T00:09:09.588 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1.service: Deactivated successfully. 2026-03-09T00:09:09.588 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1.service: Unit process 144719 (conmon) remains running after unit stopped. 2026-03-09T00:09:09.588 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1.service: Unit process 144728 (podman) remains running after unit stopped. 2026-03-09T00:09:09.588 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local systemd[1]: Stopped Ceph osd.1 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:09:09.588 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1.service: Consumed 46.358s CPU time, 709.5M memory peak. 2026-03-09T00:09:09.850 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:09 vm03.local ceph-mon[129670]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:09:09.850 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:09 vm03.local ceph-mon[129670]: osdmap e79: 6 total, 5 up, 6 in 2026-03-09T00:09:09.850 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local systemd[1]: Starting Ceph osd.1 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:09:09.850 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144820]: 2026-03-09 00:09:09.743101093 +0000 UTC m=+0.024269961 container create 9620ec624856651031acdb2c611e9c979788bceee4a8a1c63d5c12fa5a0f5dcb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:09:09.850 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144820]: 2026-03-09 00:09:09.825859013 +0000 UTC m=+0.107027891 container init 9620ec624856651031acdb2c611e9c979788bceee4a8a1c63d5c12fa5a0f5dcb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default) 2026-03-09T00:09:09.850 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144820]: 2026-03-09 00:09:09.73228413 +0000 UTC m=+0.013453008 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:10.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144820]: 2026-03-09 00:09:09.850573444 +0000 UTC m=+0.131742312 container start 9620ec624856651031acdb2c611e9c979788bceee4a8a1c63d5c12fa5a0f5dcb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:09:10.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local podman[144820]: 2026-03-09 00:09:09.859971212 +0000 UTC m=+0.141140080 container attach 9620ec624856651031acdb2c611e9c979788bceee4a8a1c63d5c12fa5a0f5dcb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True) 2026-03-09T00:09:10.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local bash[144820]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.164 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:09 vm03.local bash[144820]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:09 vm06.local ceph-mon[106218]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:09:10.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:09 vm06.local ceph-mon[106218]: osdmap e79: 6 total, 5 up, 6 in 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-116f56af-32f6-413a-91fb-d831183de3b0/osd-block-6a8d6e5f-c441-499a-a4bb-8d9bc046a85f --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T00:09:10.803 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-116f56af-32f6-413a-91fb-d831183de3b0/osd-block-6a8d6e5f-c441-499a-a4bb-8d9bc046a85f --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-09T00:09:11.088 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/ln -snf /dev/ceph-116f56af-32f6-413a-91fb-d831183de3b0/osd-block-6a8d6e5f-c441-499a-a4bb-8d9bc046a85f /var/lib/ceph/osd/ceph-1/block 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/ln -snf /dev/ceph-116f56af-32f6-413a-91fb-d831183de3b0/osd-block-6a8d6e5f-c441-499a-a4bb-8d9bc046a85f /var/lib/ceph/osd/ceph-1/block 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate[144832]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local bash[144820]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local podman[144820]: 2026-03-09 00:09:10.958625595 +0000 UTC m=+1.239794463 container died 9620ec624856651031acdb2c611e9c979788bceee4a8a1c63d5c12fa5a0f5dcb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:09:11.089 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:10 vm03.local podman[144820]: 2026-03-09 00:09:10.979674277 +0000 UTC m=+1.260843145 container remove 9620ec624856651031acdb2c611e9c979788bceee4a8a1c63d5c12fa5a0f5dcb (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:09:11.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-mon[129670]: pgmap v141: 65 pgs: 12 stale+active+clean, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 51 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 9 objects/s recovering 2026-03-09T00:09:11.089 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:10 vm03.local ceph-mon[129670]: osdmap e80: 6 total, 5 up, 6 in 2026-03-09T00:09:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:10 vm06.local ceph-mon[106218]: pgmap v141: 65 pgs: 12 stale+active+clean, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 51 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 96/231 objects degraded (41.558%); 0 B/s, 9 objects/s recovering 2026-03-09T00:09:11.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:10 vm06.local ceph-mon[106218]: osdmap e80: 6 total, 5 up, 6 in 2026-03-09T00:09:11.493 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local podman[145079]: 2026-03-09 00:09:11.105191008 +0000 UTC m=+0.021671188 container create e70d2f37c6d1d0c84724d13a480b876202cc45fad647f054e72bd753d31225b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-09T00:09:11.493 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local podman[145079]: 2026-03-09 00:09:11.144785637 +0000 UTC m=+0.061265829 container init e70d2f37c6d1d0c84724d13a480b876202cc45fad647f054e72bd753d31225b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True) 2026-03-09T00:09:11.493 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local podman[145079]: 2026-03-09 00:09:11.15041928 +0000 UTC m=+0.066899471 container start e70d2f37c6d1d0c84724d13a480b876202cc45fad647f054e72bd753d31225b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS) 2026-03-09T00:09:11.493 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local bash[145079]: e70d2f37c6d1d0c84724d13a480b876202cc45fad647f054e72bd753d31225b0 2026-03-09T00:09:11.493 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local podman[145079]: 2026-03-09 00:09:11.097355894 +0000 UTC m=+0.013836096 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:11.493 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local systemd[1]: Started Ceph osd.1 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:09:11.494 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:11 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:09:11.490+0000 7f87a767e740 -1 Falling back to public interface 2026-03-09T00:09:12.443 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:12.443 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:12.443 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:12.443 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:12 vm03.local ceph-mon[129670]: pgmap v143: 65 pgs: 8 activating+undersized, 14 peering, 1 active+recovering+undersized+degraded+remapped, 9 active+undersized, 3 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 18/231 objects degraded (7.792%) 2026-03-09T00:09:12.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:12.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:12.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:12.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:12 vm06.local ceph-mon[106218]: pgmap v143: 65 pgs: 8 activating+undersized, 14 peering, 1 active+recovering+undersized+degraded+remapped, 9 active+undersized, 3 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 18/231 objects degraded (7.792%) 2026-03-09T00:09:13.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:13 vm06.local ceph-mon[106218]: Health check failed: Reduced data availability: 6 pgs peering (PG_AVAILABILITY) 2026-03-09T00:09:13.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:13 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 18/231 objects degraded (7.792%), 4 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T00:09:13.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:13.493 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:13 vm03.local ceph-mon[129670]: Health check failed: Reduced data availability: 6 pgs peering (PG_AVAILABILITY) 2026-03-09T00:09:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:13 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 18/231 objects degraded (7.792%), 4 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T00:09:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:14 vm06.local ceph-mon[106218]: pgmap v144: 65 pgs: 8 activating+undersized, 14 peering, 1 active+recovering+undersized+degraded+remapped, 9 active+undersized, 3 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 18/231 objects degraded (7.792%) 2026-03-09T00:09:14.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:14.942 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.942 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:14 vm03.local ceph-mon[129670]: pgmap v144: 65 pgs: 8 activating+undersized, 14 peering, 1 active+recovering+undersized+degraded+remapped, 9 active+undersized, 3 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 18/231 objects degraded (7.792%) 2026-03-09T00:09:14.942 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.942 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:14.942 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 -- 192.168.123.103:0/2251544161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 msgr2=0x7f152010d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 --2- 192.168.123.103:0/2251544161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152010d570 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f151801c580 tx=0x7f151801c890 comp rx=0 tx=0).stop 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 -- 192.168.123.103:0/2251544161 shutdown_connections 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 --2- 192.168.123.103:0/2251544161 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152010d570 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 --2- 192.168.123.103:0/2251544161 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152010f340 0x7f152010f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 -- 192.168.123.103:0/2251544161 >> 192.168.123.103:0/2251544161 conn(0x7f152006ce20 msgr2=0x7f152006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:15.950 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:15 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:09:15.855+0000 7f87a767e740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.949+0000 7f15264a3700 1 -- 192.168.123.103:0/2251544161 shutdown_connections 2026-03-09T00:09:15.950 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.950+0000 7f15264a3700 1 -- 192.168.123.103:0/2251544161 wait complete. 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15264a3700 1 Processor -- start 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15264a3700 1 -- start start 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15264a3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152019d280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15264a3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152010f340 0x7f152019d7c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15264a3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f152019dea0 con 0x7f152010d0f0 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15264a3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15201a1c30 con 0x7f152010f340 2026-03-09T00:09:15.951 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15254a1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152019d280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15254a1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152019d280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:45884/0 (socket says 192.168.123.103:45884) 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.951+0000 7f15254a1700 1 -- 192.168.123.103:0/1353507957 learned_addr learned my addr 192.168.123.103:0/1353507957 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f1524ca0700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152010f340 0x7f152019d7c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15254a1700 1 -- 192.168.123.103:0/1353507957 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152010f340 msgr2=0x7f152019d7c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15254a1700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152010f340 0x7f152019d7c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15254a1700 1 -- 192.168.123.103:0/1353507957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f151801c060 con 0x7f152010d0f0 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15254a1700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152019d280 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f151c00b700 tx=0x7f151c00bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:15.952 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f151c010820 con 0x7f152010d0f0 2026-03-09T00:09:15.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15264a3700 1 -- 192.168.123.103:0/1353507957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15201a1f10 con 0x7f152010d0f0 2026-03-09T00:09:15.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.952+0000 7f15264a3700 1 -- 192.168.123.103:0/1353507957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15201a2460 con 0x7f152010d0f0 2026-03-09T00:09:15.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.955+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f151c010e60 con 0x7f152010d0f0 2026-03-09T00:09:15.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.955+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f151c017570 con 0x7f152010d0f0 2026-03-09T00:09:15.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.955+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f151c017790 con 0x7f152010d0f0 2026-03-09T00:09:15.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.956+0000 7f15167fc700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f150c077850 0x7f150c079d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:15.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.956+0000 7f1524ca0700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f150c077850 0x7f150c079d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:15.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.956+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f151c09a4d0 con 0x7f152010d0f0 2026-03-09T00:09:15.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.957+0000 7f15264a3700 1 -- 192.168.123.103:0/1353507957 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1504005320 con 0x7f152010d0f0 2026-03-09T00:09:15.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.960+0000 7f1524ca0700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f150c077850 0x7f150c079d10 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1518007e60 tx=0x7f15180061f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:15.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:15.961+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f151c062d60 con 0x7f152010d0f0 2026-03-09T00:09:16.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.104+0000 7f15264a3700 1 -- 192.168.123.103:0/1353507957 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1504000bf0 con 0x7f150c077850 2026-03-09T00:09:16.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.105+0000 7f15167fc700 1 -- 192.168.123.103:0/1353507957 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f1504000bf0 con 0x7f150c077850 2026-03-09T00:09:16.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 -- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f150c077850 msgr2=0x7f150c079d10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f150c077850 0x7f150c079d10 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f1518007e60 tx=0x7f15180061f0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 -- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 msgr2=0x7f152019d280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152019d280 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f151c00b700 tx=0x7f151c00bac0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 -- 192.168.123.103:0/1353507957 shutdown_connections 2026-03-09T00:09:16.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f150c077850 0x7f150c079d10 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152010d0f0 0x7f152019d280 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 --2- 192.168.123.103:0/1353507957 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152010f340 0x7f152019d7c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.109+0000 7f150bfff700 1 -- 192.168.123.103:0/1353507957 >> 192.168.123.103:0/1353507957 conn(0x7f152006ce20 msgr2=0x7f152010b620 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.110+0000 7f150bfff700 1 -- 192.168.123.103:0/1353507957 shutdown_connections 2026-03-09T00:09:16.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.110+0000 7f150bfff700 1 -- 192.168.123.103:0/1353507957 wait complete. 2026-03-09T00:09:16.121 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:09:16.199 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 -- 192.168.123.103:0/2590266233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c10d0f0 msgr2=0x7f152c10d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 --2- 192.168.123.103:0/2590266233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c10d0f0 0x7f152c10d570 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f1520009b00 tx=0x7f1520009e10 comp rx=0 tx=0).stop 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 -- 192.168.123.103:0/2590266233 shutdown_connections 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 --2- 192.168.123.103:0/2590266233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c10d0f0 0x7f152c10d570 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 --2- 192.168.123.103:0/2590266233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c10f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 -- 192.168.123.103:0/2590266233 >> 192.168.123.103:0/2590266233 conn(0x7f152c06ce20 msgr2=0x7f152c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 -- 192.168.123.103:0/2590266233 shutdown_connections 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.198+0000 7f1531903700 1 -- 192.168.123.103:0/2590266233 wait complete. 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1531903700 1 Processor -- start 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1531903700 1 -- start start 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1531903700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c11bf10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1531903700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c116f10 0x7f152c117390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1531903700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f152c117960 con 0x7f152c116f10 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1531903700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f152c117ad0 con 0x7f152c10f340 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1530901700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c11bf10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1530901700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c11bf10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:36000/0 (socket says 192.168.123.103:36000) 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f1530901700 1 -- 192.168.123.103:0/2625943388 learned_addr learned my addr 192.168.123.103:0/2625943388 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:16.200 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.199+0000 7f152bfff700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c116f10 0x7f152c117390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1530901700 1 -- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c116f10 msgr2=0x7f152c117390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1530901700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c116f10 0x7f152c117390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1530901700 1 -- 192.168.123.103:0/2625943388 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15200097e0 con 0x7f152c10f340 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-09T00:09:16.201 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:16 vm03.local ceph-mon[129670]: pgmap v145: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 45/231 objects degraded (19.481%) 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1530901700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c11bf10 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f152400c3b0 tx=0x7f152400c6c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f152400e030 con 0x7f152c10f340 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f152c1b8420 con 0x7f152c10f340 2026-03-09T00:09:16.201 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.200+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f152c1b8780 con 0x7f152c10f340 2026-03-09T00:09:16.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.201+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f152400f040 con 0x7f152c10f340 2026-03-09T00:09:16.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.201+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1524007df0 con 0x7f152c10f340 2026-03-09T00:09:16.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.202+0000 7f15137fe700 1 -- 192.168.123.103:0/2625943388 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1518005320 con 0x7f152c10f340 2026-03-09T00:09:16.204 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.204+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1524004ad0 con 0x7f152c10f340 2026-03-09T00:09:16.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.205+0000 7f1529ffb700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1514077910 0x7f1514079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.205 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.205+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f1524099fa0 con 0x7f152c10f340 2026-03-09T00:09:16.206 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.206+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1524062830 con 0x7f152c10f340 2026-03-09T00:09:16.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.217+0000 7f152bfff700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1514077910 0x7f1514079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.221+0000 7f152bfff700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1514077910 0x7f1514079dd0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f1520000c00 tx=0x7f1520019040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.345 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.344+0000 7f15137fe700 1 -- 192.168.123.103:0/2625943388 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1518000bf0 con 0x7f1514077910 2026-03-09T00:09:16.346 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.346+0000 7f1529ffb700 1 -- 192.168.123.103:0/2625943388 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f1518000bf0 con 0x7f1514077910 2026-03-09T00:09:16.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1514077910 msgr2=0x7f1514079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1514077910 0x7f1514079dd0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f1520000c00 tx=0x7f1520019040 comp rx=0 tx=0).stop 2026-03-09T00:09:16.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 msgr2=0x7f152c11bf10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.350 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c11bf10 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f152400c3b0 tx=0x7f152400c6c0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 shutdown_connections 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1514077910 0x7f1514079dd0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.350+0000 7f1531903700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f152c10f340 0x7f152c11bf10 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.351+0000 7f1531903700 1 --2- 192.168.123.103:0/2625943388 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f152c116f10 0x7f152c117390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.351+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 >> 192.168.123.103:0/2625943388 conn(0x7f152c06ce20 msgr2=0x7f152c10ae50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.351+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 shutdown_connections 2026-03-09T00:09:16.351 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.351+0000 7f1531903700 1 -- 192.168.123.103:0/2625943388 wait complete. 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:16.422 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:16.422 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (13 PGs are or would become offline) 2026-03-09T00:09:16.422 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:16 vm06.local ceph-mon[106218]: pgmap v145: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 45/231 objects degraded (19.481%) 2026-03-09T00:09:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.435+0000 7f050e5f5700 1 -- 192.168.123.103:0/1521165707 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f050810f420 msgr2=0x7f050810f800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.435+0000 7f050e5f5700 1 --2- 192.168.123.103:0/1521165707 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f050810f420 0x7f050810f800 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f04f8009b00 tx=0x7f04f8009e10 comp rx=0 tx=0).stop 2026-03-09T00:09:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.435+0000 7f050e5f5700 1 -- 192.168.123.103:0/1521165707 shutdown_connections 2026-03-09T00:09:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.435+0000 7f050e5f5700 1 --2- 192.168.123.103:0/1521165707 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0508107d90 0x7f0508108210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.435+0000 7f050e5f5700 1 --2- 192.168.123.103:0/1521165707 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f050810f420 0x7f050810f800 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.436 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.435+0000 7f050e5f5700 1 -- 192.168.123.103:0/1521165707 >> 192.168.123.103:0/1521165707 conn(0x7f050806ce20 msgr2=0x7f050806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.438+0000 7f050e5f5700 1 -- 192.168.123.103:0/1521165707 shutdown_connections 2026-03-09T00:09:16.438 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.438+0000 7f050e5f5700 1 -- 192.168.123.103:0/1521165707 wait complete. 2026-03-09T00:09:16.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.439+0000 7f050e5f5700 1 Processor -- start 2026-03-09T00:09:16.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.439+0000 7f050e5f5700 1 -- start start 2026-03-09T00:09:16.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.439+0000 7f050e5f5700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0508107d90 0x7f050819ce80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.439 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.439+0000 7f050e5f5700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 0x7f05081a1830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.439+0000 7f050e5f5700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f050819d9e0 con 0x7f0508107d90 2026-03-09T00:09:16.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.439+0000 7f050e5f5700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f050819db50 con 0x7f050819d3c0 2026-03-09T00:09:16.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f05077fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 0x7f05081a1830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f05077fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 0x7f05081a1830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:36032/0 (socket says 192.168.123.103:36032) 2026-03-09T00:09:16.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f05077fe700 1 -- 192.168.123.103:0/1847981100 learned_addr learned my addr 192.168.123.103:0/1847981100 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:16.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f0507fff700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0508107d90 0x7f050819ce80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f05077fe700 1 -- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0508107d90 msgr2=0x7f050819ce80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f05077fe700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0508107d90 0x7f050819ce80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.440+0000 7f05077fe700 1 -- 192.168.123.103:0/1847981100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f04f80097e0 con 0x7f050819d3c0 2026-03-09T00:09:16.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.441+0000 7f05077fe700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 0x7f05081a1830 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f050000d350 tx=0x7f050000d710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.442+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05000155b0 con 0x7f050819d3c0 2026-03-09T00:09:16.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.442+0000 7f050e5f5700 1 -- 192.168.123.103:0/1847981100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05081a1e30 con 0x7f050819d3c0 2026-03-09T00:09:16.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.442+0000 7f050e5f5700 1 -- 192.168.123.103:0/1847981100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05081a2380 con 0x7f050819d3c0 2026-03-09T00:09:16.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.443+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f050000f040 con 0x7f050819d3c0 2026-03-09T00:09:16.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.443+0000 7f050e5f5700 1 -- 192.168.123.103:0/1847981100 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f05081a2660 con 0x7f050819d3c0 2026-03-09T00:09:16.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.443+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05000149c0 con 0x7f050819d3c0 2026-03-09T00:09:16.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.445+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0500014c20 con 0x7f050819d3c0 2026-03-09T00:09:16.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.446+0000 7f05057fa700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f04f00778c0 0x7f04f0079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.446+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f0500099e20 con 0x7f050819d3c0 2026-03-09T00:09:16.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.446+0000 7f0507fff700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f04f00778c0 0x7f04f0079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.447+0000 7f0507fff700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f04f00778c0 0x7f04f0079d80 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f04f8009fd0 tx=0x7f04f802b040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.449+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0500062760 con 0x7f050819d3c0 2026-03-09T00:09:16.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.603+0000 7f050e5f5700 1 -- 192.168.123.103:0/1847981100 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f050819e2c0 con 0x7f04f00778c0 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.609+0000 7f05057fa700 1 -- 192.168.123.103:0/1847981100 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3528 (secure 0 0 0) 0x7f050819e2c0 con 0x7f04f00778c0 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (4m) 3s ago 9m 24.2M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (9m) 3s ago 9m 9315k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (9m) 3m ago 9m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 3s ago 9m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (3m) 3m ago 9m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 3s ago 9m 84.1M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (7m) 3s ago 7m 19.3M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (7m) 3s ago 7m 188M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:09:16.609 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (7m) 3m ago 7m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (7m) 3m ago 7m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (5m) 3s ago 10m 624M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (5m) 3m ago 9m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (3m) 3s ago 10m 60.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (3m) 3m ago 9m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (4m) 3s ago 9m 9977k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (4m) 3m ago 9m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (2m) 3s ago 8m 178M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (5s) 3s ago 8m 12.9M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (8m) 3s ago 8m 392M 4096M 18.2.1 5be31c24972a 00566abbcc16 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (8m) 3m ago 8m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (8m) 3m ago 8m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (7m) 3m ago 7m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:09:16.610 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (4m) 3s ago 9m 57.1M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 -- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f04f00778c0 msgr2=0x7f04f0079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f04f00778c0 0x7f04f0079d80 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f04f8009fd0 tx=0x7f04f802b040 comp rx=0 tx=0).stop 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 -- 192.168.123.103:0/1847981100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 msgr2=0x7f05081a1830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 0x7f05081a1830 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f050000d350 tx=0x7f050000d710 comp rx=0 tx=0).stop 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 -- 192.168.123.103:0/1847981100 shutdown_connections 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f04f00778c0 0x7f04f0079d80 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0508107d90 0x7f050819ce80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 --2- 192.168.123.103:0/1847981100 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f050819d3c0 0x7f05081a1830 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 -- 192.168.123.103:0/1847981100 >> 192.168.123.103:0/1847981100 conn(0x7f050806ce20 msgr2=0x7f050806f580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 -- 192.168.123.103:0/1847981100 shutdown_connections 2026-03-09T00:09:16.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.612+0000 7f04eeffd700 1 -- 192.168.123.103:0/1847981100 wait complete. 2026-03-09T00:09:16.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.694+0000 7fd6e665e700 1 -- 192.168.123.103:0/2741851029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010f340 msgr2=0x7fd6e010f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.694+0000 7fd6e665e700 1 --2- 192.168.123.103:0/2741851029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010f340 0x7fd6e010f720 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fd6d0009b50 tx=0x7fd6d0009e60 comp rx=0 tx=0).stop 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 -- 192.168.123.103:0/2741851029 shutdown_connections 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 --2- 192.168.123.103:0/2741851029 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e010d0f0 0x7fd6e010d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 --2- 192.168.123.103:0/2741851029 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010f340 0x7fd6e010f720 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 -- 192.168.123.103:0/2741851029 >> 192.168.123.103:0/2741851029 conn(0x7fd6e006ce20 msgr2=0x7fd6e006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 -- 192.168.123.103:0/2741851029 shutdown_connections 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 -- 192.168.123.103:0/2741851029 wait complete. 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.695+0000 7fd6e665e700 1 Processor -- start 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e665e700 1 -- start start 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e665e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010d0f0 0x7fd6e01ab510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e665e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 0x7fd6e01a5590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e665e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6e01ac020 con 0x7fd6e010d0f0 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e665e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd6e01a5ad0 con 0x7fd6e01aba50 2026-03-09T00:09:16.696 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e4e5b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 0x7fd6e01a5590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e4e5b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 0x7fd6e01a5590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:36058/0 (socket says 192.168.123.103:36058) 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e4e5b700 1 -- 192.168.123.103:0/4104941679 learned_addr learned my addr 192.168.123.103:0/4104941679 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.696+0000 7fd6e565c700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010d0f0 0x7fd6e01ab510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.697+0000 7fd6e4e5b700 1 -- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010d0f0 msgr2=0x7fd6e01ab510 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.697+0000 7fd6e4e5b700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010d0f0 0x7fd6e01ab510 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.697+0000 7fd6e4e5b700 1 -- 192.168.123.103:0/4104941679 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd6d00097e0 con 0x7fd6e01aba50 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.697+0000 7fd6e565c700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010d0f0 0x7fd6e01ab510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:09:16.697 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.697+0000 7fd6e4e5b700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 0x7fd6e01a5590 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fd6d800c3b0 tx=0x7fd6d800c770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.697+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6d800e030 con 0x7fd6e01aba50 2026-03-09T00:09:16.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.698+0000 7fd6e665e700 1 -- 192.168.123.103:0/4104941679 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd6e01a5db0 con 0x7fd6e01aba50 2026-03-09T00:09:16.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.698+0000 7fd6e665e700 1 -- 192.168.123.103:0/4104941679 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd6e01a6300 con 0x7fd6e01aba50 2026-03-09T00:09:16.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.698+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd6d800f040 con 0x7fd6e01aba50 2026-03-09T00:09:16.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.698+0000 7fd6e665e700 1 -- 192.168.123.103:0/4104941679 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd6e004f2e0 con 0x7fd6e01aba50 2026-03-09T00:09:16.699 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.699+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd6d8007e50 con 0x7fd6e01aba50 2026-03-09T00:09:16.700 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.700+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd6d8009160 con 0x7fd6e01aba50 2026-03-09T00:09:16.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.700+0000 7fd6d67fc700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd6cc077910 0x7fd6cc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.701+0000 7fd6e565c700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd6cc077910 0x7fd6cc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.701+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7fd6d809a760 con 0x7fd6e01aba50 2026-03-09T00:09:16.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.701+0000 7fd6e565c700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd6cc077910 0x7fd6cc079dd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fd6d000b5c0 tx=0x7fd6d0011040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.704 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.704+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd6d8062ff0 con 0x7fd6e01aba50 2026-03-09T00:09:16.879 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.878+0000 7fd6e665e700 1 -- 192.168.123.103:0/4104941679 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fd6e01a6b00 con 0x7fd6e01aba50 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.882+0000 7fd6d67fc700 1 -- 192.168.123.103:0/4104941679 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fd6d8062740 con 0x7fd6e01aba50 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4, 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 8, 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:09:16.883 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 -- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd6cc077910 msgr2=0x7fd6cc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd6cc077910 0x7fd6cc079dd0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fd6d000b5c0 tx=0x7fd6d0011040 comp rx=0 tx=0).stop 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 -- 192.168.123.103:0/4104941679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 msgr2=0x7fd6e01a5590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 0x7fd6e01a5590 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fd6d800c3b0 tx=0x7fd6d800c770 comp rx=0 tx=0).stop 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 -- 192.168.123.103:0/4104941679 shutdown_connections 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd6cc077910 0x7fd6cc079dd0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd6e010d0f0 0x7fd6e01ab510 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 --2- 192.168.123.103:0/4104941679 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd6e01aba50 0x7fd6e01a5590 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.889+0000 7fd6cbfff700 1 -- 192.168.123.103:0/4104941679 >> 192.168.123.103:0/4104941679 conn(0x7fd6e006ce20 msgr2=0x7fd6e00700a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.890+0000 7fd6cbfff700 1 -- 192.168.123.103:0/4104941679 shutdown_connections 2026-03-09T00:09:16.890 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.890+0000 7fd6cbfff700 1 -- 192.168.123.103:0/4104941679 wait complete. 2026-03-09T00:09:16.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.984+0000 7fcf55958700 1 -- 192.168.123.103:0/2646657927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf5010f660 msgr2=0x7fcf50107d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.984+0000 7fcf55958700 1 --2- 192.168.123.103:0/2646657927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf5010f660 0x7fcf50107d90 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fcf4800b3a0 tx=0x7fcf4800b6b0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.985 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 -- 192.168.123.103:0/2646657927 shutdown_connections 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 --2- 192.168.123.103:0/2646657927 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf501082d0 0x7fcf50108750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 --2- 192.168.123.103:0/2646657927 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf5010f660 0x7fcf50107d90 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 -- 192.168.123.103:0/2646657927 >> 192.168.123.103:0/2646657927 conn(0x7fcf5006d0f0 msgr2=0x7fcf5006d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 -- 192.168.123.103:0/2646657927 shutdown_connections 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 -- 192.168.123.103:0/2646657927 wait complete. 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 Processor -- start 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.985+0000 7fcf55958700 1 -- start start 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf55958700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf501082d0 0x7fcf501130e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf55958700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 0x7fcf50113620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf55958700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf50118280 con 0x7fcf501082d0 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf55958700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf501183f0 con 0x7fcf5010f660 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 0x7fcf50113620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 0x7fcf50113620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:36066/0 (socket says 192.168.123.103:36066) 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 -- 192.168.123.103:0/723259394 learned_addr learned my addr 192.168.123.103:0/723259394 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:16.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 -- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf501082d0 msgr2=0x7fcf501130e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:16.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf501082d0 0x7fcf501130e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:16.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 -- 192.168.123.103:0/723259394 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcf4800b050 con 0x7fcf5010f660 2026-03-09T00:09:16.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.986+0000 7fcf4e7fc700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 0x7fcf50113620 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fcf4000eb10 tx=0x7fcf4000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.988+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcf4000cca0 con 0x7fcf5010f660 2026-03-09T00:09:16.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.988+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcf4000ce00 con 0x7fcf5010f660 2026-03-09T00:09:16.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.988+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcf40018910 con 0x7fcf5010f660 2026-03-09T00:09:16.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.989+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcf50113c80 con 0x7fcf5010f660 2026-03-09T00:09:16.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.989+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcf501b8860 con 0x7fcf5010f660 2026-03-09T00:09:16.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.990+0000 7fcf367fc700 1 -- 192.168.123.103:0/723259394 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcf5004ea90 con 0x7fcf5010f660 2026-03-09T00:09:16.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.991+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcf40018a70 con 0x7fcf5010f660 2026-03-09T00:09:16.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.992+0000 7fcf54956700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcf38077870 0x7fcf38079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:16.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.992+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7fcf40014070 con 0x7fcf5010f660 2026-03-09T00:09:16.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.993+0000 7fcf4effd700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcf38077870 0x7fcf38079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:16.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.993+0000 7fcf4effd700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcf38077870 0x7fcf38079d30 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fcf48015040 tx=0x7fcf4800ba00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:16.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:16.994+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcf400631d0 con 0x7fcf5010f660 2026-03-09T00:09:17.149 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:17 vm03.local ceph-mon[129670]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 6 pgs peering) 2026-03-09T00:09:17.149 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:17 vm03.local ceph-mon[129670]: from='client.34286 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:17.149 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:17 vm03.local ceph-mon[129670]: from='client.44209 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:17.149 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:17 vm03.local ceph-mon[129670]: from='client.44213 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:17.149 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:17 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4104941679' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:17.149 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.148+0000 7fcf367fc700 1 -- 192.168.123.103:0/723259394 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fcf50066e80 con 0x7fcf5010f660 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.154+0000 7fcf54956700 1 -- 192.168.123.103:0/723259394 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7fcf40062920 con 0x7fcf5010f660 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:09:17.155 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:17.156 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcf38077870 msgr2=0x7fcf38079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcf38077870 0x7fcf38079d30 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fcf48015040 tx=0x7fcf4800ba00 comp rx=0 tx=0).stop 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 msgr2=0x7fcf50113620 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 0x7fcf50113620 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fcf4000eb10 tx=0x7fcf4000eed0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 shutdown_connections 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcf38077870 0x7fcf38079d30 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcf501082d0 0x7fcf501130e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 --2- 192.168.123.103:0/723259394 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcf5010f660 0x7fcf50113620 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.160 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 >> 192.168.123.103:0/723259394 conn(0x7fcf5006d0f0 msgr2=0x7fcf5010d510 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:17.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.160+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 shutdown_connections 2026-03-09T00:09:17.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.161+0000 7fcf55958700 1 -- 192.168.123.103:0/723259394 wait complete. 2026-03-09T00:09:17.162 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:09:17.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.252+0000 7ff216979700 1 -- 192.168.123.103:0/2767732768 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010f340 msgr2=0x7ff21010f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.252+0000 7ff216979700 1 --2- 192.168.123.103:0/2767732768 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010f340 0x7ff21010f720 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7ff200009b00 tx=0x7ff200009e10 comp rx=0 tx=0).stop 2026-03-09T00:09:17.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 -- 192.168.123.103:0/2767732768 shutdown_connections 2026-03-09T00:09:17.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 --2- 192.168.123.103:0/2767732768 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010d0f0 0x7ff21010d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 --2- 192.168.123.103:0/2767732768 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010f340 0x7ff21010f720 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.253 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 -- 192.168.123.103:0/2767732768 >> 192.168.123.103:0/2767732768 conn(0x7ff21006ce20 msgr2=0x7ff21006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 -- 192.168.123.103:0/2767732768 shutdown_connections 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 -- 192.168.123.103:0/2767732768 wait complete. 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 Processor -- start 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.253+0000 7ff216979700 1 -- start start 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff216979700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010d0f0 0x7ff21011bf80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff216979700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010f340 0x7ff210116f80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff216979700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff210117550 con 0x7ff21010d0f0 2026-03-09T00:09:17.254 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff216979700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2101176c0 con 0x7ff21010f340 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff215176700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010f340 0x7ff210116f80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff215176700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010f340 0x7ff210116f80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:36076/0 (socket says 192.168.123.103:36076) 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff215176700 1 -- 192.168.123.103:0/865823544 learned_addr learned my addr 192.168.123.103:0/865823544 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.254+0000 7ff215977700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010d0f0 0x7ff21011bf80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff215977700 1 -- 192.168.123.103:0/865823544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010f340 msgr2=0x7ff210116f80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff215977700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010f340 0x7ff210116f80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff215977700 1 -- 192.168.123.103:0/865823544 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2000097e0 con 0x7ff21010d0f0 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff215977700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010d0f0 0x7ff21011bf80 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7ff200000c00 tx=0x7ff20000bfd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:17.255 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff20001d070 con 0x7ff21010d0f0 2026-03-09T00:09:17.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff210117940 con 0x7ff21010d0f0 2026-03-09T00:09:17.256 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.255+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2101b8420 con 0x7ff21010d0f0 2026-03-09T00:09:17.257 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.256+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff200003d20 con 0x7ff21010d0f0 2026-03-09T00:09:17.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.257+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff200017840 con 0x7ff21010d0f0 2026-03-09T00:09:17.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.258+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff200017a60 con 0x7ff21010d0f0 2026-03-09T00:09:17.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.259+0000 7ff206ffd700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1fc077ab0 0x7ff1fc079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:17.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.259+0000 7ff215176700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1fc077ab0 0x7ff1fc079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:17.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.259+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7ff20009bf10 con 0x7ff21010d0f0 2026-03-09T00:09:17.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.259+0000 7ff215176700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1fc077ab0 0x7ff1fc079f70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ff210118770 tx=0x7ff20800b040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:17.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.260+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff1f40052f0 con 0x7ff21010d0f0 2026-03-09T00:09:17.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.264+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff2000647a0 con 0x7ff21010d0f0 2026-03-09T00:09:17.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.406+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff1f4000bc0 con 0x7ff1fc077ab0 2026-03-09T00:09:17.407 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:17 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:09:17.294+0000 7f87a767e740 -1 osd.1 78 log_to_monitors true 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.410+0000 7ff206ffd700 1 -- 192.168.123.103:0/865823544 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7ff1f4000bc0 con 0x7ff1fc077ab0 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "8/23 daemons upgraded", 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:09:17.411 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1fc077ab0 msgr2=0x7ff1fc079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1fc077ab0 0x7ff1fc079f70 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7ff210118770 tx=0x7ff20800b040 comp rx=0 tx=0).stop 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010d0f0 msgr2=0x7ff21011bf80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010d0f0 0x7ff21011bf80 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7ff200000c00 tx=0x7ff20000bfd0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 shutdown_connections 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff1fc077ab0 0x7ff1fc079f70 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff21010d0f0 0x7ff21011bf80 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 --2- 192.168.123.103:0/865823544 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff21010f340 0x7ff210116f80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.413+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 >> 192.168.123.103:0/865823544 conn(0x7ff21006ce20 msgr2=0x7ff210070340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.414+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 shutdown_connections 2026-03-09T00:09:17.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.414+0000 7ff216979700 1 -- 192.168.123.103:0/865823544 wait complete. 2026-03-09T00:09:17.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:17 vm06.local ceph-mon[106218]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 6 pgs peering) 2026-03-09T00:09:17.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:17 vm06.local ceph-mon[106218]: from='client.34286 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:17.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:17 vm06.local ceph-mon[106218]: from='client.44209 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:17.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:17 vm06.local ceph-mon[106218]: from='client.44213 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:17.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:17 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4104941679' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.489+0000 7fb395ee1700 1 -- 192.168.123.103:0/544965836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 msgr2=0x7fb39010d640 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.489+0000 7fb395ee1700 1 --2- 192.168.123.103:0/544965836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb39010d640 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fb378009b50 tx=0x7fb378009e60 comp rx=0 tx=0).stop 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.490+0000 7fb395ee1700 1 -- 192.168.123.103:0/544965836 shutdown_connections 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.490+0000 7fb395ee1700 1 --2- 192.168.123.103:0/544965836 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb39010d640 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.490+0000 7fb395ee1700 1 --2- 192.168.123.103:0/544965836 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 0x7fb3900695e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.490+0000 7fb395ee1700 1 -- 192.168.123.103:0/544965836 >> 192.168.123.103:0/544965836 conn(0x7fb390076b30 msgr2=0x7fb390076f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:17.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.490+0000 7fb395ee1700 1 -- 192.168.123.103:0/544965836 shutdown_connections 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.490+0000 7fb395ee1700 1 -- 192.168.123.103:0/544965836 wait complete. 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb395ee1700 1 Processor -- start 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb395ee1700 1 -- start start 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb395ee1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 0x7fb390198f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb395ee1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb390199490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb395ee1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb390199b70 con 0x7fb390069b20 2026-03-09T00:09:17.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb395ee1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb39019d900 con 0x7fb390069200 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.491+0000 7fb386dff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb390199490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb38f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 0x7fb390198f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb38f7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 0x7fb390198f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:36102/0 (socket says 192.168.123.103:36102) 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb38f7fe700 1 -- 192.168.123.103:0/2734168569 learned_addr learned my addr 192.168.123.103:0/2734168569 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb386dff700 1 -- 192.168.123.103:0/2734168569 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 msgr2=0x7fb390198f50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb386dff700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 0x7fb390198f50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb386dff700 1 -- 192.168.123.103:0/2734168569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3780097e0 con 0x7fb390069b20 2026-03-09T00:09:17.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.492+0000 7fb386dff700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb390199490 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fb378009b50 tx=0x7fb378004c80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:17.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.493+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb37801d070 con 0x7fb390069b20 2026-03-09T00:09:17.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.493+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb378022470 con 0x7fb390069b20 2026-03-09T00:09:17.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.493+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb39019db80 con 0x7fb390069b20 2026-03-09T00:09:17.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.493+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb37800f650 con 0x7fb390069b20 2026-03-09T00:09:17.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.493+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb39019e070 con 0x7fb390069b20 2026-03-09T00:09:17.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.494+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb378022a80 con 0x7fb390069b20 2026-03-09T00:09:17.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.494+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb39010adb0 con 0x7fb390069b20 2026-03-09T00:09:17.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.497+0000 7fb38d7fa700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb37c077870 0x7fb37c079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:17.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.497+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6165+0+0 (secure 0 0 0) 0x7fb37809af10 con 0x7fb390069b20 2026-03-09T00:09:17.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.497+0000 7fb38f7fe700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb37c077870 0x7fb37c079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:17.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.498+0000 7fb38f7fe700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb37c077870 0x7fb37c079d30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fb38000bd30 tx=0x7fb38000b480 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:17.498 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.498+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb3780637d0 con 0x7fb390069b20 2026-03-09T00:09:17.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.678+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb390066e80 con 0x7fb390069b20 2026-03-09T00:09:17.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.679+0000 7fb38d7fa700 1 -- 192.168.123.103:0/2734168569 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1331 (secure 0 0 0) 0x7fb378027550 con 0x7fb390069b20 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down; Degraded data redundancy: 45/231 objects degraded (19.481%), 14 pgs degraded, 1 pg undersized 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout: osd.1 (root=default,host=vm03) is down 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 45/231 objects degraded (19.481%), 14 pgs degraded, 1 pg undersized 2026-03-09T00:09:17.681 INFO:teuthology.orchestra.run.vm03.stdout: pg 1.0 is active+undersized+degraded, acting [3,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.0 is active+undersized+degraded, acting [3,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1 is active+undersized+degraded, acting [2,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.2 is active+undersized+degraded, acting [5,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.3 is active+undersized+degraded, acting [5,2] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.4 is active+undersized+degraded, acting [0,4] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.6 is active+undersized+degraded, acting [3,4] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.9 is active+undersized+degraded, acting [4,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.a is active+undersized+degraded, acting [4,3] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.d is active+undersized+degraded, acting [3,2] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.10 is active+undersized+degraded, acting [2,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.15 is active+undersized+degraded, acting [3,0] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1b is active+undersized+degraded, acting [0,5] 2026-03-09T00:09:17.682 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.1f is stuck undersized for 2m, current state active+recovering+undersized+degraded+remapped, last acting [2,3] 2026-03-09T00:09:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.683+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb37c077870 msgr2=0x7fb37c079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.683+0000 7fb395ee1700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb37c077870 0x7fb37c079d30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7fb38000bd30 tx=0x7fb38000b480 comp rx=0 tx=0).stop 2026-03-09T00:09:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.684+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 msgr2=0x7fb390199490 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.684+0000 7fb395ee1700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb390199490 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7fb378009b50 tx=0x7fb378004c80 comp rx=0 tx=0).stop 2026-03-09T00:09:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.684+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 shutdown_connections 2026-03-09T00:09:17.684 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.684+0000 7fb395ee1700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb37c077870 0x7fb37c079d30 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.684+0000 7fb395ee1700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb390069200 0x7fb390198f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.684+0000 7fb395ee1700 1 --2- 192.168.123.103:0/2734168569 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb390069b20 0x7fb390199490 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:17.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.685+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 >> 192.168.123.103:0/2734168569 conn(0x7fb390076b30 msgr2=0x7fb3900feb50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:17.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.685+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 shutdown_connections 2026-03-09T00:09:17.685 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:17.685+0000 7fb395ee1700 1 -- 192.168.123.103:0/2734168569 wait complete. 2026-03-09T00:09:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:18 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/723259394' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:09:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:18 vm03.local ceph-mon[129670]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T00:09:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:18 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 45/231 objects degraded (19.481%), 14 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T00:09:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:18 vm03.local ceph-mon[129670]: from='client.34302 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:18 vm03.local ceph-mon[129670]: pgmap v146: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 45/231 objects degraded (19.481%) 2026-03-09T00:09:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:18 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2734168569' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:09:18.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:18 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/723259394' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:09:18.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:18 vm06.local ceph-mon[106218]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-09T00:09:18.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:18 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 45/231 objects degraded (19.481%), 14 pgs degraded, 1 pg undersized (PG_DEGRADED) 2026-03-09T00:09:18.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:18 vm06.local ceph-mon[106218]: from='client.34302 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:18.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:18 vm06.local ceph-mon[106218]: pgmap v146: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 45/231 objects degraded (19.481%) 2026-03-09T00:09:18.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:18 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2734168569' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:09:19.338 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:09:19 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:09:19.135+0000 7f879ec17640 -1 osd.1 78 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:09:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:19 vm03.local ceph-mon[129670]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T00:09:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:19 vm03.local ceph-mon[129670]: osdmap e81: 6 total, 5 up, 6 in 2026-03-09T00:09:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:19 vm03.local ceph-mon[129670]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:09:19.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:19 vm06.local ceph-mon[106218]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-09T00:09:19.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:19 vm06.local ceph-mon[106218]: osdmap e81: 6 total, 5 up, 6 in 2026-03-09T00:09:19.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:19 vm06.local ceph-mon[106218]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:09:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:20 vm03.local ceph-mon[129670]: osdmap e82: 6 total, 5 up, 6 in 2026-03-09T00:09:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:20 vm03.local ceph-mon[129670]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' 2026-03-09T00:09:20.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:20 vm03.local ceph-mon[129670]: pgmap v149: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 45/231 objects degraded (19.481%); 0 B/s, 5 objects/s recovering 2026-03-09T00:09:20.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:20 vm06.local ceph-mon[106218]: osdmap e82: 6 total, 5 up, 6 in 2026-03-09T00:09:20.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:20 vm06.local ceph-mon[106218]: from='osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100]' entity='osd.1' 2026-03-09T00:09:20.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:20 vm06.local ceph-mon[106218]: pgmap v149: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 45/231 objects degraded (19.481%); 0 B/s, 5 objects/s recovering 2026-03-09T00:09:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:21 vm03.local ceph-mon[129670]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:09:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:21 vm03.local ceph-mon[129670]: osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100] boot 2026-03-09T00:09:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:21 vm03.local ceph-mon[129670]: osdmap e83: 6 total, 6 up, 6 in 2026-03-09T00:09:21.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:09:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:21 vm06.local ceph-mon[106218]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:09:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:21 vm06.local ceph-mon[106218]: osd.1 [v2:192.168.123.103:6810/82548100,v1:192.168.123.103:6811/82548100] boot 2026-03-09T00:09:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:21 vm06.local ceph-mon[106218]: osdmap e83: 6 total, 6 up, 6 in 2026-03-09T00:09:21.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-09T00:09:22.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:22 vm03.local ceph-mon[129670]: osdmap e84: 6 total, 6 up, 6 in 2026-03-09T00:09:22.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:22 vm03.local ceph-mon[129670]: pgmap v152: 65 pgs: 2 peering, 19 active+undersized, 13 active+undersized+degraded, 31 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 32/231 objects degraded (13.853%); 0 B/s, 6 objects/s recovering 2026-03-09T00:09:22.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:22 vm06.local ceph-mon[106218]: osdmap e84: 6 total, 6 up, 6 in 2026-03-09T00:09:22.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:22 vm06.local ceph-mon[106218]: pgmap v152: 65 pgs: 2 peering, 19 active+undersized, 13 active+undersized+degraded, 31 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 32/231 objects degraded (13.853%); 0 B/s, 6 objects/s recovering 2026-03-09T00:09:23.553 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:23 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 32/231 objects degraded (13.853%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:23.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:23 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 32/231 objects degraded (13.853%), 13 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:24.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:24 vm03.local ceph-mon[129670]: pgmap v153: 65 pgs: 2 peering, 14 active+undersized, 6 active+undersized+degraded, 43 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 16/231 objects degraded (6.926%); 0 B/s, 4 objects/s recovering 2026-03-09T00:09:24.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:24 vm06.local ceph-mon[106218]: pgmap v153: 65 pgs: 2 peering, 14 active+undersized, 6 active+undersized+degraded, 43 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 16/231 objects degraded (6.926%); 0 B/s, 4 objects/s recovering 2026-03-09T00:09:26.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:25 vm03.local ceph-mon[129670]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 16/231 objects degraded (6.926%), 6 pgs degraded) 2026-03-09T00:09:26.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:25 vm06.local ceph-mon[106218]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 16/231 objects degraded (6.926%), 6 pgs degraded) 2026-03-09T00:09:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:26 vm03.local ceph-mon[129670]: pgmap v154: 65 pgs: 2 peering, 63 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 3 objects/s recovering 2026-03-09T00:09:27.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:26 vm06.local ceph-mon[106218]: pgmap v154: 65 pgs: 2 peering, 63 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 3 objects/s recovering 2026-03-09T00:09:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:28 vm06.local ceph-mon[106218]: pgmap v155: 65 pgs: 65 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 3 objects/s recovering 2026-03-09T00:09:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:28 vm03.local ceph-mon[129670]: pgmap v155: 65 pgs: 65 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 3 objects/s recovering 2026-03-09T00:09:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:30.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:30.916 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:30 vm03.local ceph-mon[129670]: pgmap v156: 65 pgs: 65 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 2 objects/s recovering 2026-03-09T00:09:30.916 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:30 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:30.916 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:30 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:30.916 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:30 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T00:09:30.916 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:30 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:30 vm06.local ceph-mon[106218]: pgmap v156: 65 pgs: 65 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 0 B/s, 2 objects/s recovering 2026-03-09T00:09:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:30 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:30 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:30 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-09T00:09:31.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:30 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:31.743 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:31 vm03.local systemd[1]: Stopping Ceph osd.2 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:09:32.012 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[83418]: 2026-03-09T00:09:31.741+0000 7f54fbbe8700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:09:32.012 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[83418]: 2026-03-09T00:09:31.741+0000 7f54fbbe8700 -1 osd.2 84 *** Got signal Terminated *** 2026-03-09T00:09:32.012 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[83418]: 2026-03-09T00:09:31.741+0000 7f54fbbe8700 -1 osd.2 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:09:32.012 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:32.012 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-mon[129670]: Upgrade: osd.2 is safe to restart 2026-03-09T00:09:32.012 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-mon[129670]: Upgrade: Updating osd.2 2026-03-09T00:09:32.012 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-mon[129670]: Deploying daemon osd.2 on vm03 2026-03-09T00:09:32.012 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:31 vm03.local ceph-mon[129670]: osd.2 marked itself down and dead 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149482]: 2026-03-09 00:09:32.011969288 +0000 UTC m=+0.285781628 container died 00566abbcc16c85e9305605fb8d9b26a07ca6b3bbe120b894b12e2bb9c54e182 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1) 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149482]: 2026-03-09 00:09:32.034729604 +0000 UTC m=+0.308541924 container remove 00566abbcc16c85e9305605fb8d9b26a07ca6b3bbe120b894b12e2bb9c54e182 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, ceph=True, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True) 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local bash[149482]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.178892658 +0000 UTC m=+0.018931962 container create 1e5167288b9386e38c2986b29771da80d66088c66df8209d6b51e5ec16c1adc7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.220581851 +0000 UTC m=+0.060621165 container init 1e5167288b9386e38c2986b29771da80d66088c66df8209d6b51e5ec16c1adc7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0) 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.223779114 +0000 UTC m=+0.063818418 container start 1e5167288b9386e38c2986b29771da80d66088c66df8209d6b51e5ec16c1adc7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:09:32.270 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.228412225 +0000 UTC m=+0.068451529 container attach 1e5167288b9386e38c2986b29771da80d66088c66df8209d6b51e5ec16c1adc7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3) 2026-03-09T00:09:32.271 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.172103853 +0000 UTC m=+0.012143168 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:32.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:31 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-09T00:09:32.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:31 vm06.local ceph-mon[106218]: Upgrade: osd.2 is safe to restart 2026-03-09T00:09:32.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:31 vm06.local ceph-mon[106218]: Upgrade: Updating osd.2 2026-03-09T00:09:32.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:31 vm06.local ceph-mon[106218]: Deploying daemon osd.2 on vm03 2026-03-09T00:09:32.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:31 vm06.local ceph-mon[106218]: osd.2 marked itself down and dead 2026-03-09T00:09:32.565 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.35766745 +0000 UTC m=+0.197706754 container died 1e5167288b9386e38c2986b29771da80d66088c66df8209d6b51e5ec16c1adc7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:09:32.565 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149549]: 2026-03-09 00:09:32.376538647 +0000 UTC m=+0.216577951 container remove 1e5167288b9386e38c2986b29771da80d66088c66df8209d6b51e5ec16c1adc7 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:09:32.565 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.2.service: Deactivated successfully. 2026-03-09T00:09:32.565 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local systemd[1]: Stopped Ceph osd.2 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:09:32.565 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.2.service: Consumed 38.542s CPU time. 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local systemd[1]: Starting Ceph osd.2 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149653]: 2026-03-09 00:09:32.681579147 +0000 UTC m=+0.023318952 container create 52dd9b0860f9686e345d46f2c8c280ea1b84ecbe41fe82796fef99f9983fb5d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True) 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149653]: 2026-03-09 00:09:32.7265761 +0000 UTC m=+0.068315905 container init 52dd9b0860f9686e345d46f2c8c280ea1b84ecbe41fe82796fef99f9983fb5d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149653]: 2026-03-09 00:09:32.73067986 +0000 UTC m=+0.072419665 container start 52dd9b0860f9686e345d46f2c8c280ea1b84ecbe41fe82796fef99f9983fb5d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149653]: 2026-03-09 00:09:32.735399304 +0000 UTC m=+0.077139118 container attach 52dd9b0860f9686e345d46f2c8c280ea1b84ecbe41fe82796fef99f9983fb5d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223) 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local podman[149653]: 2026-03-09 00:09:32.671042599 +0000 UTC m=+0.012782404 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:32.822 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:32 vm03.local ceph-mon[129670]: pgmap v157: 65 pgs: 65 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:09:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:32 vm03.local ceph-mon[129670]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:09:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:32 vm03.local ceph-mon[129670]: osdmap e85: 6 total, 5 up, 6 in 2026-03-09T00:09:33.088 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local bash[149653]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.088 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.089 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:32 vm03.local bash[149653]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:32 vm06.local ceph-mon[106218]: pgmap v157: 65 pgs: 65 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail 2026-03-09T00:09:33.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:32 vm06.local ceph-mon[106218]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:09:33.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:32 vm06.local ceph-mon[106218]: osdmap e85: 6 total, 5 up, 6 in 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-25f6f163-2df8-4d26-9084-0b86fe620a73/osd-block-57704364-d509-479a-8dff-0b9f590cc6d0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-25f6f163-2df8-4d26-9084-0b86fe620a73/osd-block-57704364-d509-479a-8dff-0b9f590cc6d0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-09T00:09:33.693 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/ln -snf /dev/ceph-25f6f163-2df8-4d26-9084-0b86fe620a73/osd-block-57704364-d509-479a-8dff-0b9f590cc6d0 /var/lib/ceph/osd/ceph-2/block 2026-03-09T00:09:33.955 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/ln -snf /dev/ceph-25f6f163-2df8-4d26-9084-0b86fe620a73/osd-block-57704364-d509-479a-8dff-0b9f590cc6d0 /var/lib/ceph/osd/ceph-2/block 2026-03-09T00:09:33.955 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate[149664]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149653]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local podman[149653]: 2026-03-09 00:09:33.729221945 +0000 UTC m=+1.070961740 container died 52dd9b0860f9686e345d46f2c8c280ea1b84ecbe41fe82796fef99f9983fb5d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local podman[149653]: 2026-03-09 00:09:33.754314646 +0000 UTC m=+1.096054451 container remove 52dd9b0860f9686e345d46f2c8c280ea1b84ecbe41fe82796fef99f9983fb5d6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223) 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local podman[149923]: 2026-03-09 00:09:33.864569782 +0000 UTC m=+0.019800619 container create e7841e7307ae35f7cca41e5e3e7f3843934706ec142a1bdc9863e0a94fc0a2db (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local podman[149923]: 2026-03-09 00:09:33.916648708 +0000 UTC m=+0.071879556 container init e7841e7307ae35f7cca41e5e3e7f3843934706ec142a1bdc9863e0a94fc0a2db (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local podman[149923]: 2026-03-09 00:09:33.919498882 +0000 UTC m=+0.074729719 container start e7841e7307ae35f7cca41e5e3e7f3843934706ec142a1bdc9863e0a94fc0a2db (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223) 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local bash[149923]: e7841e7307ae35f7cca41e5e3e7f3843934706ec142a1bdc9863e0a94fc0a2db 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local podman[149923]: 2026-03-09 00:09:33.856673222 +0000 UTC m=+0.011904070 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:33.956 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:33 vm03.local systemd[1]: Started Ceph osd.2 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:09:33.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-mon[129670]: osdmap e86: 6 total, 5 up, 6 in 2026-03-09T00:09:33.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-mon[129670]: pgmap v160: 65 pgs: 4 active+undersized, 5 stale+active+clean, 3 active+undersized+degraded, 53 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 13/231 objects degraded (5.628%) 2026-03-09T00:09:33.956 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:33 vm06.local ceph-mon[106218]: osdmap e86: 6 total, 5 up, 6 in 2026-03-09T00:09:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:33 vm06.local ceph-mon[106218]: pgmap v160: 65 pgs: 4 active+undersized, 5 stale+active+clean, 3 active+undersized+degraded, 53 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 13/231 objects degraded (5.628%) 2026-03-09T00:09:34.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:34.739 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:09:34.511+0000 7fec6b62b740 -1 Falling back to public interface 2026-03-09T00:09:35.019 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.019 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-mon[129670]: Health check failed: Degraded data redundancy: 13/231 objects degraded (5.628%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:35.019 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.019 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:35.019 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.019 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:34 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:34 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:34 vm06.local ceph-mon[106218]: Health check failed: Degraded data redundancy: 13/231 objects degraded (5.628%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:34 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:34 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:34 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:34 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.833 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:36 vm03.local ceph-mon[129670]: pgmap v161: 65 pgs: 11 active+undersized, 3 stale+active+clean, 8 active+undersized+degraded, 43 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 23/231 objects degraded (9.957%) 2026-03-09T00:09:36.833 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.833 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.833 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.833 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:36 vm06.local ceph-mon[106218]: pgmap v161: 65 pgs: 11 active+undersized, 3 stale+active+clean, 8 active+undersized+degraded, 43 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 23/231 objects degraded (9.957%) 2026-03-09T00:09:36.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:36.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: pgmap v162: 65 pgs: 16 active+undersized, 11 active+undersized+degraded, 38 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 32/231 objects degraded (13.853%) 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:38.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:38.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T00:09:38.839 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:09:38.508+0000 7fec6b62b740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: pgmap v162: 65 pgs: 16 active+undersized, 11 active+undersized+degraded, 38 active+clean; 209 MiB data, 2.5 GiB used, 117 GiB / 120 GiB avail; 32/231 objects degraded (13.853%) 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:38.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:38 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-09T00:09:39.338 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:38 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:09:38.893+0000 7fec6b62b740 -1 osd.2 84 log_to_monitors true 2026-03-09T00:09:39.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:39 vm03.local ceph-mon[129670]: from='osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T00:09:39.838 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:09:39 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:09:39.570+0000 7fec633c5640 -1 osd.2 84 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:09:39.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:39 vm06.local ceph-mon[106218]: from='osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-09T00:09:40.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:40 vm03.local ceph-mon[129670]: from='osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T00:09:40.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:40 vm03.local ceph-mon[129670]: osdmap e87: 6 total, 5 up, 6 in 2026-03-09T00:09:40.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:40 vm03.local ceph-mon[129670]: from='osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:09:40.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:40 vm03.local ceph-mon[129670]: pgmap v164: 65 pgs: 16 active+undersized, 11 active+undersized+degraded, 38 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 32/231 objects degraded (13.853%) 2026-03-09T00:09:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:40 vm06.local ceph-mon[106218]: from='osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-09T00:09:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:40 vm06.local ceph-mon[106218]: osdmap e87: 6 total, 5 up, 6 in 2026-03-09T00:09:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:40 vm06.local ceph-mon[106218]: from='osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm03", "root=default"]}]: dispatch 2026-03-09T00:09:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:40 vm06.local ceph-mon[106218]: pgmap v164: 65 pgs: 16 active+undersized, 11 active+undersized+degraded, 38 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 32/231 objects degraded (13.853%) 2026-03-09T00:09:41.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:41 vm03.local ceph-mon[129670]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:09:41.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:41 vm03.local ceph-mon[129670]: osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432] boot 2026-03-09T00:09:41.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:41 vm03.local ceph-mon[129670]: osdmap e88: 6 total, 6 up, 6 in 2026-03-09T00:09:41.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:09:41.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:41 vm06.local ceph-mon[106218]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:09:41.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:41 vm06.local ceph-mon[106218]: osd.2 [v2:192.168.123.103:6818/1046798432,v1:192.168.123.103:6819/1046798432] boot 2026-03-09T00:09:41.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:41 vm06.local ceph-mon[106218]: osdmap e88: 6 total, 6 up, 6 in 2026-03-09T00:09:41.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-09T00:09:42.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:42 vm06.local ceph-mon[106218]: osdmap e89: 6 total, 6 up, 6 in 2026-03-09T00:09:42.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:42 vm06.local ceph-mon[106218]: pgmap v167: 65 pgs: 5 peering, 12 active+undersized, 10 active+undersized+degraded, 38 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-09T00:09:42.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:42 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 32/231 objects degraded (13.853%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:42 vm03.local ceph-mon[129670]: osdmap e89: 6 total, 6 up, 6 in 2026-03-09T00:09:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:42 vm03.local ceph-mon[129670]: pgmap v167: 65 pgs: 5 peering, 12 active+undersized, 10 active+undersized+degraded, 38 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 29/231 objects degraded (12.554%) 2026-03-09T00:09:43.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:42 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 32/231 objects degraded (13.853%), 11 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:44 vm03.local ceph-mon[129670]: pgmap v168: 65 pgs: 5 peering, 10 active+undersized, 9 active+undersized+degraded, 41 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-09T00:09:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:45.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:44 vm06.local ceph-mon[106218]: pgmap v168: 65 pgs: 5 peering, 10 active+undersized, 9 active+undersized+degraded, 41 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-09T00:09:45.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:45.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:45 vm03.local ceph-mon[129670]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 26/231 objects degraded (11.255%), 9 pgs degraded) 2026-03-09T00:09:46.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:45 vm06.local ceph-mon[106218]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 26/231 objects degraded (11.255%), 9 pgs degraded) 2026-03-09T00:09:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:46 vm03.local ceph-mon[129670]: pgmap v169: 65 pgs: 5 peering, 60 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:47.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:46 vm06.local ceph-mon[106218]: pgmap v169: 65 pgs: 5 peering, 60 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.792+0000 7fd23df80700 1 -- 192.168.123.103:0/2605624377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd238102a00 msgr2=0x7fd23810aef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.792+0000 7fd23df80700 1 --2- 192.168.123.103:0/2605624377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd238102a00 0x7fd23810aef0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fd228009b80 tx=0x7fd228009e90 comp rx=0 tx=0).stop 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.793+0000 7fd23df80700 1 -- 192.168.123.103:0/2605624377 shutdown_connections 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.793+0000 7fd23df80700 1 --2- 192.168.123.103:0/2605624377 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd238102a00 0x7fd23810aef0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.793+0000 7fd23df80700 1 --2- 192.168.123.103:0/2605624377 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd2381020e0 0x7fd2381024c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.793+0000 7fd23df80700 1 -- 192.168.123.103:0/2605624377 >> 192.168.123.103:0/2605624377 conn(0x7fd238076e80 msgr2=0x7fd238077290 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.794+0000 7fd23df80700 1 -- 192.168.123.103:0/2605624377 shutdown_connections 2026-03-09T00:09:47.794 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.794+0000 7fd23df80700 1 -- 192.168.123.103:0/2605624377 wait complete. 2026-03-09T00:09:47.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd23df80700 1 Processor -- start 2026-03-09T00:09:47.795 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd23df80700 1 -- start start 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd23df80700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd2381020e0 0x7fd23819d2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd23df80700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 0x7fd23819d7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd23df80700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd23819ded0 con 0x7fd2381020e0 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd23df80700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd2381a1c60 con 0x7fd238102a00 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.795+0000 7fd22ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 0x7fd23819d7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd22ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 0x7fd23819d7f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49300/0 (socket says 192.168.123.103:49300) 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd22ffff700 1 -- 192.168.123.103:0/3367765231 learned_addr learned my addr 192.168.123.103:0/3367765231 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd22ffff700 1 -- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd2381020e0 msgr2=0x7fd23819d2b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd2377fe700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd2381020e0 0x7fd23819d2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:47.796 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd22ffff700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd2381020e0 0x7fd23819d2b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd22ffff700 1 -- 192.168.123.103:0/3367765231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd2280097e0 con 0x7fd238102a00 2026-03-09T00:09:47.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.796+0000 7fd2377fe700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd2381020e0 0x7fd23819d2b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:09:47.797 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.797+0000 7fd22ffff700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 0x7fd23819d7f0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fd2280048c0 tx=0x7fd2280049a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:47.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.803+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd22801d070 con 0x7fd238102a00 2026-03-09T00:09:47.804 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.803+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd22800bc70 con 0x7fd238102a00 2026-03-09T00:09:47.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.805+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd22800f780 con 0x7fd238102a00 2026-03-09T00:09:47.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.805+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd2381a1e00 con 0x7fd238102a00 2026-03-09T00:09:47.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.806+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd2381a22c0 con 0x7fd238102a00 2026-03-09T00:09:47.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.807+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd2381065e0 con 0x7fd238102a00 2026-03-09T00:09:47.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.808+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd228022470 con 0x7fd238102a00 2026-03-09T00:09:47.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.808+0000 7fd2357fa700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd2180778c0 0x7fd218079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:47.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.808+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fd22809bb60 con 0x7fd238102a00 2026-03-09T00:09:47.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.809+0000 7fd2377fe700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd2180778c0 0x7fd218079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:47.810 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.809+0000 7fd2377fe700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd2180778c0 0x7fd218079d80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fd220005fd0 tx=0x7fd220005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:47.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.811+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd228064410 con 0x7fd238102a00 2026-03-09T00:09:47.939 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.938+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd23819e680 con 0x7fd2180778c0 2026-03-09T00:09:47.940 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.940+0000 7fd2357fa700 1 -- 192.168.123.103:0/3367765231 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fd23819e680 con 0x7fd2180778c0 2026-03-09T00:09:47.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.943+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd2180778c0 msgr2=0x7fd218079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:47.943 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.943+0000 7fd23df80700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd2180778c0 0x7fd218079d80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fd220005fd0 tx=0x7fd220005ee0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 msgr2=0x7fd23819d7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 0x7fd23819d7f0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7fd2280048c0 tx=0x7fd2280049a0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 shutdown_connections 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd2180778c0 0x7fd218079d80 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd2381020e0 0x7fd23819d2b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 --2- 192.168.123.103:0/3367765231 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd238102a00 0x7fd23819d7f0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:47.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 >> 192.168.123.103:0/3367765231 conn(0x7fd238076e80 msgr2=0x7fd238103720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:47.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.944+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 shutdown_connections 2026-03-09T00:09:47.945 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:47.945+0000 7fd23df80700 1 -- 192.168.123.103:0/3367765231 wait complete. 2026-03-09T00:09:47.957 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:09:48.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.041+0000 7f84f4a2b700 1 -- 192.168.123.103:0/3876741930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f0101c50 msgr2=0x7f84f0102030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.041+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/3876741930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f0101c50 0x7f84f0102030 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f84d8009b00 tx=0x7f84d8009e10 comp rx=0 tx=0).stop 2026-03-09T00:09:48.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.042+0000 7f84f4a2b700 1 -- 192.168.123.103:0/3876741930 shutdown_connections 2026-03-09T00:09:48.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.042+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/3876741930 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f00ff6d0 0x7f84f00ffb50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.042+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/3876741930 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f0101c50 0x7f84f0102030 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.042 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.042+0000 7f84f4a2b700 1 -- 192.168.123.103:0/3876741930 >> 192.168.123.103:0/3876741930 conn(0x7f84f0075010 msgr2=0x7f84f0075420 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.042+0000 7f84f4a2b700 1 -- 192.168.123.103:0/3876741930 shutdown_connections 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.042+0000 7f84f4a2b700 1 -- 192.168.123.103:0/3876741930 wait complete. 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84f4a2b700 1 Processor -- start 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84f4a2b700 1 -- start start 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84f4a2b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f00ff6d0 0x7f84f0102ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84f4a2b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 0x7f84f0103000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.043 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84f4a2b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84f0106c50 con 0x7f84f00ff6d0 2026-03-09T00:09:48.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84edd9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 0x7f84f0103000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84edd9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 0x7f84f0103000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49316/0 (socket says 192.168.123.103:49316) 2026-03-09T00:09:48.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.043+0000 7f84edd9b700 1 -- 192.168.123.103:0/2232958593 learned_addr learned my addr 192.168.123.103:0/2232958593 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:48.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.044+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84f0103540 con 0x7f84f0101c50 2026-03-09T00:09:48.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.044+0000 7f84ee59c700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f00ff6d0 0x7f84f0102ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84ee59c700 1 -- 192.168.123.103:0/2232958593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 msgr2=0x7f84f0103000 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84ee59c700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 0x7f84f0103000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84ee59c700 1 -- 192.168.123.103:0/2232958593 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84d80097e0 con 0x7f84f00ff6d0 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84ee59c700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f00ff6d0 0x7f84f0102ac0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f84d8009b00 tx=0x7f84d8004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f84d801d070 con 0x7f84f00ff6d0 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f84d800bc50 con 0x7f84f00ff6d0 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.045+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f84d800f670 con 0x7f84f00ff6d0 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.046+0000 7f84edd9b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 0x7f84f0103000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.046+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84f01037c0 con 0x7f84f00ff6d0 2026-03-09T00:09:48.046 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.046+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84f01a75b0 con 0x7f84f00ff6d0 2026-03-09T00:09:48.047 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.047+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f84d800f7d0 con 0x7f84f00ff6d0 2026-03-09T00:09:48.048 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.047+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f84f004ea90 con 0x7f84f00ff6d0 2026-03-09T00:09:48.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.049+0000 7f84e77fe700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f84dc0778c0 0x7f84dc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.049+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f84d809af40 con 0x7f84f00ff6d0 2026-03-09T00:09:48.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.051+0000 7f84edd9b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f84dc0778c0 0x7f84dc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.051+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f84d80649f0 con 0x7f84f00ff6d0 2026-03-09T00:09:48.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.051+0000 7f84edd9b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f84dc0778c0 0x7f84dc079d80 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f84f01042b0 tx=0x7f84e0008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.197 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.196+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f84f00fdc50 con 0x7f84dc0778c0 2026-03-09T00:09:48.198 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.198+0000 7f84e77fe700 1 -- 192.168.123.103:0/2232958593 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f84f00fdc50 con 0x7f84dc0778c0 2026-03-09T00:09:48.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.201+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f84dc0778c0 msgr2=0x7f84dc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.201+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f84dc0778c0 0x7f84dc079d80 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f84f01042b0 tx=0x7f84e0008040 comp rx=0 tx=0).stop 2026-03-09T00:09:48.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.202+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f00ff6d0 msgr2=0x7f84f0102ac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.202+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f00ff6d0 0x7f84f0102ac0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f84d8009b00 tx=0x7f84d8004970 comp rx=0 tx=0).stop 2026-03-09T00:09:48.202 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.202+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 shutdown_connections 2026-03-09T00:09:48.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.202+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f84dc0778c0 0x7f84dc079d80 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.202+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f84f00ff6d0 0x7f84f0102ac0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.202+0000 7f84f4a2b700 1 --2- 192.168.123.103:0/2232958593 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f84f0101c50 0x7f84f0103000 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.203+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 >> 192.168.123.103:0/2232958593 conn(0x7f84f0075010 msgr2=0x7f84f00fd540 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.203+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 shutdown_connections 2026-03-09T00:09:48.203 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.203+0000 7f84f4a2b700 1 -- 192.168.123.103:0/2232958593 wait complete. 2026-03-09T00:09:48.282 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.281+0000 7fbf56916700 1 -- 192.168.123.103:0/3152845469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 msgr2=0x7fbf500688b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.281+0000 7fbf56916700 1 --2- 192.168.123.103:0/3152845469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf500688b0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fbf38009b30 tx=0x7fbf38009e40 comp rx=0 tx=0).stop 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.282+0000 7fbf56916700 1 -- 192.168.123.103:0/3152845469 shutdown_connections 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.282+0000 7fbf56916700 1 --2- 192.168.123.103:0/3152845469 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf50068df0 0x7fbf5010d5b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.282+0000 7fbf56916700 1 --2- 192.168.123.103:0/3152845469 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf500688b0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.282+0000 7fbf56916700 1 -- 192.168.123.103:0/3152845469 >> 192.168.123.103:0/3152845469 conn(0x7fbf50075960 msgr2=0x7fbf50075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.282+0000 7fbf56916700 1 -- 192.168.123.103:0/3152845469 shutdown_connections 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.282+0000 7fbf56916700 1 -- 192.168.123.103:0/3152845469 wait complete. 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.283+0000 7fbf56916700 1 Processor -- start 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.283+0000 7fbf56916700 1 -- start start 2026-03-09T00:09:48.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.283+0000 7fbf56916700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf50198e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.283+0000 7fbf56916700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf50068df0 0x7fbf501993b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.283+0000 7fbf56916700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf50199a90 con 0x7fbf500684d0 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.283+0000 7fbf56916700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf5019d820 con 0x7fbf50068df0 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf50198e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf50198e70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36226/0 (socket says 192.168.123.103:36226) 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4ffff700 1 -- 192.168.123.103:0/741333883 learned_addr learned my addr 192.168.123.103:0/741333883 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:48.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4f7fe700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf50068df0 0x7fbf501993b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4ffff700 1 -- 192.168.123.103:0/741333883 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf50068df0 msgr2=0x7fbf501993b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4ffff700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf50068df0 0x7fbf501993b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.284+0000 7fbf4ffff700 1 -- 192.168.123.103:0/741333883 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf380097e0 con 0x7fbf500684d0 2026-03-09T00:09:48.285 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.285+0000 7fbf4ffff700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf50198e70 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fbf38004c30 tx=0x7fbf38004c60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.285+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf3801d070 con 0x7fbf500684d0 2026-03-09T00:09:48.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.285+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbf3800bcd0 con 0x7fbf500684d0 2026-03-09T00:09:48.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.285+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf380216d0 con 0x7fbf500684d0 2026-03-09T00:09:48.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.285+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf5019daa0 con 0x7fbf500684d0 2026-03-09T00:09:48.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.285+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf5019dff0 con 0x7fbf500684d0 2026-03-09T00:09:48.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.287+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbf3802b430 con 0x7fbf500684d0 2026-03-09T00:09:48.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.288+0000 7fbf4d7fa700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbf3c0778c0 0x7fbf3c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.288+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fbf3809beb0 con 0x7fbf500684d0 2026-03-09T00:09:48.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.289+0000 7fbf4f7fe700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbf3c0778c0 0x7fbf3c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.289 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.289+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbf5010ad20 con 0x7fbf500684d0 2026-03-09T00:09:48.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.292+0000 7fbf4f7fe700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbf3c0778c0 0x7fbf3c079d80 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fbf5019a490 tx=0x7fbf40009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.293 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.293+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbf380646e0 con 0x7fbf500684d0 2026-03-09T00:09:48.418 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.417+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fbf5019a1d0 con 0x7fbf3c0778c0 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.422+0000 7fbf4d7fa700 1 -- 192.168.123.103:0/741333883 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fbf5019a1d0 con 0x7fbf3c0778c0 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (5m) 12s ago 10m 24.4M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (10m) 12s ago 10m 9499k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (9m) 3m ago 9m 8656k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (3m) 12s ago 10m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (3m) 3m ago 9m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (4m) 12s ago 9m 85.2M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (8m) 12s ago 8m 19.5M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (8m) 12s ago 8m 188M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (8m) 3m ago 8m 19.7M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (8m) 3m ago 8m 15.9M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (5m) 12s ago 10m 625M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (5m) 3m ago 9m 489M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 12s ago 10m 61.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (3m) 3m ago 9m 48.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (5m) 12s ago 10m 9643k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (5m) 3m ago 9m 9420k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (3m) 12s ago 9m 178M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (37s) 12s ago 9m 142M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (14s) 12s ago 8m 12.3M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e7841e7307ae 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (8m) 3m ago 8m 499M 4096M 18.2.1 5be31c24972a 1ece32056ab6 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (8m) 3m ago 8m 475M 4096M 18.2.1 5be31c24972a ee6260a1124c 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (8m) 3m ago 8m 398M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:09:48.423 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (5m) 12s ago 9m 58.0M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.425+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbf3c0778c0 msgr2=0x7fbf3c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.425+0000 7fbf56916700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbf3c0778c0 0x7fbf3c079d80 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fbf5019a490 tx=0x7fbf40009380 comp rx=0 tx=0).stop 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.425+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 msgr2=0x7fbf50198e70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.425+0000 7fbf56916700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf50198e70 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fbf38004c30 tx=0x7fbf38004c60 comp rx=0 tx=0).stop 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 shutdown_connections 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbf3c0778c0 0x7fbf3c079d80 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbf500684d0 0x7fbf50198e70 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.426 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 --2- 192.168.123.103:0/741333883 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbf50068df0 0x7fbf501993b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 >> 192.168.123.103:0/741333883 conn(0x7fbf50075960 msgr2=0x7fbf500fe950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 shutdown_connections 2026-03-09T00:09:48.427 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.426+0000 7fbf56916700 1 -- 192.168.123.103:0/741333883 wait complete. 2026-03-09T00:09:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 -- 192.168.123.103:0/979747233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa174068df0 msgr2=0x7fa17410d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/979747233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa174068df0 0x7fa17410d5b0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fa164009b30 tx=0x7fa164009e40 comp rx=0 tx=0).stop 2026-03-09T00:09:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 -- 192.168.123.103:0/979747233 shutdown_connections 2026-03-09T00:09:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/979747233 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa174068df0 0x7fa17410d5b0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/979747233 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa1740684d0 0x7fa1740688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 -- 192.168.123.103:0/979747233 >> 192.168.123.103:0/979747233 conn(0x7fa174075960 msgr2=0x7fa174075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 -- 192.168.123.103:0/979747233 shutdown_connections 2026-03-09T00:09:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.500+0000 7fa17a4d3700 1 -- 192.168.123.103:0/979747233 wait complete. 2026-03-09T00:09:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.501+0000 7fa17a4d3700 1 Processor -- start 2026-03-09T00:09:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.501+0000 7fa17a4d3700 1 -- start start 2026-03-09T00:09:48.501 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.501+0000 7fa17a4d3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1740684d0 0x7fa174198e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.501+0000 7fa17a4d3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 0x7fa1741993b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.501+0000 7fa17a4d3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa174199a90 con 0x7fa1740684d0 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.501+0000 7fa17a4d3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa17419d820 con 0x7fa174068df0 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 0x7fa1741993b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 0x7fa1741993b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49354/0 (socket says 192.168.123.103:49354) 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 -- 192.168.123.103:0/1943584304 learned_addr learned my addr 192.168.123.103:0/1943584304 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa173fff700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1740684d0 0x7fa174198e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 -- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1740684d0 msgr2=0x7fa174198e70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1740684d0 0x7fa174198e70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 -- 192.168.123.103:0/1943584304 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa1640097e0 con 0x7fa174068df0 2026-03-09T00:09:48.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa173fff700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1740684d0 0x7fa174198e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:09:48.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.502+0000 7fa1737fe700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 0x7fa1741993b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fa164004a30 tx=0x7fa164004b10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.503+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa16401d070 con 0x7fa174068df0 2026-03-09T00:09:48.503 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.503+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa17419db00 con 0x7fa174068df0 2026-03-09T00:09:48.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.503+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa17419e050 con 0x7fa174068df0 2026-03-09T00:09:48.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.503+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa16400bcd0 con 0x7fa174068df0 2026-03-09T00:09:48.504 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.503+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa16400f810 con 0x7fa174068df0 2026-03-09T00:09:48.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.504+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa16400f970 con 0x7fa174068df0 2026-03-09T00:09:48.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.505+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa17410ad20 con 0x7fa174068df0 2026-03-09T00:09:48.505 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.505+0000 7fa1717fa700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa160077870 0x7fa160079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.505+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa16409b080 con 0x7fa174068df0 2026-03-09T00:09:48.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.505+0000 7fa173fff700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa160077870 0x7fa160079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.506 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.506+0000 7fa173fff700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa160077870 0x7fa160079d30 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa15c009a20 tx=0x7fa15c008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.508+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa164063960 con 0x7fa174068df0 2026-03-09T00:09:48.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.674+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa17404ea90 con 0x7fa174068df0 2026-03-09T00:09:48.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.675+0000 7fa1717fa700 1 -- 192.168.123.103:0/1943584304 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fa164005c00 con 0x7fa174068df0 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:09:48.677 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:09:48.678 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:09:48.678 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:09:48.678 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 7, 2026-03-09T00:09:48.678 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 7 2026-03-09T00:09:48.678 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:09:48.678 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:09:48.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa160077870 msgr2=0x7fa160079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa160077870 0x7fa160079d30 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa15c009a20 tx=0x7fa15c008040 comp rx=0 tx=0).stop 2026-03-09T00:09:48.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 msgr2=0x7fa1741993b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 0x7fa1741993b0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fa164004a30 tx=0x7fa164004b10 comp rx=0 tx=0).stop 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 shutdown_connections 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa160077870 0x7fa160079d30 secure :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fa15c009a20 tx=0x7fa15c008040 comp rx=0 tx=0).stop 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa1740684d0 0x7fa174198e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 --2- 192.168.123.103:0/1943584304 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa174068df0 0x7fa1741993b0 secure :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fa164004a30 tx=0x7fa164004b10 comp rx=0 tx=0).stop 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.680+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 >> 192.168.123.103:0/1943584304 conn(0x7fa174075960 msgr2=0x7fa1740fe950 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.681+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 shutdown_connections 2026-03-09T00:09:48.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.681+0000 7fa17a4d3700 1 -- 192.168.123.103:0/1943584304 wait complete. 2026-03-09T00:09:48.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.764+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/3378917217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a18101a80 msgr2=0x7f4a18105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.764+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/3378917217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a18101a80 0x7f4a18105ad0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f4a08009b00 tx=0x7f4a08009e10 comp rx=0 tx=0).stop 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.764+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/3378917217 shutdown_connections 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.764+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/3378917217 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a18101a80 0x7f4a18105ad0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.764+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/3378917217 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a181010d0 0x7f4a181014b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.764+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/3378917217 >> 192.168.123.103:0/3378917217 conn(0x7f4a180fc920 msgr2=0x7f4a180fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.765+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/3378917217 shutdown_connections 2026-03-09T00:09:48.765 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.765+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/3378917217 wait complete. 2026-03-09T00:09:48.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.766+0000 7f4a1e2ff700 1 Processor -- start 2026-03-09T00:09:48.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.766+0000 7f4a1e2ff700 1 -- start start 2026-03-09T00:09:48.766 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.766+0000 7f4a1e2ff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 0x7f4a1819c9e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.766+0000 7f4a17fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 0x7f4a1819c9e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.766+0000 7f4a17fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 0x7f4a1819c9e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36266/0 (socket says 192.168.123.103:36266) 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.766+0000 7f4a1e2ff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a18101a80 0x7f4a1819cf20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a17fff700 1 -- 192.168.123.103:0/4162042539 learned_addr learned my addr 192.168.123.103:0/4162042539 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a177fe700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a18101a80 0x7f4a1819cf20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a1e2ff700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a1819d5b0 con 0x7f4a181010d0 2026-03-09T00:09:48.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a18196a60 con 0x7f4a18101a80 2026-03-09T00:09:48.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a17fff700 1 -- 192.168.123.103:0/4162042539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a18101a80 msgr2=0x7f4a1819cf20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a17fff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a18101a80 0x7f4a1819cf20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a17fff700 1 -- 192.168.123.103:0/4162042539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a080097e0 con 0x7f4a181010d0 2026-03-09T00:09:48.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.767+0000 7f4a17fff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 0x7f4a1819c9e0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4a0000b700 tx=0x7f4a0000ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.768+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a00011840 con 0x7f4a181010d0 2026-03-09T00:09:48.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.768+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4a00011e80 con 0x7f4a181010d0 2026-03-09T00:09:48.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.768+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a18196d40 con 0x7f4a181010d0 2026-03-09T00:09:48.769 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.768+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a18197290 con 0x7f4a181010d0 2026-03-09T00:09:48.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.769+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a00011b50 con 0x7f4a181010d0 2026-03-09T00:09:48.770 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.770+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4a0001a400 con 0x7f4a181010d0 2026-03-09T00:09:48.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.770+0000 7f4a157fa700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4a040778c0 0x7f4a04079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:48.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.771+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4a00099730 con 0x7f4a181010d0 2026-03-09T00:09:48.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.771+0000 7f4a177fe700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4a040778c0 0x7f4a04079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:48.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.771+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a18196ed0 con 0x7f4a181010d0 2026-03-09T00:09:48.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.771+0000 7f4a177fe700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4a040778c0 0x7f4a04079d80 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4a180fd0a0 tx=0x7f4a0800b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:48.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.774+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4a18196ed0 con 0x7f4a181010d0 2026-03-09T00:09:48.922 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.922+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f4a181094d0 con 0x7f4a181010d0 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.923+0000 7f4a157fa700 1 -- 192.168.123.103:0/4162042539 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f4a00015070 con 0x7f4a181010d0 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:09:48.924 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:48.925 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:09:48.925 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:48 vm03.local ceph-mon[129670]: pgmap v170: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:48.925 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:48 vm03.local ceph-mon[129670]: from='client.44237 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:48.925 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:48 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1943584304' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.929+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4a040778c0 msgr2=0x7f4a04079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.929+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4a040778c0 0x7f4a04079d80 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4a180fd0a0 tx=0x7f4a0800b540 comp rx=0 tx=0).stop 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.929+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 msgr2=0x7f4a1819c9e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.929+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 0x7f4a1819c9e0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4a0000b700 tx=0x7f4a0000ba10 comp rx=0 tx=0).stop 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.930+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 shutdown_connections 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.930+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4a040778c0 0x7f4a04079d80 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.930+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4a181010d0 0x7f4a1819c9e0 secure :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f4a0000b700 tx=0x7f4a0000ba10 comp rx=0 tx=0).stop 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.930+0000 7f4a1e2ff700 1 --2- 192.168.123.103:0/4162042539 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4a18101a80 0x7f4a1819cf20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:48.930 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.930+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 >> 192.168.123.103:0/4162042539 conn(0x7f4a180fc920 msgr2=0x7f4a180fdee0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:48.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.930+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 shutdown_connections 2026-03-09T00:09:48.931 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:48.931+0000 7f4a1e2ff700 1 -- 192.168.123.103:0/4162042539 wait complete. 2026-03-09T00:09:48.932 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 -- 192.168.123.103:0/2539973102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a4068df0 msgr2=0x7fe0a410d480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 --2- 192.168.123.103:0/2539973102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a4068df0 0x7fe0a410d480 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fe0a0009b50 tx=0x7fe0a0009e60 comp rx=0 tx=0).stop 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 -- 192.168.123.103:0/2539973102 shutdown_connections 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 --2- 192.168.123.103:0/2539973102 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a4068df0 0x7fe0a410d480 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 --2- 192.168.123.103:0/2539973102 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a40684d0 0x7fe0a40688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 -- 192.168.123.103:0/2539973102 >> 192.168.123.103:0/2539973102 conn(0x7fe0a4076950 msgr2=0x7fe0a4076d60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.017+0000 7fe0abe36700 1 -- 192.168.123.103:0/2539973102 shutdown_connections 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 -- 192.168.123.103:0/2539973102 wait complete. 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 Processor -- start 2026-03-09T00:09:49.018 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 -- start start 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 0x7fe0a4198d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a41992c0 0x7fe0a419d730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0a41998e0 con 0x7fe0a40684d0 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.018+0000 7fe0abe36700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0a4199a50 con 0x7fe0a41992c0 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a9bd2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 0x7fe0a4198d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a9bd2700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 0x7fe0a4198d80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:36282/0 (socket says 192.168.123.103:36282) 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a9bd2700 1 -- 192.168.123.103:0/1652162598 learned_addr learned my addr 192.168.123.103:0/1652162598 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a93d1700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a41992c0 0x7fe0a419d730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:49.019 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a9bd2700 1 -- 192.168.123.103:0/1652162598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a41992c0 msgr2=0x7fe0a419d730 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a9bd2700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a41992c0 0x7fe0a419d730 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a9bd2700 1 -- 192.168.123.103:0/1652162598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0a00097e0 con 0x7fe0a40684d0 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.019+0000 7fe0a93d1700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a41992c0 0x7fe0a419d730 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.020+0000 7fe0a9bd2700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 0x7fe0a4198d80 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fe09400d8d0 tx=0x7fe09400dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.020+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe094009940 con 0x7fe0a40684d0 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.020+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0a419dd30 con 0x7fe0a40684d0 2026-03-09T00:09:49.020 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.020+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0a419e280 con 0x7fe0a40684d0 2026-03-09T00:09:49.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.020+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe094010460 con 0x7fe0a40684d0 2026-03-09T00:09:49.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.021+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe09400f5d0 con 0x7fe0a40684d0 2026-03-09T00:09:49.022 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.022+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe09400f7f0 con 0x7fe0a40684d0 2026-03-09T00:09:49.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.023+0000 7fe09affd700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0900779e0 0x7fe090079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:49.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.023+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fe09409a8f0 con 0x7fe0a40684d0 2026-03-09T00:09:49.023 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.023+0000 7fe0a93d1700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0900779e0 0x7fe090079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:49.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.023+0000 7fe0a93d1700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0900779e0 0x7fe090079ea0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fe0a000b5c0 tx=0x7fe0a0005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:49.024 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.024+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe088005320 con 0x7fe0a40684d0 2026-03-09T00:09:49.029 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.029+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe094062ad0 con 0x7fe0a40684d0 2026-03-09T00:09:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:48 vm06.local ceph-mon[106218]: pgmap v170: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:48 vm06.local ceph-mon[106218]: from='client.44237 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:48 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1943584304' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:09:49.173 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.173+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe088000bf0 con 0x7fe0900779e0 2026-03-09T00:09:49.175 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.174+0000 7fe09affd700 1 -- 192.168.123.103:0/1652162598 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fe088000bf0 con 0x7fe0900779e0 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "9/23 daemons upgraded", 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:09:49.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.178+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0900779e0 msgr2=0x7fe090079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.178+0000 7fe0abe36700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0900779e0 0x7fe090079ea0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fe0a000b5c0 tx=0x7fe0a0005fb0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.178+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 msgr2=0x7fe0a4198d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.178+0000 7fe0abe36700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 0x7fe0a4198d80 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7fe09400d8d0 tx=0x7fe09400dc90 comp rx=0 tx=0).stop 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 shutdown_connections 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0900779e0 0x7fe090079ea0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe0a40684d0 0x7fe0a4198d80 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 --2- 192.168.123.103:0/1652162598 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe0a41992c0 0x7fe0a419d730 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 >> 192.168.123.103:0/1652162598 conn(0x7fe0a4076950 msgr2=0x7fe0a40fe9e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 shutdown_connections 2026-03-09T00:09:49.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.179+0000 7fe0abe36700 1 -- 192.168.123.103:0/1652162598 wait complete. 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.268+0000 7f2ad53d1700 1 -- 192.168.123.103:0/4264063887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 msgr2=0x7f2ad0102120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.268+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/4264063887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0102120 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f2ac0009b00 tx=0x7f2ac0009e10 comp rx=0 tx=0).stop 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.268+0000 7f2ad53d1700 1 -- 192.168.123.103:0/4264063887 shutdown_connections 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.268+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/4264063887 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad00ffc30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.268+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/4264063887 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0102120 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.268+0000 7f2ad53d1700 1 -- 192.168.123.103:0/4264063887 >> 192.168.123.103:0/4264063887 conn(0x7f2ad0075050 msgr2=0x7f2ad0075460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.269+0000 7f2ad53d1700 1 -- 192.168.123.103:0/4264063887 shutdown_connections 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.269+0000 7f2ad53d1700 1 -- 192.168.123.103:0/4264063887 wait complete. 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.269+0000 7f2ad53d1700 1 Processor -- start 2026-03-09T00:09:49.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.269+0000 7f2ad53d1700 1 -- start start 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2ad53d1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad0198f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2ad53d1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0199460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2ad53d1700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ad0199b40 con 0x7f2ad0101d40 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2ad53d1700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2ad019d830 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2aceffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad0198f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2ac7fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0199460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2aceffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad0198f20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:49384/0 (socket says 192.168.123.103:49384) 2026-03-09T00:09:49.270 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.270+0000 7f2aceffd700 1 -- 192.168.123.103:0/2473897249 learned_addr learned my addr 192.168.123.103:0/2473897249 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:09:49.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2aceffd700 1 -- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 msgr2=0x7f2ad0199460 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2aceffd700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0199460 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2aceffd700 1 -- 192.168.123.103:0/2473897249 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2ac00097e0 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2ac7fff700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0199460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:09:49.271 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2aceffd700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad0198f20 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f2ac00094d0 tx=0x7f2ac0004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:49.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2ac001d070 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2ac0022470 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2ac000f650 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.271+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2ad019dab0 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.273 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.272+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2ad019dfa0 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.274+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2ac000f830 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.274+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2ad010b6a0 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.274+0000 7f2accff9700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ab0077870 0x7f2ab0079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:09:49.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.275+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(89..89 src has 1..89) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f2ac009afe0 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.275 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.275+0000 7f2ac7fff700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ab0077870 0x7f2ab0079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:09:49.276 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.275+0000 7f2ac7fff700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ab0077870 0x7f2ab0079d30 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f2ad019a4a0 tx=0x7f2ab8009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:09:49.279 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.278+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2ac0064a00 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.453 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.452+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f2ad004ea90 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.456 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.456+0000 7f2accff9700 1 -- 192.168.123.103:0/2473897249 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f2ac0060030 con 0x7f2ad00ff7b0 2026-03-09T00:09:49.457 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:09:49.457 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:09:49.457 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.459+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ab0077870 msgr2=0x7f2ab0079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.459+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ab0077870 0x7f2ab0079d30 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f2ad019a4a0 tx=0x7f2ab8009380 comp rx=0 tx=0).stop 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.459+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 msgr2=0x7f2ad0198f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.459+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad0198f20 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f2ac00094d0 tx=0x7f2ac0004930 comp rx=0 tx=0).stop 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 shutdown_connections 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ab0077870 0x7f2ab0079d30 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2ad00ff7b0 0x7f2ad0198f20 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.460 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 --2- 192.168.123.103:0/2473897249 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2ad0101d40 0x7f2ad0199460 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:09:49.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 >> 192.168.123.103:0/2473897249 conn(0x7f2ad0075050 msgr2=0x7f2ad00fd760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:09:49.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 shutdown_connections 2026-03-09T00:09:49.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:09:49.460+0000 7f2ad53d1700 1 -- 192.168.123.103:0/2473897249 wait complete. 2026-03-09T00:09:50.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:49 vm03.local ceph-mon[129670]: from='client.34318 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:50.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:49 vm03.local ceph-mon[129670]: from='client.34322 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:50.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:49 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4162042539' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:09:50.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:49 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2473897249' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:09:50.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:49 vm06.local ceph-mon[106218]: from='client.34318 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:50.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:49 vm06.local ceph-mon[106218]: from='client.34322 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:50.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:49 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4162042539' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:09:50.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:49 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2473897249' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:09:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:50 vm03.local ceph-mon[129670]: from='client.34334 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:50 vm03.local ceph-mon[129670]: pgmap v171: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:51.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:50 vm06.local ceph-mon[106218]: from='client.34334 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:09:51.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:50 vm06.local ceph-mon[106218]: pgmap v171: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:53.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:52 vm06.local ceph-mon[106218]: pgmap v172: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:53.141 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:52 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:52 vm03.local ceph-mon[129670]: pgmap v172: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:53.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:52 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: Upgrade: osd.3 is safe to restart 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:54.172 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-mon[106218]: osd.3 marked itself down and dead 2026-03-09T00:09:54.172 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:53 vm06.local systemd[1]: Stopping Ceph osd.3 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:09:54.172 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[64254]: 2026-03-09T00:09:53.795+0000 7fc409ea1700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:09:54.172 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[64254]: 2026-03-09T00:09:53.795+0000 7fc409ea1700 -1 osd.3 89 *** Got signal Terminated *** 2026-03-09T00:09:54.172 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:53 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[64254]: 2026-03-09T00:09:53.796+0000 7fc409ea1700 -1 osd.3 89 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: Upgrade: osd.3 is safe to restart 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:54.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:53 vm03.local ceph-mon[129670]: osd.3 marked itself down and dead 2026-03-09T00:09:55.074 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:54 vm06.local ceph-mon[106218]: Upgrade: Updating osd.3 2026-03-09T00:09:55.074 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:54 vm06.local ceph-mon[106218]: Deploying daemon osd.3 on vm06 2026-03-09T00:09:55.074 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:54 vm06.local ceph-mon[106218]: pgmap v173: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:55.074 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:54 vm06.local ceph-mon[106218]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:09:55.074 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:54 vm06.local ceph-mon[106218]: osdmap e90: 6 total, 5 up, 6 in 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:54 vm06.local podman[113761]: 2026-03-09 00:09:54.820616092 +0000 UTC m=+1.037878816 container died 1ece32056ab68d699994dd966b9a39bc9c3a2251eabc085efa4c405629aa2ed5 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_CLEAN=True, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, org.label-schema.build-date=20240222) 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:54 vm06.local podman[113761]: 2026-03-09 00:09:54.847241423 +0000 UTC m=+1.064504147 container remove 1ece32056ab68d699994dd966b9a39bc9c3a2251eabc085efa4c405629aa2ed5 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, RELEASE=HEAD, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:54 vm06.local bash[113761]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:54 vm06.local podman[113828]: 2026-03-09 00:09:54.984010049 +0000 UTC m=+0.019722665 container create 29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113828]: 2026-03-09 00:09:55.031973357 +0000 UTC m=+0.067685982 container init 29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS) 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113828]: 2026-03-09 00:09:55.035054342 +0000 UTC m=+0.070766958 container start 29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:09:55.075 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113828]: 2026-03-09 00:09:55.040117268 +0000 UTC m=+0.075829875 container attach 29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:09:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:54 vm03.local ceph-mon[129670]: Upgrade: Updating osd.3 2026-03-09T00:09:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:54 vm03.local ceph-mon[129670]: Deploying daemon osd.3 on vm06 2026-03-09T00:09:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:54 vm03.local ceph-mon[129670]: pgmap v173: 65 pgs: 65 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:54 vm03.local ceph-mon[129670]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:09:55.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:54 vm03.local ceph-mon[129670]: osdmap e90: 6 total, 5 up, 6 in 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113828]: 2026-03-09 00:09:54.976963169 +0000 UTC m=+0.012675785 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local conmon[113840]: conmon 29dbb3cfbb7d68032a7c : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19.scope/container/memory.events 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113828]: 2026-03-09 00:09:55.161237195 +0000 UTC m=+0.196949811 container died 29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113828]: 2026-03-09 00:09:55.181225496 +0000 UTC m=+0.216938112 container remove 29dbb3cfbb7d68032a7ce2b2956d54c7837cd3a3bcf5bee4ee9679d46b3eeb19 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3.service: Deactivated successfully. 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3.service: Unit process 113840 (conmon) remains running after unit stopped. 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3.service: Unit process 113848 (podman) remains running after unit stopped. 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local systemd[1]: Stopped Ceph osd.3 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:09:55.372 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3.service: Consumed 52.853s CPU time, 1.3G memory peak. 2026-03-09T00:09:55.629 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local systemd[1]: Starting Ceph osd.3 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:09:55.629 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113932]: 2026-03-09 00:09:55.484619195 +0000 UTC m=+0.021335012 container create 68a1b364da30345e7097e710a5ba5461441657ba31fcd812df2e5561d7800c21 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-09T00:09:55.629 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113932]: 2026-03-09 00:09:55.530259966 +0000 UTC m=+0.066975783 container init 68a1b364da30345e7097e710a5ba5461441657ba31fcd812df2e5561d7800c21 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-09T00:09:55.629 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113932]: 2026-03-09 00:09:55.533402346 +0000 UTC m=+0.070118163 container start 68a1b364da30345e7097e710a5ba5461441657ba31fcd812df2e5561d7800c21 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:09:55.630 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113932]: 2026-03-09 00:09:55.538124144 +0000 UTC m=+0.074839961 container attach 68a1b364da30345e7097e710a5ba5461441657ba31fcd812df2e5561d7800c21 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:09:55.630 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local podman[113932]: 2026-03-09 00:09:55.475476443 +0000 UTC m=+0.012192271 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:55.630 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:55.630 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local bash[113932]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:55.630 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:55.630 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:55 vm06.local bash[113932]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-c5fb3503-0e0e-4b01-9f26-4e2e31866a48/osd-block-8b49bccb-fd91-44f4-831e-a401044f0e64 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T00:09:56.421 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-c5fb3503-0e0e-4b01-9f26-4e2e31866a48/osd-block-8b49bccb-fd91-44f4-831e-a401044f0e64 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-09T00:09:56.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-mon[106218]: pgmap v175: 65 pgs: 18 peering, 6 stale+active+clean, 41 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:56.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-mon[106218]: osdmap e91: 6 total, 5 up, 6 in 2026-03-09T00:09:56.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-mon[106218]: Health check failed: Reduced data availability: 10 pgs peering (PG_AVAILABILITY) 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/ln -snf /dev/ceph-c5fb3503-0e0e-4b01-9f26-4e2e31866a48/osd-block-8b49bccb-fd91-44f4-831e-a401044f0e64 /var/lib/ceph/osd/ceph-3/block 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/ln -snf /dev/ceph-c5fb3503-0e0e-4b01-9f26-4e2e31866a48/osd-block-8b49bccb-fd91-44f4-831e-a401044f0e64 /var/lib/ceph/osd/ceph-3/block 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate[113943]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[113932]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local podman[114156]: 2026-03-09 00:09:56.553861138 +0000 UTC m=+0.012010188 container died 68a1b364da30345e7097e710a5ba5461441657ba31fcd812df2e5561d7800c21 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True) 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local podman[114156]: 2026-03-09 00:09:56.572154136 +0000 UTC m=+0.030303186 container remove 68a1b364da30345e7097e710a5ba5461441657ba31fcd812df2e5561d7800c21 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-activate, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local podman[114193]: 2026-03-09 00:09:56.675194014 +0000 UTC m=+0.018590745 container create 8e61be6171398e1b405adf5b066946b3ea009a52ba01924d5d95b99f781bbec1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local podman[114193]: 2026-03-09 00:09:56.723083544 +0000 UTC m=+0.066480294 container init 8e61be6171398e1b405adf5b066946b3ea009a52ba01924d5d95b99f781bbec1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0) 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local podman[114193]: 2026-03-09 00:09:56.725602268 +0000 UTC m=+0.068999009 container start 8e61be6171398e1b405adf5b066946b3ea009a52ba01924d5d95b99f781bbec1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local bash[114193]: 8e61be6171398e1b405adf5b066946b3ea009a52ba01924d5d95b99f781bbec1 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local podman[114193]: 2026-03-09 00:09:56.668676425 +0000 UTC m=+0.012073177 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:09:56.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:56 vm06.local systemd[1]: Started Ceph osd.3 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:09:57.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:56 vm03.local ceph-mon[129670]: pgmap v175: 65 pgs: 18 peering, 6 stale+active+clean, 41 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:09:57.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:56 vm03.local ceph-mon[129670]: osdmap e91: 6 total, 5 up, 6 in 2026-03-09T00:09:57.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:56 vm03.local ceph-mon[129670]: Health check failed: Reduced data availability: 10 pgs peering (PG_AVAILABILITY) 2026-03-09T00:09:57.671 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:09:57 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:09:57.326+0000 7fecd88ae740 -1 Falling back to public interface 2026-03-09T00:09:58.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:58.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:58.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:57 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:58.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:58.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:58.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:57 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: pgmap v177: 65 pgs: 4 active+undersized, 18 peering, 3 stale+active+clean, 3 active+undersized+degraded, 37 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 9/231 objects degraded (3.896%) 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: Health check failed: Degraded data redundancy: 9/231 objects degraded (3.896%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:09:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: pgmap v177: 65 pgs: 4 active+undersized, 18 peering, 3 stale+active+clean, 3 active+undersized+degraded, 37 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 9/231 objects degraded (3.896%) 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: Health check failed: Degraded data redundancy: 9/231 objects degraded (3.896%), 3 pgs degraded (PG_DEGRADED) 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:09:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:09:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (14 PGs are or would become offline) 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pgmap v178: 65 pgs: 10 active+undersized, 18 peering, 10 active+undersized+degraded, 27 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down; Reduced data availability: 10 pgs peering; Degraded data redundancy: 9/231 objects degraded (3.896%), 3 pgs degraded 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: [WRN] OSD_DOWN: 1 osds down 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: osd.3 (root=default,host=vm06) is down 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: [WRN] PG_AVAILABILITY: Reduced data availability: 10 pgs peering 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.7 is stuck peering for 3m, current state peering, last acting [4,2] 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.8 is stuck peering for 3m, current state peering, last acting [5,0] 2026-03-09T00:10:00.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.b is stuck peering for 3m, current state peering, last acting [4,5] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.14 is stuck peering for 2m, current state peering, last acting [4,5] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.1a is stuck peering for 3m, current state peering, last acting [4,5] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.1d is stuck peering for 3m, current state peering, last acting [5,0] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 3.a is stuck peering for 3m, current state peering, last acting [4,1] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 3.11 is stuck peering for 3m, current state peering, last acting [4,0] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 3.13 is stuck peering for 3m, current state peering, last acting [5,2] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 3.14 is stuck peering for 3m, current state peering, last acting [4,5] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: [WRN] PG_DEGRADED: Degraded data redundancy: 9/231 objects degraded (3.896%), 3 pgs degraded 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 1.0 is active+undersized+degraded, acting [0,1] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.5 is active+undersized+degraded, acting [0,4] 2026-03-09T00:10:00.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:00 vm06.local ceph-mon[106218]: pg 2.1f is active+undersized+degraded, acting [0,4] 2026-03-09T00:10:00.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (14 PGs are or would become offline) 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pgmap v178: 65 pgs: 10 active+undersized, 18 peering, 10 active+undersized+degraded, 27 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 26/231 objects degraded (11.255%) 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down; Reduced data availability: 10 pgs peering; Degraded data redundancy: 9/231 objects degraded (3.896%), 3 pgs degraded 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: [WRN] OSD_DOWN: 1 osds down 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: osd.3 (root=default,host=vm06) is down 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: [WRN] PG_AVAILABILITY: Reduced data availability: 10 pgs peering 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.7 is stuck peering for 3m, current state peering, last acting [4,2] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.8 is stuck peering for 3m, current state peering, last acting [5,0] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.b is stuck peering for 3m, current state peering, last acting [4,5] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.14 is stuck peering for 2m, current state peering, last acting [4,5] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.1a is stuck peering for 3m, current state peering, last acting [4,5] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.1d is stuck peering for 3m, current state peering, last acting [5,0] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 3.a is stuck peering for 3m, current state peering, last acting [4,1] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 3.11 is stuck peering for 3m, current state peering, last acting [4,0] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 3.13 is stuck peering for 3m, current state peering, last acting [5,2] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 3.14 is stuck peering for 3m, current state peering, last acting [4,5] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: [WRN] PG_DEGRADED: Degraded data redundancy: 9/231 objects degraded (3.896%), 3 pgs degraded 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 1.0 is active+undersized+degraded, acting [0,1] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.5 is active+undersized+degraded, acting [0,4] 2026-03-09T00:10:00.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:00 vm03.local ceph-mon[129670]: pg 2.1f is active+undersized+degraded, acting [0,4] 2026-03-09T00:10:01.912 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:01 vm06.local ceph-mon[106218]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 10 pgs peering) 2026-03-09T00:10:01.913 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:10:01 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:10:01.525+0000 7fecd88ae740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-09T00:10:01.913 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:10:01 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:10:01.912+0000 7fecd88ae740 -1 osd.3 89 log_to_monitors true 2026-03-09T00:10:02.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:01 vm03.local ceph-mon[129670]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 10 pgs peering) 2026-03-09T00:10:03.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:02 vm03.local ceph-mon[129670]: pgmap v179: 65 pgs: 21 active+undersized, 17 active+undersized+degraded, 27 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 51/231 objects degraded (22.078%) 2026-03-09T00:10:03.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:02 vm03.local ceph-mon[129670]: from='osd.3 [v2:192.168.123.106:6800/4242539988,v1:192.168.123.106:6801/4242539988]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:10:03.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:02 vm03.local ceph-mon[129670]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:10:03.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:02 vm06.local ceph-mon[106218]: pgmap v179: 65 pgs: 21 active+undersized, 17 active+undersized+degraded, 27 active+clean; 209 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 51/231 objects degraded (22.078%) 2026-03-09T00:10:03.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:02 vm06.local ceph-mon[106218]: from='osd.3 [v2:192.168.123.106:6800/4242539988,v1:192.168.123.106:6801/4242539988]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:10:03.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:02 vm06.local ceph-mon[106218]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-09T00:10:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:03 vm06.local ceph-mon[106218]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T00:10:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:03 vm06.local ceph-mon[106218]: from='osd.3 [v2:192.168.123.106:6800/4242539988,v1:192.168.123.106:6801/4242539988]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:03 vm06.local ceph-mon[106218]: osdmap e92: 6 total, 5 up, 6 in 2026-03-09T00:10:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:03 vm06.local ceph-mon[106218]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:03.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:03 vm06.local ceph-mon[106218]: from='osd.3 ' entity='osd.3' 2026-03-09T00:10:03.921 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:10:03 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:10:03.648+0000 7feccfe47640 -1 osd.3 89 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:10:04.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:03 vm03.local ceph-mon[129670]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-09T00:10:04.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:03 vm03.local ceph-mon[129670]: from='osd.3 [v2:192.168.123.106:6800/4242539988,v1:192.168.123.106:6801/4242539988]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:04.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:03 vm03.local ceph-mon[129670]: osdmap e92: 6 total, 5 up, 6 in 2026-03-09T00:10:04.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:03 vm03.local ceph-mon[129670]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:04.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:03 vm03.local ceph-mon[129670]: from='osd.3 ' entity='osd.3' 2026-03-09T00:10:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:04 vm03.local ceph-mon[129670]: pgmap v181: 65 pgs: 21 active+undersized, 17 active+undersized+degraded, 27 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 51/231 objects degraded (22.078%) 2026-03-09T00:10:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:04 vm03.local ceph-mon[129670]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:10:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:04 vm03.local ceph-mon[129670]: osd.3 [v2:192.168.123.106:6800/4242539988,v1:192.168.123.106:6801/4242539988] boot 2026-03-09T00:10:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:04 vm03.local ceph-mon[129670]: osdmap e93: 6 total, 6 up, 6 in 2026-03-09T00:10:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:04 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:10:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:04 vm06.local ceph-mon[106218]: pgmap v181: 65 pgs: 21 active+undersized, 17 active+undersized+degraded, 27 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 51/231 objects degraded (22.078%) 2026-03-09T00:10:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:04 vm06.local ceph-mon[106218]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:10:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:04 vm06.local ceph-mon[106218]: osd.3 [v2:192.168.123.106:6800/4242539988,v1:192.168.123.106:6801/4242539988] boot 2026-03-09T00:10:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:04 vm06.local ceph-mon[106218]: osdmap e93: 6 total, 6 up, 6 in 2026-03-09T00:10:05.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:04 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-09T00:10:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:06 vm06.local ceph-mon[106218]: pgmap v183: 65 pgs: 8 peering, 15 active+undersized, 15 active+undersized+degraded, 27 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 43/231 objects degraded (18.615%) 2026-03-09T00:10:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:06 vm06.local ceph-mon[106218]: osdmap e94: 6 total, 6 up, 6 in 2026-03-09T00:10:06.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:06 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 43/231 objects degraded (18.615%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:07.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:06 vm03.local ceph-mon[129670]: pgmap v183: 65 pgs: 8 peering, 15 active+undersized, 15 active+undersized+degraded, 27 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 43/231 objects degraded (18.615%) 2026-03-09T00:10:07.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:06 vm03.local ceph-mon[129670]: osdmap e94: 6 total, 6 up, 6 in 2026-03-09T00:10:07.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:06 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 43/231 objects degraded (18.615%), 15 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:09.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:08 vm03.local ceph-mon[129670]: pgmap v185: 65 pgs: 8 peering, 12 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 40/231 objects degraded (17.316%) 2026-03-09T00:10:09.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:08 vm06.local ceph-mon[106218]: pgmap v185: 65 pgs: 8 peering, 12 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 40/231 objects degraded (17.316%) 2026-03-09T00:10:10.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:09 vm03.local ceph-mon[129670]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 40/231 objects degraded (17.316%), 14 pgs degraded) 2026-03-09T00:10:10.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:09 vm06.local ceph-mon[106218]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 40/231 objects degraded (17.316%), 14 pgs degraded) 2026-03-09T00:10:11.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:10 vm03.local ceph-mon[129670]: pgmap v186: 65 pgs: 8 peering, 57 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:11.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:10 vm06.local ceph-mon[106218]: pgmap v186: 65 pgs: 8 peering, 57 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:12 vm03.local ceph-mon[129670]: pgmap v187: 65 pgs: 65 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:13.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:12 vm06.local ceph-mon[106218]: pgmap v187: 65 pgs: 65 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:14.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:14 vm06.local ceph-mon[106218]: pgmap v188: 65 pgs: 65 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:14.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:14.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:14.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:14 vm03.local ceph-mon[129670]: pgmap v188: 65 pgs: 65 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:15.820 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local systemd[1]: Stopping Ceph osd.4 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:10:15.820 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[70662]: 2026-03-09T00:10:15.676+0000 7f78c7360700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:10:15.820 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[70662]: 2026-03-09T00:10:15.676+0000 7f78c7360700 -1 osd.4 94 *** Got signal Terminated *** 2026-03-09T00:10:15.820 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[70662]: 2026-03-09T00:10:15.676+0000 7f78c7360700 -1 osd.4 94 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: Upgrade: osd.4 is safe to restart 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: Upgrade: Updating osd.4 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: Deploying daemon osd.4 on vm06 2026-03-09T00:10:16.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:15 vm03.local ceph-mon[129670]: osd.4 marked itself down and dead 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: Upgrade: osd.4 is safe to restart 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: Upgrade: Updating osd.4 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: Deploying daemon osd.4 on vm06 2026-03-09T00:10:16.095 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:15 vm06.local ceph-mon[106218]: osd.4 marked itself down and dead 2026-03-09T00:10:16.095 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local podman[118206]: 2026-03-09 00:10:15.917592786 +0000 UTC m=+0.257679782 container died ee6260a1124c3569e069a2b5e660ee88abf6f3c5688561762ac15be82b16813c (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, RELEASE=HEAD, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS) 2026-03-09T00:10:16.095 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local podman[118206]: 2026-03-09 00:10:15.935307221 +0000 UTC m=+0.275394217 container remove ee6260a1124c3569e069a2b5e660ee88abf6f3c5688561762ac15be82b16813c (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, io.buildah.version=1.29.1, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD) 2026-03-09T00:10:16.095 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:15 vm06.local bash[118206]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.096194765 +0000 UTC m=+0.020377719 container create 9c80ab0bb3f7e8be9187a6ef4fbb5a9082bb584938934e8c9bd23545c0725cf3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0) 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.13853493 +0000 UTC m=+0.062717894 container init 9c80ab0bb3f7e8be9187a6ef4fbb5a9082bb584938934e8c9bd23545c0725cf3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.144520803 +0000 UTC m=+0.068703757 container start 9c80ab0bb3f7e8be9187a6ef4fbb5a9082bb584938934e8c9bd23545c0725cf3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-deactivate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.145838479 +0000 UTC m=+0.070021433 container attach 9c80ab0bb3f7e8be9187a6ef4fbb5a9082bb584938934e8c9bd23545c0725cf3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-deactivate, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.088447266 +0000 UTC m=+0.012630231 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.292542679 +0000 UTC m=+0.216725623 container died 9c80ab0bb3f7e8be9187a6ef4fbb5a9082bb584938934e8c9bd23545c0725cf3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118272]: 2026-03-09 00:10:16.330874811 +0000 UTC m=+0.255057765 container remove 9c80ab0bb3f7e8be9187a6ef4fbb5a9082bb584938934e8c9bd23545c0725cf3 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.4.service: Deactivated successfully. 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local systemd[1]: Stopped Ceph osd.4 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:10:16.357 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.4.service: Consumed 44.622s CPU time. 2026-03-09T00:10:16.638 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local systemd[1]: Starting Ceph osd.4 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:10:16.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:16 vm06.local ceph-mon[106218]: pgmap v189: 65 pgs: 65 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:16.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:16 vm06.local ceph-mon[106218]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:10:16.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:16 vm06.local ceph-mon[106218]: osdmap e95: 6 total, 5 up, 6 in 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118375]: 2026-03-09 00:10:16.63836117 +0000 UTC m=+0.019981609 container create 4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118375]: 2026-03-09 00:10:16.691938254 +0000 UTC m=+0.073558693 container init 4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3) 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118375]: 2026-03-09 00:10:16.694824467 +0000 UTC m=+0.076444906 container start 4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118375]: 2026-03-09 00:10:16.699197401 +0000 UTC m=+0.080817851 container attach 4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local podman[118375]: 2026-03-09 00:10:16.628891296 +0000 UTC m=+0.010511746 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local bash[118375]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:16.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:16 vm06.local bash[118375]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:16 vm03.local ceph-mon[129670]: pgmap v189: 65 pgs: 65 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:16 vm03.local ceph-mon[129670]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:10:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:16 vm03.local ceph-mon[129670]: osdmap e95: 6 total, 5 up, 6 in 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-753dde67-ded6-4a33-bc24-c6eb78d96e11/osd-block-8c3a4d00-bb0a-4f59-b53b-83364e99627b --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T00:10:17.662 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-753dde67-ded6-4a33-bc24-c6eb78d96e11/osd-block-8c3a4d00-bb0a-4f59-b53b-83364e99627b --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-09T00:10:17.922 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-mon[106218]: osdmap e96: 6 total, 5 up, 6 in 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/ln -snf /dev/ceph-753dde67-ded6-4a33-bc24-c6eb78d96e11/osd-block-8c3a4d00-bb0a-4f59-b53b-83364e99627b /var/lib/ceph/osd/ceph-4/block 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/ln -snf /dev/ceph-753dde67-ded6-4a33-bc24-c6eb78d96e11/osd-block-8c3a4d00-bb0a-4f59-b53b-83364e99627b /var/lib/ceph/osd/ceph-4/block 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate[118387]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118375]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local conmon[118387]: conmon 4a03820bdb3a4a9918a5 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c.scope/container/memory.events 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local podman[118375]: 2026-03-09 00:10:17.703613347 +0000 UTC m=+1.085233786 container died 4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local podman[118375]: 2026-03-09 00:10:17.72617864 +0000 UTC m=+1.107799079 container remove 4a03820bdb3a4a9918a5706820f6a8d176814b542e2b4c5d33e50f3d2cc1ac3c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4-activate, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS) 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local podman[118624]: 2026-03-09 00:10:17.835621011 +0000 UTC m=+0.023404303 container create 21cf4dc588992eb75797fe8a0a695f9a620964a430a336f42cda0c916573077e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3) 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local podman[118624]: 2026-03-09 00:10:17.872705328 +0000 UTC m=+0.060488620 container init 21cf4dc588992eb75797fe8a0a695f9a620964a430a336f42cda0c916573077e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223) 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local podman[118624]: 2026-03-09 00:10:17.876747884 +0000 UTC m=+0.064531176 container start 21cf4dc588992eb75797fe8a0a695f9a620964a430a336f42cda0c916573077e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local bash[118624]: 21cf4dc588992eb75797fe8a0a695f9a620964a430a336f42cda0c916573077e 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local podman[118624]: 2026-03-09 00:10:17.824838251 +0000 UTC m=+0.012621543 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:10:17.922 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:17 vm06.local systemd[1]: Started Ceph osd.4 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:10:18.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:17 vm03.local ceph-mon[129670]: osdmap e96: 6 total, 5 up, 6 in 2026-03-09T00:10:18.806 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:10:18.459+0000 7fb8acca7740 -1 Falling back to public interface 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: pgmap v192: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: Health check failed: Reduced data availability: 1 pg inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: pgmap v192: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 209 MiB data, 2.1 GiB used, 118 GiB / 120 GiB avail 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: Health check failed: Reduced data availability: 1 pg inactive, 2 pgs peering (PG_AVAILABILITY) 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.536+0000 7f31080b8700 1 -- 192.168.123.103:0/1579980059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f31001033c0 msgr2=0x7f31001037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.536+0000 7f31080b8700 1 --2- 192.168.123.103:0/1579980059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f31001033c0 0x7f31001037a0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f30fc009b00 tx=0x7f30fc009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.537+0000 7f31080b8700 1 -- 192.168.123.103:0/1579980059 shutdown_connections 2026-03-09T00:10:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.537+0000 7f31080b8700 1 --2- 192.168.123.103:0/1579980059 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3100103d70 0x7f3100107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.537+0000 7f31080b8700 1 --2- 192.168.123.103:0/1579980059 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f31001033c0 0x7f31001037a0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.537+0000 7f31080b8700 1 -- 192.168.123.103:0/1579980059 >> 192.168.123.103:0/1579980059 conn(0x7f31000fec30 msgr2=0x7f3100101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.537+0000 7f31080b8700 1 -- 192.168.123.103:0/1579980059 shutdown_connections 2026-03-09T00:10:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.537+0000 7f31080b8700 1 -- 192.168.123.103:0/1579980059 wait complete. 2026-03-09T00:10:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.538+0000 7f31080b8700 1 Processor -- start 2026-03-09T00:10:19.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.538+0000 7f31080b8700 1 -- start start 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.538+0000 7f31080b8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31001033c0 0x7f3100198f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.538+0000 7f31080b8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 0x7f3100199480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f3105653700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 0x7f3100199480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f31080b8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3100199b60 con 0x7f3100103d70 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f3105653700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 0x7f3100199480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47622/0 (socket says 192.168.123.103:47622) 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f3105653700 1 -- 192.168.123.103:0/18877835 learned_addr learned my addr 192.168.123.103:0/18877835 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:19.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f310019d8f0 con 0x7f31001033c0 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f3105653700 1 -- 192.168.123.103:0/18877835 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31001033c0 msgr2=0x7f3100198f40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f3105653700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31001033c0 0x7f3100198f40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.539+0000 7f3105653700 1 -- 192.168.123.103:0/18877835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f30fc0097e0 con 0x7f3100103d70 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.540+0000 7f3105653700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 0x7f3100199480 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f30f400ba70 tx=0x7f30f400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.540+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f30f400c760 con 0x7f3100103d70 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.540+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f30f400cda0 con 0x7f3100103d70 2026-03-09T00:10:19.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.540+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f30f4012550 con 0x7f3100103d70 2026-03-09T00:10:19.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.540+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f310019dbd0 con 0x7f3100103d70 2026-03-09T00:10:19.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.540+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f310019e0a0 con 0x7f3100103d70 2026-03-09T00:10:19.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.541+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f310010b6f0 con 0x7f3100103d70 2026-03-09T00:10:19.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.543+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f30f400c8c0 con 0x7f3100103d70 2026-03-09T00:10:19.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.543+0000 7f30f2ffd700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f30ec077750 0x7f30ec079c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.543+0000 7f3105e54700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f30ec077750 0x7f30ec079c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.544+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f30f4098ac0 con 0x7f3100103d70 2026-03-09T00:10:19.545 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.544+0000 7f3105e54700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f30ec077750 0x7f30ec079c10 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f30fc009ad0 tx=0x7f30fc009f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:19.546 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.546+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f30f409ae60 con 0x7f3100103d70 2026-03-09T00:10:19.681 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.681+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3100066f20 con 0x7f30ec077750 2026-03-09T00:10:19.683 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.682+0000 7f30f2ffd700 1 -- 192.168.123.103:0/18877835 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f3100066f20 con 0x7f30ec077750 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.685+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f30ec077750 msgr2=0x7f30ec079c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.685+0000 7f31080b8700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f30ec077750 0x7f30ec079c10 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f30fc009ad0 tx=0x7f30fc009f90 comp rx=0 tx=0).stop 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.685+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 msgr2=0x7f3100199480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.685+0000 7f31080b8700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 0x7f3100199480 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f30f400ba70 tx=0x7f30f400be30 comp rx=0 tx=0).stop 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 shutdown_connections 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f30ec077750 0x7f30ec079c10 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f31001033c0 0x7f3100198f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 --2- 192.168.123.103:0/18877835 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3100103d70 0x7f3100199480 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 >> 192.168.123.103:0/18877835 conn(0x7f31000fec30 msgr2=0x7f3100101030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 shutdown_connections 2026-03-09T00:10:19.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.686+0000 7f31080b8700 1 -- 192.168.123.103:0/18877835 wait complete. 2026-03-09T00:10:19.697 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.759+0000 7facc867d700 1 -- 192.168.123.103:0/3171383646 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc0101a80 msgr2=0x7facc0105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.759+0000 7facc867d700 1 --2- 192.168.123.103:0/3171383646 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc0101a80 0x7facc0105ad0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7facbc009b00 tx=0x7facbc009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.759+0000 7facc867d700 1 -- 192.168.123.103:0/3171383646 shutdown_connections 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.759+0000 7facc867d700 1 --2- 192.168.123.103:0/3171383646 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc0101a80 0x7facc0105ad0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.759+0000 7facc867d700 1 --2- 192.168.123.103:0/3171383646 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc01010d0 0x7facc01014b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.759+0000 7facc867d700 1 -- 192.168.123.103:0/3171383646 >> 192.168.123.103:0/3171383646 conn(0x7facc00fc920 msgr2=0x7facc00fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 -- 192.168.123.103:0/3171383646 shutdown_connections 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 -- 192.168.123.103:0/3171383646 wait complete. 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 Processor -- start 2026-03-09T00:10:19.760 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 -- start start 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc01010d0 0x7facc0198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc0101a80 0x7facc0199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7facc01999e0 con 0x7facc01010d0 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.760+0000 7facc867d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7facc019d770 con 0x7facc0101a80 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc5c18700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc0101a80 0x7facc0199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc5c18700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc0101a80 0x7facc0199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:55610/0 (socket says 192.168.123.103:55610) 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc5c18700 1 -- 192.168.123.103:0/180036395 learned_addr learned my addr 192.168.123.103:0/180036395 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc6419700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc01010d0 0x7facc0198dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc6419700 1 -- 192.168.123.103:0/180036395 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc0101a80 msgr2=0x7facc0199300 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc6419700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc0101a80 0x7facc0199300 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.761 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc6419700 1 -- 192.168.123.103:0/180036395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7facbc0097e0 con 0x7facc01010d0 2026-03-09T00:10:19.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.761+0000 7facc6419700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc01010d0 0x7facc0198dc0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7facb000b840 tx=0x7facb000bb50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:19.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.762+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7facb000d610 con 0x7facc01010d0 2026-03-09T00:10:19.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.762+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7facc019da50 con 0x7facc01010d0 2026-03-09T00:10:19.762 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.762+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7facc019dfa0 con 0x7facc01010d0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.763+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7facb000dc50 con 0x7facc01010d0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.763+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7facb0017400 con 0x7facc01010d0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.763+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7facb0017620 con 0x7facc01010d0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.764+0000 7facb77fe700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7facac077a60 0x7facac079f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.764+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7facb009a400 con 0x7facc01010d0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.764+0000 7facc5c18700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7facac077a60 0x7facac079f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.764+0000 7facc5c18700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7facac077a60 0x7facac079f20 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7facc019a3e0 tx=0x7facbc00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:19.764 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.764+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faca4005320 con 0x7facc01010d0 2026-03-09T00:10:19.767 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.767+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7facb0062cb0 con 0x7facc01010d0 2026-03-09T00:10:19.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.908+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faca4000bf0 con 0x7facac077a60 2026-03-09T00:10:19.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.909+0000 7facb77fe700 1 -- 192.168.123.103:0/180036395 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7faca4000bf0 con 0x7facac077a60 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.911+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7facac077a60 msgr2=0x7facac079f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.911+0000 7facc867d700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7facac077a60 0x7facac079f20 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7facc019a3e0 tx=0x7facbc00b540 comp rx=0 tx=0).stop 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.911+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc01010d0 msgr2=0x7facc0198dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.911+0000 7facc867d700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc01010d0 0x7facc0198dc0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7facb000b840 tx=0x7facb000bb50 comp rx=0 tx=0).stop 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 shutdown_connections 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7facac077a60 0x7facac079f20 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7facc01010d0 0x7facc0198dc0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 --2- 192.168.123.103:0/180036395 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7facc0101a80 0x7facc0199300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 >> 192.168.123.103:0/180036395 conn(0x7facc00fc920 msgr2=0x7facc00fdfb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 shutdown_connections 2026-03-09T00:10:19.912 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.912+0000 7facc867d700 1 -- 192.168.123.103:0/180036395 wait complete. 2026-03-09T00:10:19.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 -- 192.168.123.103:0/1727436214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073130 msgr2=0x7f989c073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 --2- 192.168.123.103:0/1727436214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073130 0x7f989c073510 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f9894009b00 tx=0x7f9894009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 -- 192.168.123.103:0/1727436214 shutdown_connections 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 --2- 192.168.123.103:0/1727436214 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073a50 0x7f989c111940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 --2- 192.168.123.103:0/1727436214 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073130 0x7f989c073510 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 -- 192.168.123.103:0/1727436214 >> 192.168.123.103:0/1727436214 conn(0x7f989c0fc920 msgr2=0x7f989c0fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 -- 192.168.123.103:0/1727436214 shutdown_connections 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.989+0000 7f98a4945700 1 -- 192.168.123.103:0/1727436214 wait complete. 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a4945700 1 Processor -- start 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a4945700 1 -- start start 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a4945700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 0x7f989c19d0e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a4945700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073a50 0x7f989c19d620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a4945700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f989c19dd00 con 0x7f989c073a50 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a1ee0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073a50 0x7f989c19d620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a26e1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 0x7f989c19d0e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a26e1700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 0x7f989c19d0e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:55636/0 (socket says 192.168.123.103:55636) 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a26e1700 1 -- 192.168.123.103:0/1539227245 learned_addr learned my addr 192.168.123.103:0/1539227245 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.990+0000 7f98a4945700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f989c1a1a90 con 0x7f989c073130 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a26e1700 1 -- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073a50 msgr2=0x7f989c19d620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a26e1700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073a50 0x7f989c19d620 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a26e1700 1 -- 192.168.123.103:0/1539227245 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f98940097e0 con 0x7f989c073130 2026-03-09T00:10:19.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f98a26e1700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 0x7f989c19d0e0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9894005b40 tx=0x7f989400bfd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:19.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.991+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f989401d070 con 0x7f989c073130 2026-03-09T00:10:19.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.992+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f989c1a1d70 con 0x7f989c073130 2026-03-09T00:10:19.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.992+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f989c1a22c0 con 0x7f989c073130 2026-03-09T00:10:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.992+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f989400f460 con 0x7f989c073130 2026-03-09T00:10:19.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.992+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9894021620 con 0x7f989c073130 2026-03-09T00:10:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.994+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9894003a40 con 0x7f989c073130 2026-03-09T00:10:19.994 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.994+0000 7f988f7fe700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9888077870 0x7f9888079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.994+0000 7f98a1ee0700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9888077870 0x7f9888079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.995+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f989409ac00 con 0x7f989c073130 2026-03-09T00:10:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.995+0000 7f98a1ee0700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9888077870 0x7f9888079d30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f989c19e700 tx=0x7f9890009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:19.995 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.995+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9880005320 con 0x7f989c073130 2026-03-09T00:10:19.999 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:19.999+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f98940634f0 con 0x7f989c073130 2026-03-09T00:10:20.131 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.131+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f9880000bf0 con 0x7f9888077870 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.137+0000 7f988f7fe700 1 -- 192.168.123.103:0/1539227245 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f9880000bf0 con 0x7f9888077870 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (5m) 44s ago 10m 24.4M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (10m) 44s ago 10m 9499k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (10m) 0s ago 10m 9151k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 44s ago 10m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (4m) 0s ago 10m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (5m) 44s ago 10m 85.2M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (8m) 44s ago 8m 19.5M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:10:20.137 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (8m) 44s ago 8m 188M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (8m) 0s ago 8m 21.6M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (8m) 0s ago 8m 18.0M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (6m) 44s ago 11m 625M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (6m) 0s ago 10m 491M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (4m) 44s ago 11m 61.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (4m) 0s ago 10m 55.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (5m) 44s ago 10m 9643k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (5m) 0s ago 10m 9655k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 44s ago 9m 178M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (69s) 44s ago 9m 142M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (46s) 44s ago 9m 12.3M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e7841e7307ae 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (23s) 0s ago 9m 174M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 8e61be617139 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (2s) 0s ago 9m 13.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 21cf4dc58899 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (8m) 0s ago 8m 418M 4096M 18.2.1 5be31c24972a f51e8cd94301 2026-03-09T00:10:20.138 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (5m) 44s ago 10m 58.0M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:10:20.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9888077870 msgr2=0x7f9888079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9888077870 0x7f9888079d30 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f989c19e700 tx=0x7f9890009450 comp rx=0 tx=0).stop 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 msgr2=0x7f989c19d0e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 0x7f989c19d0e0 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9894005b40 tx=0x7f989400bfd0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 shutdown_connections 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9888077870 0x7f9888079d30 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f989c073130 0x7f989c19d0e0 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 --2- 192.168.123.103:0/1539227245 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f989c073a50 0x7f989c19d620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.140+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 >> 192.168.123.103:0/1539227245 conn(0x7f989c0fc920 msgr2=0x7f989c103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.141+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 shutdown_connections 2026-03-09T00:10:20.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.141+0000 7f98a4945700 1 -- 192.168.123.103:0/1539227245 wait complete. 2026-03-09T00:10:20.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.216+0000 7fa78f88c700 1 -- 192.168.123.103:0/1069755445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa788103d70 msgr2=0x7fa788107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.216+0000 7fa78f88c700 1 --2- 192.168.123.103:0/1069755445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa788103d70 0x7fa788107dc0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fa78000b3a0 tx=0x7fa78000b6b0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.216+0000 7fa78f88c700 1 -- 192.168.123.103:0/1069755445 shutdown_connections 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.216+0000 7fa78f88c700 1 --2- 192.168.123.103:0/1069755445 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa788103d70 0x7fa788107dc0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.216+0000 7fa78f88c700 1 --2- 192.168.123.103:0/1069755445 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa7881033c0 0x7fa7881037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.216+0000 7fa78f88c700 1 -- 192.168.123.103:0/1069755445 >> 192.168.123.103:0/1069755445 conn(0x7fa7880fec30 msgr2=0x7fa788101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.220+0000 7fa78f88c700 1 -- 192.168.123.103:0/1069755445 shutdown_connections 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.220+0000 7fa78f88c700 1 -- 192.168.123.103:0/1069755445 wait complete. 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.220+0000 7fa78f88c700 1 Processor -- start 2026-03-09T00:10:20.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.220+0000 7fa78f88c700 1 -- start start 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.220+0000 7fa78f88c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 0x7fa788198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78d628700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 0x7fa788198f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78d628700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 0x7fa788198f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:47678/0 (socket says 192.168.123.103:47678) 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78f88c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa788103d70 0x7fa788199440 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78f88c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa788199b20 con 0x7fa7881033c0 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78f88c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa78819d8b0 con 0x7fa788103d70 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78d628700 1 -- 192.168.123.103:0/2909918332 learned_addr learned my addr 192.168.123.103:0/2909918332 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:20.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78ce27700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa788103d70 0x7fa788199440 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78d628700 1 -- 192.168.123.103:0/2909918332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa788103d70 msgr2=0x7fa788199440 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78d628700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa788103d70 0x7fa788199440 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.221+0000 7fa78d628700 1 -- 192.168.123.103:0/2909918332 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa78000b050 con 0x7fa7881033c0 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.222+0000 7fa78d628700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 0x7fa788198f00 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fa78400ba70 tx=0x7fa78400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.222+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa78400c780 con 0x7fa7881033c0 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.222+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa78400cdc0 con 0x7fa7881033c0 2026-03-09T00:10:20.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.222+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa78819db90 con 0x7fa7881033c0 2026-03-09T00:10:20.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.223+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa784012550 con 0x7fa7881033c0 2026-03-09T00:10:20.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.223+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa78819e0e0 con 0x7fa7881033c0 2026-03-09T00:10:20.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.224+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa78400c8e0 con 0x7fa7881033c0 2026-03-09T00:10:20.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.224+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa78810b710 con 0x7fa7881033c0 2026-03-09T00:10:20.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.227+0000 7fa77e7fc700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa7740776c0 0x7fa774079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.227+0000 7fa78ce27700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa7740776c0 0x7fa774079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.227+0000 7fa78ce27700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa7740776c0 0x7fa774079b80 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fa780009250 tx=0x7fa78000bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.227+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fa784098810 con 0x7fa7881033c0 2026-03-09T00:10:20.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.228+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa784061160 con 0x7fa7881033c0 2026-03-09T00:10:20.364 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.364 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.365 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: pgmap v193: 65 pgs: 12 active+undersized, 8 peering, 1 stale+active+clean, 9 active+undersized+degraded, 35 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 25/231 objects degraded (10.823%) 2026-03-09T00:10:20.365 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:20.365 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.365 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.365 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:20 vm06.local ceph-mon[106218]: from='client.34352 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:20.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.404+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa78804ea90 con 0x7fa7881033c0 2026-03-09T00:10:20.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.405+0000 7fa77e7fc700 1 -- 192.168.123.103:0/2909918332 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fa7840608b0 con 0x7fa7881033c0 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 1, 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 8 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:10:20.406 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: pgmap v193: 65 pgs: 12 active+undersized, 8 peering, 1 stale+active+clean, 9 active+undersized+degraded, 35 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 25/231 objects degraded (10.823%) 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:20.408 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:20 vm03.local ceph-mon[129670]: from='client.34352 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.411+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa7740776c0 msgr2=0x7fa774079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.411+0000 7fa78f88c700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa7740776c0 0x7fa774079b80 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7fa780009250 tx=0x7fa78000bf90 comp rx=0 tx=0).stop 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.411+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 msgr2=0x7fa788198f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.411+0000 7fa78f88c700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 0x7fa788198f00 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fa78400ba70 tx=0x7fa78400be30 comp rx=0 tx=0).stop 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.412+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 shutdown_connections 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.412+0000 7fa78f88c700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa7740776c0 0x7fa774079b80 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.412+0000 7fa78f88c700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa7881033c0 0x7fa788198f00 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.412+0000 7fa78f88c700 1 --2- 192.168.123.103:0/2909918332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa788103d70 0x7fa788199440 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.412+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 >> 192.168.123.103:0/2909918332 conn(0x7fa7880fec30 msgr2=0x7fa7881002d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.413+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 shutdown_connections 2026-03-09T00:10:20.413 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.413+0000 7fa78f88c700 1 -- 192.168.123.103:0/2909918332 wait complete. 2026-03-09T00:10:20.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.486+0000 7f5191aec700 1 -- 192.168.123.103:0/1689847398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c1033c0 msgr2=0x7f518c1037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.486 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.486+0000 7f5191aec700 1 --2- 192.168.123.103:0/1689847398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c1033c0 0x7f518c1037a0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f517c009b50 tx=0x7f517c009e60 comp rx=0 tx=0).stop 2026-03-09T00:10:20.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.486+0000 7f5191aec700 1 -- 192.168.123.103:0/1689847398 shutdown_connections 2026-03-09T00:10:20.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.486+0000 7f5191aec700 1 --2- 192.168.123.103:0/1689847398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c103d70 0x7f518c107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.486+0000 7f5191aec700 1 --2- 192.168.123.103:0/1689847398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c1033c0 0x7f518c1037a0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.486+0000 7f5191aec700 1 -- 192.168.123.103:0/1689847398 >> 192.168.123.103:0/1689847398 conn(0x7f518c0fec30 msgr2=0x7f518c101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.487+0000 7f5191aec700 1 -- 192.168.123.103:0/1689847398 shutdown_connections 2026-03-09T00:10:20.487 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.487+0000 7f5191aec700 1 -- 192.168.123.103:0/1689847398 wait complete. 2026-03-09T00:10:20.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.488+0000 7f5191aec700 1 Processor -- start 2026-03-09T00:10:20.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.488+0000 7f5191aec700 1 -- start start 2026-03-09T00:10:20.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.488+0000 7f5191aec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 0x7f518c078560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.488+0000 7f5191aec700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c103d70 0x7f518c078ac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.488 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.488+0000 7f5191aec700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f518c0791a0 con 0x7f518c103d70 2026-03-09T00:10:20.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.488+0000 7f5191aec700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f518c1a1d60 con 0x7f518c1033c0 2026-03-09T00:10:20.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.489+0000 7f5183fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c103d70 0x7f518c078ac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.489+0000 7f518b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 0x7f518c078560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.489+0000 7f518b7fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 0x7f518c078560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:55664/0 (socket says 192.168.123.103:55664) 2026-03-09T00:10:20.489 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.489+0000 7f518b7fe700 1 -- 192.168.123.103:0/150700374 learned_addr learned my addr 192.168.123.103:0/150700374 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.489+0000 7f518b7fe700 1 -- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c103d70 msgr2=0x7f518c078ac0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.489+0000 7f518b7fe700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c103d70 0x7f518c078ac0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.490+0000 7f518b7fe700 1 -- 192.168.123.103:0/150700374 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f517c0097e0 con 0x7f518c1033c0 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.490+0000 7f5183fff700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c103d70 0x7f518c078ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.490+0000 7f518b7fe700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 0x7f518c078560 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f517c009b20 tx=0x7f517c0056f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.490+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f517c01d070 con 0x7f518c1033c0 2026-03-09T00:10:20.490 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.490+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f518c1a1f60 con 0x7f518c1033c0 2026-03-09T00:10:20.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.490+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f518c1a2450 con 0x7f518c1033c0 2026-03-09T00:10:20.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.491+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f517c00bcf0 con 0x7f518c1033c0 2026-03-09T00:10:20.491 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.491+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f517c021750 con 0x7f518c1033c0 2026-03-09T00:10:20.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.492+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f517c02b430 con 0x7f518c1033c0 2026-03-09T00:10:20.492 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.492+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f518c10b760 con 0x7f518c1033c0 2026-03-09T00:10:20.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.492+0000 7f51897fa700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5178077870 0x7f5178079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.493+0000 7f5183fff700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5178077870 0x7f5178079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.493+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f517c09af10 con 0x7f518c1033c0 2026-03-09T00:10:20.493 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.493+0000 7f5183fff700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5178077870 0x7f5178079d30 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f518c19e9a0 tx=0x7f5174009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.495 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.495+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f517c0647a0 con 0x7f518c1033c0 2026-03-09T00:10:20.644 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.644+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f518c04ea90 con 0x7f518c1033c0 2026-03-09T00:10:20.645 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.645+0000 7f51897fa700 1 -- 192.168.123.103:0/150700374 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 11 v11) v1 ==== 76+0+1919 (secure 0 0 0) 0x7f517c026070 con 0x7f518c1033c0 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:e11 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:epoch 11 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:01:51.424075+0000 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 39 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14480} 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:14480} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.103:6826/3708505754,v1:192.168.123.103:6827/3708505754] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:20.646 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:20.647 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:20.647 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:10:20.647 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:20.647 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:14492} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:20.647 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:20.647 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:20.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5178077870 msgr2=0x7f5178079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5178077870 0x7f5178079d30 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f518c19e9a0 tx=0x7f5174009380 comp rx=0 tx=0).stop 2026-03-09T00:10:20.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 msgr2=0x7f518c078560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 0x7f518c078560 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f517c009b20 tx=0x7f517c0056f0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.650 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 shutdown_connections 2026-03-09T00:10:20.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5178077870 0x7f5178079d30 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.650+0000 7f5191aec700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f518c1033c0 0x7f518c078560 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.651+0000 7f5191aec700 1 --2- 192.168.123.103:0/150700374 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f518c103d70 0x7f518c078ac0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.651+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 >> 192.168.123.103:0/150700374 conn(0x7f518c0fec30 msgr2=0x7f518c101030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.651+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 shutdown_connections 2026-03-09T00:10:20.651 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.651+0000 7f5191aec700 1 -- 192.168.123.103:0/150700374 wait complete. 2026-03-09T00:10:20.652 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.727+0000 7f05dc521700 1 -- 192.168.123.103:0/715423494 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103cf0 msgr2=0x7f05d4107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.727+0000 7f05dc521700 1 --2- 192.168.123.103:0/715423494 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103cf0 0x7f05d4107d40 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f05d0009b50 tx=0x7f05d0009e60 comp rx=0 tx=0).stop 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.727+0000 7f05dc521700 1 -- 192.168.123.103:0/715423494 shutdown_connections 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.727+0000 7f05dc521700 1 --2- 192.168.123.103:0/715423494 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103cf0 0x7f05d4107d40 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.727+0000 7f05dc521700 1 --2- 192.168.123.103:0/715423494 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103340 0x7f05d4103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.727+0000 7f05dc521700 1 -- 192.168.123.103:0/715423494 >> 192.168.123.103:0/715423494 conn(0x7f05d40feb90 msgr2=0x7f05d4100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.728+0000 7f05dc521700 1 -- 192.168.123.103:0/715423494 shutdown_connections 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.728+0000 7f05dc521700 1 -- 192.168.123.103:0/715423494 wait complete. 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.728+0000 7f05dc521700 1 Processor -- start 2026-03-09T00:10:20.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.728+0000 7f05dc521700 1 -- start start 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05dc521700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103340 0x7f05d4198f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05dc521700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103cf0 0x7f05d41994a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05dc521700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05d4199b80 con 0x7f05d4103340 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05dc521700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05d419d910 con 0x7f05d4103cf0 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05d9abc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103cf0 0x7f05d41994a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05da2bd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103340 0x7f05d4198f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05d9abc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103cf0 0x7f05d41994a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:55680/0 (socket says 192.168.123.103:55680) 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05d9abc700 1 -- 192.168.123.103:0/2434630566 learned_addr learned my addr 192.168.123.103:0/2434630566 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:20.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05da2bd700 1 -- 192.168.123.103:0/2434630566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103cf0 msgr2=0x7f05d41994a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05da2bd700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103cf0 0x7f05d41994a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.729+0000 7f05da2bd700 1 -- 192.168.123.103:0/2434630566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05d00097e0 con 0x7f05d4103340 2026-03-09T00:10:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.730+0000 7f05da2bd700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103340 0x7f05d4198f60 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f05c400b6d0 tx=0x7f05c400b9e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.730+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05c4011630 con 0x7f05d4103340 2026-03-09T00:10:20.731 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.730+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05d419dbf0 con 0x7f05d4103340 2026-03-09T00:10:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.731+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05d419e140 con 0x7f05d4103340 2026-03-09T00:10:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.732+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f05c4011c70 con 0x7f05d4103340 2026-03-09T00:10:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.732+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05c4010380 con 0x7f05d4103340 2026-03-09T00:10:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.732+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f05c40105a0 con 0x7f05d4103340 2026-03-09T00:10:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.732+0000 7f05cb7fe700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f05c00778c0 0x7f05c0079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.733 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.732+0000 7f05d9abc700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f05c00778c0 0x7f05c0079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.734+0000 7f05d9abc700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f05c00778c0 0x7f05c0079d80 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f05d419a580 tx=0x7f05d0004e80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.734+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f05c401b860 con 0x7f05d4103340 2026-03-09T00:10:20.734 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.734+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f05b8005320 con 0x7f05d4103340 2026-03-09T00:10:20.737 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.737+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f05c4062160 con 0x7f05d4103340 2026-03-09T00:10:20.875 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.874+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f05b8000bf0 con 0x7f05c00778c0 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.876+0000 7f05cb7fe700 1 -- 192.168.123.103:0/2434630566 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f05b8000bf0 con 0x7f05c00778c0 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "11/23 daemons upgraded", 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading osd daemons", 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:10:20.876 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.879+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f05c00778c0 msgr2=0x7f05c0079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.879+0000 7f05dc521700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f05c00778c0 0x7f05c0079d80 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f05d419a580 tx=0x7f05d0004e80 comp rx=0 tx=0).stop 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103340 msgr2=0x7f05d4198f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103340 0x7f05d4198f60 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f05c400b6d0 tx=0x7f05c400b9e0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 shutdown_connections 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f05c00778c0 0x7f05c0079d80 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f05d4103340 0x7f05d4198f60 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 --2- 192.168.123.103:0/2434630566 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f05d4103cf0 0x7f05d41994a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.880 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 >> 192.168.123.103:0/2434630566 conn(0x7f05d40feb90 msgr2=0x7f05d4100f90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 shutdown_connections 2026-03-09T00:10:20.881 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.880+0000 7f05dc521700 1 -- 192.168.123.103:0/2434630566 wait complete. 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.958+0000 7f9b568f6700 1 -- 192.168.123.103:0/3648979299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50101130 msgr2=0x7f9b501047b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.958+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3648979299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50101130 0x7f9b501047b0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f9b4c009b00 tx=0x7f9b4c009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.959+0000 7f9b568f6700 1 -- 192.168.123.103:0/3648979299 shutdown_connections 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.959+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3648979299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50101130 0x7f9b501047b0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.959+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3648979299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50100780 0x7f9b50100b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.959+0000 7f9b568f6700 1 -- 192.168.123.103:0/3648979299 >> 192.168.123.103:0/3648979299 conn(0x7f9b50074df0 msgr2=0x7f9b50075200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.959+0000 7f9b568f6700 1 -- 192.168.123.103:0/3648979299 shutdown_connections 2026-03-09T00:10:20.959 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.959+0000 7f9b568f6700 1 -- 192.168.123.103:0/3648979299 wait complete. 2026-03-09T00:10:20.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b568f6700 1 Processor -- start 2026-03-09T00:10:20.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b568f6700 1 -- start start 2026-03-09T00:10:20.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b568f6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50100780 0x7f9b5019a590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b568f6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50101130 0x7f9b5019aad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b568f6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b5019b160 con 0x7f9b50100780 2026-03-09T00:10:20.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b568f6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9b50194610 con 0x7f9b50101130 2026-03-09T00:10:20.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b558f4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50100780 0x7f9b5019a590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.960+0000 7f9b550f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50101130 0x7f9b5019aad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b550f3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50101130 0x7f9b5019aad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:55698/0 (socket says 192.168.123.103:55698) 2026-03-09T00:10:20.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b550f3700 1 -- 192.168.123.103:0/3467465042 learned_addr learned my addr 192.168.123.103:0/3467465042 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:20.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b558f4700 1 -- 192.168.123.103:0/3467465042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50101130 msgr2=0x7f9b5019aad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:20.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b558f4700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50101130 0x7f9b5019aad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:20.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b558f4700 1 -- 192.168.123.103:0/3467465042 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9b4c0097e0 con 0x7f9b50100780 2026-03-09T00:10:20.962 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b558f4700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50100780 0x7f9b5019a590 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f9b4400b840 tx=0x7f9b4400bb50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:20.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.961+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b4400d610 con 0x7f9b50100780 2026-03-09T00:10:20.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.962+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9b4400dc50 con 0x7f9b50100780 2026-03-09T00:10:20.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.962+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9b44017400 con 0x7f9b50100780 2026-03-09T00:10:20.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.962+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9b501948f0 con 0x7f9b50100780 2026-03-09T00:10:20.963 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.962+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9b50194e40 con 0x7f9b50100780 2026-03-09T00:10:20.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.963+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9b4400d770 con 0x7f9b50100780 2026-03-09T00:10:20.964 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.963+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9b50108120 con 0x7f9b50100780 2026-03-09T00:10:20.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.966+0000 7f9b42ffd700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9b3c07bcd0 0x7f9b3c07e190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:20.966 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.966+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(96..96 src has 1..96) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9b44099290 con 0x7f9b50100780 2026-03-09T00:10:20.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.966+0000 7f9b550f3700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9b3c07bcd0 0x7f9b3c07e190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:20.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.967+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9b44061ac0 con 0x7f9b50100780 2026-03-09T00:10:20.967 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:20.967+0000 7f9b550f3700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9b3c07bcd0 0x7f9b3c07e190 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f9b50106750 tx=0x7f9b4c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:21.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.139+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9b5004f2e0 con 0x7f9b50100780 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.140+0000 7f9b42ffd700 1 -- 192.168.123.103:0/3467465042 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1244 (secure 0 0 0) 0x7f9b44061210 con 0x7f9b50100780 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down; Reduced data availability: 1 pg inactive, 2 pgs peering; Degraded data redundancy: 25/231 objects degraded (10.823%), 9 pgs degraded 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: osd.4 (root=default,host=vm06) is down 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_AVAILABILITY: Reduced data availability: 1 pg inactive, 2 pgs peering 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.f is stuck peering for 3m, current state peering, last acting [0,5] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 3.3 is stuck peering for 3m, current state peering, last acting [0,3] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 25/231 objects degraded (10.823%), 9 pgs degraded 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.4 is active+undersized+degraded, acting [1,0] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.5 is active+undersized+degraded, acting [3,0] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.6 is active+undersized+degraded, acting [1,3] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.7 is active+undersized+degraded, acting [3,2] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.9 is active+undersized+degraded, acting [1,0] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.a is active+undersized+degraded, acting [1,3] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.b is active+undersized+degraded, acting [3,5] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.14 is active+undersized+degraded, acting [3,5] 2026-03-09T00:10:21.141 INFO:teuthology.orchestra.run.vm03.stdout: pg 2.1a is active+undersized+degraded, acting [3,5] 2026-03-09T00:10:21.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9b3c07bcd0 msgr2=0x7f9b3c07e190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9b3c07bcd0 0x7f9b3c07e190 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f9b50106750 tx=0x7f9b4c005fb0 comp rx=0 tx=0).stop 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50100780 msgr2=0x7f9b5019a590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50100780 0x7f9b5019a590 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f9b4400b840 tx=0x7f9b4400bb50 comp rx=0 tx=0).stop 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 shutdown_connections 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f9b3c07bcd0 0x7f9b3c07e190 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f9b50100780 0x7f9b5019a590 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 --2- 192.168.123.103:0/3467465042 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f9b50101130 0x7f9b5019aad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 >> 192.168.123.103:0/3467465042 conn(0x7f9b50074df0 msgr2=0x7f9b500feca0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 shutdown_connections 2026-03-09T00:10:21.144 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:21.143+0000 7f9b568f6700 1 -- 192.168.123.103:0/3467465042 wait complete. 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: Health check failed: Degraded data redundancy: 25/231 objects degraded (10.823%), 9 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2909918332' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/150700374' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='client.34368 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:21.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:21 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3467465042' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='client.44259 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: Health check failed: Degraded data redundancy: 25/231 objects degraded (10.823%), 9 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2909918332' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/150700374' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='client.34368 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:21.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:21 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3467465042' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:10:22.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:22 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:22.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:22 vm06.local ceph-mon[106218]: Upgrade: unsafe to stop osd(s) at this time (6 PGs are or would become offline) 2026-03-09T00:10:22.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:22 vm06.local ceph-mon[106218]: pgmap v194: 65 pgs: 20 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 38/231 objects degraded (16.450%) 2026-03-09T00:10:22.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:22 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:22.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:22 vm03.local ceph-mon[129670]: Upgrade: unsafe to stop osd(s) at this time (6 PGs are or would become offline) 2026-03-09T00:10:22.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:22 vm03.local ceph-mon[129670]: pgmap v194: 65 pgs: 20 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 38/231 objects degraded (16.450%) 2026-03-09T00:10:23.313 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:22 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:10:22.978+0000 7fb8acca7740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-09T00:10:23.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:23 vm06.local ceph-mon[106218]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 2 pgs peering) 2026-03-09T00:10:23.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:23 vm06.local ceph-mon[106218]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T00:10:23.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:23 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:10:23.312+0000 7fb8acca7740 -1 osd.4 94 log_to_monitors true 2026-03-09T00:10:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:23 vm03.local ceph-mon[129670]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg inactive, 2 pgs peering) 2026-03-09T00:10:23.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:23 vm03.local ceph-mon[129670]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-09T00:10:24.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:24 vm03.local ceph-mon[129670]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T00:10:24.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:24 vm03.local ceph-mon[129670]: osdmap e97: 6 total, 5 up, 6 in 2026-03-09T00:10:24.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:24 vm03.local ceph-mon[129670]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:24.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:24 vm03.local ceph-mon[129670]: pgmap v196: 65 pgs: 20 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 38/231 objects degraded (16.450%) 2026-03-09T00:10:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:24 vm06.local ceph-mon[106218]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-09T00:10:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:24 vm06.local ceph-mon[106218]: osdmap e97: 6 total, 5 up, 6 in 2026-03-09T00:10:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:24 vm06.local ceph-mon[106218]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:24.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:24 vm06.local ceph-mon[106218]: pgmap v196: 65 pgs: 20 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 38/231 objects degraded (16.450%) 2026-03-09T00:10:24.921 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:10:24 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:10:24.468+0000 7fb8a4240640 -1 osd.4 94 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:10:25.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:25 vm03.local ceph-mon[129670]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' 2026-03-09T00:10:25.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:25 vm06.local ceph-mon[106218]: from='osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899]' entity='osd.4' 2026-03-09T00:10:26.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:26 vm03.local ceph-mon[129670]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:10:26.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:26 vm03.local ceph-mon[129670]: osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899] boot 2026-03-09T00:10:26.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:26 vm03.local ceph-mon[129670]: osdmap e98: 6 total, 6 up, 6 in 2026-03-09T00:10:26.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:26 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:10:26.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:26 vm03.local ceph-mon[129670]: pgmap v198: 65 pgs: 20 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 38/231 objects degraded (16.450%) 2026-03-09T00:10:26.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:26 vm03.local ceph-mon[129670]: osdmap e99: 6 total, 6 up, 6 in 2026-03-09T00:10:26.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:26 vm06.local ceph-mon[106218]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:10:26.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:26 vm06.local ceph-mon[106218]: osd.4 [v2:192.168.123.106:6808/2923148899,v1:192.168.123.106:6809/2923148899] boot 2026-03-09T00:10:26.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:26 vm06.local ceph-mon[106218]: osdmap e98: 6 total, 6 up, 6 in 2026-03-09T00:10:26.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:26 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-09T00:10:26.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:26 vm06.local ceph-mon[106218]: pgmap v198: 65 pgs: 20 active+undersized, 14 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 38/231 objects degraded (16.450%) 2026-03-09T00:10:26.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:26 vm06.local ceph-mon[106218]: osdmap e99: 6 total, 6 up, 6 in 2026-03-09T00:10:27.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:27 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 38/231 objects degraded (16.450%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:27.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:27 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 38/231 objects degraded (16.450%), 14 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:28.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:28 vm03.local ceph-mon[129670]: pgmap v200: 65 pgs: 6 peering, 16 active+undersized, 12 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 32/231 objects degraded (13.853%) 2026-03-09T00:10:28.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:28 vm06.local ceph-mon[106218]: pgmap v200: 65 pgs: 6 peering, 16 active+undersized, 12 active+undersized+degraded, 31 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 32/231 objects degraded (13.853%) 2026-03-09T00:10:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:30.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:30.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:30.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:31.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:30 vm03.local ceph-mon[129670]: pgmap v201: 65 pgs: 6 peering, 3 active+undersized, 1 active+undersized+degraded, 55 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 4/231 objects degraded (1.732%) 2026-03-09T00:10:31.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:30 vm06.local ceph-mon[106218]: pgmap v201: 65 pgs: 6 peering, 3 active+undersized, 1 active+undersized+degraded, 55 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 4/231 objects degraded (1.732%) 2026-03-09T00:10:32.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:31 vm03.local ceph-mon[129670]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 4/231 objects degraded (1.732%), 1 pg degraded) 2026-03-09T00:10:32.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:31 vm06.local ceph-mon[106218]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 4/231 objects degraded (1.732%), 1 pg degraded) 2026-03-09T00:10:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:32 vm03.local ceph-mon[129670]: pgmap v202: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:33.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:32 vm06.local ceph-mon[106218]: pgmap v202: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:35.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:34 vm03.local ceph-mon[129670]: pgmap v203: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:35.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:34 vm06.local ceph-mon[106218]: pgmap v203: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:36.843 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:36 vm06.local ceph-mon[106218]: pgmap v204: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:36.843 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:36.843 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:36.843 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T00:10:36.843 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:36 vm03.local ceph-mon[129670]: pgmap v204: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-09T00:10:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:37.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local systemd[1]: Stopping Ceph osd.5 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:10:37.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[76918]: 2026-03-09T00:10:37.302+0000 7f8fac679700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:10:37.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[76918]: 2026-03-09T00:10:37.302+0000 7f8fac679700 -1 osd.5 99 *** Got signal Terminated *** 2026-03-09T00:10:37.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[76918]: 2026-03-09T00:10:37.302+0000 7f8fac679700 -1 osd.5 99 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:10:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:37 vm03.local ceph-mon[129670]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:37 vm03.local ceph-mon[129670]: Upgrade: osd.5 is safe to restart 2026-03-09T00:10:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:37 vm03.local ceph-mon[129670]: Upgrade: Updating osd.5 2026-03-09T00:10:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:37 vm03.local ceph-mon[129670]: Deploying daemon osd.5 on vm06 2026-03-09T00:10:38.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:37 vm03.local ceph-mon[129670]: osd.5 marked itself down and dead 2026-03-09T00:10:38.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-mon[106218]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-09T00:10:38.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-mon[106218]: Upgrade: osd.5 is safe to restart 2026-03-09T00:10:38.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-mon[106218]: Upgrade: Updating osd.5 2026-03-09T00:10:38.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-mon[106218]: Deploying daemon osd.5 on vm06 2026-03-09T00:10:38.106 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:37 vm06.local ceph-mon[106218]: osd.5 marked itself down and dead 2026-03-09T00:10:38.106 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local podman[122576]: 2026-03-09 00:10:37.919573309 +0000 UTC m=+0.631412500 container died f51e8cd94301fb2ccf2d0c9d7ffd99d1dbccce8a111e3ddb7e78a0c9df90abde (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, CEPH_POINT_RELEASE=-18.2.1, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS) 2026-03-09T00:10:38.106 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local podman[122576]: 2026-03-09 00:10:37.948227759 +0000 UTC m=+0.660066940 container remove f51e8cd94301fb2ccf2d0c9d7ffd99d1dbccce8a111e3ddb7e78a0c9df90abde (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.vendor=CentOS, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1) 2026-03-09T00:10:38.106 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:37 vm06.local bash[122576]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.10658361 +0000 UTC m=+0.019299472 container create a31f2d615a95c5d03ed3af145964b6113ea9f6fb19ace66324cad982dfbc1193 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.154309143 +0000 UTC m=+0.067025005 container init a31f2d615a95c5d03ed3af145964b6113ea9f6fb19ace66324cad982dfbc1193 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2) 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.158278382 +0000 UTC m=+0.070994244 container start a31f2d615a95c5d03ed3af145964b6113ea9f6fb19ace66324cad982dfbc1193 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.161371481 +0000 UTC m=+0.074087353 container attach a31f2d615a95c5d03ed3af145964b6113ea9f6fb19ace66324cad982dfbc1193 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True) 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.099128987 +0000 UTC m=+0.011844849 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.292066161 +0000 UTC m=+0.204782023 container died a31f2d615a95c5d03ed3af145964b6113ea9f6fb19ace66324cad982dfbc1193 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122643]: 2026-03-09 00:10:38.319247033 +0000 UTC m=+0.231962895 container remove a31f2d615a95c5d03ed3af145964b6113ea9f6fb19ace66324cad982dfbc1193 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.5.service: Deactivated successfully. 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.5.service: Unit process 122654 (conmon) remains running after unit stopped. 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local systemd[1]: Stopped Ceph osd.5 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:10:38.413 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local systemd[1]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.5.service: Consumed 42.228s CPU time, 1.1G memory peak. 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local systemd[1]: Starting Ceph osd.5 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122752]: 2026-03-09 00:10:38.65671457 +0000 UTC m=+0.023000097 container create d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122752]: 2026-03-09 00:10:38.715604039 +0000 UTC m=+0.081889566 container init d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122752]: 2026-03-09 00:10:38.718749847 +0000 UTC m=+0.085035374 container start d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122752]: 2026-03-09 00:10:38.71971345 +0000 UTC m=+0.085998977 container attach d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True) 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local podman[122752]: 2026-03-09 00:10:38.643722154 +0000 UTC m=+0.010007692 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:38.804 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local bash[122752]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:38 vm03.local ceph-mon[129670]: pgmap v205: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:39.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:38 vm03.local ceph-mon[129670]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:10:39.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:38 vm03.local ceph-mon[129670]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T00:10:39.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:38 vm03.local ceph-mon[129670]: osdmap e100: 6 total, 5 up, 6 in 2026-03-09T00:10:39.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:39.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-mon[106218]: pgmap v205: 65 pgs: 65 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail 2026-03-09T00:10:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-mon[106218]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-09T00:10:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-mon[106218]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-09T00:10:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-mon[106218]: osdmap e100: 6 total, 5 up, 6 in 2026-03-09T00:10:39.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:39.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local bash[122752]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:38 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T00:10:39.471 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T00:10:39.762 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-88f2fed4-b080-4e2b-ae39-4787786de0a4/osd-block-15462a2c-77f6-4f87-a9bf-e5fe4de71f8f --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T00:10:39.762 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-88f2fed4-b080-4e2b-ae39-4787786de0a4/osd-block-15462a2c-77f6-4f87-a9bf-e5fe4de71f8f --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-09T00:10:40.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:39 vm03.local ceph-mon[129670]: osdmap e101: 6 total, 5 up, 6 in 2026-03-09T00:10:40.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-mon[106218]: osdmap e101: 6 total, 5 up, 6 in 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/ln -snf /dev/ceph-88f2fed4-b080-4e2b-ae39-4787786de0a4/osd-block-15462a2c-77f6-4f87-a9bf-e5fe4de71f8f /var/lib/ceph/osd/ceph-5/block 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/ln -snf /dev/ceph-88f2fed4-b080-4e2b-ae39-4787786de0a4/osd-block-15462a2c-77f6-4f87-a9bf-e5fe4de71f8f /var/lib/ceph/osd/ceph-5/block 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate[122767]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[122752]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local conmon[122767]: conmon d452349c4ea71dbe8a54 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065.scope/container/memory.events 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local podman[122752]: 2026-03-09 00:10:39.797705462 +0000 UTC m=+1.163990989 container died d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid) 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local podman[122752]: 2026-03-09 00:10:39.828487063 +0000 UTC m=+1.194772580 container remove d452349c4ea71dbe8a543e11aed45177cf8c5bfa2eafd245408766ab57dee065 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local podman[123015]: 2026-03-09 00:10:39.957332862 +0000 UTC m=+0.018363550 container create fbc950d55a6775f3eb36b391c3ac9eda1786929d1923f237cdf756fa97a4512d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local podman[123015]: 2026-03-09 00:10:39.990916207 +0000 UTC m=+0.051946885 container init fbc950d55a6775f3eb36b391c3ac9eda1786929d1923f237cdf756fa97a4512d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local podman[123015]: 2026-03-09 00:10:39.996751779 +0000 UTC m=+0.057782468 container start fbc950d55a6775f3eb36b391c3ac9eda1786929d1923f237cdf756fa97a4512d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local bash[123015]: fbc950d55a6775f3eb36b391c3ac9eda1786929d1923f237cdf756fa97a4512d 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:39 vm06.local podman[123015]: 2026-03-09 00:10:39.950795947 +0000 UTC m=+0.011826635 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:10:40.171 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:40 vm06.local systemd[1]: Started Ceph osd.5 for ae8f0172-1b4a-11f1-916a-712b2ac006b7. 2026-03-09T00:10:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-mon[106218]: pgmap v208: 65 pgs: 3 active+undersized, 10 peering, 5 stale+active+clean, 2 active+undersized+degraded, 45 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5/231 objects degraded (2.165%) 2026-03-09T00:10:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-mon[106218]: Health check failed: Reduced data availability: 5 pgs peering (PG_AVAILABILITY) 2026-03-09T00:10:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-mon[106218]: Health check failed: Degraded data redundancy: 5/231 objects degraded (2.165%), 2 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:40.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:40.921 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:40 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:10:40.844+0000 7f2e9c274740 -1 Falling back to public interface 2026-03-09T00:10:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:40 vm03.local ceph-mon[129670]: pgmap v208: 65 pgs: 3 active+undersized, 10 peering, 5 stale+active+clean, 2 active+undersized+degraded, 45 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5/231 objects degraded (2.165%) 2026-03-09T00:10:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:40 vm03.local ceph-mon[129670]: Health check failed: Reduced data availability: 5 pgs peering (PG_AVAILABILITY) 2026-03-09T00:10:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:40 vm03.local ceph-mon[129670]: Health check failed: Degraded data redundancy: 5/231 objects degraded (2.165%), 2 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:41.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:42.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:42 vm03.local ceph-mon[129670]: pgmap v209: 65 pgs: 6 active+undersized, 10 peering, 3 stale+active+clean, 2 active+undersized+degraded, 44 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5/231 objects degraded (2.165%) 2026-03-09T00:10:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:42 vm06.local ceph-mon[106218]: pgmap v209: 65 pgs: 6 active+undersized, 10 peering, 3 stale+active+clean, 2 active+undersized+degraded, 44 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 5/231 objects degraded (2.165%) 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all osd 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T00:10:43.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all osd 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: Upgrade: Setting require_osd_release to 19 squid 2026-03-09T00:10:43.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-09T00:10:44.646 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:10:44.379+0000 7f2e9c274740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-09T00:10:44.646 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:10:44.645+0000 7f2e9c274740 -1 osd.5 99 log_to_monitors true 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: pgmap v210: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 33/231 objects degraded (14.286%) 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: osdmap e102: 6 total, 5 up, 6 in 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: osdmap e103: 6 total, 5 up, 6 in 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: Standby daemon mds.cephfs.vm03.ralade assigned to filesystem cephfs as rank 0 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:10:44.646 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T00:10:44.647 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:44 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm03.ralade=up:replay} 2 up:standby 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[129666]: 2026-03-09T00:10:44.532+0000 7f802c841640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: pgmap v210: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 33/231 objects degraded (14.286%) 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: osdmap e102: 6 total, 5 up, 6 in 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.sejksk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: osdmap e103: 6 total, 5 up, 6 in 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: Standby daemon mds.cephfs.vm03.ralade assigned to filesystem cephfs as rank 0 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T00:10:44.839 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:44 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm03.ralade=up:replay} 2 up:standby 2026-03-09T00:10:45.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:45 vm06.local ceph-mon[106218]: Upgrade: Updating mds.cephfs.vm03.sejksk 2026-03-09T00:10:45.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:45 vm06.local ceph-mon[106218]: Deploying daemon mds.cephfs.vm03.sejksk on vm03 2026-03-09T00:10:45.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:45 vm06.local ceph-mon[106218]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 5 pgs peering) 2026-03-09T00:10:45.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:45 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:10:45.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:45 vm06.local ceph-mon[106218]: from='osd.5 [v2:192.168.123.106:6816/1339628286,v1:192.168.123.106:6817/1339628286]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:10:45.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:45 vm06.local ceph-mon[106218]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:10:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:45 vm03.local ceph-mon[129670]: Upgrade: Updating mds.cephfs.vm03.sejksk 2026-03-09T00:10:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:45 vm03.local ceph-mon[129670]: Deploying daemon mds.cephfs.vm03.sejksk on vm03 2026-03-09T00:10:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:45 vm03.local ceph-mon[129670]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 5 pgs peering) 2026-03-09T00:10:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:45 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:10:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:45 vm03.local ceph-mon[129670]: from='osd.5 [v2:192.168.123.106:6816/1339628286,v1:192.168.123.106:6817/1339628286]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:10:46.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:45 vm03.local ceph-mon[129670]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-09T00:10:46.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:46 vm06.local ceph-mon[106218]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T00:10:46.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:46 vm06.local ceph-mon[106218]: osdmap e104: 6 total, 5 up, 6 in 2026-03-09T00:10:46.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:46 vm06.local ceph-mon[106218]: from='osd.5 [v2:192.168.123.106:6816/1339628286,v1:192.168.123.106:6817/1339628286]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:46.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:46 vm06.local ceph-mon[106218]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:46.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:46 vm06.local ceph-mon[106218]: pgmap v214: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 1.3 MiB/s rd, 0 op/s; 33/231 objects degraded (14.286%) 2026-03-09T00:10:46.921 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:10:46 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:10:46.692+0000 7f2e9380d640 -1 osd.5 99 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-09T00:10:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:46 vm03.local ceph-mon[129670]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-09T00:10:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:46 vm03.local ceph-mon[129670]: osdmap e104: 6 total, 5 up, 6 in 2026-03-09T00:10:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:46 vm03.local ceph-mon[129670]: from='osd.5 [v2:192.168.123.106:6816/1339628286,v1:192.168.123.106:6817/1339628286]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:46 vm03.local ceph-mon[129670]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm06", "root=default"]}]: dispatch 2026-03-09T00:10:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:46 vm03.local ceph-mon[129670]: pgmap v214: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 209 MiB data, 1.2 GiB used, 119 GiB / 120 GiB avail; 1.3 MiB/s rd, 0 op/s; 33/231 objects degraded (14.286%) 2026-03-09T00:10:48.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:47 vm03.local ceph-mon[129670]: from='osd.5 ' entity='osd.5' 2026-03-09T00:10:48.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:47 vm03.local ceph-mon[129670]: Health check update: Degraded data redundancy: 33/231 objects degraded (14.286%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:48.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:47 vm06.local ceph-mon[106218]: from='osd.5 ' entity='osd.5' 2026-03-09T00:10:48.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:47 vm06.local ceph-mon[106218]: Health check update: Degraded data redundancy: 33/231 objects degraded (14.286%), 12 pgs degraded (PG_DEGRADED) 2026-03-09T00:10:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:48 vm06.local ceph-mon[106218]: pgmap v215: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 2.0 MiB/s rd, 1 op/s; 33/231 objects degraded (14.286%) 2026-03-09T00:10:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:48 vm06.local ceph-mon[106218]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:10:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:48 vm06.local ceph-mon[106218]: osd.5 [v2:192.168.123.106:6816/1339628286,v1:192.168.123.106:6817/1339628286] boot 2026-03-09T00:10:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:48 vm06.local ceph-mon[106218]: osdmap e105: 6 total, 6 up, 6 in 2026-03-09T00:10:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:48 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:10:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:48 vm03.local ceph-mon[129670]: pgmap v215: 65 pgs: 16 active+undersized, 12 active+undersized+degraded, 37 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 2.0 MiB/s rd, 1 op/s; 33/231 objects degraded (14.286%) 2026-03-09T00:10:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:48 vm03.local ceph-mon[129670]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-09T00:10:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:48 vm03.local ceph-mon[129670]: osd.5 [v2:192.168.123.106:6816/1339628286,v1:192.168.123.106:6817/1339628286] boot 2026-03-09T00:10:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:48 vm03.local ceph-mon[129670]: osdmap e105: 6 total, 6 up, 6 in 2026-03-09T00:10:49.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:48 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-09T00:10:50.047 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:50 vm03.local ceph-mon[129670]: osdmap e106: 6 total, 6 up, 6 in 2026-03-09T00:10:50.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:50 vm06.local ceph-mon[106218]: osdmap e106: 6 total, 6 up, 6 in 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.219+0000 7ff843905700 1 -- 192.168.123.103:0/2270283637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c069200 msgr2=0x7ff83c0695e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.219+0000 7ff843905700 1 --2- 192.168.123.103:0/2270283637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c069200 0x7ff83c0695e0 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7ff82c009b50 tx=0x7ff82c009e60 comp rx=0 tx=0).stop 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 -- 192.168.123.103:0/2270283637 shutdown_connections 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 --2- 192.168.123.103:0/2270283637 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 0x7ff83c10d640 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 --2- 192.168.123.103:0/2270283637 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c069200 0x7ff83c0695e0 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 -- 192.168.123.103:0/2270283637 >> 192.168.123.103:0/2270283637 conn(0x7ff83c076b30 msgr2=0x7ff83c076f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 -- 192.168.123.103:0/2270283637 shutdown_connections 2026-03-09T00:10:51.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 -- 192.168.123.103:0/2270283637 wait complete. 2026-03-09T00:10:51.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.221+0000 7ff843905700 1 Processor -- start 2026-03-09T00:10:51.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff843905700 1 -- start start 2026-03-09T00:10:51.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff843905700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 0x7ff83c1991f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff843905700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 0x7ff83c19dba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff843905700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff83c199d50 con 0x7ff83c199730 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff843905700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff83c199ec0 con 0x7ff83c069b20 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff840ea0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 0x7ff83c19dba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff840ea0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 0x7ff83c19dba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60466/0 (socket says 192.168.123.103:60466) 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.222+0000 7ff840ea0700 1 -- 192.168.123.103:0/1546745726 learned_addr learned my addr 192.168.123.103:0/1546745726 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff840ea0700 1 -- 192.168.123.103:0/1546745726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 msgr2=0x7ff83c1991f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff8416a1700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 0x7ff83c1991f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff840ea0700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 0x7ff83c1991f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff840ea0700 1 -- 192.168.123.103:0/1546745726 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff82c0097e0 con 0x7ff83c199730 2026-03-09T00:10:51.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff8416a1700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 0x7ff83c1991f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:10:51.224 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff840ea0700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 0x7ff83c19dba0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7ff83800b700 tx=0x7ff83800bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff838010840 con 0x7ff83c199730 2026-03-09T00:10:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff838010e80 con 0x7ff83c199730 2026-03-09T00:10:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.223+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff83800d590 con 0x7ff83c199730 2026-03-09T00:10:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.224+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff83c19e1a0 con 0x7ff83c199730 2026-03-09T00:10:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.224+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff83c19e670 con 0x7ff83c199730 2026-03-09T00:10:51.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.225+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff83c10adb0 con 0x7ff83c199730 2026-03-09T00:10:51.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.229+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff83800f3e0 con 0x7ff83c199730 2026-03-09T00:10:51.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.229+0000 7ff8327fc700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff828077990 0x7ff828079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.229+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7ff838099440 con 0x7ff83c199730 2026-03-09T00:10:51.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.229+0000 7ff8416a1700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff828077990 0x7ff828079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.230+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff838061c20 con 0x7ff83c199730 2026-03-09T00:10:51.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.230+0000 7ff8416a1700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff828077990 0x7ff828079e50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7ff82c006010 tx=0x7ff82c0058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:51 vm03.local ceph-mon[129670]: pgmap v218: 65 pgs: 12 peering, 10 active+undersized, 4 active+undersized+degraded, 39 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 29 MiB/s rd, 9 op/s; 11/231 objects degraded (4.762%) 2026-03-09T00:10:51.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:51 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:reconnect 2026-03-09T00:10:51.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:51 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm03.ralade=up:reconnect} 2 up:standby 2026-03-09T00:10:51.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:51 vm03.local ceph-mon[129670]: reconnect by client.24313 v1:192.168.144.1:0/748981855 after 0.00700001 2026-03-09T00:10:51.340 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:51 vm03.local ceph-mon[129670]: reconnect by client.24309 v1:192.168.123.103:0/929198358 after 0.00700001 2026-03-09T00:10:51.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.355+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff83c19e950 con 0x7ff828077990 2026-03-09T00:10:51.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.357+0000 7ff8327fc700 1 -- 192.168.123.103:0/1546745726 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7ff83c19e950 con 0x7ff828077990 2026-03-09T00:10:51.359 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff828077990 msgr2=0x7ff828079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff828077990 0x7ff828079e50 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7ff82c006010 tx=0x7ff82c0058e0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 msgr2=0x7ff83c19dba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.367 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 0x7ff83c19dba0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7ff83800b700 tx=0x7ff83800bac0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 shutdown_connections 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff828077990 0x7ff828079e50 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff83c069b20 0x7ff83c1991f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 --2- 192.168.123.103:0/1546745726 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff83c199730 0x7ff83c19dba0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.359+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 >> 192.168.123.103:0/1546745726 conn(0x7ff83c076b30 msgr2=0x7ff83c0fec30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.360+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 shutdown_connections 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.360+0000 7ff843905700 1 -- 192.168.123.103:0/1546745726 wait complete. 2026-03-09T00:10:51.368 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:10:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:51 vm06.local ceph-mon[106218]: pgmap v218: 65 pgs: 12 peering, 10 active+undersized, 4 active+undersized+degraded, 39 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 29 MiB/s rd, 9 op/s; 11/231 objects degraded (4.762%) 2026-03-09T00:10:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:51 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:reconnect 2026-03-09T00:10:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:51 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm03.ralade=up:reconnect} 2 up:standby 2026-03-09T00:10:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:51 vm06.local ceph-mon[106218]: reconnect by client.24313 v1:192.168.144.1:0/748981855 after 0.00700001 2026-03-09T00:10:51.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:51 vm06.local ceph-mon[106218]: reconnect by client.24309 v1:192.168.123.103:0/929198358 after 0.00700001 2026-03-09T00:10:51.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 -- 192.168.123.103:0/1456474928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103340 msgr2=0x7fd840103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 --2- 192.168.123.103:0/1456474928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103340 0x7fd840103720 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fd834009b00 tx=0x7fd834009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:51.440 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 -- 192.168.123.103:0/1456474928 shutdown_connections 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 --2- 192.168.123.103:0/1456474928 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103cf0 0x7fd840107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 --2- 192.168.123.103:0/1456474928 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103340 0x7fd840103720 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 -- 192.168.123.103:0/1456474928 >> 192.168.123.103:0/1456474928 conn(0x7fd8400feb90 msgr2=0x7fd840100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 -- 192.168.123.103:0/1456474928 shutdown_connections 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.440+0000 7fd84762d700 1 -- 192.168.123.103:0/1456474928 wait complete. 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.441+0000 7fd84762d700 1 Processor -- start 2026-03-09T00:10:51.441 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.441+0000 7fd84762d700 1 -- start start 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.441+0000 7fd84762d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103340 0x7fd840198e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd84762d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 0x7fd8401993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd84762d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8401999f0 con 0x7fd840103cf0 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd84762d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd840199b30 con 0x7fd840103340 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd844bc8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 0x7fd8401993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd844bc8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 0x7fd8401993a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60484/0 (socket says 192.168.123.103:60484) 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd844bc8700 1 -- 192.168.123.103:0/1471270260 learned_addr learned my addr 192.168.123.103:0/1471270260 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:51.442 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.442+0000 7fd8453c9700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103340 0x7fd840198e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd8453c9700 1 -- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 msgr2=0x7fd8401993a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd8453c9700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 0x7fd8401993a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd8453c9700 1 -- 192.168.123.103:0/1471270260 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8340097e0 con 0x7fd840103340 2026-03-09T00:10:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd844bc8700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 0x7fd8401993a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:10:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd8453c9700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103340 0x7fd840198e60 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fd834006010 tx=0x7fd834004c30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.443 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd83401d070 con 0x7fd840103340 2026-03-09T00:10:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.443+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd84019d920 con 0x7fd840103340 2026-03-09T00:10:51.444 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.444+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd84019de10 con 0x7fd840103340 2026-03-09T00:10:51.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.444+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd83400bc50 con 0x7fd840103340 2026-03-09T00:10:51.445 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.444+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd834021620 con 0x7fd840103340 2026-03-09T00:10:51.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.445+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd83402b430 con 0x7fd840103340 2026-03-09T00:10:51.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.445+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd84010b690 con 0x7fd840103340 2026-03-09T00:10:51.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.446+0000 7fd8327fc700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd82c0778c0 0x7fd82c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.446+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fd83409ad70 con 0x7fd840103340 2026-03-09T00:10:51.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.446+0000 7fd844bc8700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd82c0778c0 0x7fd82c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.446+0000 7fd844bc8700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd82c0778c0 0x7fd82c079d80 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fd84019a480 tx=0x7fd83c00a400 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.448 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.448+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd834063cb0 con 0x7fd840103340 2026-03-09T00:10:51.580 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.580+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fd84019a1c0 con 0x7fd82c0778c0 2026-03-09T00:10:51.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.581+0000 7fd8327fc700 1 -- 192.168.123.103:0/1471270260 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fd84019a1c0 con 0x7fd82c0778c0 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.583+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd82c0778c0 msgr2=0x7fd82c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.583+0000 7fd84762d700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd82c0778c0 0x7fd82c079d80 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7fd84019a480 tx=0x7fd83c00a400 comp rx=0 tx=0).stop 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103340 msgr2=0x7fd840198e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103340 0x7fd840198e60 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fd834006010 tx=0x7fd834004c30 comp rx=0 tx=0).stop 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 shutdown_connections 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd82c0778c0 0x7fd82c079d80 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd840103340 0x7fd840198e60 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 --2- 192.168.123.103:0/1471270260 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd840103cf0 0x7fd8401993a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 >> 192.168.123.103:0/1471270260 conn(0x7fd8400feb90 msgr2=0x7fd840100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 shutdown_connections 2026-03-09T00:10:51.584 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.584+0000 7fd84762d700 1 -- 192.168.123.103:0/1471270260 wait complete. 2026-03-09T00:10:51.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.667+0000 7f56d395f700 1 -- 192.168.123.103:0/2506785710 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 msgr2=0x7f56cc107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.667 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.667+0000 7f56d395f700 1 --2- 192.168.123.103:0/2506785710 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 0x7f56cc107d40 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f56c8009b00 tx=0x7f56c8009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:51.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.669+0000 7f56d395f700 1 -- 192.168.123.103:0/2506785710 shutdown_connections 2026-03-09T00:10:51.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.669+0000 7f56d395f700 1 --2- 192.168.123.103:0/2506785710 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 0x7f56cc107d40 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.669+0000 7f56d395f700 1 --2- 192.168.123.103:0/2506785710 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.669+0000 7f56d395f700 1 -- 192.168.123.103:0/2506785710 >> 192.168.123.103:0/2506785710 conn(0x7f56cc0feb90 msgr2=0x7f56cc100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.670 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.670+0000 7f56d395f700 1 -- 192.168.123.103:0/2506785710 shutdown_connections 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.670+0000 7f56d395f700 1 -- 192.168.123.103:0/2506785710 wait complete. 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d395f700 1 Processor -- start 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d395f700 1 -- start start 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d395f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc19d8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d395f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 0x7f56cc078320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d395f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56cc19dfb0 con 0x7f56cc103340 2026-03-09T00:10:51.671 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d395f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56cc19e120 con 0x7f56cc103cf0 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d16fb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc19d8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d16fb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc19d8b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60496/0 (socket says 192.168.123.103:60496) 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.671+0000 7f56d16fb700 1 -- 192.168.123.103:0/3241615026 learned_addr learned my addr 192.168.123.103:0/3241615026 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56d16fb700 1 -- 192.168.123.103:0/3241615026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 msgr2=0x7f56cc078320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56d16fb700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 0x7f56cc078320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56d16fb700 1 -- 192.168.123.103:0/3241615026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56bc009710 con 0x7f56cc103340 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56d16fb700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc19d8b0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f56bc00ec80 tx=0x7f56bc00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56bc00cd50 con 0x7f56cc103340 2026-03-09T00:10:51.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56d395f700 1 -- 192.168.123.103:0/3241615026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56c80097e0 con 0x7f56cc103340 2026-03-09T00:10:51.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.672+0000 7f56d395f700 1 -- 192.168.123.103:0/3241615026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56cc078d70 con 0x7f56cc103340 2026-03-09T00:10:51.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.673+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f56bc00ceb0 con 0x7f56cc103340 2026-03-09T00:10:51.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.673+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56bc005320 con 0x7f56cc103340 2026-03-09T00:10:51.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.674+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f56bc005610 con 0x7f56cc103340 2026-03-09T00:10:51.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.674+0000 7f56d395f700 1 -- 192.168.123.103:0/3241615026 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56b0005320 con 0x7f56cc103340 2026-03-09T00:10:51.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.679+0000 7f56c27fc700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f56b80778c0 0x7f56b8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.680+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f56bc014070 con 0x7f56cc103340 2026-03-09T00:10:51.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.680+0000 7f56d0efa700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f56b80778c0 0x7f56b8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.680+0000 7f56d0efa700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f56b80778c0 0x7f56b8079d80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f56c800b5c0 tx=0x7f56c8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.680 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.680+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f56bc09e380 con 0x7f56cc103340 2026-03-09T00:10:51.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.823+0000 7f56d395f700 1 -- 192.168.123.103:0/3241615026 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f56b0000bf0 con 0x7f56b80778c0 2026-03-09T00:10:51.831 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.831+0000 7f56c27fc700 1 -- 192.168.123.103:0/3241615026 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f56b0000bf0 con 0x7f56b80778c0 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (6m) 76s ago 11m 24.4M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (11m) 76s ago 11m 9499k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (10m) 10s ago 10m 9369k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (4m) 76s ago 11m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (4m) 10s ago 10m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (5m) 76s ago 11m 85.2M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (9m) 76s ago 9m 19.5M - 18.2.1 5be31c24972a 404501ca3f76 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (9m) 76s ago 9m 188M - 18.2.1 5be31c24972a b71cb8823eff 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (9m) 10s ago 9m 21.8M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (9m) 10s ago 9m 18.2M - 18.2.1 5be31c24972a 84dbd6c37a69 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (7m) 76s ago 11m 625M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (6m) 10s ago 10m 491M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 76s ago 11m 61.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (5m) 10s ago 10m 53.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (6m) 76s ago 11m 9643k - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (6m) 10s ago 10m 9663k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (4m) 76s ago 10m 178M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (100s) 76s ago 10m 142M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (77s) 76s ago 10m 12.3M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e7841e7307ae 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (55s) 10s ago 9m 175M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 8e61be617139 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (33s) 10s ago 9m 143M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 21cf4dc58899 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (11s) 10s ago 9m 13.2M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fbc950d55a67 2026-03-09T00:10:51.833 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (6m) 76s ago 10m 58.0M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.835+0000 7f56b7fff700 1 -- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f56b80778c0 msgr2=0x7f56b8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.835+0000 7f56b7fff700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f56b80778c0 0x7f56b8079d80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f56c800b5c0 tx=0x7f56c8005fb0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.835+0000 7f56b7fff700 1 -- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 msgr2=0x7f56cc19d8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.835+0000 7f56b7fff700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc19d8b0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f56bc00ec80 tx=0x7f56bc00c5b0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 -- 192.168.123.103:0/3241615026 shutdown_connections 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f56b80778c0 0x7f56b8079d80 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f56cc103340 0x7f56cc19d8b0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 --2- 192.168.123.103:0/3241615026 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f56cc103cf0 0x7f56cc078320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.836 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 -- 192.168.123.103:0/3241615026 >> 192.168.123.103:0/3241615026 conn(0x7f56cc0feb90 msgr2=0x7f56cc100130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 -- 192.168.123.103:0/3241615026 shutdown_connections 2026-03-09T00:10:51.837 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.836+0000 7f56b7fff700 1 -- 192.168.123.103:0/3241615026 wait complete. 2026-03-09T00:10:51.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.922+0000 7f0b68daa700 1 -- 192.168.123.103:0/1036756657 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b64107d90 msgr2=0x7f0b641081f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.922+0000 7f0b68daa700 1 --2- 192.168.123.103:0/1036756657 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b64107d90 0x7f0b641081f0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f0b5c00b3a0 tx=0x7f0b5c00b6b0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 -- 192.168.123.103:0/1036756657 shutdown_connections 2026-03-09T00:10:51.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 --2- 192.168.123.103:0/1036756657 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b64107d90 0x7f0b641081f0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 --2- 192.168.123.103:0/1036756657 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6410d310 0x7f0b6410d6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.927 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 -- 192.168.123.103:0/1036756657 >> 192.168.123.103:0/1036756657 conn(0x7f0b6406ce20 msgr2=0x7f0b6406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:51.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 -- 192.168.123.103:0/1036756657 shutdown_connections 2026-03-09T00:10:51.928 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 -- 192.168.123.103:0/1036756657 wait complete. 2026-03-09T00:10:51.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 Processor -- start 2026-03-09T00:10:51.929 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.927+0000 7f0b68daa700 1 -- start start 2026-03-09T00:10:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b68daa700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b6410d310 0x7f0b6407ce70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b68daa700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6407d3b0 0x7f0b6407d830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b68daa700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b640819f0 con 0x7f0b6410d310 2026-03-09T00:10:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b68daa700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b64081b60 con 0x7f0b6407d3b0 2026-03-09T00:10:51.932 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b62ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6407d3b0 0x7f0b6407d830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b637fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b6410d310 0x7f0b6407ce70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b637fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b6410d310 0x7f0b6407ce70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60524/0 (socket says 192.168.123.103:60524) 2026-03-09T00:10:51.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.928+0000 7f0b637fe700 1 -- 192.168.123.103:0/107122183 learned_addr learned my addr 192.168.123.103:0/107122183 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:51.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.929+0000 7f0b62ffd700 1 -- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b6410d310 msgr2=0x7f0b6407ce70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:51.933 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.929+0000 7f0b62ffd700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b6410d310 0x7f0b6407ce70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.929+0000 7f0b62ffd700 1 -- 192.168.123.103:0/107122183 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b5c00b050 con 0x7f0b6407d3b0 2026-03-09T00:10:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.932+0000 7f0b62ffd700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6407d3b0 0x7f0b6407d830 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f0b5c000f80 tx=0x7f0b5c007b60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.933+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b5c00e050 con 0x7f0b6407d3b0 2026-03-09T00:10:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.933+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0b5c003cd0 con 0x7f0b6407d3b0 2026-03-09T00:10:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.933+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b5c01b9b0 con 0x7f0b6407d3b0 2026-03-09T00:10:51.934 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.933+0000 7f0b68daa700 1 -- 192.168.123.103:0/107122183 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0b64081de0 con 0x7f0b6407d3b0 2026-03-09T00:10:51.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.933+0000 7f0b68daa700 1 -- 192.168.123.103:0/107122183 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0b640822d0 con 0x7f0b6407d3b0 2026-03-09T00:10:51.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.935+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0b5c019070 con 0x7f0b6407d3b0 2026-03-09T00:10:51.935 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.935+0000 7f0b60ff9700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0b4c077910 0x7f0b4c079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:51.936 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.935+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f0b5c099c60 con 0x7f0b6407d3b0 2026-03-09T00:10:51.936 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.936+0000 7f0b637fe700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0b4c077910 0x7f0b4c079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:51.936 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.935+0000 7f0b68daa700 1 -- 192.168.123.103:0/107122183 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0b6404f2e0 con 0x7f0b6407d3b0 2026-03-09T00:10:51.937 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.936+0000 7f0b637fe700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0b4c077910 0x7f0b4c079dd0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f0b54005950 tx=0x7f0b54016040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:51.944 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:51.941+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0b5c0624c0 con 0x7f0b6407d3b0 2026-03-09T00:10:52.154 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.153+0000 7f0b68daa700 1 -- 192.168.123.103:0/107122183 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0b6407e440 con 0x7f0b6407d3b0 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.157+0000 7f0b60ff9700 1 -- 192.168.123.103:0/107122183 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+709 (secure 0 0 0) 0x7f0b5c017070 con 0x7f0b6407d3b0 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 10 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:10:52.159 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 -- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0b4c077910 msgr2=0x7f0b4c079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0b4c077910 0x7f0b4c079dd0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f0b54005950 tx=0x7f0b54016040 comp rx=0 tx=0).stop 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 -- 192.168.123.103:0/107122183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6407d3b0 msgr2=0x7f0b6407d830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6407d3b0 0x7f0b6407d830 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f0b5c000f80 tx=0x7f0b5c007b60 comp rx=0 tx=0).stop 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 -- 192.168.123.103:0/107122183 shutdown_connections 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0b4c077910 0x7f0b4c079dd0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0b6410d310 0x7f0b6407ce70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 --2- 192.168.123.103:0/107122183 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0b6407d3b0 0x7f0b6407d830 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 -- 192.168.123.103:0/107122183 >> 192.168.123.103:0/107122183 conn(0x7f0b6406ce20 msgr2=0x7f0b64071130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.161+0000 7f0b4a7fc700 1 -- 192.168.123.103:0/107122183 shutdown_connections 2026-03-09T00:10:52.165 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.165+0000 7f0b4a7fc700 1 -- 192.168.123.103:0/107122183 wait complete. 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:rejoin 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm03.ralade=up:rejoin} 2 up:standby 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: daemon mds.cephfs.vm03.ralade is now active in filesystem cephfs as rank 0 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: from='client.34382 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: pgmap v219: 65 pgs: 12 peering, 53 active+clean; 209 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 7 op/s 2026-03-09T00:10:52.234 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:52 vm03.local ceph-mon[129670]: from='client.34390 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.232+0000 7fa944751700 1 -- 192.168.123.103:0/3011498391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 msgr2=0x7fa93c103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.232+0000 7fa944751700 1 --2- 192.168.123.103:0/3011498391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c103720 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fa938009b50 tx=0x7fa938009e60 comp rx=0 tx=0).stop 2026-03-09T00:10:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.234+0000 7fa944751700 1 -- 192.168.123.103:0/3011498391 shutdown_connections 2026-03-09T00:10:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.234+0000 7fa944751700 1 --2- 192.168.123.103:0/3011498391 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.234+0000 7fa944751700 1 --2- 192.168.123.103:0/3011498391 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c103720 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.234+0000 7fa944751700 1 -- 192.168.123.103:0/3011498391 >> 192.168.123.103:0/3011498391 conn(0x7fa93c0feb90 msgr2=0x7fa93c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.234+0000 7fa944751700 1 -- 192.168.123.103:0/3011498391 shutdown_connections 2026-03-09T00:10:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.234+0000 7fa944751700 1 -- 192.168.123.103:0/3011498391 wait complete. 2026-03-09T00:10:52.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.235+0000 7fa944751700 1 Processor -- start 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.235+0000 7fa944751700 1 -- start start 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa944751700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c198e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa944751700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c1993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa944751700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa93c199a80 con 0x7fa93c103340 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa944751700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa93c19d810 con 0x7fa93c103cf0 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa941cec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c1993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa941cec700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c1993a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:34384/0 (socket says 192.168.123.103:34384) 2026-03-09T00:10:52.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa941cec700 1 -- 192.168.123.103:0/999279839 learned_addr learned my addr 192.168.123.103:0/999279839 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.236+0000 7fa9424ed700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c198e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa941cec700 1 -- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 msgr2=0x7fa93c198e60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa941cec700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c198e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa941cec700 1 -- 192.168.123.103:0/999279839 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa9380097e0 con 0x7fa93c103cf0 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa9424ed700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa941cec700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c1993a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fa92c00d900 tx=0x7fa92c00dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa92c0098e0 con 0x7fa93c103cf0 2026-03-09T00:10:52.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa93c19daf0 con 0x7fa93c103cf0 2026-03-09T00:10:52.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa93c19e040 con 0x7fa93c103cf0 2026-03-09T00:10:52.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa92c010460 con 0x7fa93c103cf0 2026-03-09T00:10:52.238 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.237+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa92c00f5d0 con 0x7fa93c103cf0 2026-03-09T00:10:52.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.239+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa92c009a40 con 0x7fa93c103cf0 2026-03-09T00:10:52.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.239+0000 7fa9337fe700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa928077920 0x7fa928079de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.239+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fa92c0994b0 con 0x7fa93c103cf0 2026-03-09T00:10:52.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.239+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa920005320 con 0x7fa93c103cf0 2026-03-09T00:10:52.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.239+0000 7fa9424ed700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa928077920 0x7fa928079de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.240+0000 7fa9424ed700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa928077920 0x7fa928079de0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fa93800b5c0 tx=0x7fa938005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:52.243 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.243+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa92c0615c0 con 0x7fa93c103cf0 2026-03-09T00:10:52.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.382+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fa920006200 con 0x7fa93c103cf0 2026-03-09T00:10:52.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.383+0000 7fa9337fe700 1 -- 192.168.123.103:0/999279839 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 16 v16) v1 ==== 76+0+1754 (secure 0 0 0) 0x7fa92c020070 con 0x7fa93c103cf0 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:e16 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-09T00:10:52:056595+0000 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:epoch 16 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:10:52.056592+0000 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:10:52.384 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 103 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:up {0=14492} 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 14492 members: 14492 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{0:14492} state up:active seq 139 join_fscid=1 addr [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:24287} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:52.385 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:24297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1090058295,v1:192.168.123.106:6825/1090058295] compat {c=[1],r=[1],i=[7ff]}] 2026-03-09T00:10:52.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.386+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa928077920 msgr2=0x7fa928079de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.386+0000 7fa944751700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa928077920 0x7fa928079de0 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fa93800b5c0 tx=0x7fa938005fb0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.386+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 msgr2=0x7fa93c1993a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.386+0000 7fa944751700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c1993a0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fa92c00d900 tx=0x7fa92c00dcc0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 shutdown_connections 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa928077920 0x7fa928079de0 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa93c103340 0x7fa93c198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 --2- 192.168.123.103:0/999279839 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa93c103cf0 0x7fa93c1993a0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 >> 192.168.123.103:0/999279839 conn(0x7fa93c0feb90 msgr2=0x7fa93c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 shutdown_connections 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.387+0000 7fa944751700 1 -- 192.168.123.103:0/999279839 wait complete. 2026-03-09T00:10:52.407 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-09T00:10:52.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:rejoin 2026-03-09T00:10:52.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm03.ralade=up:rejoin} 2 up:standby 2026-03-09T00:10:52.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: daemon mds.cephfs.vm03.ralade is now active in filesystem cephfs as rank 0 2026-03-09T00:10:52.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: from='client.34382 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:52.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:52.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: pgmap v219: 65 pgs: 12 peering, 53 active+clean; 209 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 24 MiB/s rd, 7 op/s 2026-03-09T00:10:52.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:52 vm06.local ceph-mon[106218]: from='client.34390 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:52.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.458+0000 7fe4573e0700 1 -- 192.168.123.103:0/1545732211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 msgr2=0x7fe4480964e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.458+0000 7fe4573e0700 1 --2- 192.168.123.103:0/1545732211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe4480964e0 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7fe44c009b00 tx=0x7fe44c009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.461+0000 7fe4573e0700 1 -- 192.168.123.103:0/1545732211 shutdown_connections 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.461+0000 7fe4573e0700 1 --2- 192.168.123.103:0/1545732211 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe448096ab0 0x7fe44809a9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.461+0000 7fe4573e0700 1 --2- 192.168.123.103:0/1545732211 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe4480964e0 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.461+0000 7fe4573e0700 1 -- 192.168.123.103:0/1545732211 >> 192.168.123.103:0/1545732211 conn(0x7fe44800b920 msgr2=0x7fe44800bd30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.464+0000 7fe4573e0700 1 -- 192.168.123.103:0/1545732211 shutdown_connections 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.464+0000 7fe4573e0700 1 -- 192.168.123.103:0/1545732211 wait complete. 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.466+0000 7fe4573e0700 1 Processor -- start 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.466+0000 7fe4573e0700 1 -- start start 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4573e0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe44812f660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4573e0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe448096ab0 0x7fe44812fba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4573e0700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe448130230 con 0x7fe448096100 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4573e0700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe4481296e0 con 0x7fe448096ab0 2026-03-09T00:10:52.481 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4563de700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe44812f660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4563de700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe44812f660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60568/0 (socket says 192.168.123.103:60568) 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.467+0000 7fe4563de700 1 -- 192.168.123.103:0/3739674353 learned_addr learned my addr 192.168.123.103:0/3739674353 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4563de700 1 -- 192.168.123.103:0/3739674353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe448096ab0 msgr2=0x7fe44812fba0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4563de700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe448096ab0 0x7fe44812fba0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4563de700 1 -- 192.168.123.103:0/3739674353 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe44c0097e0 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4563de700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe44812f660 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fe44c009ad0 tx=0x7fe44c00bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe44c01d070 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe44c00f460 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe448129960 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.468+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe448129e50 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.469+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe44c021620 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.470+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe44c02b430 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.470+0000 7fe4477fe700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe43c07bdf0 0x7fe43c07e2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.470+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fe44c09bd80 con 0x7fe448096100 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.477+0000 7fe455bdd700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe43c07bdf0 0x7fe43c07e2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.477+0000 7fe455bdd700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe43c07bdf0 0x7fe43c07e2b0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fe44812ab50 tx=0x7fe440009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:52.482 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.477+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe434005320 con 0x7fe448096100 2026-03-09T00:10:52.483 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.480+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe44c0645e0 con 0x7fe448096100 2026-03-09T00:10:52.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.607+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe434000bf0 con 0x7fe43c07bdf0 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.608+0000 7fe4477fe700 1 -- 192.168.123.103:0/3739674353 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fe434000bf0 con 0x7fe43c07bdf0 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "osd", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "12/23 daemons upgraded", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading mds daemons", 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:10:52.609 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:10:52.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe43c07bdf0 msgr2=0x7fe43c07e2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe43c07bdf0 0x7fe43c07e2b0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7fe44812ab50 tx=0x7fe440009450 comp rx=0 tx=0).stop 2026-03-09T00:10:52.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 msgr2=0x7fe44812f660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe44812f660 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fe44c009ad0 tx=0x7fe44c00bab0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 shutdown_connections 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe43c07bdf0 0x7fe43c07e2b0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe448096100 0x7fe44812f660 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 --2- 192.168.123.103:0/3739674353 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe448096ab0 0x7fe44812fba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 >> 192.168.123.103:0/3739674353 conn(0x7fe44800b920 msgr2=0x7fe448094a10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 shutdown_connections 2026-03-09T00:10:52.613 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.612+0000 7fe4573e0700 1 -- 192.168.123.103:0/3739674353 wait complete. 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.705+0000 7f1d7f9da700 1 -- 192.168.123.103:0/464595133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 msgr2=0x7f1d781049e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.705+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/464595133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d781049e0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f1d6c009b00 tx=0x7f1d6c009e10 comp rx=0 tx=0).stop 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- 192.168.123.103:0/464595133 shutdown_connections 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/464595133 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d781049e0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/464595133 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d780fffe0 0x7f1d781003c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- 192.168.123.103:0/464595133 >> 192.168.123.103:0/464595133 conn(0x7f1d780fb830 msgr2=0x7f1d780fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- 192.168.123.103:0/464595133 shutdown_connections 2026-03-09T00:10:52.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- 192.168.123.103:0/464595133 wait complete. 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 Processor -- start 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- start start 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d780fffe0 0x7f1d78072960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d7806d960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d7806dea0 con 0x7f1d78100990 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.706+0000 7f1d7f9da700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d7806e010 con 0x7f1d780fffe0 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d7806d960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d7806d960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:60596/0 (socket says 192.168.123.103:60596) 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 -- 192.168.123.103:0/2127688366 learned_addr learned my addr 192.168.123.103:0/2127688366 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:10:52.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 -- 192.168.123.103:0/2127688366 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d780fffe0 msgr2=0x7f1d78072960 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:10:52.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d780fffe0 0x7f1d78072960 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 -- 192.168.123.103:0/2127688366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1d6c0097e0 con 0x7f1d78100990 2026-03-09T00:10:52.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.707+0000 7f1d7cf75700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d7806d960 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f1d6c00b5c0 tx=0x7f1d6c004a00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:52.708 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.708+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d6c01d070 con 0x7f1d78100990 2026-03-09T00:10:52.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.708+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1d7806e290 con 0x7f1d78100990 2026-03-09T00:10:52.709 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.708+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1d7806e780 con 0x7f1d78100990 2026-03-09T00:10:52.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.708+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1d6c00bc50 con 0x7f1d78100990 2026-03-09T00:10:52.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.708+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d6c017610 con 0x7f1d78100990 2026-03-09T00:10:52.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.709+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1d6c017900 con 0x7f1d78100990 2026-03-09T00:10:52.710 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.710+0000 7f1d6a7fc700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d64077910 0x7f1d64079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:10:52.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.710+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(106..106 src has 1..106) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f1d6c09c270 con 0x7f1d78100990 2026-03-09T00:10:52.711 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.711+0000 7f1d7d776700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d64077910 0x7f1d64079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:10:52.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.711+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1d7804ea90 con 0x7f1d78100990 2026-03-09T00:10:52.713 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.713+0000 7f1d7d776700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d64077910 0x7f1d64079dd0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f1d7400a9b0 tx=0x7f1d74005c90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:10:52.714 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.714+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1d6c064a50 con 0x7f1d78100990 2026-03-09T00:10:52.884 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.883+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f1d7806ee90 con 0x7f1d78100990 2026-03-09T00:10:52.885 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.885+0000 7f1d6a7fc700 1 -- 192.168.123.103:0/2127688366 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f1d6c0641a0 con 0x7f1d78100990 2026-03-09T00:10:52.886 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:10:52.886 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:10:52.886 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:10:52.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.889+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d64077910 msgr2=0x7f1d64079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.889+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d64077910 0x7f1d64079dd0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f1d7400a9b0 tx=0x7f1d74005c90 comp rx=0 tx=0).stop 2026-03-09T00:10:52.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.889+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 msgr2=0x7f1d7806d960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:10:52.889 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.889+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d7806d960 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f1d6c00b5c0 tx=0x7f1d6c004a00 comp rx=0 tx=0).stop 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.890+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 shutdown_connections 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.890+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d64077910 0x7f1d64079dd0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.890+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d780fffe0 0x7f1d78072960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.890+0000 7f1d7f9da700 1 --2- 192.168.123.103:0/2127688366 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d78100990 0x7f1d7806d960 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.890+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 >> 192.168.123.103:0/2127688366 conn(0x7f1d780fb830 msgr2=0x7f1d780fce20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.891+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 shutdown_connections 2026-03-09T00:10:52.891 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:10:52.891+0000 7f1d7f9da700 1 -- 192.168.123.103:0/2127688366 wait complete. 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 11/231 objects degraded (4.762%), 4 pgs degraded) 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:active 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.ralade=up:active} 2 up:standby 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/107122183' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/999279839' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='client.34400 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:53.214 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:53 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2127688366' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 11/231 objects degraded (4.762%), 4 pgs degraded) 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6828/3870847623,v1:192.168.123.103:6829/3870847623] up:active 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.ralade=up:active} 2 up:standby 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/107122183' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/999279839' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='client.34400 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:53.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:53 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2127688366' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:10:54.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:54 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:boot 2026-03-09T00:10:54.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:54 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.ralade=up:active} 3 up:standby 2026-03-09T00:10:54.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:54 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:10:54.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:54 vm06.local ceph-mon[106218]: pgmap v220: 65 pgs: 65 active+clean; 209 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 383 B/s wr, 8 op/s 2026-03-09T00:10:54.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:54 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:boot 2026-03-09T00:10:54.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:54 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.ralade=up:active} 3 up:standby 2026-03-09T00:10:54.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:54 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:10:54.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:54 vm03.local ceph-mon[129670]: pgmap v220: 65 pgs: 65 active+clean; 209 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 383 B/s wr, 8 op/s 2026-03-09T00:10:55.981 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:55 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:55.981 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:55 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:55.981 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:55 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:55.981 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:55 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:55 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:55 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:55 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:55 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: pgmap v221: 65 pgs: 65 active+clean; 209 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 511 B/s wr, 9 op/s 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:56.725 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:56 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: pgmap v221: 65 pgs: 65 active+clean; 209 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 511 B/s wr, 9 op/s 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:57.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:56 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:10:58.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[129666]: 2026-03-09T00:10:58.286+0000 7f802c841640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: Upgrade: Updating mds.cephfs.vm03.ralade 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: pgmap v222: 65 pgs: 65 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 515 B/s wr, 8 op/s 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: Deploying daemon mds.cephfs.vm03.ralade on vm03 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: osdmap e107: 6 total, 6 up, 6 in 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: Standby daemon mds.cephfs.vm06.ixduim assigned to filesystem cephfs as rank 0 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm06.ixduim=up:replay} 2 up:standby 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:59.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: Upgrade: Updating mds.cephfs.vm03.ralade 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: pgmap v222: 65 pgs: 65 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 515 B/s wr, 8 op/s 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm03.ralade", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: Deploying daemon mds.cephfs.vm03.ralade on vm03 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: osdmap e107: 6 total, 6 up, 6 in 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: Standby daemon mds.cephfs.vm06.ixduim assigned to filesystem cephfs as rank 0 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm06.ixduim=up:replay} 2 up:standby 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:10:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:00.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:10:59 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:11:00.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:10:59 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:11:01.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:00 vm03.local ceph-mon[129670]: pgmap v224: 65 pgs: 65 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 3.0 MiB/s rd, 511 B/s wr, 5 op/s 2026-03-09T00:11:01.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:00 vm06.local ceph-mon[106218]: pgmap v224: 65 pgs: 65 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 3.0 MiB/s rd, 511 B/s wr, 5 op/s 2026-03-09T00:11:02.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:02 vm03.local ceph-mon[129670]: pgmap v225: 65 pgs: 65 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 5.8 MiB/s rd, 5.8 KiB/s wr, 8 op/s 2026-03-09T00:11:02.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:02 vm06.local ceph-mon[106218]: pgmap v225: 65 pgs: 65 active+clean; 209 MiB data, 883 MiB used, 119 GiB / 120 GiB avail; 5.8 MiB/s rd, 5.8 KiB/s wr, 8 op/s 2026-03-09T00:11:04.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:04 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:reconnect 2026-03-09T00:11:04.739 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:04 vm03.local ceph-mon[129670]: reconnect by client.24309 v1:192.168.123.103:0/929198358 after 0.002 2026-03-09T00:11:04.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:04 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm06.ixduim=up:reconnect} 2 up:standby 2026-03-09T00:11:04.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:04 vm03.local ceph-mon[129670]: reconnect by client.24313 v1:192.168.144.1:0/748981855 after 0.003 2026-03-09T00:11:04.740 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:04 vm03.local ceph-mon[129670]: pgmap v226: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 5.5 KiB/s wr, 10 op/s 2026-03-09T00:11:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:04 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:reconnect 2026-03-09T00:11:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:04 vm06.local ceph-mon[106218]: reconnect by client.24309 v1:192.168.123.103:0/929198358 after 0.002 2026-03-09T00:11:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:04 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm06.ixduim=up:reconnect} 2 up:standby 2026-03-09T00:11:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:04 vm06.local ceph-mon[106218]: reconnect by client.24313 v1:192.168.144.1:0/748981855 after 0.003 2026-03-09T00:11:04.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:04 vm06.local ceph-mon[106218]: pgmap v226: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 5.5 KiB/s wr, 10 op/s 2026-03-09T00:11:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:05 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:rejoin 2026-03-09T00:11:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:05 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm06.ixduim=up:rejoin} 2 up:standby 2026-03-09T00:11:05.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:05 vm03.local ceph-mon[129670]: daemon mds.cephfs.vm06.ixduim is now active in filesystem cephfs as rank 0 2026-03-09T00:11:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:05 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:rejoin 2026-03-09T00:11:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:05 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm06.ixduim=up:rejoin} 2 up:standby 2026-03-09T00:11:05.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:05 vm06.local ceph-mon[106218]: daemon mds.cephfs.vm06.ixduim is now active in filesystem cephfs as rank 0 2026-03-09T00:11:06.822 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:06 vm06.local ceph-mon[106218]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:11:06.822 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:06 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:active 2026-03-09T00:11:06.822 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:06 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 2 up:standby 2026-03-09T00:11:06.822 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:06 vm06.local ceph-mon[106218]: pgmap v227: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5.4 KiB/s wr, 9 op/s 2026-03-09T00:11:06.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:06 vm03.local ceph-mon[129670]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:11:06.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:06 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.106:6826/1001012017,v1:192.168.123.106:6827/1001012017] up:active 2026-03-09T00:11:06.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:06 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 2 up:standby 2026-03-09T00:11:06.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:06 vm03.local ceph-mon[129670]: pgmap v227: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5.4 KiB/s wr, 9 op/s 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: pgmap v228: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 5.4 KiB/s wr, 9 op/s 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6828/1027317762,v1:192.168.123.103:6829/1027317762] up:boot 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 3 up:standby 2026-03-09T00:11:09.260 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:08 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: pgmap v228: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 5.4 KiB/s wr, 9 op/s 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6828/1027317762,v1:192.168.123.103:6829/1027317762] up:boot 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 3 up:standby 2026-03-09T00:11:09.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:09 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:11:11.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:11 vm03.local ceph-mon[129670]: pgmap v229: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.9 KiB/s wr, 11 op/s 2026-03-09T00:11:12.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:11 vm06.local ceph-mon[106218]: pgmap v229: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 18 MiB/s rd, 4.9 KiB/s wr, 11 op/s 2026-03-09T00:11:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:12 vm03.local ceph-mon[129670]: pgmap v230: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 8.8 KiB/s wr, 11 op/s 2026-03-09T00:11:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:12 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:12 vm06.local ceph-mon[106218]: pgmap v230: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 8.8 KiB/s wr, 11 op/s 2026-03-09T00:11:13.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:13.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:12 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: Upgrade: Updating mds.cephfs.vm06.vlrwtl 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: Deploying daemon mds.cephfs.vm06.vlrwtl on vm06 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: pgmap v231: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 4.4 KiB/s wr, 8 op/s 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:13 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: Upgrade: Updating mds.cephfs.vm06.vlrwtl 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.vlrwtl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: Deploying daemon mds.cephfs.vm06.vlrwtl on vm06 2026-03-09T00:11:14.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: pgmap v231: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 13 MiB/s rd, 4.4 KiB/s wr, 8 op/s 2026-03-09T00:11:14.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:14.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:13 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:15.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:14 vm03.local ceph-mon[129670]: osdmap e108: 6 total, 6 up, 6 in 2026-03-09T00:11:15.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:14 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 2 up:standby 2026-03-09T00:11:15.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:14 vm06.local ceph-mon[106218]: osdmap e108: 6 total, 6 up, 6 in 2026-03-09T00:11:15.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:14 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 2 up:standby 2026-03-09T00:11:16.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:16 vm03.local ceph-mon[129670]: pgmap v233: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 1.0 MiB/s rd, 5.3 KiB/s wr, 6 op/s 2026-03-09T00:11:16.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:16 vm06.local ceph-mon[106218]: pgmap v233: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 1.0 MiB/s rd, 5.3 KiB/s wr, 6 op/s 2026-03-09T00:11:17.918 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:17.918 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:17.918 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:17.918 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:17.918 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:17 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:18.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:17 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:18 vm06.local ceph-mon[106218]: pgmap v234: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 37 KiB/s rd, 5.2 KiB/s wr, 5 op/s 2026-03-09T00:11:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:18 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.106:6824/1987865018,v1:192.168.123.106:6825/1987865018] up:boot 2026-03-09T00:11:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:18 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 3 up:standby 2026-03-09T00:11:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:11:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:19.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:18 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:18 vm03.local ceph-mon[129670]: pgmap v234: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 37 KiB/s rd, 5.2 KiB/s wr, 5 op/s 2026-03-09T00:11:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:18 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.106:6824/1987865018,v1:192.168.123.106:6825/1987865018] up:boot 2026-03-09T00:11:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:18 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm06.ixduim=up:active} 3 up:standby 2026-03-09T00:11:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:11:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:18 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: pgmap v235: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 5.1 KiB/s wr, 2 op/s 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:11:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:20 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: pgmap v235: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 5.1 KiB/s wr, 2 op/s 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm06.ixduim", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:21.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:20 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[129666]: 2026-03-09T00:11:20.924+0000 7f802c841640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: Upgrade: Updating mds.cephfs.vm06.ixduim 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: Deploying daemon mds.cephfs.vm06.ixduim on vm06 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: osdmap e109: 6 total, 6 up, 6 in 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: Standby daemon mds.cephfs.vm03.sejksk assigned to filesystem cephfs as rank 0 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T00:11:22.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:21 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:replay} 2 up:standby 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: Upgrade: Updating mds.cephfs.vm06.ixduim 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: Deploying daemon mds.cephfs.vm06.ixduim on vm06 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: osdmap e109: 6 total, 6 up, 6 in 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: Standby daemon mds.cephfs.vm03.sejksk assigned to filesystem cephfs as rank 0 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-09T00:11:22.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:21 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:replay} 2 up:standby 2026-03-09T00:11:22.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/1277371399 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe88107ff0 msgr2=0x7fbe881083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:22.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 --2- 192.168.123.103:0/1277371399 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe88107ff0 0x7fbe881083d0 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7fbe78007780 tx=0x7fbe78007a90 comp rx=0 tx=0).stop 2026-03-09T00:11:22.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/1277371399 shutdown_connections 2026-03-09T00:11:22.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 --2- 192.168.123.103:0/1277371399 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe881089a0 0x7fbe8810be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:22.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 --2- 192.168.123.103:0/1277371399 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe88107ff0 0x7fbe881083d0 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:22.971 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/1277371399 >> 192.168.123.103:0/1277371399 conn(0x7fbe8806ce20 msgr2=0x7fbe8806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/1277371399 shutdown_connections 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.970+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/1277371399 wait complete. 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.971+0000 7fbe8e5f8700 1 Processor -- start 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.971+0000 7fbe8e5f8700 1 -- start start 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.971+0000 7fbe8e5f8700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe881089a0 0x7fbe8807cd20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.971+0000 7fbe8e5f8700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 0x7fbe8807d6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.971+0000 7fbe8e5f8700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe88083d90 con 0x7fbe881089a0 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.971+0000 7fbe8e5f8700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe880818a0 con 0x7fbe8807d260 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 0x7fbe8807d6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:22.972 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe877fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 0x7fbe8807d6e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33576/0 (socket says 192.168.123.103:33576) 2026-03-09T00:11:22.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe877fe700 1 -- 192.168.123.103:0/413297842 learned_addr learned my addr 192.168.123.103:0/413297842 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:22.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe87fff700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe881089a0 0x7fbe8807cd20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:22.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe877fe700 1 -- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe881089a0 msgr2=0x7fbe8807cd20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:22.973 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe877fe700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe881089a0 0x7fbe8807cd20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.972+0000 7fbe877fe700 1 -- 192.168.123.103:0/413297842 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbe78007430 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.973+0000 7fbe877fe700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 0x7fbe8807d6e0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fbe8000c370 tx=0x7fbe8000c680 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.973+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe8000e050 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.973+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/413297842 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbe88081b20 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.973+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/413297842 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbe88082070 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.973+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbe8000f040 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.973+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe80013400 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.974+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/413297842 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbe8804f2e0 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.975+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbe80004ad0 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.975+0000 7fbe857fa700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbe70077910 0x7fbe70079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.975+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fbe80099e30 con 0x7fbe8807d260 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.975+0000 7fbe87fff700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbe70077910 0x7fbe70079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:22.976 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.976+0000 7fbe87fff700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbe70077910 0x7fbe70079dd0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fbe78007750 tx=0x7fbe78007de0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:22.977 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:22.977+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbe80061f10 con 0x7fbe8807d260 2026-03-09T00:11:23.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.104+0000 7fbe8e5f8700 1 -- 192.168.123.103:0/413297842 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbe8807e390 con 0x7fbe70077910 2026-03-09T00:11:23.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.105+0000 7fbe857fa700 1 -- 192.168.123.103:0/413297842 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7fbe8807e390 con 0x7fbe70077910 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 -- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbe70077910 msgr2=0x7fbe70079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbe70077910 0x7fbe70079dd0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fbe78007750 tx=0x7fbe78007de0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 -- 192.168.123.103:0/413297842 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 msgr2=0x7fbe8807d6e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 0x7fbe8807d6e0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7fbe8000c370 tx=0x7fbe8000c680 comp rx=0 tx=0).stop 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 -- 192.168.123.103:0/413297842 shutdown_connections 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbe70077910 0x7fbe70079dd0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbe881089a0 0x7fbe8807cd20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 --2- 192.168.123.103:0/413297842 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbe8807d260 0x7fbe8807d6e0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 -- 192.168.123.103:0/413297842 >> 192.168.123.103:0/413297842 conn(0x7fbe8806ce20 msgr2=0x7fbe88071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 -- 192.168.123.103:0/413297842 shutdown_connections 2026-03-09T00:11:23.109 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.108+0000 7fbe6effd700 1 -- 192.168.123.103:0/413297842 wait complete. 2026-03-09T00:11:23.118 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:11:23.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.176+0000 7f80d7803700 1 -- 192.168.123.103:0/3823964978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d0107ff0 msgr2=0x7f80d01083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.176+0000 7f80d7803700 1 --2- 192.168.123.103:0/3823964978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d0107ff0 0x7f80d01083d0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f80cc009b00 tx=0x7f80cc009e10 comp rx=0 tx=0).stop 2026-03-09T00:11:23.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.176+0000 7f80d7803700 1 -- 192.168.123.103:0/3823964978 shutdown_connections 2026-03-09T00:11:23.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.176+0000 7f80d7803700 1 --2- 192.168.123.103:0/3823964978 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d01089a0 0x7f80d010be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.176+0000 7f80d7803700 1 --2- 192.168.123.103:0/3823964978 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d0107ff0 0x7f80d01083d0 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.177 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.176+0000 7f80d7803700 1 -- 192.168.123.103:0/3823964978 >> 192.168.123.103:0/3823964978 conn(0x7f80d006ce20 msgr2=0x7f80d006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 -- 192.168.123.103:0/3823964978 shutdown_connections 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 -- 192.168.123.103:0/3823964978 wait complete. 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 Processor -- start 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 -- start start 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d01089a0 0x7f80d007cf10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 0x7f80d007d8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80d00819f0 con 0x7f80d01089a0 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d7803700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f80d0081b30 con 0x7f80d007d450 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d559f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d01089a0 0x7f80d007cf10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d4d9e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 0x7f80d007d8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d4d9e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 0x7f80d007d8d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33592/0 (socket says 192.168.123.103:33592) 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.177+0000 7f80d4d9e700 1 -- 192.168.123.103:0/4241228432 learned_addr learned my addr 192.168.123.103:0/4241228432 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.178+0000 7f80d4d9e700 1 -- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d01089a0 msgr2=0x7f80d007cf10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.178+0000 7f80d4d9e700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d01089a0 0x7f80d007cf10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.178+0000 7f80d4d9e700 1 -- 192.168.123.103:0/4241228432 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f80cc0097e0 con 0x7f80d007d450 2026-03-09T00:11:23.178 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.178+0000 7f80d4d9e700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 0x7f80d007d8d0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f80c800ee40 tx=0x7f80c800c620 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.178+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80c800e050 con 0x7f80d007d450 2026-03-09T00:11:23.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.178+0000 7f80d7803700 1 -- 192.168.123.103:0/4241228432 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f80d0081dc0 con 0x7f80d007d450 2026-03-09T00:11:23.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.179+0000 7f80d7803700 1 -- 192.168.123.103:0/4241228432 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f80d0082310 con 0x7f80d007d450 2026-03-09T00:11:23.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.179+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f80c800f040 con 0x7f80d007d450 2026-03-09T00:11:23.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.179+0000 7f80d7803700 1 -- 192.168.123.103:0/4241228432 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f80d004f2e0 con 0x7f80d007d450 2026-03-09T00:11:23.179 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.179+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f80c8007a50 con 0x7f80d007d450 2026-03-09T00:11:23.180 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.180+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f80c801d070 con 0x7f80d007d450 2026-03-09T00:11:23.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.181+0000 7f80c67fc700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f80bc077910 0x7f80bc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.181+0000 7f80d559f700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f80bc077910 0x7f80bc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.181 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.181+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f80c80a33b0 con 0x7f80d007d450 2026-03-09T00:11:23.182 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.182+0000 7f80d559f700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f80bc077910 0x7f80bc079dd0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f80cc00b5c0 tx=0x7f80cc01d040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.183 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.182+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f80c806b490 con 0x7f80d007d450 2026-03-09T00:11:23.308 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:22 vm03.local ceph-mon[129670]: pgmap v237: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:23.308 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.306+0000 7f80d7803700 1 -- 192.168.123.103:0/4241228432 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f80d007e510 con 0x7f80bc077910 2026-03-09T00:11:23.310 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.310+0000 7f80c67fc700 1 -- 192.168.123.103:0/4241228432 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f80cc0053d0 con 0x7f80bc077910 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 -- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f80bc077910 msgr2=0x7f80bc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f80bc077910 0x7f80bc079dd0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f80cc00b5c0 tx=0x7f80cc01d040 comp rx=0 tx=0).stop 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 -- 192.168.123.103:0/4241228432 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 msgr2=0x7f80d007d8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 0x7f80d007d8d0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f80c800ee40 tx=0x7f80c800c620 comp rx=0 tx=0).stop 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 -- 192.168.123.103:0/4241228432 shutdown_connections 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f80bc077910 0x7f80bc079dd0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f80d01089a0 0x7f80d007cf10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 --2- 192.168.123.103:0/4241228432 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f80d007d450 0x7f80d007d8d0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.313 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 -- 192.168.123.103:0/4241228432 >> 192.168.123.103:0/4241228432 conn(0x7f80d006ce20 msgr2=0x7f80d0070550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 -- 192.168.123.103:0/4241228432 shutdown_connections 2026-03-09T00:11:23.314 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.313+0000 7f80bbfff700 1 -- 192.168.123.103:0/4241228432 wait complete. 2026-03-09T00:11:23.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 -- 192.168.123.103:0/1478432159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e40107ff0 msgr2=0x7f7e401083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.380 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 --2- 192.168.123.103:0/1478432159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e40107ff0 0x7f7e401083d0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f7e30009b00 tx=0x7f7e30009e10 comp rx=0 tx=0).stop 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 -- 192.168.123.103:0/1478432159 shutdown_connections 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 --2- 192.168.123.103:0/1478432159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e401089a0 0x7f7e4010be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 --2- 192.168.123.103:0/1478432159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e40107ff0 0x7f7e401083d0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 -- 192.168.123.103:0/1478432159 >> 192.168.123.103:0/1478432159 conn(0x7f7e4006ce20 msgr2=0x7f7e4006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 -- 192.168.123.103:0/1478432159 shutdown_connections 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.380+0000 7f7e4509c700 1 -- 192.168.123.103:0/1478432159 wait complete. 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e4509c700 1 Processor -- start 2026-03-09T00:11:23.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e4509c700 1 -- start start 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e4509c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e401089a0 0x7f7e4007cf10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e4509c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 0x7f7e4007d8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e4509c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e400819f0 con 0x7f7e4007d450 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e4509c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e40081b30 con 0x7f7e401089a0 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e3e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 0x7f7e4007d8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e3e59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 0x7f7e4007d8d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42014/0 (socket says 192.168.123.103:42014) 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e3e59c700 1 -- 192.168.123.103:0/410812136 learned_addr learned my addr 192.168.123.103:0/410812136 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.381+0000 7f7e3ed9d700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e401089a0 0x7f7e4007cf10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e3e59c700 1 -- 192.168.123.103:0/410812136 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e401089a0 msgr2=0x7f7e4007cf10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e3e59c700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e401089a0 0x7f7e4007cf10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e3e59c700 1 -- 192.168.123.103:0/410812136 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7e300097e0 con 0x7f7e4007d450 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e3e59c700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 0x7f7e4007d8d0 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f7e3800dc30 tx=0x7f7e3800dd10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.382 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7e38004d60 con 0x7f7e4007d450 2026-03-09T00:11:23.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e4509c700 1 -- 192.168.123.103:0/410812136 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7e40081dc0 con 0x7f7e4007d450 2026-03-09T00:11:23.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.382+0000 7f7e4509c700 1 -- 192.168.123.103:0/410812136 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7e40082310 con 0x7f7e4007d450 2026-03-09T00:11:23.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.383+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7e3800ce80 con 0x7f7e4007d450 2026-03-09T00:11:23.383 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.383+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7e38019390 con 0x7f7e4007d450 2026-03-09T00:11:23.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.383+0000 7f7e4509c700 1 -- 192.168.123.103:0/410812136 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7e2c005320 con 0x7f7e4007d450 2026-03-09T00:11:23.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.384+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7e3800c9a0 con 0x7f7e4007d450 2026-03-09T00:11:23.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.385+0000 7f7e27fff700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7e280779e0 0x7f7e28079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.385+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f7e38028080 con 0x7f7e4007d450 2026-03-09T00:11:23.385 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.385+0000 7f7e3ed9d700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7e280779e0 0x7f7e28079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.386 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.386+0000 7f7e3ed9d700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7e280779e0 0x7f7e28079ea0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f7e3000b5c0 tx=0x7f7e30011040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.388 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.387+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7e38066a20 con 0x7f7e4007d450 2026-03-09T00:11:23.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:22 vm06.local ceph-mon[106218]: pgmap v237: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:23.502 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.502+0000 7f7e4509c700 1 -- 192.168.123.103:0/410812136 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7e2c000bf0 con 0x7f7e280779e0 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.507+0000 7f7e27fff700 1 -- 192.168.123.103:0/410812136 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f7e2c000bf0 con 0x7f7e280779e0 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (6m) 11s ago 11m 24.4M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (11m) 11s ago 11m 9873k - 18.2.1 5be31c24972a b93f8a220f71 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (11m) 5s ago 11m 9763k - 18.2.1 5be31c24972a d06aea65065e 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 11s ago 11m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (5m) 5s ago 11m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (6m) 11s ago 11m 87.9M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (15s) 11s ago 9m 14.3M - 19.2.3-678-ge911bdeb 654f31e6858e 0f2e03d0bb71 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (31s) 11s ago 9m 18.5M - 19.2.3-678-ge911bdeb 654f31e6858e 35aa1832dc40 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (9m) 5s ago 9m 94.9M - 18.2.1 5be31c24972a 868f24dd3b07 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (6s) 5s ago 9m 16.0M - 19.2.3-678-ge911bdeb 654f31e6858e c123d86a9659 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (7m) 11s ago 12m 630M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (7m) 5s ago 11m 491M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (5m) 11s ago 12m 65.7M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (5m) 5s ago 11m 56.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:11:23.507 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (7m) 11s ago 11m 9.78M - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (6m) 5s ago 11m 9680k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 11s ago 10m 184M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 11s ago 10m 156M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (109s) 11s ago 10m 111M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e7841e7307ae 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (86s) 5s ago 10m 187M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 8e61be617139 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (65s) 5s ago 10m 147M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 21cf4dc58899 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (43s) 5s ago 10m 139M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fbc950d55a67 2026-03-09T00:11:23.508 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (6m) 11s ago 11m 58.8M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 -- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7e280779e0 msgr2=0x7f7e28079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7e280779e0 0x7f7e28079ea0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f7e3000b5c0 tx=0x7f7e30011040 comp rx=0 tx=0).stop 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 -- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 msgr2=0x7f7e4007d8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 0x7f7e4007d8d0 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f7e3800dc30 tx=0x7f7e3800dd10 comp rx=0 tx=0).stop 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 -- 192.168.123.103:0/410812136 shutdown_connections 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7e280779e0 0x7f7e28079ea0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f7e401089a0 0x7f7e4007cf10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 --2- 192.168.123.103:0/410812136 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f7e4007d450 0x7f7e4007d8d0 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.510 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 -- 192.168.123.103:0/410812136 >> 192.168.123.103:0/410812136 conn(0x7f7e4006ce20 msgr2=0x7f7e40070550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 -- 192.168.123.103:0/410812136 shutdown_connections 2026-03-09T00:11:23.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.510+0000 7f7e25ffb700 1 -- 192.168.123.103:0/410812136 wait complete. 2026-03-09T00:11:23.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1459593821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 msgr2=0x7f5d240ffdf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.598 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 --2- 192.168.123.103:0/1459593821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d240ffdf0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7f5d14009b00 tx=0x7f5d14009e10 comp rx=0 tx=0).stop 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1459593821 shutdown_connections 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 --2- 192.168.123.103:0/1459593821 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d24100330 0x7f5d2410d080 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 --2- 192.168.123.103:0/1459593821 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d240ffdf0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1459593821 >> 192.168.123.103:0/1459593821 conn(0x7f5d240fb650 msgr2=0x7f5d240fda70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.598+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1459593821 shutdown_connections 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.599+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1459593821 wait complete. 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.599+0000 7f5d29c2b700 1 Processor -- start 2026-03-09T00:11:23.599 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.599+0000 7f5d29c2b700 1 -- start start 2026-03-09T00:11:23.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d29c2b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d2419e6f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d237fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d2419e6f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d237fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d2419e6f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42034/0 (socket says 192.168.123.103:42034) 2026-03-09T00:11:23.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d237fe700 1 -- 192.168.123.103:0/1743993257 learned_addr learned my addr 192.168.123.103:0/1743993257 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:23.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d29c2b700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d24100330 0x7f5d2419ec30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.600 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d2419f2c0 con 0x7f5d240ffa10 2026-03-09T00:11:23.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d22ffd700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d24100330 0x7f5d2419ec30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.600+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d24198980 con 0x7f5d24100330 2026-03-09T00:11:23.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.601+0000 7f5d237fe700 1 -- 192.168.123.103:0/1743993257 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d24100330 msgr2=0x7f5d2419ec30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.601+0000 7f5d237fe700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d24100330 0x7f5d2419ec30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.601+0000 7f5d237fe700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d140097e0 con 0x7f5d240ffa10 2026-03-09T00:11:23.601 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.601+0000 7f5d237fe700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d2419e6f0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f5d14006010 tx=0x7f5d14004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.602+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d1401d070 con 0x7f5d240ffa10 2026-03-09T00:11:23.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.602+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5d1400bc50 con 0x7f5d240ffa10 2026-03-09T00:11:23.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.602+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5d14022860 con 0x7f5d240ffa10 2026-03-09T00:11:23.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.602+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d24198c00 con 0x7f5d240ffa10 2026-03-09T00:11:23.602 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.602+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d241990f0 con 0x7f5d240ffa10 2026-03-09T00:11:23.603 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.602+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5d2410a7f0 con 0x7f5d240ffa10 2026-03-09T00:11:23.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.604+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5d1400fbf0 con 0x7f5d240ffa10 2026-03-09T00:11:23.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.604+0000 7f5d20ff9700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5d0c0778c0 0x7f5d0c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.604 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.604+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f5d1409b8f0 con 0x7f5d240ffa10 2026-03-09T00:11:23.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.604+0000 7f5d22ffd700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5d0c0778c0 0x7f5d0c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.605 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.605+0000 7f5d22ffd700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5d0c0778c0 0x7f5d0c079d80 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f5d24199df0 tx=0x7f5d1800b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.608 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.607+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5d140640a0 con 0x7f5d240ffa10 2026-03-09T00:11:23.768 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.768+0000 7f5d29c2b700 1 -- 192.168.123.103:0/1743993257 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f5d2404f2e0 con 0x7f5d240ffa10 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.772+0000 7f5d20ff9700 1 -- 192.168.123.103:0/1743993257 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f5d140637f0 con 0x7f5d240ffa10 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 13 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:11:23.772 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:11:23.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 -- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5d0c0778c0 msgr2=0x7f5d0c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5d0c0778c0 0x7f5d0c079d80 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f5d24199df0 tx=0x7f5d1800b410 comp rx=0 tx=0).stop 2026-03-09T00:11:23.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 -- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 msgr2=0x7f5d2419e6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d2419e6f0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7f5d14006010 tx=0x7f5d14004c30 comp rx=0 tx=0).stop 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 -- 192.168.123.103:0/1743993257 shutdown_connections 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5d0c0778c0 0x7f5d0c079d80 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5d240ffa10 0x7f5d2419e6f0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 --2- 192.168.123.103:0/1743993257 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5d24100330 0x7f5d2419ec30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 -- 192.168.123.103:0/1743993257 >> 192.168.123.103:0/1743993257 conn(0x7f5d240fb650 msgr2=0x7f5d240fcbf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.778+0000 7f5d0a7fc700 1 -- 192.168.123.103:0/1743993257 shutdown_connections 2026-03-09T00:11:23.779 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.779+0000 7f5d0a7fc700 1 -- 192.168.123.103:0/1743993257 wait complete. 2026-03-09T00:11:23.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.850+0000 7f0e25096700 1 -- 192.168.123.103:0/3452484045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20107d90 msgr2=0x7f0e20108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.851 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.850+0000 7f0e25096700 1 --2- 192.168.123.103:0/3452484045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20107d90 0x7f0e20108210 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f0e1801c320 tx=0x7f0e1801c630 comp rx=0 tx=0).stop 2026-03-09T00:11:23.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.850+0000 7f0e25096700 1 -- 192.168.123.103:0/3452484045 shutdown_connections 2026-03-09T00:11:23.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.850+0000 7f0e25096700 1 --2- 192.168.123.103:0/3452484045 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e20107d90 0x7f0e20108210 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.850+0000 7f0e25096700 1 --2- 192.168.123.103:0/3452484045 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2010f420 0x7f0e2010f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.853 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.850+0000 7f0e25096700 1 -- 192.168.123.103:0/3452484045 >> 192.168.123.103:0/3452484045 conn(0x7f0e2006ce20 msgr2=0x7f0e2006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:23.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.853+0000 7f0e25096700 1 -- 192.168.123.103:0/3452484045 shutdown_connections 2026-03-09T00:11:23.854 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.853+0000 7f0e25096700 1 -- 192.168.123.103:0/3452484045 wait complete. 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e25096700 1 Processor -- start 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e25096700 1 -- start start 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e25096700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e2010f420 0x7f0e2019cf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e25096700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2019d470 0x7f0e201a18e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e25096700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e2019da90 con 0x7f0e2010f420 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e25096700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0e2019dc00 con 0x7f0e2019d470 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e1e59c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2019d470 0x7f0e201a18e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e1ed9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e2010f420 0x7f0e2019cf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e1ed9d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e2010f420 0x7f0e2019cf30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42038/0 (socket says 192.168.123.103:42038) 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.854+0000 7f0e1ed9d700 1 -- 192.168.123.103:0/275248848 learned_addr learned my addr 192.168.123.103:0/275248848 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.855+0000 7f0e1e59c700 1 -- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e2010f420 msgr2=0x7f0e2019cf30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.855+0000 7f0e1e59c700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e2010f420 0x7f0e2019cf30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:23.855 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.855+0000 7f0e1e59c700 1 -- 192.168.123.103:0/275248848 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0e1801c060 con 0x7f0e2019d470 2026-03-09T00:11:23.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.855+0000 7f0e1e59c700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2019d470 0x7f0e201a18e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f0e18007fd0 tx=0x7f0e18009300 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.855+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e18003c10 con 0x7f0e2019d470 2026-03-09T00:11:23.856 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.856+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0e201a1e80 con 0x7f0e2019d470 2026-03-09T00:11:23.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.856+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0e201a23d0 con 0x7f0e2019d470 2026-03-09T00:11:23.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.856+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0e1801f040 con 0x7f0e2019d470 2026-03-09T00:11:23.857 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.857+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0e1802c670 con 0x7f0e2019d470 2026-03-09T00:11:23.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.858+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0e1802a070 con 0x7f0e2019d470 2026-03-09T00:11:23.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.858+0000 7f0e07fff700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0e08077910 0x7f0e08079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:23.858 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.858+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f0e180ab450 con 0x7f0e2019d470 2026-03-09T00:11:23.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.858+0000 7f0e1ed9d700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0e08077910 0x7f0e08079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:23.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.859+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0e0c005320 con 0x7f0e2019d470 2026-03-09T00:11:23.859 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.859+0000 7f0e1ed9d700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0e08077910 0x7f0e08079dd0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f0e10009cc0 tx=0x7f0e10009400 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:23.865 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:23.865+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0e18073b80 con 0x7f0e2019d470 2026-03-09T00:11:24.007 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.007+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f0e0c006200 con 0x7f0e2019d470 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.013+0000 7f0e07fff700 1 -- 192.168.123.103:0/275248848 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 27 v27) v1 ==== 76+0+1788 (secure 0 0 0) 0x7f0e18028020 con 0x7f0e2019d470 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:e27 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-09T00:11:20:931090+0000 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:epoch 27 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:11:20.931086+0000 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:11:24.013 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 109 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34404} 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:34404} state up:replay seq 1 join_fscid=1 addr [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:34412} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.103:6828/1027317762,v1:192.168.123.103:6829/1027317762] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:11:24.014 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:44297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1987865018,v1:192.168.123.106:6825/1987865018] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.015+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0e08077910 msgr2=0x7f0e08079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.015+0000 7f0e25096700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0e08077910 0x7f0e08079dd0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f0e10009cc0 tx=0x7f0e10009400 comp rx=0 tx=0).stop 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.015+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2019d470 msgr2=0x7f0e201a18e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.015+0000 7f0e25096700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2019d470 0x7f0e201a18e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f0e18007fd0 tx=0x7f0e18009300 comp rx=0 tx=0).stop 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 shutdown_connections 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0e08077910 0x7f0e08079dd0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0e2010f420 0x7f0e2019cf30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 --2- 192.168.123.103:0/275248848 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0e2019d470 0x7f0e201a18e0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 >> 192.168.123.103:0/275248848 conn(0x7f0e2006ce20 msgr2=0x7f0e2006f580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 shutdown_connections 2026-03-09T00:11:24.017 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.017+0000 7f0e25096700 1 -- 192.168.123.103:0/275248848 wait complete. 2026-03-09T00:11:24.018 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 27 2026-03-09T00:11:24.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 -- 192.168.123.103:0/707543655 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410f380 msgr2=0x7ff65410f760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 --2- 192.168.123.103:0/707543655 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410f380 0x7ff65410f760 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7ff644009b00 tx=0x7ff644009e10 comp rx=0 tx=0).stop 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 -- 192.168.123.103:0/707543655 shutdown_connections 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 --2- 192.168.123.103:0/707543655 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410cf00 0x7ff65410d380 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 --2- 192.168.123.103:0/707543655 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410f380 0x7ff65410f760 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 -- 192.168.123.103:0/707543655 >> 192.168.123.103:0/707543655 conn(0x7ff65406ce10 msgr2=0x7ff65406d220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 -- 192.168.123.103:0/707543655 shutdown_connections 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.104+0000 7ff65359e700 1 -- 192.168.123.103:0/707543655 wait complete. 2026-03-09T00:11:24.105 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff65359e700 1 Processor -- start 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff65359e700 1 -- start start 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff65359e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410cf00 0x7ff65419d070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff65359e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 0x7ff65419d5b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff65359e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff65419dc90 con 0x7ff65410cf00 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff65359e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6541a19d0 con 0x7ff65410f380 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff651d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 0x7ff65419d5b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff651d9b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 0x7ff65419d5b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:33662/0 (socket says 192.168.123.103:33662) 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.105+0000 7ff651d9b700 1 -- 192.168.123.103:0/2633363704 learned_addr learned my addr 192.168.123.103:0/2633363704 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.106+0000 7ff651d9b700 1 -- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410cf00 msgr2=0x7ff65419d070 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.106+0000 7ff651d9b700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410cf00 0x7ff65419d070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.106 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.106+0000 7ff651d9b700 1 -- 192.168.123.103:0/2633363704 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6440097e0 con 0x7ff65410f380 2026-03-09T00:11:24.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.106+0000 7ff651d9b700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 0x7ff65419d5b0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7ff64c01d3b0 tx=0x7ff64c01d770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:24.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.107+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff64c01f030 con 0x7ff65410f380 2026-03-09T00:11:24.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.107+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff64c020040 con 0x7ff65410f380 2026-03-09T00:11:24.107 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.107+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff64c025770 con 0x7ff65410f380 2026-03-09T00:11:24.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.107+0000 7ff65359e700 1 -- 192.168.123.103:0/2633363704 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6541a1cb0 con 0x7ff65410f380 2026-03-09T00:11:24.108 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.107+0000 7ff65359e700 1 -- 192.168.123.103:0/2633363704 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6541a2200 con 0x7ff65410f380 2026-03-09T00:11:24.110 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.107+0000 7ff65359e700 1 -- 192.168.123.103:0/2633363704 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff65410c550 con 0x7ff65410f380 2026-03-09T00:11:24.111 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.111+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff64c004ad0 con 0x7ff65410f380 2026-03-09T00:11:24.112 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.111+0000 7ff6437fe700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff63c0778c0 0x7ff63c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:24.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.111+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff64c0aacf0 con 0x7ff65410f380 2026-03-09T00:11:24.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.113+0000 7ff65259c700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff63c0778c0 0x7ff63c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:24.113 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.113+0000 7ff65259c700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff63c0778c0 0x7ff63c079d80 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7ff644005f50 tx=0x7ff644005e40 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:24.114 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.113+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff64c0734a0 con 0x7ff65410f380 2026-03-09T00:11:24.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.240+0000 7ff65359e700 1 -- 192.168.123.103:0/2633363704 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff6541a24e0 con 0x7ff63c0778c0 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": true, 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [ 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "crash", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "mon", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "osd", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "mgr" 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: ], 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "progress": "15/23 daemons upgraded", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "message": "Currently upgrading mds daemons", 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:11:24.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.244+0000 7ff6437fe700 1 -- 192.168.123.103:0/2633363704 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7ff6541a24e0 con 0x7ff63c0778c0 2026-03-09T00:11:24.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 -- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff63c0778c0 msgr2=0x7ff63c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff63c0778c0 0x7ff63c079d80 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7ff644005f50 tx=0x7ff644005e40 comp rx=0 tx=0).stop 2026-03-09T00:11:24.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 -- 192.168.123.103:0/2633363704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 msgr2=0x7ff65419d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.250 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 0x7ff65419d5b0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7ff64c01d3b0 tx=0x7ff64c01d770 comp rx=0 tx=0).stop 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 -- 192.168.123.103:0/2633363704 shutdown_connections 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff63c0778c0 0x7ff63c079d80 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff65410cf00 0x7ff65419d070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 --2- 192.168.123.103:0/2633363704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff65410f380 0x7ff65419d5b0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 -- 192.168.123.103:0/2633363704 >> 192.168.123.103:0/2633363704 conn(0x7ff65406ce10 msgr2=0x7ff6540704c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 -- 192.168.123.103:0/2633363704 shutdown_connections 2026-03-09T00:11:24.251 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.250+0000 7ff6417fa700 1 -- 192.168.123.103:0/2633363704 wait complete. 2026-03-09T00:11:24.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.327+0000 7fc5a40c7700 1 -- 192.168.123.103:0/4146579716 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c1089a0 msgr2=0x7fc59c10be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.327+0000 7fc5a40c7700 1 --2- 192.168.123.103:0/4146579716 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c1089a0 0x7fc59c10be70 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7fc59400b600 tx=0x7fc59400b910 comp rx=0 tx=0).stop 2026-03-09T00:11:24.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- 192.168.123.103:0/4146579716 shutdown_connections 2026-03-09T00:11:24.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 --2- 192.168.123.103:0/4146579716 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c1089a0 0x7fc59c10be70 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 --2- 192.168.123.103:0/4146579716 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc59c107ff0 0x7fc59c1083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.328 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- 192.168.123.103:0/4146579716 >> 192.168.123.103:0/4146579716 conn(0x7fc59c06ce20 msgr2=0x7fc59c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- 192.168.123.103:0/4146579716 shutdown_connections 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- 192.168.123.103:0/4146579716 wait complete. 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 Processor -- start 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- start start 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 0x7fc59c07d250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc59c07d790 0x7fc59c081b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc59c07dc10 con 0x7fc59c107ff0 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a40c7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc59c07dd50 con 0x7fc59c07d790 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a1e63700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 0x7fc59c07d250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a1e63700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 0x7fc59c07d250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:42070/0 (socket says 192.168.123.103:42070) 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.328+0000 7fc5a1e63700 1 -- 192.168.123.103:0/2777014373 learned_addr learned my addr 192.168.123.103:0/2777014373 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a1662700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc59c07d790 0x7fc59c081b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a1e63700 1 -- 192.168.123.103:0/2777014373 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc59c07d790 msgr2=0x7fc59c081b60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a1e63700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc59c07d790 0x7fc59c081b60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a1e63700 1 -- 192.168.123.103:0/2777014373 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc59400b050 con 0x7fc59c107ff0 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a1e63700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 0x7fc59c07d250 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fc59800ba70 tx=0x7fc59800be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:24.329 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc59800c760 con 0x7fc59c107ff0 2026-03-09T00:11:24.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a40c7700 1 -- 192.168.123.103:0/2777014373 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc59c0820a0 con 0x7fc59c107ff0 2026-03-09T00:11:24.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.329+0000 7fc5a40c7700 1 -- 192.168.123.103:0/2777014373 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc59c082590 con 0x7fc59c107ff0 2026-03-09T00:11:24.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.331+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc59800cda0 con 0x7fc59c107ff0 2026-03-09T00:11:24.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.331+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc598012550 con 0x7fc59c107ff0 2026-03-09T00:11:24.331 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.331+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc598012770 con 0x7fc59c107ff0 2026-03-09T00:11:24.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.332+0000 7fc592ffd700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc588079cd0 0x7fc58807c190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:24.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.332+0000 7fc5a1662700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc588079cd0 0x7fc58807c190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:24.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.332+0000 7fc5a1662700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc588079cd0 0x7fc58807c190 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fc594007fd0 tx=0x7fc594007ca0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:24.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.332+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc598099e50 con 0x7fc59c107ff0 2026-03-09T00:11:24.332 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.332+0000 7fc5a40c7700 1 -- 192.168.123.103:0/2777014373 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc580005320 con 0x7fc59c107ff0 2026-03-09T00:11:24.335 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.335+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc598062710 con 0x7fc59c107ff0 2026-03-09T00:11:24.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:24 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1743993257' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:24.494 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.494+0000 7fc5a40c7700 1 -- 192.168.123.103:0/2777014373 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fc580005190 con 0x7fc59c107ff0 2026-03-09T00:11:24.495 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:24 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1743993257' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:24.497 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.497+0000 7fc592ffd700 1 -- 192.168.123.103:0/2777014373 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+297 (secure 0 0 0) 0x7fc598019090 con 0x7fc59c107ff0 2026-03-09T00:11:24.497 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem is degraded; 1 filesystem with deprecated feature inline_data 2026-03-09T00:11:24.497 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_DEGRADED: 1 filesystem is degraded 2026-03-09T00:11:24.497 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs is degraded 2026-03-09T00:11:24.497 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:11:24.498 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 -- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc588079cd0 msgr2=0x7fc58807c190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc588079cd0 0x7fc58807c190 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fc594007fd0 tx=0x7fc594007ca0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 -- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 msgr2=0x7fc59c07d250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 0x7fc59c07d250 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7fc59800ba70 tx=0x7fc59800be30 comp rx=0 tx=0).stop 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 -- 192.168.123.103:0/2777014373 shutdown_connections 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc588079cd0 0x7fc58807c190 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc59c107ff0 0x7fc59c07d250 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 --2- 192.168.123.103:0/2777014373 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc59c07d790 0x7fc59c081b60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 -- 192.168.123.103:0/2777014373 >> 192.168.123.103:0/2777014373 conn(0x7fc59c06ce20 msgr2=0x7fc59c10ae50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 -- 192.168.123.103:0/2777014373 shutdown_connections 2026-03-09T00:11:24.500 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:24.500+0000 7fc590ff9700 1 -- 192.168.123.103:0/2777014373 wait complete. 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: from='client.44301 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: from='client.44305 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: from='client.34426 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: pgmap v238: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 11 MiB/s rd, 3 op/s 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/275248848' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: from='client.44319 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:25 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2777014373' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: from='client.44301 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: from='client.44305 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: from='client.34426 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: pgmap v238: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 11 MiB/s rd, 3 op/s 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/275248848' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: from='client.44319 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:25.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:25 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2777014373' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:11:26.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:26 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:reconnect 2026-03-09T00:11:26.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:26 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:reconnect} 2 up:standby 2026-03-09T00:11:26.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:26 vm03.local ceph-mon[129670]: reconnect by client.24309 v1:192.168.123.103:0/929198358 after 0 2026-03-09T00:11:26.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:26 vm03.local ceph-mon[129670]: reconnect by client.24313 v1:192.168.144.1:0/748981855 after 0.001 2026-03-09T00:11:26.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:26 vm03.local ceph-mon[129670]: pgmap v239: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 4 op/s 2026-03-09T00:11:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:26 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:reconnect 2026-03-09T00:11:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:26 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:reconnect} 2 up:standby 2026-03-09T00:11:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:26 vm06.local ceph-mon[106218]: reconnect by client.24309 v1:192.168.123.103:0/929198358 after 0 2026-03-09T00:11:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:26 vm06.local ceph-mon[106218]: reconnect by client.24313 v1:192.168.144.1:0/748981855 after 0.001 2026-03-09T00:11:26.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:26 vm06.local ceph-mon[106218]: pgmap v239: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 15 MiB/s rd, 4 op/s 2026-03-09T00:11:27.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:27 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:rejoin 2026-03-09T00:11:27.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:27 vm03.local ceph-mon[129670]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:rejoin} 2 up:standby 2026-03-09T00:11:27.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:27 vm03.local ceph-mon[129670]: daemon mds.cephfs.vm03.sejksk is now active in filesystem cephfs as rank 0 2026-03-09T00:11:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:27 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:rejoin 2026-03-09T00:11:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:27 vm06.local ceph-mon[106218]: fsmap cephfs:1/1 {0=cephfs.vm03.sejksk=up:rejoin} 2 up:standby 2026-03-09T00:11:27.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:27 vm06.local ceph-mon[106218]: daemon mds.cephfs.vm03.sejksk is now active in filesystem cephfs as rank 0 2026-03-09T00:11:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:28 vm03.local ceph-mon[129670]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:11:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:28 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:active 2026-03-09T00:11:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:28 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 2 up:standby 2026-03-09T00:11:28.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:28 vm03.local ceph-mon[129670]: pgmap v240: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 6 op/s 2026-03-09T00:11:28.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:28 vm06.local ceph-mon[106218]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-09T00:11:28.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:28 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] up:active 2026-03-09T00:11:28.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:28 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 2 up:standby 2026-03-09T00:11:28.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:28 vm06.local ceph-mon[106218]: pgmap v240: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 6 op/s 2026-03-09T00:11:29.587 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:29 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:29.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:29 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:30.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:30 vm03.local ceph-mon[129670]: pgmap v241: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 102 B/s wr, 9 op/s 2026-03-09T00:11:30.656 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:30 vm06.local ceph-mon[106218]: pgmap v241: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 102 B/s wr, 9 op/s 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: pgmap v242: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5.0 KiB/s wr, 11 op/s 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: mds.? [v2:192.168.123.106:6826/839008662,v1:192.168.123.106:6827/839008662] up:boot 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:32 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.165 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: pgmap v242: 65 pgs: 65 active+clean; 209 MiB data, 884 MiB used, 119 GiB / 120 GiB avail; 19 MiB/s rd, 5.0 KiB/s wr, 11 op/s 2026-03-09T00:11:33.165 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.165 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: mds.? [v2:192.168.123.106:6826/839008662,v1:192.168.123.106:6827/839008662] up:boot 2026-03-09T00:11:33.165 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:11:33.166 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:11:33.166 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.166 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:33.166 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:33.166 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:32 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: pgmap v243: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 4.4 KiB/s wr, 10 op/s 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all mds 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.ralade"}]': finished 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.sejksk"}]': finished 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.ixduim"}]': finished 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vlrwtl"}]': finished 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T00:11:34.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:33 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: pgmap v243: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 17 MiB/s rd, 4.4 KiB/s wr, 10 op/s 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all mds 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.ralade"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.ralade"}]': finished 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.sejksk"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm03.sejksk"}]': finished 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.ixduim"}]: dispatch 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.ixduim"}]': finished 2026-03-09T00:11:34.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vlrwtl"}]: dispatch 2026-03-09T00:11:34.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm06.vlrwtl"}]': finished 2026-03-09T00:11:34.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T00:11:34.339 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:33 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T00:11:35.921 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T00:11:35.922 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all rgw 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:11:36.271 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:35 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all rgw 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all rbd-mirror 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm03", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:11:36.272 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:35 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: Upgrade: Updating ceph-exporter.vm03 (1/2) 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: pgmap v244: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 8.4 MiB/s rd, 4.4 KiB/s wr, 7 op/s 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:11:37.157 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:36 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: Upgrade: Updating ceph-exporter.vm03 (1/2) 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: Deploying daemon ceph-exporter.vm03 on vm03 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: pgmap v244: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 8.4 MiB/s rd, 4.4 KiB/s wr, 7 op/s 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm06", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-09T00:11:37.283 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:36 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:37.930 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:37 vm03.local ceph-mon[129670]: Upgrade: Updating ceph-exporter.vm06 (2/2) 2026-03-09T00:11:37.930 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:37 vm03.local ceph-mon[129670]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-09T00:11:37.930 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:37 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.930 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:37 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:37.930 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:37 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:38.039 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:37 vm06.local ceph-mon[106218]: Upgrade: Updating ceph-exporter.vm06 (2/2) 2026-03-09T00:11:38.039 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:37 vm06.local ceph-mon[106218]: Deploying daemon ceph-exporter.vm06 on vm06 2026-03-09T00:11:38.039 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:37 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.039 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:37 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.039 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:37 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: pgmap v245: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 4.7 MiB/s rd, 4.4 KiB/s wr, 6 op/s 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:38.929 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:38 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: pgmap v245: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 4.7 MiB/s rd, 4.4 KiB/s wr, 6 op/s 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:38 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: pgmap v246: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 1.0 MiB/s rd, 4.4 KiB/s wr, 5 op/s 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:40 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: pgmap v246: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 1.0 MiB/s rd, 4.4 KiB/s wr, 5 op/s 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:40.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:40 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]': finished 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]': finished 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:41.589 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: Upgrade: Setting filesystem cephfs Joinable 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: fsmap cephfs:1 {0=cephfs.vm03.sejksk=up:active} 3 up:standby 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm03"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm06"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-09T00:11:41.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:41.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:41.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:41.672 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all iscsi 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all nfs 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: Upgrade: Setting container_image for all nvmeof 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: Upgrade: Finalizing container_image settings 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: Upgrade: Complete! 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:42.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:42 vm03.local ceph-mon[129670]: pgmap v247: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 1023 B/s rd, 4.3 KiB/s wr, 2 op/s 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all ceph-exporter 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all iscsi 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all nfs 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: Upgrade: Setting container_image for all nvmeof 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: Upgrade: Finalizing container_image settings 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: Upgrade: Complete! 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:42.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:42 vm06.local ceph-mon[106218]: pgmap v247: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail; 1023 B/s rd, 4.3 KiB/s wr, 2 op/s 2026-03-09T00:11:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:44 vm03.local ceph-mon[129670]: pgmap v248: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:45.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:44 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:45.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:44 vm06.local ceph-mon[106218]: pgmap v248: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:45.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:45.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:44 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:11:47.085 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:46 vm06.local ceph-mon[106218]: pgmap v249: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:47.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:46 vm03.local ceph-mon[129670]: pgmap v249: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:49.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:48 vm03.local ceph-mon[129670]: pgmap v250: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:49.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:48 vm06.local ceph-mon[106218]: pgmap v250: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:50 vm03.local ceph-mon[129670]: pgmap v251: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:50 vm06.local ceph-mon[106218]: pgmap v251: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:52.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:52 vm06.local ceph-mon[106218]: pgmap v252: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:52 vm03.local ceph-mon[129670]: pgmap v252: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:54.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.570+0000 7f1d2ea14700 1 -- 192.168.123.103:0/4273154937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 msgr2=0x7f1d28072fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:54.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.570+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/4273154937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d28072fc0 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f1d1c009b50 tx=0x7f1d1c009e60 comp rx=0 tx=0).stop 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.570+0000 7f1d2ea14700 1 -- 192.168.123.103:0/4273154937 shutdown_connections 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.570+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/4273154937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d28073500 0x7f1d28073980 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.570+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/4273154937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d28072fc0 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.571+0000 7f1d2ea14700 1 -- 192.168.123.103:0/4273154937 >> 192.168.123.103:0/4273154937 conn(0x7f1d28078ed0 msgr2=0x7f1d280792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.571+0000 7f1d2ea14700 1 -- 192.168.123.103:0/4273154937 shutdown_connections 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.571+0000 7f1d2ea14700 1 -- 192.168.123.103:0/4273154937 wait complete. 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.572+0000 7f1d2ea14700 1 Processor -- start 2026-03-09T00:11:54.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.572+0000 7f1d2ea14700 1 -- start start 2026-03-09T00:11:54.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.572+0000 7f1d2ea14700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d28073500 0x7f1d2819d180 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:54.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.572+0000 7f1d2ea14700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d2819d6c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:54.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.572+0000 7f1d2ea14700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d2819dda0 con 0x7f1d28074d00 2026-03-09T00:11:54.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.573+0000 7f1d2ea14700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1d281a1b30 con 0x7f1d28073500 2026-03-09T00:11:54.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.573+0000 7f1d277fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d2819d6c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:54.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.573+0000 7f1d277fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d2819d6c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:58186/0 (socket says 192.168.123.103:58186) 2026-03-09T00:11:54.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.573+0000 7f1d277fe700 1 -- 192.168.123.103:0/3638623170 learned_addr learned my addr 192.168.123.103:0/3638623170 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:54.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.573+0000 7f1d277fe700 1 -- 192.168.123.103:0/3638623170 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d28073500 msgr2=0x7f1d2819d180 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:11:54.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.573+0000 7f1d277fe700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d28073500 0x7f1d2819d180 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d277fe700 1 -- 192.168.123.103:0/3638623170 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1d1c0097e0 con 0x7f1d28074d00 2026-03-09T00:11:54.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d277fe700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d2819d6c0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f1d1800eb10 tx=0x7f1d1800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:54.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d1800cca0 con 0x7f1d28074d00 2026-03-09T00:11:54.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1d1800ce00 con 0x7f1d28074d00 2026-03-09T00:11:54.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1d180189c0 con 0x7f1d28074d00 2026-03-09T00:11:54.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1d281a1e10 con 0x7f1d28074d00 2026-03-09T00:11:54.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.574+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1d281a2360 con 0x7f1d28074d00 2026-03-09T00:11:54.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.577+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1d18018b20 con 0x7f1d28074d00 2026-03-09T00:11:54.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.577+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1d2804ea90 con 0x7f1d28074d00 2026-03-09T00:11:54.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.577+0000 7f1d257fa700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d10077870 0x7f1d10079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:54.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.577+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f1d18014070 con 0x7f1d28074d00 2026-03-09T00:11:54.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.581+0000 7f1d27fff700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d10077870 0x7f1d10079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:54.581 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.581+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1d18062cc0 con 0x7f1d28074d00 2026-03-09T00:11:54.582 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.582+0000 7f1d27fff700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d10077870 0x7f1d10079d30 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f1d1c000c00 tx=0x7f1d1c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:54.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.715+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f1d281a2640 con 0x7f1d10077870 2026-03-09T00:11:54.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.717+0000 7f1d257fa700 1 -- 192.168.123.103:0/3638623170 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f1d281a2640 con 0x7f1d10077870 2026-03-09T00:11:54.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.721+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d10077870 msgr2=0x7f1d10079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:54.721 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.721+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d10077870 0x7f1d10079d30 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f1d1c000c00 tx=0x7f1d1c005fb0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.721+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 msgr2=0x7f1d2819d6c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:54.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.722+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d2819d6c0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f1d1800eb10 tx=0x7f1d1800eed0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.722+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 shutdown_connections 2026-03-09T00:11:54.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.722+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1d10077870 0x7f1d10079d30 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.722+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1d28073500 0x7f1d2819d180 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.722+0000 7f1d2ea14700 1 --2- 192.168.123.103:0/3638623170 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1d28074d00 0x7f1d2819d6c0 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:54.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.722+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 >> 192.168.123.103:0/3638623170 conn(0x7f1d28078ed0 msgr2=0x7f1d2810fa90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:54.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.723+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 shutdown_connections 2026-03-09T00:11:54.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:54.723+0000 7f1d2ea14700 1 -- 192.168.123.103:0/3638623170 wait complete. 2026-03-09T00:11:54.788 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T00:11:54.837 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:54 vm03.local ceph-mon[129670]: pgmap v253: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:54.978 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:11:55.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:54 vm06.local ceph-mon[106218]: pgmap v253: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.238+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/3833927015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 msgr2=0x7f4b88105bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.238+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/3833927015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b88105bd0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f4b78009b00 tx=0x7f4b78009e10 comp rx=0 tx=0).stop 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.239+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/3833927015 shutdown_connections 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.239+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/3833927015 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b88105bd0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.239+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/3833927015 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 0x7f4b88101660 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.239+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/3833927015 >> 192.168.123.103:0/3833927015 conn(0x7f4b88078ed0 msgr2=0x7f4b880792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.239+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/3833927015 shutdown_connections 2026-03-09T00:11:55.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.239+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/3833927015 wait complete. 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b8dbe4700 1 Processor -- start 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b8dbe4700 1 -- start start 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b8dbe4700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 0x7f4b88198e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b8dbe4700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b881993c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b8dbe4700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4b88199aa0 con 0x7f4b88101280 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b8dbe4700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4b8819d830 con 0x7f4b88101c30 2026-03-09T00:11:55.240 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b86ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b881993c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b86ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b881993c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48790/0 (socket says 192.168.123.103:48790) 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b86ffd700 1 -- 192.168.123.103:0/278439398 learned_addr learned my addr 192.168.123.103:0/278439398 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.240+0000 7f4b877fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 0x7f4b88198e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b86ffd700 1 -- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 msgr2=0x7f4b88198e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b86ffd700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 0x7f4b88198e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b86ffd700 1 -- 192.168.123.103:0/278439398 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4b70009710 con 0x7f4b88101c30 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b877fe700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 0x7f4b88198e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b86ffd700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b881993c0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4b7800ba30 tx=0x7f4b7800ba60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4b7801d070 con 0x7f4b88101c30 2026-03-09T00:11:55.241 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4b780097e0 con 0x7f4b88101c30 2026-03-09T00:11:55.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4b7800f460 con 0x7f4b88101c30 2026-03-09T00:11:55.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.241+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4b8819de10 con 0x7f4b88101c30 2026-03-09T00:11:55.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.242+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4b88105a60 con 0x7f4b88101c30 2026-03-09T00:11:55.242 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.242+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4b78003a40 con 0x7f4b88101c30 2026-03-09T00:11:55.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.243+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4b7800f5d0 con 0x7f4b88101c30 2026-03-09T00:11:55.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.243+0000 7f4b84ff9700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4b740778c0 0x7f4b74079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:55.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.244+0000 7f4b877fe700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4b740778c0 0x7f4b74079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:55.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.244+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f4b7809af60 con 0x7f4b88101c30 2026-03-09T00:11:55.244 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.244+0000 7f4b877fe700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4b740778c0 0x7f4b74079d80 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f4b7000f790 tx=0x7f4b70009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:55.245 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.245+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4b780cb9f0 con 0x7f4b88101c30 2026-03-09T00:11:55.368 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.367+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f4b8819a2f0 con 0x7f4b740778c0 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.372+0000 7f4b84ff9700 1 -- 192.168.123.103:0/278439398 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f4b8819a2f0 con 0x7f4b740778c0 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (7m) 16s ago 12m 24.4M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (19s) 16s ago 12m 9893k - 19.2.3-678-ge911bdeb 654f31e6858e 32be78e0d748 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (18s) 17s ago 11m 9625k - 19.2.3-678-ge911bdeb 654f31e6858e d24bb3c627fc 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (5m) 16s ago 12m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (5m) 17s ago 11m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (6m) 16s ago 12m 90.2M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (47s) 16s ago 10m 15.3M - 19.2.3-678-ge911bdeb 654f31e6858e 0f2e03d0bb71 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (63s) 16s ago 10m 101M - 19.2.3-678-ge911bdeb 654f31e6858e 35aa1832dc40 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (24s) 17s ago 10m 14.4M - 19.2.3-678-ge911bdeb 654f31e6858e 7f828154106b 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (38s) 17s ago 10m 18.3M - 19.2.3-678-ge911bdeb 654f31e6858e c123d86a9659 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (8m) 16s ago 12m 632M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (7m) 17s ago 11m 491M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (6m) 16s ago 12m 69.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (6m) 17s ago 11m 58.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (7m) 16s ago 12m 9.82M - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (7m) 17s ago 11m 9684k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (5m) 16s ago 11m 184M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (2m) 16s ago 11m 157M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (2m) 16s ago 11m 112M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e7841e7307ae 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (118s) 17s ago 10m 190M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 8e61be617139 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (97s) 17s ago 10m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 21cf4dc58899 2026-03-09T00:11:55.373 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (75s) 17s ago 10m 140M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fbc950d55a67 2026-03-09T00:11:55.374 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (7m) 16s ago 12m 59.3M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4b740778c0 msgr2=0x7f4b74079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4b740778c0 0x7f4b74079d80 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f4b7000f790 tx=0x7f4b70009450 comp rx=0 tx=0).stop 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 msgr2=0x7f4b881993c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b881993c0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f4b7800ba30 tx=0x7f4b7800ba60 comp rx=0 tx=0).stop 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 shutdown_connections 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4b740778c0 0x7f4b74079d80 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4b88101280 0x7f4b88198e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 --2- 192.168.123.103:0/278439398 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4b88101c30 0x7f4b881993c0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 >> 192.168.123.103:0/278439398 conn(0x7f4b88078ed0 msgr2=0x7f4b88102ba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:55.376 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 shutdown_connections 2026-03-09T00:11:55.377 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.376+0000 7f4b8dbe4700 1 -- 192.168.123.103:0/278439398 wait complete. 2026-03-09T00:11:55.443 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-09T00:11:55.638 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.896+0000 7fa115ecc700 1 -- 192.168.123.103:0/204210332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 msgr2=0x7fa110103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.896+0000 7fa115ecc700 1 --2- 192.168.123.103:0/204210332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110103720 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7fa0f8009b00 tx=0x7fa0f8009e10 comp rx=0 tx=0).stop 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 -- 192.168.123.103:0/204210332 shutdown_connections 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 --2- 192.168.123.103:0/204210332 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 --2- 192.168.123.103:0/204210332 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110103720 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 -- 192.168.123.103:0/204210332 >> 192.168.123.103:0/204210332 conn(0x7fa1100febd0 msgr2=0x7fa110100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 -- 192.168.123.103:0/204210332 shutdown_connections 2026-03-09T00:11:55.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 -- 192.168.123.103:0/204210332 wait complete. 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.897+0000 7fa115ecc700 1 Processor -- start 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa115ecc700 1 -- start start 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa115ecc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110198e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa115ecc700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110199370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa115ecc700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa1101999c0 con 0x7fa110103340 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa115ecc700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa110199b00 con 0x7fa110103cf0 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa10f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110198e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa10effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110199370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa10f7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110198e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57282/0 (socket says 192.168.123.103:57282) 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa10f7fe700 1 -- 192.168.123.103:0/2470382486 learned_addr learned my addr 192.168.123.103:0/2470382486 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:55.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.898+0000 7fa10effd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110199370 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48818/0 (socket says 192.168.123.103:48818) 2026-03-09T00:11:55.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10effd700 1 -- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 msgr2=0x7fa110198e30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:55.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10effd700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110198e30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:55.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10effd700 1 -- 192.168.123.103:0/2470382486 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa0f80097e0 con 0x7fa110103cf0 2026-03-09T00:11:55.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10effd700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110199370 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa10000c8c0 tx=0x7fa10000cbd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:55.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa1000043f0 con 0x7fa110103cf0 2026-03-09T00:11:55.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa100004550 con 0x7fa110103cf0 2026-03-09T00:11:55.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa100007d00 con 0x7fa110103cf0 2026-03-09T00:11:55.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa11019d950 con 0x7fa110103cf0 2026-03-09T00:11:55.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.899+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa11019dea0 con 0x7fa110103cf0 2026-03-09T00:11:55.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.900+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa11010b690 con 0x7fa110103cf0 2026-03-09T00:11:55.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.901+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa1000074c0 con 0x7fa110103cf0 2026-03-09T00:11:55.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.901+0000 7fa10cff9700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa0fc077910 0x7fa0fc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:55.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.901+0000 7fa10f7fe700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa0fc077910 0x7fa0fc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:55.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.902+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa100099da0 con 0x7fa110103cf0 2026-03-09T00:11:55.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.902+0000 7fa10f7fe700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa0fc077910 0x7fa0fc079dd0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa0f800b5c0 tx=0x7fa0f8005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:55.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:55.903+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa100062550 con 0x7fa110103cf0 2026-03-09T00:11:56.038 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.038+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa11019a2a0 con 0x7fa0fc077910 2026-03-09T00:11:56.039 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:55 vm03.local ceph-mon[129670]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:56.041 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.041+0000 7fa10cff9700 1 -- 192.168.123.103:0/2470382486 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7fa11019a2a0 con 0x7fa0fc077910 2026-03-09T00:11:56.041 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:11:56.041 INFO:teuthology.orchestra.run.vm03.stdout: "target_image": null, 2026-03-09T00:11:56.041 INFO:teuthology.orchestra.run.vm03.stdout: "in_progress": false, 2026-03-09T00:11:56.041 INFO:teuthology.orchestra.run.vm03.stdout: "which": "", 2026-03-09T00:11:56.042 INFO:teuthology.orchestra.run.vm03.stdout: "services_complete": [], 2026-03-09T00:11:56.042 INFO:teuthology.orchestra.run.vm03.stdout: "progress": null, 2026-03-09T00:11:56.042 INFO:teuthology.orchestra.run.vm03.stdout: "message": "", 2026-03-09T00:11:56.042 INFO:teuthology.orchestra.run.vm03.stdout: "is_paused": false 2026-03-09T00:11:56.042 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.043+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa0fc077910 msgr2=0x7fa0fc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.043+0000 7fa115ecc700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa0fc077910 0x7fa0fc079dd0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fa0f800b5c0 tx=0x7fa0f8005fd0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 msgr2=0x7fa110199370 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110199370 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7fa10000c8c0 tx=0x7fa10000cbd0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 shutdown_connections 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa0fc077910 0x7fa0fc079dd0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa110103340 0x7fa110198e30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.044 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 --2- 192.168.123.103:0/2470382486 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa110103cf0 0x7fa110199370 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 >> 192.168.123.103:0/2470382486 conn(0x7fa1100febd0 msgr2=0x7fa110100fc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:56.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 shutdown_connections 2026-03-09T00:11:56.045 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.044+0000 7fa115ecc700 1 -- 192.168.123.103:0/2470382486 wait complete. 2026-03-09T00:11:56.112 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-09T00:11:56.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:55 vm06.local ceph-mon[106218]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:56.268 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.535+0000 7f79b3513700 1 -- 192.168.123.103:0/1381681273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac1033c0 msgr2=0x7f79ac1037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.535+0000 7f79b3513700 1 --2- 192.168.123.103:0/1381681273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac1033c0 0x7f79ac1037a0 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f799c009ab0 tx=0x7f799c009dc0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.536+0000 7f79b3513700 1 -- 192.168.123.103:0/1381681273 shutdown_connections 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.536+0000 7f79b3513700 1 --2- 192.168.123.103:0/1381681273 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac103d70 0x7f79ac107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.536+0000 7f79b3513700 1 --2- 192.168.123.103:0/1381681273 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac1033c0 0x7f79ac1037a0 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.536+0000 7f79b3513700 1 -- 192.168.123.103:0/1381681273 >> 192.168.123.103:0/1381681273 conn(0x7f79ac0fec30 msgr2=0x7f79ac101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.536+0000 7f79b3513700 1 -- 192.168.123.103:0/1381681273 shutdown_connections 2026-03-09T00:11:56.536 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.536+0000 7f79b3513700 1 -- 192.168.123.103:0/1381681273 wait complete. 2026-03-09T00:11:56.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b3513700 1 Processor -- start 2026-03-09T00:11:56.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b3513700 1 -- start start 2026-03-09T00:11:56.537 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b3513700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac103d70 0x7f79ac1990f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b3513700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 0x7f79ac19daa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b3513700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f79ac199c50 con 0x7f79ac103d70 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b3513700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f79ac199dc0 con 0x7f79ac199630 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b0aae700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 0x7f79ac19daa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b0aae700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 0x7f79ac19daa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:48842/0 (socket says 192.168.123.103:48842) 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.537+0000 7f79b0aae700 1 -- 192.168.123.103:0/1455955380 learned_addr learned my addr 192.168.123.103:0/1455955380 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.538+0000 7f79b0aae700 1 -- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac103d70 msgr2=0x7f79ac1990f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.538+0000 7f79b12af700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac103d70 0x7f79ac1990f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.538+0000 7f79b0aae700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac103d70 0x7f79ac1990f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.538+0000 7f79b0aae700 1 -- 192.168.123.103:0/1455955380 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f799c009710 con 0x7f79ac199630 2026-03-09T00:11:56.538 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.538+0000 7f79b12af700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac103d70 0x7f79ac1990f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:11:56.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.538+0000 7f79b0aae700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 0x7f79ac19daa0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f79a800eab0 tx=0x7f79a800ee70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:56.539 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.539+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f79a800cc40 con 0x7f79ac199630 2026-03-09T00:11:56.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.539+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f79a800cda0 con 0x7f79ac199630 2026-03-09T00:11:56.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.539+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f79a80187d0 con 0x7f79ac199630 2026-03-09T00:11:56.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.539+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f79ac19e0a0 con 0x7f79ac199630 2026-03-09T00:11:56.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.539+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f79ac19e5f0 con 0x7f79ac199630 2026-03-09T00:11:56.540 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.540+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f79ac10b760 con 0x7f79ac199630 2026-03-09T00:11:56.541 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.541+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f79a8010ba0 con 0x7f79ac199630 2026-03-09T00:11:56.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.541+0000 7f79a27fc700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7998077870 0x7f7998079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:56.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.542+0000 7f79b12af700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7998077870 0x7f7998079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:56.542 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.542+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f79a8014070 con 0x7f79ac199630 2026-03-09T00:11:56.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.542+0000 7f79b12af700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7998077870 0x7f7998079d30 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f799c00b5c0 tx=0x7f799c005d80 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:56.543 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.543+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f79a8063100 con 0x7f79ac199630 2026-03-09T00:11:56.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.722+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f79ac19e8d0 con 0x7f79ac199630 2026-03-09T00:11:56.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.723+0000 7f79a27fc700 1 -- 192.168.123.103:0/1455955380 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f79a8062850 con 0x7f79ac199630 2026-03-09T00:11:56.724 INFO:teuthology.orchestra.run.vm03.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-09T00:11:56.724 INFO:teuthology.orchestra.run.vm03.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-09T00:11:56.724 INFO:teuthology.orchestra.run.vm03.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.726+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7998077870 msgr2=0x7f7998079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7998077870 0x7f7998079d30 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f799c00b5c0 tx=0x7f799c005d80 comp rx=0 tx=0).stop 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 msgr2=0x7f79ac19daa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 0x7f79ac19daa0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f79a800eab0 tx=0x7f79a800ee70 comp rx=0 tx=0).stop 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 shutdown_connections 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7998077870 0x7f7998079d30 secure :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f799c00b5c0 tx=0x7f799c005d80 comp rx=0 tx=0).stop 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f79ac103d70 0x7f79ac1990f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 --2- 192.168.123.103:0/1455955380 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f79ac199630 0x7f79ac19daa0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:56.728 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.727+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 >> 192.168.123.103:0/1455955380 conn(0x7f79ac0fec30 msgr2=0x7f79ac100340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:56.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.728+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 shutdown_connections 2026-03-09T00:11:56.729 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:56.728+0000 7f79b3513700 1 -- 192.168.123.103:0/1455955380 wait complete. 2026-03-09T00:11:56.797 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T00:11:56.960 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:11:56.989 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:56 vm03.local ceph-mon[129670]: from='client.44333 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:56.989 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:56 vm03.local ceph-mon[129670]: pgmap v254: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:57.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:56 vm06.local ceph-mon[106218]: from='client.44333 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:57.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:56 vm06.local ceph-mon[106218]: pgmap v254: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:57.259 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.258+0000 7f93a498c700 1 -- 192.168.123.103:0/1153689676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c107d90 msgr2=0x7f939c108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:57.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.258+0000 7f93a498c700 1 --2- 192.168.123.103:0/1153689676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c107d90 0x7f939c108210 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f939800b600 tx=0x7f939800b910 comp rx=0 tx=0).stop 2026-03-09T00:11:57.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 -- 192.168.123.103:0/1153689676 shutdown_connections 2026-03-09T00:11:57.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 --2- 192.168.123.103:0/1153689676 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c107d90 0x7f939c108210 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:57.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 --2- 192.168.123.103:0/1153689676 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f939c10d4e0 0x7f939c10d8c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:57.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 -- 192.168.123.103:0/1153689676 >> 192.168.123.103:0/1153689676 conn(0x7f939c06d0f0 msgr2=0x7f939c06d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:57.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 -- 192.168.123.103:0/1153689676 shutdown_connections 2026-03-09T00:11:57.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 -- 192.168.123.103:0/1153689676 wait complete. 2026-03-09T00:11:57.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 Processor -- start 2026-03-09T00:11:57.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.262+0000 7f93a498c700 1 -- start start 2026-03-09T00:11:57.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a498c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f939c10d4e0 0x7f939c07d220 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:57.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a498c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 0x7f939c081bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:57.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a498c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f939c07dc70 con 0x7f939c07d760 2026-03-09T00:11:57.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a498c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f939c07ddb0 con 0x7f939c10d4e0 2026-03-09T00:11:57.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a1f27700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 0x7f939c081bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:57.264 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a1f27700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 0x7f939c081bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:57316/0 (socket says 192.168.123.103:57316) 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a1f27700 1 -- 192.168.123.103:0/2546460805 learned_addr learned my addr 192.168.123.103:0/2546460805 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a2728700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f939c10d4e0 0x7f939c07d220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a1f27700 1 -- 192.168.123.103:0/2546460805 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f939c10d4e0 msgr2=0x7f939c07d220 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a1f27700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f939c10d4e0 0x7f939c07d220 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.263+0000 7f93a1f27700 1 -- 192.168.123.103:0/2546460805 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f939800b050 con 0x7f939c07d760 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.264+0000 7f93a1f27700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 0x7f939c081bd0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f93980095d0 tx=0x7f9398007c50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.265+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f939800e030 con 0x7f939c07d760 2026-03-09T00:11:57.265 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.265+0000 7f93a498c700 1 -- 192.168.123.103:0/2546460805 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f939c082110 con 0x7f939c07d760 2026-03-09T00:11:57.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.265+0000 7f93a498c700 1 -- 192.168.123.103:0/2546460805 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f939c082630 con 0x7f939c07d760 2026-03-09T00:11:57.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.265+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9398025d30 con 0x7f939c07d760 2026-03-09T00:11:57.266 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.265+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f939801b670 con 0x7f939c07d760 2026-03-09T00:11:57.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.267+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9398019070 con 0x7f939c07d760 2026-03-09T00:11:57.267 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.267+0000 7f938f7fe700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f93880779e0 0x7f9388079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:11:57.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.267+0000 7f93a2728700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f93880779e0 0x7f9388079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:11:57.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.268+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f93980a30a0 con 0x7f939c07d760 2026-03-09T00:11:57.268 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.268+0000 7f93a498c700 1 -- 192.168.123.103:0/2546460805 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9380005320 con 0x7f939c07d760 2026-03-09T00:11:57.269 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.268+0000 7f93a2728700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f93880779e0 0x7f9388079ea0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f9390009960 tx=0x7f9390008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:11:57.274 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.274+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f939806b850 con 0x7f939c07d760 2026-03-09T00:11:57.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.449+0000 7f93a498c700 1 -- 192.168.123.103:0/2546460805 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9380006200 con 0x7f939c07d760 2026-03-09T00:11:57.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.450+0000 7f938f7fe700 1 -- 192.168.123.103:0/2546460805 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f939806afa0 con 0x7f939c07d760 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:11:57.451 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 -- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f93880779e0 msgr2=0x7f9388079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f93880779e0 0x7f9388079ea0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f9390009960 tx=0x7f9390008040 comp rx=0 tx=0).stop 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 -- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 msgr2=0x7f939c081bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 0x7f939c081bd0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f93980095d0 tx=0x7f9398007c50 comp rx=0 tx=0).stop 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 -- 192.168.123.103:0/2546460805 shutdown_connections 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f93880779e0 0x7f9388079ea0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f939c10d4e0 0x7f939c07d220 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 --2- 192.168.123.103:0/2546460805 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f939c07d760 0x7f939c081bd0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 -- 192.168.123.103:0/2546460805 >> 192.168.123.103:0/2546460805 conn(0x7f939c06d0f0 msgr2=0x7f939c070590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 -- 192.168.123.103:0/2546460805 shutdown_connections 2026-03-09T00:11:57.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:11:57.455+0000 7f938d7fa700 1 -- 192.168.123.103:0/2546460805 wait complete. 2026-03-09T00:11:57.513 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-09T00:11:57.690 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:11:57.769 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:57 vm03.local ceph-mon[129670]: from='client.44335 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:57.769 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:57 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1455955380' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:11:57.769 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:57 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2546460805' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:57.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:57 vm06.local ceph-mon[106218]: from='client.44335 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:11:57.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:57 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1455955380' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-09T00:11:57.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:57 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2546460805' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:11:57.933 INFO:teuthology.orchestra.run.vm03.stdout:wait for servicemap items w/ changing names to refresh 2026-03-09T00:11:57.971 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-09T00:11:58.134 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:11:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:58 vm03.local ceph-mon[129670]: pgmap v255: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:59.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:11:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:11:59.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:58 vm06.local ceph-mon[106218]: pgmap v255: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:11:59.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:11:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:01.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:00 vm03.local ceph-mon[129670]: pgmap v256: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:01.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:00 vm06.local ceph-mon[106218]: pgmap v256: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:02.920 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:02 vm06.local ceph-mon[106218]: pgmap v257: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:03.087 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:02 vm03.local ceph-mon[129670]: pgmap v257: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:05.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:04 vm03.local ceph-mon[129670]: pgmap v258: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:05.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:04 vm06.local ceph-mon[106218]: pgmap v258: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:07.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:06 vm03.local ceph-mon[129670]: pgmap v259: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:07.133 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:06 vm06.local ceph-mon[106218]: pgmap v259: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:09.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:08 vm06.local ceph-mon[106218]: pgmap v260: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:09.337 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:08 vm03.local ceph-mon[129670]: pgmap v260: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:11.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:10 vm03.local ceph-mon[129670]: pgmap v261: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:11.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:11 vm06.local ceph-mon[106218]: pgmap v261: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:12.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:12 vm03.local ceph-mon[129670]: pgmap v262: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:12.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:12 vm06.local ceph-mon[106218]: pgmap v262: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:14 vm03.local ceph-mon[129670]: pgmap v263: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:15.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:15.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:14 vm06.local ceph-mon[106218]: pgmap v263: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:15.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:17.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:16 vm03.local ceph-mon[129670]: pgmap v264: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:17.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:16 vm06.local ceph-mon[106218]: pgmap v264: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:19.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:18 vm03.local ceph-mon[129670]: pgmap v265: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:19.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:19 vm06.local ceph-mon[106218]: pgmap v265: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:20.838 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:20 vm03.local ceph-mon[129670]: pgmap v266: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:20.921 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:20 vm06.local ceph-mon[106218]: pgmap v266: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:23.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:22 vm06.local ceph-mon[106218]: pgmap v267: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:23.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:22 vm03.local ceph-mon[129670]: pgmap v267: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:24.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:23 vm03.local ceph-mon[129670]: pgmap v268: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:24.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:23 vm06.local ceph-mon[106218]: pgmap v268: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:27.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:26 vm03.local ceph-mon[129670]: pgmap v269: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:27.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:26 vm06.local ceph-mon[106218]: pgmap v269: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:28 vm03.local ceph-mon[129670]: pgmap v270: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:29.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:28 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:29.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:28 vm06.local ceph-mon[106218]: pgmap v270: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:29.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:28 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:31.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:30 vm03.local ceph-mon[129670]: pgmap v271: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:31.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:30 vm06.local ceph-mon[106218]: pgmap v271: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:33.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:32 vm03.local ceph-mon[129670]: pgmap v272: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:33.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:32 vm06.local ceph-mon[106218]: pgmap v272: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:35.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:34 vm03.local ceph-mon[129670]: pgmap v273: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:35.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:34 vm06.local ceph-mon[106218]: pgmap v273: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:37.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:36 vm03.local ceph-mon[129670]: pgmap v274: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:37.223 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:36 vm06.local ceph-mon[106218]: pgmap v274: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:39.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:38 vm03.local ceph-mon[129670]: pgmap v275: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:39.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:38 vm06.local ceph-mon[106218]: pgmap v275: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:41.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:40 vm03.local ceph-mon[129670]: pgmap v276: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:41.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:40 vm06.local ceph-mon[106218]: pgmap v276: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:42.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:12:42.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:12:42.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:12:42.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:41 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:12:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-09T00:12:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-09T00:12:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-09T00:12:42.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:41 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' 2026-03-09T00:12:43.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:42 vm03.local ceph-mon[129670]: pgmap v277: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:43.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:42 vm06.local ceph-mon[106218]: pgmap v277: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:43 vm03.local ceph-mon[129670]: pgmap v278: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:44.338 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:43 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:44.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:43 vm06.local ceph-mon[106218]: pgmap v278: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:44.421 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:43 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:47.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:46 vm03.local ceph-mon[129670]: pgmap v279: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:47.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:46 vm06.local ceph-mon[106218]: pgmap v279: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:49.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:48 vm03.local ceph-mon[129670]: pgmap v280: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:49.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:48 vm06.local ceph-mon[106218]: pgmap v280: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:51.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:50 vm03.local ceph-mon[129670]: pgmap v281: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:51.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:50 vm06.local ceph-mon[106218]: pgmap v281: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:53.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:52 vm03.local ceph-mon[129670]: pgmap v282: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:53.130 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:52 vm06.local ceph-mon[106218]: pgmap v282: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:55.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:54 vm03.local ceph-mon[129670]: pgmap v283: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:55.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:54 vm06.local ceph-mon[106218]: pgmap v283: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:57.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:56 vm03.local ceph-mon[129670]: pgmap v284: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:57.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:56 vm06.local ceph-mon[106218]: pgmap v284: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:58.374 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-09T00:12:58.517 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.770+0000 7f3a3e76c700 1 -- 192.168.123.103:0/787510937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a380690e0 msgr2=0x7f3a38105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.770+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/787510937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a380690e0 0x7f3a38105b50 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f3a28009b00 tx=0x7f3a28009e10 comp rx=0 tx=0).stop 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 -- 192.168.123.103:0/787510937 shutdown_connections 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/787510937 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a380690e0 0x7f3a38105b50 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/787510937 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a38068730 0x7f3a38068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 -- 192.168.123.103:0/787510937 >> 192.168.123.103:0/787510937 conn(0x7f3a38075960 msgr2=0x7f3a38075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:12:58.771 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 -- 192.168.123.103:0/787510937 shutdown_connections 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 -- 192.168.123.103:0/787510937 wait complete. 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 Processor -- start 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.771+0000 7f3a3e76c700 1 -- start start 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a3e76c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a38068730 0x7f3a380fff40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a3e76c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 0x7f3a38100480 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a3e76c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a381009c0 con 0x7f3a38068730 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a3e76c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a38100b30 con 0x7f3a380690e0 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a377fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 0x7f3a38100480 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a377fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 0x7f3a38100480 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57796/0 (socket says 192.168.123.103:57796) 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a377fe700 1 -- 192.168.123.103:0/3137749191 learned_addr learned my addr 192.168.123.103:0/3137749191 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:12:58.772 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a377fe700 1 -- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a38068730 msgr2=0x7f3a380fff40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:12:58.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.772+0000 7f3a37fff700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a38068730 0x7f3a380fff40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:58.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a377fe700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a38068730 0x7f3a380fff40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a377fe700 1 -- 192.168.123.103:0/3137749191 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a280097e0 con 0x7f3a380690e0 2026-03-09T00:12:58.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a37fff700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a38068730 0x7f3a380fff40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:12:58.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a377fe700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 0x7f3a38100480 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f3a28006010 tx=0x7f3a28005de0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:12:58.773 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a2801d070 con 0x7f3a380690e0 2026-03-09T00:12:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a38100d60 con 0x7f3a380690e0 2026-03-09T00:12:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.773+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a38071c10 con 0x7f3a380690e0 2026-03-09T00:12:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.774+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3a2800bc50 con 0x7f3a380690e0 2026-03-09T00:12:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.774+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a280216b0 con 0x7f3a380690e0 2026-03-09T00:12:58.774 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.774+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a3804ea90 con 0x7f3a380690e0 2026-03-09T00:12:58.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.775+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3a2800fad0 con 0x7f3a380690e0 2026-03-09T00:12:58.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.775+0000 7f3a357fa700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3a24077910 0x7f3a24079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:58.775 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.775+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f3a2809ae90 con 0x7f3a380690e0 2026-03-09T00:12:58.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.776+0000 7f3a37fff700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3a24077910 0x7f3a24079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:58.776 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.776+0000 7f3a37fff700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3a24077910 0x7f3a24079dd0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f3a20007950 tx=0x7f3a20008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:12:58.778 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.778+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3a280636f0 con 0x7f3a380690e0 2026-03-09T00:12:58.897 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.896+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 --> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3a381014c0 con 0x7f3a24077910 2026-03-09T00:12:58.897 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:58 vm03.local ceph-mon[129670]: pgmap v285: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:58.897 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:58 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.902+0000 7f3a357fa700 1 -- 192.168.123.103:0/3137749191 <== mgr.34104 v2:192.168.123.103:6800/1313678299 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f3a381014c0 con 0x7f3a24077910 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:alertmanager.vm03 vm03 *:9093,9094 running (8m) 80s ago 13m 24.4M - 0.25.0 c8568f914cd2 6bc39b415ac6 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm03 vm03 running (82s) 80s ago 13m 9893k - 19.2.3-678-ge911bdeb 654f31e6858e 32be78e0d748 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:ceph-exporter.vm06 vm06 running (81s) 80s ago 12m 9625k - 19.2.3-678-ge911bdeb 654f31e6858e d24bb3c627fc 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm03 vm03 running (6m) 80s ago 13m 7838k - 19.2.3-678-ge911bdeb 654f31e6858e a76700a1f6bf 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:crash.vm06 vm06 running (6m) 80s ago 12m 7860k - 19.2.3-678-ge911bdeb 654f31e6858e ab35f6a843c1 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:grafana.vm03 vm03 *:3000 running (8m) 80s ago 13m 90.2M - 10.4.0 c8b91775d855 00a3394cdec9 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.ralade vm03 running (111s) 80s ago 11m 15.3M - 19.2.3-678-ge911bdeb 654f31e6858e 0f2e03d0bb71 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm03.sejksk vm03 running (2m) 80s ago 11m 101M - 19.2.3-678-ge911bdeb 654f31e6858e 35aa1832dc40 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.ixduim vm06 running (87s) 80s ago 11m 14.4M - 19.2.3-678-ge911bdeb 654f31e6858e 7f828154106b 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mds.cephfs.vm06.vlrwtl vm06 running (102s) 80s ago 11m 18.3M - 19.2.3-678-ge911bdeb 654f31e6858e c123d86a9659 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm03.yvcons vm03 *:8443,9283,8765 running (9m) 80s ago 13m 632M - 19.2.3-678-ge911bdeb 654f31e6858e 5c5f89207f88 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mgr.vm06.rzcvhn vm06 *:8443,9283,8765 running (8m) 80s ago 12m 491M - 19.2.3-678-ge911bdeb 654f31e6858e e3d70135f3ac 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm03 vm03 running (7m) 80s ago 14m 69.3M 2048M 19.2.3-678-ge911bdeb 654f31e6858e cafe87ec117d 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:mon.vm06 vm06 running (7m) 80s ago 12m 58.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 33df752aa193 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm03 vm03 *:9100 running (8m) 80s ago 13m 9.82M - 1.7.0 72c9c2088986 0cdd6e671b4f 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:node-exporter.vm06 vm06 *:9100 running (8m) 80s ago 12m 9684k - 1.7.0 72c9c2088986 848c5c72973d 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:osd.0 vm03 running (6m) 80s ago 12m 184M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 7112eceae9ce 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:osd.1 vm03 running (3m) 80s ago 12m 157M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e70d2f37c6d1 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:osd.2 vm03 running (3m) 80s ago 12m 112M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e7841e7307ae 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:osd.3 vm06 running (3m) 80s ago 11m 190M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 8e61be617139 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:osd.4 vm06 running (2m) 80s ago 11m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 21cf4dc58899 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:osd.5 vm06 running (2m) 80s ago 11m 140M 4096M 19.2.3-678-ge911bdeb 654f31e6858e fbc950d55a67 2026-03-09T00:12:58.903 INFO:teuthology.orchestra.run.vm03.stdout:prometheus.vm03 vm03 *:9095 running (8m) 80s ago 13m 59.3M - 2.51.0 1d3b7f56885b 16d6071e49fb 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3a24077910 msgr2=0x7f3a24079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3a24077910 0x7f3a24079dd0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f3a20007950 tx=0x7f3a20008040 comp rx=0 tx=0).stop 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 msgr2=0x7f3a38100480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 0x7f3a38100480 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f3a28006010 tx=0x7f3a28005de0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 shutdown_connections 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f3a24077910 0x7f3a24079dd0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f3a38068730 0x7f3a380fff40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 --2- 192.168.123.103:0/3137749191 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f3a380690e0 0x7f3a38100480 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 >> 192.168.123.103:0/3137749191 conn(0x7f3a38075960 msgr2=0x7f3a380ff7d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 shutdown_connections 2026-03-09T00:12:58.905 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:58.905+0000 7f3a3e76c700 1 -- 192.168.123.103:0/3137749191 wait complete. 2026-03-09T00:12:58.966 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-09T00:12:59.114 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:12:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:58 vm06.local ceph-mon[106218]: pgmap v285: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:12:59.171 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:58 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:12:59.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.353+0000 7efd40852700 1 -- 192.168.123.103:0/1145227692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd380731c0 msgr2=0x7efd380735a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:59.354 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.353+0000 7efd40852700 1 --2- 192.168.123.103:0/1145227692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd380731c0 0x7efd380735a0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7efd24009b50 tx=0x7efd24009e60 comp rx=0 tx=0).stop 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.354+0000 7efd40852700 1 -- 192.168.123.103:0/1145227692 shutdown_connections 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.354+0000 7efd40852700 1 --2- 192.168.123.103:0/1145227692 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd38073ae0 0x7efd3810d1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.354+0000 7efd40852700 1 --2- 192.168.123.103:0/1145227692 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd380731c0 0x7efd380735a0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.354+0000 7efd40852700 1 -- 192.168.123.103:0/1145227692 >> 192.168.123.103:0/1145227692 conn(0x7efd380fc9b0 msgr2=0x7efd380fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.354+0000 7efd40852700 1 -- 192.168.123.103:0/1145227692 shutdown_connections 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.354+0000 7efd40852700 1 -- 192.168.123.103:0/1145227692 wait complete. 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.355+0000 7efd40852700 1 Processor -- start 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.355+0000 7efd40852700 1 -- start start 2026-03-09T00:12:59.355 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.355+0000 7efd40852700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd380731c0 0x7efd38198e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.355+0000 7efd40852700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 0x7efd38199350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 0x7efd38199350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 0x7efd38199350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59504/0 (socket says 192.168.123.103:59504) 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3e5ee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd380731c0 0x7efd38198e10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3e5ee700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd380731c0 0x7efd38198e10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57812/0 (socket says 192.168.123.103:57812) 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd40852700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd38199a30 con 0x7efd38073ae0 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 -- 192.168.123.103:0/3627365381 learned_addr learned my addr 192.168.123.103:0/3627365381 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:12:59.356 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efd3819d7c0 con 0x7efd380731c0 2026-03-09T00:12:59.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 -- 192.168.123.103:0/3627365381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd380731c0 msgr2=0x7efd38198e10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:59.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd380731c0 0x7efd38198e10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 -- 192.168.123.103:0/3627365381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efd240097e0 con 0x7efd38073ae0 2026-03-09T00:12:59.357 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.356+0000 7efd3dded700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 0x7efd38199350 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7efd2c00d900 tx=0x7efd2c00dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:12:59.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.357+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd2c0041d0 con 0x7efd38073ae0 2026-03-09T00:12:59.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.357+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7efd2c004d10 con 0x7efd38073ae0 2026-03-09T00:12:59.358 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.357+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efd2c00b750 con 0x7efd38073ae0 2026-03-09T00:12:59.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.357+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efd3819daa0 con 0x7efd38073ae0 2026-03-09T00:12:59.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.357+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efd3819dff0 con 0x7efd38073ae0 2026-03-09T00:12:59.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.358+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efd3810a8c0 con 0x7efd38073ae0 2026-03-09T00:12:59.361 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.359+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7efd2c004e80 con 0x7efd38073ae0 2026-03-09T00:12:59.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.359+0000 7efd337fe700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efd28077990 0x7efd28079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:59.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.359+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7efd2c01f030 con 0x7efd38073ae0 2026-03-09T00:12:59.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.361+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7efd2c062890 con 0x7efd38073ae0 2026-03-09T00:12:59.362 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.361+0000 7efd3e5ee700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efd28077990 0x7efd28079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:59.363 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.362+0000 7efd3e5ee700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efd28077990 0x7efd28079e50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7efd2400b5c0 tx=0x7efd240058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:12:59.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.519+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7efd3819a210 con 0x7efd38073ae0 2026-03-09T00:12:59.520 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.520+0000 7efd337fe700 1 -- 192.168.123.103:0/3627365381 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7efd2c061fe0 con 0x7efd38073ae0 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "mon": { 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "mgr": { 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "osd": { 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "mds": { 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: }, 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "overall": { 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout: } 2026-03-09T00:12:59.522 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-09T00:12:59.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efd28077990 msgr2=0x7efd28079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:59.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efd28077990 0x7efd28079e50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7efd2400b5c0 tx=0x7efd240058e0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 msgr2=0x7efd38199350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:59.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 0x7efd38199350 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7efd2c00d900 tx=0x7efd2c00dcc0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 shutdown_connections 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7efd28077990 0x7efd28079e50 secure :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7efd2400b5c0 tx=0x7efd240058e0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7efd380731c0 0x7efd38198e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 --2- 192.168.123.103:0/3627365381 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7efd38073ae0 0x7efd38199350 secure :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7efd2c00d900 tx=0x7efd2c00dcc0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.524+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 >> 192.168.123.103:0/3627365381 conn(0x7efd380fc9b0 msgr2=0x7efd38107a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.525+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 shutdown_connections 2026-03-09T00:12:59.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.525+0000 7efd40852700 1 -- 192.168.123.103:0/3627365381 wait complete. 2026-03-09T00:12:59.590 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-09T00:12:59.735 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.984+0000 7fa2fd55f700 1 -- 192.168.123.103:0/2460536067 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103cf0 msgr2=0x7fa2f8107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.984+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/2460536067 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103cf0 0x7fa2f8107d40 secure :-1 s=READY pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7fa2e8009b50 tx=0x7fa2e8009e60 comp rx=0 tx=0).stop 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.985+0000 7fa2fd55f700 1 -- 192.168.123.103:0/2460536067 shutdown_connections 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.985+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/2460536067 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103cf0 0x7fa2f8107d40 secure :-1 s=CLOSED pgs=167 cs=0 l=1 rev1=1 crypto rx=0x7fa2e8009b50 tx=0x7fa2e8009e60 comp rx=0 tx=0).stop 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.985+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/2460536067 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa2f8103340 0x7fa2f8103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.985+0000 7fa2fd55f700 1 -- 192.168.123.103:0/2460536067 >> 192.168.123.103:0/2460536067 conn(0x7fa2f80feb90 msgr2=0x7fa2f8100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 -- 192.168.123.103:0/2460536067 shutdown_connections 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 -- 192.168.123.103:0/2460536067 wait complete. 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 Processor -- start 2026-03-09T00:12:59.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 -- start start 2026-03-09T00:12:59.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 0x7fa2f8199040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:59.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa2f8199580 0x7fa2f819d9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:59.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa2f8199ba0 con 0x7fa2f8103340 2026-03-09T00:12:59.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.986+0000 7fa2fd55f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa2f8199d10 con 0x7fa2f8199580 2026-03-09T00:12:59.987 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 0x7fa2f8199040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:59.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 0x7fa2f8199040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59526/0 (socket says 192.168.123.103:59526) 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 -- 192.168.123.103:0/834936507 learned_addr learned my addr 192.168.123.103:0/834936507 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f67fc700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa2f8199580 0x7fa2f819d9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 -- 192.168.123.103:0/834936507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa2f8199580 msgr2=0x7fa2f819d9a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa2f8199580 0x7fa2f819d9a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 -- 192.168.123.103:0/834936507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa2e80097e0 con 0x7fa2f8103340 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2f6ffd700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 0x7fa2f8199040 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7fa2e000eb10 tx=0x7fa2e000ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa2e000cc40 con 0x7fa2f8103340 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa2f819dfa0 con 0x7fa2f8103340 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.987+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa2f819e4f0 con 0x7fa2f8103340 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.988+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa2e000cda0 con 0x7fa2f8103340 2026-03-09T00:12:59.989 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.988+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa2e0018810 con 0x7fa2f8103340 2026-03-09T00:12:59.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.989+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa2e0018ab0 con 0x7fa2f8103340 2026-03-09T00:12:59.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.989+0000 7fa2effff700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa2e40779e0 0x7fa2e4079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:12:59.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.989+0000 7fa2f67fc700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa2e40779e0 0x7fa2e4079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:12:59.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.990+0000 7fa2f67fc700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa2e40779e0 0x7fa2e4079ea0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fa2e8000c00 tx=0x7fa2e8024040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:12:59.990 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.990+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa2e0014070 con 0x7fa2f8103340 2026-03-09T00:12:59.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.990+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa2d8005320 con 0x7fa2f8103340 2026-03-09T00:12:59.993 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:12:59.993+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa2e0063c60 con 0x7fa2f8103340 2026-03-09T00:13:00.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:59 vm03.local ceph-mon[129670]: from='client.44343 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:13:00.088 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:12:59 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3627365381' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:13:00.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.158+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fa2d8006200 con 0x7fa2f8103340 2026-03-09T00:13:00.159 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.158+0000 7fa2effff700 1 -- 192.168.123.103:0/834936507 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fa2e00633b0 con 0x7fa2f8103340 2026-03-09T00:13:00.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa2e40779e0 msgr2=0x7fa2e4079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:00.161 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa2e40779e0 0x7fa2e4079ea0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7fa2e8000c00 tx=0x7fa2e8024040 comp rx=0 tx=0).stop 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 msgr2=0x7fa2f8199040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 0x7fa2f8199040 secure :-1 s=READY pgs=168 cs=0 l=1 rev1=1 crypto rx=0x7fa2e000eb10 tx=0x7fa2e000ee20 comp rx=0 tx=0).stop 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 shutdown_connections 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa2e40779e0 0x7fa2e4079ea0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa2f8103340 0x7fa2f8199040 unknown :-1 s=CLOSED pgs=168 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 --2- 192.168.123.103:0/834936507 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa2f8199580 0x7fa2f819d9a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.161+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 >> 192.168.123.103:0/834936507 conn(0x7fa2f80feb90 msgr2=0x7fa2f8100300 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.162+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 shutdown_connections 2026-03-09T00:13:00.162 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:00.162+0000 7fa2fd55f700 1 -- 192.168.123.103:0/834936507 wait complete. 2026-03-09T00:13:00.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:59 vm06.local ceph-mon[106218]: from='client.44343 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-09T00:13:00.170 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:12:59 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3627365381' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:13:00.171 INFO:teuthology.orchestra.run.vm03.stdout:true 2026-03-09T00:13:00.350 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-09T00:13:00.531 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:00.931 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:00 vm03.local ceph-mon[129670]: pgmap v286: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:00.931 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:00 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/834936507' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.229+0000 7f96ea0b6700 1 -- 192.168.123.103:0/3204494602 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 msgr2=0x7f96e4103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.229+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/3204494602 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4103720 secure :-1 s=READY pgs=169 cs=0 l=1 rev1=1 crypto rx=0x7f96d8009a60 tx=0x7f96d8009d70 comp rx=0 tx=0).stop 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.230+0000 7f96ea0b6700 1 -- 192.168.123.103:0/3204494602 shutdown_connections 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.230+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/3204494602 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.230+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/3204494602 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4103720 unknown :-1 s=CLOSED pgs=169 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.230+0000 7f96ea0b6700 1 -- 192.168.123.103:0/3204494602 >> 192.168.123.103:0/3204494602 conn(0x7f96e40feb90 msgr2=0x7f96e4100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.231+0000 7f96ea0b6700 1 -- 192.168.123.103:0/3204494602 shutdown_connections 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.231+0000 7f96ea0b6700 1 -- 192.168.123.103:0/3204494602 wait complete. 2026-03-09T00:13:01.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.231+0000 7f96ea0b6700 1 Processor -- start 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.231+0000 7f96ea0b6700 1 -- start start 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96ea0b6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96ea0b6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96ea0b6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96e41999f0 con 0x7f96e4103340 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96ea0b6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96e419d780 con 0x7f96e4103cf0 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96e37fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4198dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96e2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:01.232 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96e2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4199310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:57846/0 (socket says 192.168.123.103:57846) 2026-03-09T00:13:01.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.232+0000 7f96e2ffd700 1 -- 192.168.123.103:0/1894885750 learned_addr learned my addr 192.168.123.103:0/1894885750 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:01.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.233+0000 7f96e2ffd700 1 -- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 msgr2=0x7f96e4198dd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:01.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.233+0000 7f96e2ffd700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4198dd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.233+0000 7f96e2ffd700 1 -- 192.168.123.103:0/1894885750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96d8009710 con 0x7f96e4103cf0 2026-03-09T00:13:01.233 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.233+0000 7f96e37fe700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:13:01.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.233+0000 7f96e2ffd700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4199310 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f96d400eab0 tx=0x7f96d400edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:01.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.233+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f96d400cb80 con 0x7f96e4103cf0 2026-03-09T00:13:01.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.234+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96e419da60 con 0x7f96e4103cf0 2026-03-09T00:13:01.234 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.234+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96e419dfb0 con 0x7f96e4103cf0 2026-03-09T00:13:01.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.234+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f96d400cce0 con 0x7f96e4103cf0 2026-03-09T00:13:01.235 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.234+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f96d40187a0 con 0x7f96e4103cf0 2026-03-09T00:13:01.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.235+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f96d40189d0 con 0x7f96e4103cf0 2026-03-09T00:13:01.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.235+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f96e410b6e0 con 0x7f96e4103cf0 2026-03-09T00:13:01.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.236+0000 7f96e0ff9700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f96cc077870 0x7f96cc079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:01.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.236+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f96d4014070 con 0x7f96e4103cf0 2026-03-09T00:13:01.236 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.236+0000 7f96e37fe700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f96cc077870 0x7f96cc079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:01.237 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.237+0000 7f96e37fe700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f96cc077870 0x7f96cc079d30 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f96d8009a30 tx=0x7f96d800b580 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:01.239 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.238+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f96d4063b90 con 0x7f96e4103cf0 2026-03-09T00:13:01.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.402+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f96e419dbf0 con 0x7f96e4103cf0 2026-03-09T00:13:01.403 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.403+0000 7f96e0ff9700 1 -- 192.168.123.103:0/1894885750 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f96e419dbf0 con 0x7f96e4103cf0 2026-03-09T00:13:01.405 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f96cc077870 msgr2=0x7f96cc079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f96cc077870 0x7f96cc079d30 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f96d8009a30 tx=0x7f96d800b580 comp rx=0 tx=0).stop 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 msgr2=0x7f96e4199310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4199310 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f96d400eab0 tx=0x7f96d400edc0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 shutdown_connections 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f96cc077870 0x7f96cc079d30 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f96e4103340 0x7f96e4198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 --2- 192.168.123.103:0/1894885750 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f96e4103cf0 0x7f96e4199310 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.405+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 >> 192.168.123.103:0/1894885750 conn(0x7f96e40feb90 msgr2=0x7f96e4100f30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.406+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 shutdown_connections 2026-03-09T00:13:01.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.406+0000 7f96ea0b6700 1 -- 192.168.123.103:0/1894885750 wait complete. 2026-03-09T00:13:01.415 INFO:teuthology.orchestra.run.vm03.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-09T00:13:01.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:00 vm06.local ceph-mon[106218]: pgmap v286: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:01.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:00 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/834936507' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:13:01.525 DEBUG:teuthology.parallel:result is None 2026-03-09T00:13:01.525 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-09T00:13:01.528 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm03.local 2026-03-09T00:13:01.528 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- bash -c 'ceph fs dump' 2026-03-09T00:13:01.673 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.907+0000 7f95d8274700 1 -- 192.168.123.103:0/2739417707 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d01001a0 msgr2=0x7f95d0100580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.907+0000 7f95d8274700 1 --2- 192.168.123.103:0/2739417707 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d01001a0 0x7f95d0100580 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f95c4009a60 tx=0x7f95c4009d70 comp rx=0 tx=0).stop 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.908+0000 7f95d8274700 1 -- 192.168.123.103:0/2739417707 shutdown_connections 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.908+0000 7f95d8274700 1 --2- 192.168.123.103:0/2739417707 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d0100b50 0x7f95d0104a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.908+0000 7f95d8274700 1 --2- 192.168.123.103:0/2739417707 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d01001a0 0x7f95d0100580 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.908+0000 7f95d8274700 1 -- 192.168.123.103:0/2739417707 >> 192.168.123.103:0/2739417707 conn(0x7f95d0075960 msgr2=0x7f95d0075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.908+0000 7f95d8274700 1 -- 192.168.123.103:0/2739417707 shutdown_connections 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.909+0000 7f95d8274700 1 -- 192.168.123.103:0/2739417707 wait complete. 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.909+0000 7f95d8274700 1 Processor -- start 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.909+0000 7f95d8274700 1 -- start start 2026-03-09T00:13:01.909 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.909+0000 7f95d8274700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 0x7f95d0198e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.909+0000 7f95d6010700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 0x7f95d0198e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d6010700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 0x7f95d0198e60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59562/0 (socket says 192.168.123.103:59562) 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d8274700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d0100b50 0x7f95d01993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d6010700 1 -- 192.168.123.103:0/2463160653 learned_addr learned my addr 192.168.123.103:0/2463160653 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d8274700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95d0199a80 con 0x7f95d01001a0 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95d019d810 con 0x7f95d0100b50 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d580f700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d0100b50 0x7f95d01993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d6010700 1 -- 192.168.123.103:0/2463160653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d0100b50 msgr2=0x7f95d01993a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d6010700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d0100b50 0x7f95d01993a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d6010700 1 -- 192.168.123.103:0/2463160653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95cc0097e0 con 0x7f95d01001a0 2026-03-09T00:13:01.910 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d580f700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d0100b50 0x7f95d01993a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:13:01.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.910+0000 7f95d6010700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 0x7f95d0198e60 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f95c400b5c0 tx=0x7f95c400f690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:01.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.911+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95c401d070 con 0x7f95d01001a0 2026-03-09T00:13:01.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.911+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f95c400fc70 con 0x7f95d01001a0 2026-03-09T00:13:01.911 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.911+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95c4009710 con 0x7f95d01001a0 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.911+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95c4017710 con 0x7f95d01001a0 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.911+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95d019ddf0 con 0x7f95d01001a0 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.912+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f95c4017870 con 0x7f95d01001a0 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.912+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95d0108390 con 0x7f95d01001a0 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.915+0000 7f95c37fe700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f95bc077990 0x7f95bc079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.915+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f95c409bdb0 con 0x7f95d01001a0 2026-03-09T00:13:01.915 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.915+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f95c400fde0 con 0x7f95d01001a0 2026-03-09T00:13:01.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.915+0000 7f95d580f700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f95bc077990 0x7f95bc079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:01.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:01.916+0000 7f95d580f700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f95bc077990 0x7f95bc079e50 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f95d019a480 tx=0x7f95cc009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:02.061 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:01 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1894885750' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:13:02.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.060+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f95d004ea90 con 0x7f95d01001a0 2026-03-09T00:13:02.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.062+0000 7f95c37fe700 1 -- 192.168.123.103:0/2463160653 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 33 v33) v1 ==== 76+0+1974 (secure 0 0 0) 0x7f95c4064560 con 0x7f95d01001a0 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:e33 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:btime 2026-03-09T00:11:41:190503+0000 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:legacy client fscid: 1 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:fs_name cephfs 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:epoch 33 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-09T00:01:42.952984+0000 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:modified 2026-03-09T00:11:40.266801+0000 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:tableserver 0 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:root 0 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:session_timeout 60 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:session_autoclose 300 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:max_file_size 1099511627776 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:max_xattr_size 65536 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:required_client_features {} 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:last_failure 0 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:last_failure_osd_epoch 109 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:max_mds 1 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:in 0 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:up {0=34404} 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:failed 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:damaged 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:stopped 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:data_pools [3] 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:metadata_pool 2 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:inline_data enabled 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:balancer 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:bal_rank_mask -1 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:standby_count_wanted 1 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:qdb_cluster leader: 34404 members: 34404 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.sejksk{0:34404} state up:active seq 12 join_fscid=1 addr [v2:192.168.123.103:6826/784666836,v1:192.168.123.103:6827/784666836] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:Standby daemons: 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm03.ralade{-1:34412} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.103:6828/1027317762,v1:192.168.123.103:6829/1027317762] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:13:02.064 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.vlrwtl{-1:44297} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6824/1987865018,v1:192.168.123.106:6825/1987865018] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:13:02.065 INFO:teuthology.orchestra.run.vm03.stdout:[mds.cephfs.vm06.ixduim{-1:44325} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.106:6826/839008662,v1:192.168.123.106:6827/839008662] compat {c=[1],r=[1],i=[1fff]}] 2026-03-09T00:13:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.066+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f95bc077990 msgr2=0x7f95bc079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.066+0000 7f95d8274700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f95bc077990 0x7f95bc079e50 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f95d019a480 tx=0x7f95cc009500 comp rx=0 tx=0).stop 2026-03-09T00:13:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.066+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 msgr2=0x7f95d0198e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:02.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.066+0000 7f95d8274700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 0x7f95d0198e60 secure :-1 s=READY pgs=170 cs=0 l=1 rev1=1 crypto rx=0x7f95c400b5c0 tx=0x7f95c400f690 comp rx=0 tx=0).stop 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 shutdown_connections 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f95bc077990 0x7f95bc079e50 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f95d01001a0 0x7f95d0198e60 unknown :-1 s=CLOSED pgs=170 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 --2- 192.168.123.103:0/2463160653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f95d0100b50 0x7f95d01993a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 >> 192.168.123.103:0/2463160653 conn(0x7f95d0075960 msgr2=0x7f95d00ff830 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 shutdown_connections 2026-03-09T00:13:02.067 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.067+0000 7f95d8274700 1 -- 192.168.123.103:0/2463160653 wait complete. 2026-03-09T00:13:02.068 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 33 2026-03-09T00:13:02.153 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-09T00:13:02.156 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 2026-03-09T00:13:02.297 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:02.420 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:01 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1894885750' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.549+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1660705909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de80684d0 msgr2=0x7f4de8068950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.549+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1660705909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de80684d0 0x7f4de8068950 secure :-1 s=READY pgs=171 cs=0 l=1 rev1=1 crypto rx=0x7f4dd8009b00 tx=0x7f4dd8009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.550+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1660705909 shutdown_connections 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.550+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1660705909 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de80684d0 0x7f4de8068950 unknown :-1 s=CLOSED pgs=171 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.550+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1660705909 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4de8105be0 0x7f4de8105fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.550+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1660705909 >> 192.168.123.103:0/1660705909 conn(0x7f4de80756b0 msgr2=0x7f4de8075ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.550+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1660705909 shutdown_connections 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.551+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1660705909 wait complete. 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.551+0000 7f4ded3a9700 1 Processor -- start 2026-03-09T00:13:02.551 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.551+0000 7f4ded3a9700 1 -- start start 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.551+0000 7f4ded3a9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4de80684d0 0x7f4de8193730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.551+0000 7f4ded3a9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 0x7f4de8193c70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.551+0000 7f4ded3a9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4de81978c0 con 0x7f4de8105be0 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 0x7f4de8193c70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 0x7f4de8193c70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59566/0 (socket says 192.168.123.103:59566) 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 -- 192.168.123.103:0/1997009235 learned_addr learned my addr 192.168.123.103:0/1997009235 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:02.552 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4de81941b0 con 0x7f4de80684d0 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 -- 192.168.123.103:0/1997009235 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4de80684d0 msgr2=0x7f4de8193730 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4de80684d0 0x7f4de8193730 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 -- 192.168.123.103:0/1997009235 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4dd80097e0 con 0x7f4de8105be0 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de77fe700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 0x7f4de8193c70 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f4dd80094d0 tx=0x7f4dd80049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4dd801d070 con 0x7f4de8105be0 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.552+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4dd800bc50 con 0x7f4de8105be0 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.553+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4de8194430 con 0x7f4de8105be0 2026-03-09T00:13:02.553 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.553+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4de81a72d0 con 0x7f4de8105be0 2026-03-09T00:13:02.554 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.553+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4dd8022620 con 0x7f4de8105be0 2026-03-09T00:13:02.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.554+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4de804f2e0 con 0x7f4de8105be0 2026-03-09T00:13:02.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.554+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4dd800f660 con 0x7f4de8105be0 2026-03-09T00:13:02.557 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.555+0000 7f4de57fa700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4dd40778c0 0x7f4dd4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:02.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.555+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f4dd809b4e0 con 0x7f4de8105be0 2026-03-09T00:13:02.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.556+0000 7f4de7fff700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4dd40778c0 0x7f4dd4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:02.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.556+0000 7f4de7fff700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4dd40778c0 0x7f4dd4079d80 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f4dd0009780 tx=0x7f4dd0006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:02.558 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.557+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4dd8063b90 con 0x7f4de8105be0 2026-03-09T00:13:02.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.694+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f4de804ea90 con 0x7f4de8105be0 2026-03-09T00:13:02.695 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.695+0000 7f4de57fa700 1 -- 192.168.123.103:0/1997009235 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 33 v33) v1 ==== 94+0+5260 (secure 0 0 0) 0x7f4dd80632e0 con 0x7f4de8105be0 2026-03-09T00:13:02.696 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:02.696 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":33,"btime":"2026-03-09T00:11:41:190503+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44325,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/839008662","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":839008662},{"type":"v1","addr":"192.168.123.106:6827","nonce":839008662}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":31}],"filesystems":[{"mdsmap":{"epoch":33,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:40.266801+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:active","state_seq":12,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34404,"qdb_cluster":[34404]},"id":1}]} 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.697+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4dd40778c0 msgr2=0x7f4dd4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.697+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4dd40778c0 0x7f4dd4079d80 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f4dd0009780 tx=0x7f4dd0006cb0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.697+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 msgr2=0x7f4de8193c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 0x7f4de8193c70 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7f4dd80094d0 tx=0x7f4dd80049e0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 shutdown_connections 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f4dd40778c0 0x7f4dd4079d80 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4de80684d0 0x7f4de8193730 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 --2- 192.168.123.103:0/1997009235 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4de8105be0 0x7f4de8193c70 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 >> 192.168.123.103:0/1997009235 conn(0x7f4de80756b0 msgr2=0x7f4de80fdaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 shutdown_connections 2026-03-09T00:13:02.698 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:02.698+0000 7f4ded3a9700 1 -- 192.168.123.103:0/1997009235 wait complete. 2026-03-09T00:13:02.699 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 33 2026-03-09T00:13:02.744 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 9, 'max_mds': 1, 'flags': 18} 2026-03-09T00:13:02.744 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 10 2026-03-09T00:13:02.883 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:02.949 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:02 vm03.local ceph-mon[129670]: pgmap v287: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:02.949 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:02 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2463160653' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:13:02.949 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:02 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1997009235' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T00:13:03.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.132+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/3104598251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103340 msgr2=0x7f2edc103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.133 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.132+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/3104598251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103340 0x7f2edc103720 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7f2ecc009b00 tx=0x7f2ecc009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.133+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/3104598251 shutdown_connections 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.133+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/3104598251 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103cf0 0x7f2edc107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.133+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/3104598251 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103340 0x7f2edc103720 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.133+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/3104598251 >> 192.168.123.103:0/3104598251 conn(0x7f2edc0feb90 msgr2=0x7f2edc100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.133+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/3104598251 shutdown_connections 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.133+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/3104598251 wait complete. 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.134+0000 7f2ee0ad7700 1 Processor -- start 2026-03-09T00:13:03.134 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.134+0000 7f2ee0ad7700 1 -- start start 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.134+0000 7f2ee0ad7700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103340 0x7f2edc198ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.134+0000 7f2ee0ad7700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 0x7f2edc199410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.134+0000 7f2ee0ad7700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2edc199af0 con 0x7f2edc103cf0 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.134+0000 7f2ee0ad7700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2edc19d880 con 0x7f2edc103340 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 0x7f2edc199410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 0x7f2edc199410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59594/0 (socket says 192.168.123.103:59594) 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 -- 192.168.123.103:0/1429639505 learned_addr learned my addr 192.168.123.103:0/1429639505 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 -- 192.168.123.103:0/1429639505 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103340 msgr2=0x7f2edc198ed0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2eda59c700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103340 0x7f2edc198ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103340 0x7f2edc198ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 -- 192.168.123.103:0/1429639505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2ecc0097e0 con 0x7f2edc103cf0 2026-03-09T00:13:03.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.135+0000 7f2ed3fff700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 0x7f2edc199410 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f2ec400ba70 tx=0x7f2ec400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:03.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.136+0000 7f2eda59c700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103340 0x7f2edc198ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:03.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.136+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2ec400c760 con 0x7f2edc103cf0 2026-03-09T00:13:03.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.136+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2ec400cda0 con 0x7f2edc103cf0 2026-03-09T00:13:03.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.136+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2edc19db60 con 0x7f2edc103cf0 2026-03-09T00:13:03.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.136+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2edc19e0b0 con 0x7f2edc103cf0 2026-03-09T00:13:03.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.136+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2ec4012550 con 0x7f2edc103cf0 2026-03-09T00:13:03.140 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.137+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2edc10b670 con 0x7f2edc103cf0 2026-03-09T00:13:03.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.140+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2ec4014440 con 0x7f2edc103cf0 2026-03-09T00:13:03.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.141+0000 7f2ed37fe700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ebc077910 0x7f2ebc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:03.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.141+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f2ec4099590 con 0x7f2edc103cf0 2026-03-09T00:13:03.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.141+0000 7f2eda59c700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ebc077910 0x7f2ebc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:03.141 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.141+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2ec40999a0 con 0x7f2edc103cf0 2026-03-09T00:13:03.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.141+0000 7f2eda59c700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ebc077910 0x7f2ebc079dd0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f2ecc000c00 tx=0x7f2ecc005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:03.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:02 vm06.local ceph-mon[106218]: pgmap v287: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:03.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:02 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2463160653' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-09T00:13:03.212 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:02 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1997009235' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-09T00:13:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.285+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 10, "format": "json"} v 0) v1 -- 0x7f2edc19a2d0 con 0x7f2edc103cf0 2026-03-09T00:13:03.286 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.286+0000 7f2ed37fe700 1 -- 192.168.123.103:0/1429639505 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 10, "format": "json"}]=0 dumped fsmap epoch 10 v33) v1 ==== 107+0+4913 (secure 0 0 0) 0x7f2ec4061e50 con 0x7f2edc103cf0 2026-03-09T00:13:03.288 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:03.288 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":10,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":10,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:01:50.421315+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14480},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14480":{"gid":14480,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":8,"state":"up:rejoin","state_seq":4,"addr":"192.168.123.103:6827/3708505754","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3708505754},{"type":"v1","addr":"192.168.123.103:6827","nonce":3708505754}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:03.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ebc077910 msgr2=0x7f2ebc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.290 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ebc077910 0x7f2ebc079dd0 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f2ecc000c00 tx=0x7f2ecc005c00 comp rx=0 tx=0).stop 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 msgr2=0x7f2edc199410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 0x7f2edc199410 secure :-1 s=READY pgs=174 cs=0 l=1 rev1=1 crypto rx=0x7f2ec400ba70 tx=0x7f2ec400be30 comp rx=0 tx=0).stop 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 shutdown_connections 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f2ebc077910 0x7f2ebc079dd0 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f2edc103340 0x7f2edc198ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 --2- 192.168.123.103:0/1429639505 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2edc103cf0 0x7f2edc199410 unknown :-1 s=CLOSED pgs=174 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.291 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.290+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 >> 192.168.123.103:0/1429639505 conn(0x7f2edc0feb90 msgr2=0x7f2edc100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:03.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.291+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 shutdown_connections 2026-03-09T00:13:03.292 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.291+0000 7f2ee0ad7700 1 -- 192.168.123.103:0/1429639505 wait complete. 2026-03-09T00:13:03.292 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 10 2026-03-09T00:13:03.335 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 11 2026-03-09T00:13:03.476 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.714+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/2686990272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102960 msgr2=0x7fbd8810ae50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.715 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.714+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/2686990272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102960 0x7fbd8810ae50 secure :-1 s=READY pgs=175 cs=0 l=1 rev1=1 crypto rx=0x7fbd84009b00 tx=0x7fbd84009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.715+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/2686990272 shutdown_connections 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.715+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/2686990272 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102960 0x7fbd8810ae50 unknown :-1 s=CLOSED pgs=175 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.715+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/2686990272 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102040 0x7fbd88102420 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.715+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/2686990272 >> 192.168.123.103:0/2686990272 conn(0x7fbd880fb830 msgr2=0x7fbd880fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.715+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/2686990272 shutdown_connections 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.716+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/2686990272 wait complete. 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.716+0000 7fbd8fa4a700 1 Processor -- start 2026-03-09T00:13:03.716 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.716+0000 7fbd8fa4a700 1 -- start start 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.716+0000 7fbd8fa4a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 0x7fbd8819c990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8fa4a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102960 0x7fbd8819ced0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8fa4a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd8819d4d0 con 0x7fbd88102040 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8d7e6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 0x7fbd8819c990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8d7e6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 0x7fbd8819c990 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:59608/0 (socket says 192.168.123.103:59608) 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8d7e6700 1 -- 192.168.123.103:0/4123689827 learned_addr learned my addr 192.168.123.103:0/4123689827 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8cfe5700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102960 0x7fbd8819ced0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:03.717 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8fa4a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbd88196a10 con 0x7fbd88102960 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8d7e6700 1 -- 192.168.123.103:0/4123689827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102960 msgr2=0x7fbd8819ced0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8d7e6700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102960 0x7fbd8819ced0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8d7e6700 1 -- 192.168.123.103:0/4123689827 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbd840097e0 con 0x7fbd88102040 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.717+0000 7fbd8cfe5700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102960 0x7fbd8819ced0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.718+0000 7fbd8d7e6700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 0x7fbd8819c990 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7fbd7c00c930 tx=0x7fbd7c00cc40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.718+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd7c007ab0 con 0x7fbd88102040 2026-03-09T00:13:03.718 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.718+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbd7c00ce80 con 0x7fbd88102040 2026-03-09T00:13:03.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.718+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbd88196cf0 con 0x7fbd88102040 2026-03-09T00:13:03.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.718+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbd88197240 con 0x7fbd88102040 2026-03-09T00:13:03.719 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.719+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbd7c007ab0 con 0x7fbd88102040 2026-03-09T00:13:03.720 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.720+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbd88108550 con 0x7fbd88102040 2026-03-09T00:13:03.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.720+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fbd7c00f450 con 0x7fbd88102040 2026-03-09T00:13:03.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.720+0000 7fbd7a7fc700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbd740778d0 0x7fbd74079d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:03.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.720+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fbd7c0990e0 con 0x7fbd88102040 2026-03-09T00:13:03.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.721+0000 7fbd8cfe5700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbd740778d0 0x7fbd74079d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:03.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.721+0000 7fbd8cfe5700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbd740778d0 0x7fbd74079d90 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fbd880fcfc0 tx=0x7fbd8400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:03.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.723+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fbd7c061810 con 0x7fbd88102040 2026-03-09T00:13:03.863 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.862+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 11, "format": "json"} v 0) v1 -- 0x7fbd8804ea90 con 0x7fbd88102040 2026-03-09T00:13:03.864 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.864+0000 7fbd7a7fc700 1 -- 192.168.123.103:0/4123689827 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 11, "format": "json"}]=0 dumped fsmap epoch 11 v33) v1 ==== 107+0+4913 (secure 0 0 0) 0x7fbd7c061810 con 0x7fbd88102040 2026-03-09T00:13:03.864 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:03.865 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":11,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":11,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:01:51.424075+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14480},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14480":{"gid":14480,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":8,"state":"up:active","state_seq":5,"addr":"192.168.123.103:6827/3708505754","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":3708505754},{"type":"v1","addr":"192.168.123.103:6827","nonce":3708505754}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbd740778d0 msgr2=0x7fbd74079d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbd740778d0 0x7fbd74079d90 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fbd880fcfc0 tx=0x7fbd8400b540 comp rx=0 tx=0).stop 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 msgr2=0x7fbd8819c990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 0x7fbd8819c990 secure :-1 s=READY pgs=176 cs=0 l=1 rev1=1 crypto rx=0x7fbd7c00c930 tx=0x7fbd7c00cc40 comp rx=0 tx=0).stop 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 shutdown_connections 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fbd740778d0 0x7fbd74079d90 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fbd88102040 0x7fbd8819c990 unknown :-1 s=CLOSED pgs=176 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.867 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 --2- 192.168.123.103:0/4123689827 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fbd88102960 0x7fbd8819ced0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:03.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.867+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 >> 192.168.123.103:0/4123689827 conn(0x7fbd880fb830 msgr2=0x7fbd88105690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:03.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.868+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 shutdown_connections 2026-03-09T00:13:03.868 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:03.868+0000 7fbd8fa4a700 1 -- 192.168.123.103:0/4123689827 wait complete. 2026-03-09T00:13:03.869 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 11 2026-03-09T00:13:04.375 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 12 2026-03-09T00:13:04.531 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:04.601 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:04 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1429639505' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-09T00:13:04.601 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:04 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4123689827' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-09T00:13:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:04 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1429639505' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 10, "format": "json"}]: dispatch 2026-03-09T00:13:04.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:04 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4123689827' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-09T00:13:04.805 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.804+0000 7fe77f744700 1 -- 192.168.123.103:0/1992004175 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103cf0 msgr2=0x7fe778107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.804+0000 7fe77f744700 1 --2- 192.168.123.103:0/1992004175 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103cf0 0x7fe778107d40 secure :-1 s=READY pgs=177 cs=0 l=1 rev1=1 crypto rx=0x7fe774009b00 tx=0x7fe774009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.805+0000 7fe77f744700 1 -- 192.168.123.103:0/1992004175 shutdown_connections 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.805+0000 7fe77f744700 1 --2- 192.168.123.103:0/1992004175 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103cf0 0x7fe778107d40 unknown :-1 s=CLOSED pgs=177 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.805+0000 7fe77f744700 1 --2- 192.168.123.103:0/1992004175 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe778103340 0x7fe778103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.805+0000 7fe77f744700 1 -- 192.168.123.103:0/1992004175 >> 192.168.123.103:0/1992004175 conn(0x7fe7780feb90 msgr2=0x7fe778100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.806+0000 7fe77f744700 1 -- 192.168.123.103:0/1992004175 shutdown_connections 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.806+0000 7fe77f744700 1 -- 192.168.123.103:0/1992004175 wait complete. 2026-03-09T00:13:04.806 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.806+0000 7fe77f744700 1 Processor -- start 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.806+0000 7fe77f744700 1 -- start start 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77f744700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 0x7fe778198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77d4e0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 0x7fe778198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77d4e0700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 0x7fe778198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55810/0 (socket says 192.168.123.103:55810) 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77d4e0700 1 -- 192.168.123.103:0/4098844339 learned_addr learned my addr 192.168.123.103:0/4098844339 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77f744700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe778103cf0 0x7fe778199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:04.807 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe778199a00 con 0x7fe778103340 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe77819d790 con 0x7fe778103cf0 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77ccdf700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe778103cf0 0x7fe778199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77d4e0700 1 -- 192.168.123.103:0/4098844339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe778103cf0 msgr2=0x7fe778199320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77d4e0700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe778103cf0 0x7fe778199320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.807+0000 7fe77d4e0700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe7740097e0 con 0x7fe778103340 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.808+0000 7fe77d4e0700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 0x7fe778198de0 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7fe76800d8d0 tx=0x7fe76800dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.808+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe768009880 con 0x7fe778103340 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.808+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe768010460 con 0x7fe778103340 2026-03-09T00:13:04.808 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.808+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe77819da70 con 0x7fe778103340 2026-03-09T00:13:04.809 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.808+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe76800f5d0 con 0x7fe778103340 2026-03-09T00:13:04.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.809+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe77819df90 con 0x7fe778103340 2026-03-09T00:13:04.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.810+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe76800f800 con 0x7fe778103340 2026-03-09T00:13:04.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.810+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe77810b740 con 0x7fe778103340 2026-03-09T00:13:04.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.810+0000 7fe76e7fc700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe7640778c0 0x7fe764079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:04.811 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.811+0000 7fe77ccdf700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe7640778c0 0x7fe764079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:04.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.811+0000 7fe77ccdf700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe7640778c0 0x7fe764079d80 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe77400b5c0 tx=0x7fe774005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:04.812 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.811+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fe768066be0 con 0x7fe778103340 2026-03-09T00:13:04.814 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.814+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe76809e050 con 0x7fe778103340 2026-03-09T00:13:04.955 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.955+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7fe77819a210 con 0x7fe778103340 2026-03-09T00:13:04.956 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.956+0000 7fe76e7fc700 1 -- 192.168.123.103:0/4098844339 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v33) v1 ==== 107+0+4121 (secure 0 0 0) 0x7fe768016070 con 0x7fe778103340 2026-03-09T00:13:04.957 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:04.957 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":12,"btime":"2026-03-09T00:10:44:533512+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14492,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:44.533289+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":103,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.959+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe7640778c0 msgr2=0x7fe764079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.959+0000 7fe77f744700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe7640778c0 0x7fe764079d80 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fe77400b5c0 tx=0x7fe774005fb0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 msgr2=0x7fe778198de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 0x7fe778198de0 secure :-1 s=READY pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7fe76800d8d0 tx=0x7fe76800dbe0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 shutdown_connections 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe7640778c0 0x7fe764079d80 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.960 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe778103340 0x7fe778198de0 secure :-1 s=CLOSED pgs=178 cs=0 l=1 rev1=1 crypto rx=0x7fe76800d8d0 tx=0x7fe76800dbe0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 --2- 192.168.123.103:0/4098844339 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe778103cf0 0x7fe778199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:04.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 >> 192.168.123.103:0/4098844339 conn(0x7fe7780feb90 msgr2=0x7fe778100f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:04.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 shutdown_connections 2026-03-09T00:13:04.961 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:04.960+0000 7fe77f744700 1 -- 192.168.123.103:0/4098844339 wait complete. 2026-03-09T00:13:04.961 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 12 2026-03-09T00:13:05.007 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 13 2026-03-09T00:13:05.158 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.405+0000 7f257841e700 1 -- 192.168.123.103:0/3047251772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 msgr2=0x7f257010d2c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.405+0000 7f257841e700 1 --2- 192.168.123.103:0/3047251772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f257010d2c0 secure :-1 s=READY pgs=179 cs=0 l=1 rev1=1 crypto rx=0x7f256c009b00 tx=0x7f256c009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.406+0000 7f257841e700 1 -- 192.168.123.103:0/3047251772 shutdown_connections 2026-03-09T00:13:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.406+0000 7f257841e700 1 --2- 192.168.123.103:0/3047251772 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f257010d2c0 unknown :-1 s=CLOSED pgs=179 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.406+0000 7f257841e700 1 --2- 192.168.123.103:0/3047251772 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 0x7f2570100030 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.406 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.406+0000 7f257841e700 1 -- 192.168.123.103:0/3047251772 >> 192.168.123.103:0/3047251772 conn(0x7f25700fb830 msgr2=0x7f25700fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.406+0000 7f257841e700 1 -- 192.168.123.103:0/3047251772 shutdown_connections 2026-03-09T00:13:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.406+0000 7f257841e700 1 -- 192.168.123.103:0/3047251772 wait complete. 2026-03-09T00:13:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.407+0000 7f257841e700 1 Processor -- start 2026-03-09T00:13:05.407 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.407+0000 7f257841e700 1 -- start start 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.407+0000 7f257841e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 0x7f2570198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.407+0000 7f257841e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f2570199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.407+0000 7f257841e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25701999e0 con 0x7f2570100600 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.407+0000 7f257841e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f257019d770 con 0x7f25700ffc50 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f2570199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f2570199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55834/0 (socket says 192.168.123.103:55834) 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 -- 192.168.123.103:0/1556131014 learned_addr learned my addr 192.168.123.103:0/1556131014 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 -- 192.168.123.103:0/1556131014 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 msgr2=0x7f2570198dc0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:05.408 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25761ba700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 0x7f2570198dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 0x7f2570198dc0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 -- 192.168.123.103:0/1556131014 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f256c0097e0 con 0x7f2570100600 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.408+0000 7f25759b9700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f2570199300 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f256c006010 tx=0x7f256c00bb10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.409+0000 7f25761ba700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 0x7f2570198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.409+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f256c01d070 con 0x7f2570100600 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.409+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f257019d9f0 con 0x7f2570100600 2026-03-09T00:13:05.409 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.409+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f257019df40 con 0x7f2570100600 2026-03-09T00:13:05.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.409+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f256c00f460 con 0x7f2570100600 2026-03-09T00:13:05.411 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.409+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f256c021620 con 0x7f2570100600 2026-03-09T00:13:05.412 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.411+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2554005320 con 0x7f2570100600 2026-03-09T00:13:05.414 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.412+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f256c003ac0 con 0x7f2570100600 2026-03-09T00:13:05.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.415+0000 7f25637fe700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f255c0778c0 0x7f255c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:05.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.415+0000 7f25761ba700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f255c0778c0 0x7f255c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:05.415 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.415+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f256c09a770 con 0x7f2570100600 2026-03-09T00:13:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.415+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f256c09cdd0 con 0x7f2570100600 2026-03-09T00:13:05.416 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.416+0000 7f25761ba700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f255c0778c0 0x7f255c079d80 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2564005fd0 tx=0x7f2564005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:05.565 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:05 vm03.local ceph-mon[129670]: pgmap v288: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:05.565 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:05 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4098844339' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T00:13:05.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.570+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7f2554005190 con 0x7f2570100600 2026-03-09T00:13:05.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.571+0000 7f25637fe700 1 -- 192.168.123.103:0/1556131014 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v33) v1 ==== 107+0+4132 (secure 0 0 0) 0x7f256c062fe0 con 0x7f2570100600 2026-03-09T00:13:05.572 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:05.572 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":13,"btime":"2026-03-09T00:10:44:543101+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:44.543097+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":103,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.ralade","rank":0,"incarnation":13,"state":"up:replay","state_seq":2,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f255c0778c0 msgr2=0x7f255c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f255c0778c0 0x7f255c079d80 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2564005fd0 tx=0x7f2564005ee0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 msgr2=0x7f2570199300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f2570199300 secure :-1 s=READY pgs=180 cs=0 l=1 rev1=1 crypto rx=0x7f256c006010 tx=0x7f256c00bb10 comp rx=0 tx=0).stop 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 shutdown_connections 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f255c0778c0 0x7f255c079d80 secure :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f2564005fd0 tx=0x7f2564005ee0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f25700ffc50 0x7f2570198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 --2- 192.168.123.103:0/1556131014 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f2570100600 0x7f2570199300 unknown :-1 s=CLOSED pgs=180 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:05.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 >> 192.168.123.103:0/1556131014 conn(0x7f25700fb830 msgr2=0x7f25700fcdd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:05.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 shutdown_connections 2026-03-09T00:13:05.576 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:05.575+0000 7f257841e700 1 -- 192.168.123.103:0/1556131014 wait complete. 2026-03-09T00:13:05.577 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 13 2026-03-09T00:13:05.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:05 vm06.local ceph-mon[106218]: pgmap v288: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:05.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:05 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4098844339' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-09T00:13:05.674 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 14 2026-03-09T00:13:05.821 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.069+0000 7f1a72f0b700 1 -- 192.168.123.103:0/4142445341 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c103a50 msgr2=0x7f1a6c107aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.069+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/4142445341 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c103a50 0x7f1a6c107aa0 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f1a68009b00 tx=0x7f1a68009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.069+0000 7f1a72f0b700 1 -- 192.168.123.103:0/4142445341 shutdown_connections 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.069+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/4142445341 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c103a50 0x7f1a6c107aa0 secure :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f1a68009b00 tx=0x7f1a68009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.069+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/4142445341 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c1030a0 0x7f1a6c103480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.069+0000 7f1a72f0b700 1 -- 192.168.123.103:0/4142445341 >> 192.168.123.103:0/4142445341 conn(0x7f1a6c0fe930 msgr2=0x7f1a6c100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.070+0000 7f1a72f0b700 1 -- 192.168.123.103:0/4142445341 shutdown_connections 2026-03-09T00:13:06.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.070+0000 7f1a72f0b700 1 -- 192.168.123.103:0/4142445341 wait complete. 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.070+0000 7f1a72f0b700 1 Processor -- start 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a72f0b700 1 -- start start 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a72f0b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c1030a0 0x7f1a6c198d50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a72f0b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 0x7f1a6c19d700 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a72f0b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1a6c199820 con 0x7f1a6c1030a0 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a72f0b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1a6c199990 con 0x7f1a6c199290 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a71708700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 0x7f1a6c19d700 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a71708700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 0x7f1a6c19d700 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37672/0 (socket says 192.168.123.103:37672) 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a71708700 1 -- 192.168.123.103:0/3257253188 learned_addr learned my addr 192.168.123.103:0/3257253188 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:06.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a71f09700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c1030a0 0x7f1a6c198d50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.071+0000 7f1a71708700 1 -- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c1030a0 msgr2=0x7f1a6c198d50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a71708700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c1030a0 0x7f1a6c198d50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a71708700 1 -- 192.168.123.103:0/3257253188 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1a680097e0 con 0x7f1a6c199290 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a71f09700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c1030a0 0x7f1a6c198d50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a71708700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 0x7f1a6c19d700 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f1a68009b00 tx=0x7f1a68004900 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1a6801d070 con 0x7f1a6c199290 2026-03-09T00:13:06.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1a6c19dca0 con 0x7f1a6c199290 2026-03-09T00:13:06.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.072+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1a6c19e1f0 con 0x7f1a6c199290 2026-03-09T00:13:06.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.073+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1a68022470 con 0x7f1a6c199290 2026-03-09T00:13:06.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.073+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1a6800f650 con 0x7f1a6c199290 2026-03-09T00:13:06.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.073+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1a6c10b440 con 0x7f1a6c199290 2026-03-09T00:13:06.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.074+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1a68022aa0 con 0x7f1a6c199290 2026-03-09T00:13:06.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.074+0000 7f1a62ffd700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1a580778c0 0x7f1a58079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:06.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.074+0000 7f1a71f09700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1a580778c0 0x7f1a58079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:06.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.075+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f1a6809b050 con 0x7f1a6c199290 2026-03-09T00:13:06.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.075+0000 7f1a71f09700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1a580778c0 0x7f1a58079d80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f1a5c005fd0 tx=0x7f1a5c005e30 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:06.077 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.077+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1a68063830 con 0x7f1a6c199290 2026-03-09T00:13:06.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.213+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7f1a6c19e520 con 0x7f1a6c199290 2026-03-09T00:13:06.214 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.214+0000 7f1a62ffd700 1 -- 192.168.123.103:0/3257253188 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v33) v1 ==== 107+0+4137 (secure 0 0 0) 0x7f1a68062f80 con 0x7f1a6c199290 2026-03-09T00:13:06.216 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:06.216 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":14,"btime":"2026-03-09T00:10:49:988535+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:49.253053+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":103,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.ralade","rank":0,"incarnation":13,"state":"up:reconnect","state_seq":137,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:06.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.218+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1a580778c0 msgr2=0x7f1a58079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.218+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1a580778c0 0x7f1a58079d80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f1a5c005fd0 tx=0x7f1a5c005e30 comp rx=0 tx=0).stop 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.218+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 msgr2=0x7f1a6c19d700 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.218+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 0x7f1a6c19d700 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f1a68009b00 tx=0x7f1a68004900 comp rx=0 tx=0).stop 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.218+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 shutdown_connections 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.219+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f1a580778c0 0x7f1a58079d80 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.219+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f1a6c1030a0 0x7f1a6c198d50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.219+0000 7f1a72f0b700 1 --2- 192.168.123.103:0/3257253188 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f1a6c199290 0x7f1a6c19d700 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.219+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 >> 192.168.123.103:0/3257253188 conn(0x7f1a6c0fe930 msgr2=0x7f1a6c0fff50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:06.219 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.219+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 shutdown_connections 2026-03-09T00:13:06.220 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.219+0000 7f1a72f0b700 1 -- 192.168.123.103:0/3257253188 wait complete. 2026-03-09T00:13:06.220 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 14 2026-03-09T00:13:06.284 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 15 2026-03-09T00:13:06.423 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:06.464 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:06 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1556131014' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T00:13:06.464 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:06 vm03.local ceph-mon[129670]: pgmap v289: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:06.464 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:06 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3257253188' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T00:13:06.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:06 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1556131014' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-09T00:13:06.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:06 vm06.local ceph-mon[106218]: pgmap v289: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:06.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:06 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3257253188' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.685+0000 7f5aae1d3700 1 -- 192.168.123.103:0/1162518780 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103cf0 msgr2=0x7f5aa8107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.685+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/1162518780 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8107d40 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f5a98009b00 tx=0x7f5a98009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.685+0000 7f5aae1d3700 1 -- 192.168.123.103:0/1162518780 shutdown_connections 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.685+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/1162518780 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8107d40 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.685+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/1162518780 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103340 0x7f5aa8103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.685+0000 7f5aae1d3700 1 -- 192.168.123.103:0/1162518780 >> 192.168.123.103:0/1162518780 conn(0x7f5aa80feb90 msgr2=0x7f5aa8100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.686+0000 7f5aae1d3700 1 -- 192.168.123.103:0/1162518780 shutdown_connections 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.686+0000 7f5aae1d3700 1 -- 192.168.123.103:0/1162518780 wait complete. 2026-03-09T00:13:06.686 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.686+0000 7f5aae1d3700 1 Processor -- start 2026-03-09T00:13:06.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.686+0000 7f5aae1d3700 1 -- start start 2026-03-09T00:13:06.687 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aae1d3700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103340 0x7f5aa8198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aae1d3700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aae1d3700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5aa81999e0 con 0x7f5aa8103cf0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aae1d3700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5aa819d770 con 0x7f5aa8103340 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55862/0 (socket says 192.168.123.103:55862) 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 -- 192.168.123.103:0/113518109 learned_addr learned my addr 192.168.123.103:0/113518109 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 -- 192.168.123.103:0/113518109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103340 msgr2=0x7f5aa8198dc0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103340 0x7f5aa8198dc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 -- 192.168.123.103:0/113518109 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a90009710 con 0x7f5aa8103cf0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa6ffd700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8199300 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f5a9800ba30 tx=0x7f5a9800bb10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a9801d070 con 0x7f5aa8103cf0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.687+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5a9800f460 con 0x7f5aa8103cf0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.688+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a98003bf0 con 0x7f5aa8103cf0 2026-03-09T00:13:06.688 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.688+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a980097e0 con 0x7f5aa8103cf0 2026-03-09T00:13:06.690 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.689+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5aa819dcd0 con 0x7f5aa8103cf0 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.689+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5a98021480 con 0x7f5aa8103cf0 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.689+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5aa804ea90 con 0x7f5aa8103cf0 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.690+0000 7f5aa4ff9700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5a94077870 0x7f5a94079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.690+0000 7f5aa77fe700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5a94077870 0x7f5a94079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.690+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f5a98003d50 con 0x7f5aa8103cf0 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.691+0000 7f5aa77fe700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5a94077870 0x7f5a94079d30 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f5a90009e90 tx=0x7f5a90009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:06.693 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.693+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5a98062bc0 con 0x7f5aa8103cf0 2026-03-09T00:13:06.829 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.827+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f5aa8066e80 con 0x7f5aa8103cf0 2026-03-09T00:13:06.830 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.830+0000 7f5aa4ff9700 1 -- 192.168.123.103:0/113518109 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v33) v1 ==== 107+0+4134 (secure 0 0 0) 0x7f5a98062310 con 0x7f5aa8103cf0 2026-03-09T00:13:06.831 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:06.831 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":15,"btime":"2026-03-09T00:10:51:046301+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:50.055010+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":103,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.ralade","rank":0,"incarnation":13,"state":"up:rejoin","state_seq":138,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:06.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.833+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5a94077870 msgr2=0x7f5a94079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.833+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5a94077870 0x7f5a94079d30 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f5a90009e90 tx=0x7f5a90009450 comp rx=0 tx=0).stop 2026-03-09T00:13:06.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.834+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 msgr2=0x7f5aa8199300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:06.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.834+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8199300 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f5a9800ba30 tx=0x7f5a9800bb10 comp rx=0 tx=0).stop 2026-03-09T00:13:06.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.834+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 shutdown_connections 2026-03-09T00:13:06.834 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.834+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5a94077870 0x7f5a94079d30 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.834+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5aa8103340 0x7f5aa8198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.834+0000 7f5aae1d3700 1 --2- 192.168.123.103:0/113518109 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5aa8103cf0 0x7f5aa8199300 unknown :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:06.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.835+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 >> 192.168.123.103:0/113518109 conn(0x7f5aa80feb90 msgr2=0x7f5aa81075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:06.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.835+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 shutdown_connections 2026-03-09T00:13:06.835 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:06.835+0000 7f5aae1d3700 1 -- 192.168.123.103:0/113518109 wait complete. 2026-03-09T00:13:06.836 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 15 2026-03-09T00:13:06.876 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 16 2026-03-09T00:13:07.022 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.295+0000 7f00c9c11700 1 -- 192.168.123.103:0/651517977 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103cf0 msgr2=0x7f00c4107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.295+0000 7f00c9c11700 1 --2- 192.168.123.103:0/651517977 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103cf0 0x7f00c4107d40 secure :-1 s=READY pgs=183 cs=0 l=1 rev1=1 crypto rx=0x7f00ac009b00 tx=0x7f00ac009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.295+0000 7f00c9c11700 1 -- 192.168.123.103:0/651517977 shutdown_connections 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.295+0000 7f00c9c11700 1 --2- 192.168.123.103:0/651517977 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103cf0 0x7f00c4107d40 unknown :-1 s=CLOSED pgs=183 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.295+0000 7f00c9c11700 1 --2- 192.168.123.103:0/651517977 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103340 0x7f00c4103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.295+0000 7f00c9c11700 1 -- 192.168.123.103:0/651517977 >> 192.168.123.103:0/651517977 conn(0x7f00c40feb90 msgr2=0x7f00c4100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.296+0000 7f00c9c11700 1 -- 192.168.123.103:0/651517977 shutdown_connections 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.296+0000 7f00c9c11700 1 -- 192.168.123.103:0/651517977 wait complete. 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c9c11700 1 Processor -- start 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c9c11700 1 -- start start 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c9c11700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103340 0x7f00c4072600 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c9c11700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 0x7f00c4078300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c9c11700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00c4072bd0 con 0x7f00c4103340 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c9c11700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00c4072d40 con 0x7f00c4103cf0 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 0x7f00c4078300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c2ffd700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 0x7f00c4078300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37712/0 (socket says 192.168.123.103:37712) 2026-03-09T00:13:07.297 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c2ffd700 1 -- 192.168.123.103:0/2515424748 learned_addr learned my addr 192.168.123.103:0/2515424748 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c2ffd700 1 -- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103340 msgr2=0x7f00c4072600 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.297+0000 7f00c37fe700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103340 0x7f00c4072600 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c2ffd700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103340 0x7f00c4072600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c2ffd700 1 -- 192.168.123.103:0/2515424748 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00ac0097e0 con 0x7f00c4103cf0 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c2ffd700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 0x7f00c4078300 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f00ac005fd0 tx=0x7f00ac0049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c37fe700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103340 0x7f00c4072600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:07.298 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f00ac01d070 con 0x7f00c4103cf0 2026-03-09T00:13:07.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00c4078840 con 0x7f00c4103cf0 2026-03-09T00:13:07.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.298+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00c4078d30 con 0x7f00c4103cf0 2026-03-09T00:13:07.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.299+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f00ac00bc50 con 0x7f00c4103cf0 2026-03-09T00:13:07.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.299+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f00ac00f650 con 0x7f00c4103cf0 2026-03-09T00:13:07.299 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.299+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00a4005320 con 0x7f00c4103cf0 2026-03-09T00:13:07.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.300+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f00ac022470 con 0x7f00c4103cf0 2026-03-09T00:13:07.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.300+0000 7f00c0ff9700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f00b00776b0 0x7f00b0079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:07.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.301+0000 7f00c37fe700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f00b00776b0 0x7f00b0079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:07.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.301+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f00ac09b550 con 0x7f00c4103cf0 2026-03-09T00:13:07.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.301+0000 7f00c37fe700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f00b00776b0 0x7f00b0079b70 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f00b4005fd0 tx=0x7f00b4005de0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:07.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.302+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f00ac063db0 con 0x7f00c4103cf0 2026-03-09T00:13:07.446 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.445+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f00a4005cc0 con 0x7f00c4103cf0 2026-03-09T00:13:07.447 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.447+0000 7f00c0ff9700 1 -- 192.168.123.103:0/2515424748 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v33) v1 ==== 107+0+4143 (secure 0 0 0) 0x7f00ac063500 con 0x7f00c4103cf0 2026-03-09T00:13:07.448 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:07.448 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":16,"btime":"2026-03-09T00:10:52:056595+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:52.056592+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":103,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.ralade","rank":0,"incarnation":13,"state":"up:active","state_seq":139,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14492,"qdb_cluster":[14492]},"id":1}]} 2026-03-09T00:13:07.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f00b00776b0 msgr2=0x7f00b0079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:07.449 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f00b00776b0 0x7f00b0079b70 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f00b4005fd0 tx=0x7f00b4005de0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 msgr2=0x7f00c4078300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 0x7f00c4078300 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f00ac005fd0 tx=0x7f00ac0049e0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 shutdown_connections 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f00b00776b0 0x7f00b0079b70 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00c4103340 0x7f00c4072600 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.449+0000 7f00c9c11700 1 --2- 192.168.123.103:0/2515424748 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00c4103cf0 0x7f00c4078300 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.450+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 >> 192.168.123.103:0/2515424748 conn(0x7f00c40feb90 msgr2=0x7f00c4100ec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.450+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 shutdown_connections 2026-03-09T00:13:07.450 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.450+0000 7f00c9c11700 1 -- 192.168.123.103:0/2515424748 wait complete. 2026-03-09T00:13:07.451 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 16 2026-03-09T00:13:07.492 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 17 2026-03-09T00:13:07.646 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:07.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:07 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/113518109' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T00:13:07.671 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:07 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/113518109' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.897+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/746898601 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103cb0 msgr2=0x7f5ae0107d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.897+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/746898601 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0107d00 secure :-1 s=READY pgs=184 cs=0 l=1 rev1=1 crypto rx=0x7f5ac8009b50 tx=0x7f5ac8009e60 comp rx=0 tx=0).stop 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/746898601 shutdown_connections 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/746898601 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0107d00 unknown :-1 s=CLOSED pgs=184 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/746898601 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103300 0x7f5ae01036e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/746898601 >> 192.168.123.103:0/746898601 conn(0x7f5ae00feb90 msgr2=0x7f5ae0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/746898601 shutdown_connections 2026-03-09T00:13:07.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/746898601 wait complete. 2026-03-09T00:13:07.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.898+0000 7f5ae5f9a700 1 Processor -- start 2026-03-09T00:13:07.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5ae5f9a700 1 -- start start 2026-03-09T00:13:07.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5ae5f9a700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 0x7f5ae0198e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:07.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5ae5f9a700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0199380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:07.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5ae5f9a700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ae0199a60 con 0x7f5ae0103300 2026-03-09T00:13:07.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5ae5f9a700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ae019d7f0 con 0x7f5ae0103cb0 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5adf7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 0x7f5ae0198e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5adf7fe700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 0x7f5ae0198e40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55892/0 (socket says 192.168.123.103:55892) 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.899+0000 7f5adf7fe700 1 -- 192.168.123.103:0/1120970653 learned_addr learned my addr 192.168.123.103:0/1120970653 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.900+0000 7f5ad6dff700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0199380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.900+0000 7f5adf7fe700 1 -- 192.168.123.103:0/1120970653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103cb0 msgr2=0x7f5ae0199380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.900+0000 7f5adf7fe700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0199380 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.900+0000 7f5adf7fe700 1 -- 192.168.123.103:0/1120970653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5ac80097e0 con 0x7f5ae0103300 2026-03-09T00:13:07.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.900+0000 7f5ad6dff700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0199380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.900+0000 7f5adf7fe700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 0x7f5ae0198e40 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f5ad000eb10 tx=0x7f5ad000eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.901+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ad000cca0 con 0x7f5ae0103300 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.901+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5ad000ce00 con 0x7f5ae0103300 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.901+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ad0018910 con 0x7f5ae0103300 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.901+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ae019dad0 con 0x7f5ae0103300 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.901+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ae019df40 con 0x7f5ae0103300 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.902+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5ad0018a70 con 0x7f5ae0103300 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.902+0000 7f5add7fa700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5acc0779e0 0x7f5acc079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:07.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.903+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f5ad0014070 con 0x7f5ae0103300 2026-03-09T00:13:07.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.903+0000 7f5ad6dff700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5acc0779e0 0x7f5acc079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:07.904 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.904+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5ae004ea90 con 0x7f5ae0103300 2026-03-09T00:13:07.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.908+0000 7f5ad6dff700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5acc0779e0 0x7f5acc079ea0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f5ac800b5c0 tx=0x7f5ac8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:07.908 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:07.908+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5ad0063d10 con 0x7f5ae0103300 2026-03-09T00:13:08.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.059+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7f5ae019a1a0 con 0x7f5ae0103300 2026-03-09T00:13:08.061 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.060+0000 7f5add7fa700 1 -- 192.168.123.103:0/1120970653 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v33) v1 ==== 107+0+4991 (secure 0 0 0) 0x7f5ad0063d10 con 0x7f5ae0103300 2026-03-09T00:13:08.062 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:08.062 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":17,"btime":"2026-03-09T00:10:53:067738+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:52.056592+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":103,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14492},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14492":{"gid":14492,"name":"cephfs.vm03.ralade","rank":0,"incarnation":13,"state":"up:active","state_seq":139,"addr":"192.168.123.103:6829/3870847623","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":3870847623},{"type":"v1","addr":"192.168.123.103:6829","nonce":3870847623}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14492,"qdb_cluster":[14492]},"id":1}]} 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.064+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5acc0779e0 msgr2=0x7f5acc079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.064+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5acc0779e0 0x7f5acc079ea0 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f5ac800b5c0 tx=0x7f5ac8005fb0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.064+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 msgr2=0x7f5ae0198e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.064+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 0x7f5ae0198e40 secure :-1 s=READY pgs=185 cs=0 l=1 rev1=1 crypto rx=0x7f5ad000eb10 tx=0x7f5ad000eed0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 shutdown_connections 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f5acc0779e0 0x7f5acc079ea0 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f5ae0103300 0x7f5ae0198e40 unknown :-1 s=CLOSED pgs=185 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 --2- 192.168.123.103:0/1120970653 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f5ae0103cb0 0x7f5ae0199380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 >> 192.168.123.103:0/1120970653 conn(0x7f5ae00feb90 msgr2=0x7f5ae0100190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:08.065 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 shutdown_connections 2026-03-09T00:13:08.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.065+0000 7f5ae5f9a700 1 -- 192.168.123.103:0/1120970653 wait complete. 2026-03-09T00:13:08.066 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 17 2026-03-09T00:13:08.134 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 18 2026-03-09T00:13:08.270 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.511+0000 7f82b7956700 1 -- 192.168.123.103:0/1695240732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103340 msgr2=0x7f82b0103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.511+0000 7f82b7956700 1 --2- 192.168.123.103:0/1695240732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103340 0x7f82b0103720 secure :-1 s=READY pgs=186 cs=0 l=1 rev1=1 crypto rx=0x7f82a0009b50 tx=0x7f82a0009e60 comp rx=0 tx=0).stop 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.511+0000 7f82b7956700 1 -- 192.168.123.103:0/1695240732 shutdown_connections 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.511+0000 7f82b7956700 1 --2- 192.168.123.103:0/1695240732 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82b0103cf0 0x7f82b0107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.511+0000 7f82b7956700 1 --2- 192.168.123.103:0/1695240732 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103340 0x7f82b0103720 unknown :-1 s=CLOSED pgs=186 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.511+0000 7f82b7956700 1 -- 192.168.123.103:0/1695240732 >> 192.168.123.103:0/1695240732 conn(0x7f82b00feb90 msgr2=0x7f82b0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.512+0000 7f82b7956700 1 -- 192.168.123.103:0/1695240732 shutdown_connections 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.512+0000 7f82b7956700 1 -- 192.168.123.103:0/1695240732 wait complete. 2026-03-09T00:13:08.512 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.512+0000 7f82b7956700 1 Processor -- start 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.512+0000 7f82b7956700 1 -- start start 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b7956700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82b0103340 0x7f82b0198eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b7956700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 0x7f82b01993f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b7956700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82b0199ad0 con 0x7f82b0103cf0 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b7956700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f82b019d860 con 0x7f82b0103340 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b4ef1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 0x7f82b01993f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b4ef1700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 0x7f82b01993f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55910/0 (socket says 192.168.123.103:55910) 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b4ef1700 1 -- 192.168.123.103:0/2029266299 learned_addr learned my addr 192.168.123.103:0/2029266299 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:08.513 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b4ef1700 1 -- 192.168.123.103:0/2029266299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82b0103340 msgr2=0x7f82b0198eb0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:08.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b56f2700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82b0103340 0x7f82b0198eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:08.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b4ef1700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82b0103340 0x7f82b0198eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.513+0000 7f82b4ef1700 1 -- 192.168.123.103:0/2029266299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f82a00097e0 con 0x7f82b0103cf0 2026-03-09T00:13:08.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.514+0000 7f82b4ef1700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 0x7f82b01993f0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7f82ac009fd0 tx=0x7f82ac00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:08.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.514+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82ac00cd70 con 0x7f82b0103cf0 2026-03-09T00:13:08.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.514+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f82ac004d10 con 0x7f82b0103cf0 2026-03-09T00:13:08.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.514+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f82ac010640 con 0x7f82b0103cf0 2026-03-09T00:13:08.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.514+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f82b019db40 con 0x7f82b0103cf0 2026-03-09T00:13:08.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.514+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f82b019dfb0 con 0x7f82b0103cf0 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.515+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f82ac0107a0 con 0x7f82b0103cf0 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.515+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82b010b6e0 con 0x7f82b0103cf0 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.517+0000 7f82a67fc700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f829c077870 0x7f829c079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.517+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f82ac014070 con 0x7f82b0103cf0 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.517+0000 7f82b56f2700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f829c077870 0x7f829c079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.517+0000 7f82b56f2700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f829c077870 0x7f829c079d30 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f82a000b5c0 tx=0x7f82a00058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:08.519 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.519+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f82ac062630 con 0x7f82b0103cf0 2026-03-09T00:13:08.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:08 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2515424748' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T00:13:08.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:08 vm03.local ceph-mon[129670]: pgmap v290: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:08.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:08 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/1120970653' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T00:13:08.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.656+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7f82b004ea90 con 0x7f82b0103cf0 2026-03-09T00:13:08.658 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.657+0000 7f82a67fc700 1 -- 192.168.123.103:0/2029266299 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v33) v1 ==== 107+0+4186 (secure 0 0 0) 0x7f82ac061d80 con 0x7f82b0103cf0 2026-03-09T00:13:08.658 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:08.658 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":18,"btime":"2026-03-09T00:10:58:287982+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24287,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17}],"filesystems":[{"mdsmap":{"epoch":18,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:58.287979+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:08.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f829c077870 msgr2=0x7f829c079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:08.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f829c077870 0x7f829c079d30 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f82a000b5c0 tx=0x7f82a00058e0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 msgr2=0x7f82b01993f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:08.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 0x7f82b01993f0 secure :-1 s=READY pgs=187 cs=0 l=1 rev1=1 crypto rx=0x7f82ac009fd0 tx=0x7f82ac00c5b0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.660 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 shutdown_connections 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f829c077870 0x7f829c079d30 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f82b0103340 0x7f82b0198eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 --2- 192.168.123.103:0/2029266299 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f82b0103cf0 0x7f82b01993f0 unknown :-1 s=CLOSED pgs=187 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 >> 192.168.123.103:0/2029266299 conn(0x7f82b00feb90 msgr2=0x7f82b0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.660+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 shutdown_connections 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:08.661+0000 7f82b7956700 1 -- 192.168.123.103:0/2029266299 wait complete. 2026-03-09T00:13:08.661 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 18 2026-03-09T00:13:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:08 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2515424748' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-09T00:13:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:08 vm06.local ceph-mon[106218]: pgmap v290: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:08.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:08 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/1120970653' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-09T00:13:08.730 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 19 2026-03-09T00:13:08.878 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.224+0000 7fb788005700 1 -- 192.168.123.103:0/2886383534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 msgr2=0x7fb780102420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.224+0000 7fb788005700 1 --2- 192.168.123.103:0/2886383534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780102420 secure :-1 s=READY pgs=188 cs=0 l=1 rev1=1 crypto rx=0x7fb770009b50 tx=0x7fb770009e60 comp rx=0 tx=0).stop 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.224+0000 7fb788005700 1 -- 192.168.123.103:0/2886383534 shutdown_connections 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.224+0000 7fb788005700 1 --2- 192.168.123.103:0/2886383534 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb78010ae50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.224+0000 7fb788005700 1 --2- 192.168.123.103:0/2886383534 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780102420 unknown :-1 s=CLOSED pgs=188 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.224+0000 7fb788005700 1 -- 192.168.123.103:0/2886383534 >> 192.168.123.103:0/2886383534 conn(0x7fb7800fb830 msgr2=0x7fb7800fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 -- 192.168.123.103:0/2886383534 shutdown_connections 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 -- 192.168.123.103:0/2886383534 wait complete. 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 Processor -- start 2026-03-09T00:13:09.225 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 -- start start 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780198e70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb7801993b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb780199a90 con 0x7fb780102040 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.225+0000 7fb788005700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb78019d820 con 0x7fb780102960 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb7801993b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb7801993b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37752/0 (socket says 192.168.123.103:37752) 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 -- 192.168.123.103:0/4122508635 learned_addr learned my addr 192.168.123.103:0/4122508635 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:09.226 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb785da1700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780198e70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:09.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 -- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 msgr2=0x7fb780198e70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780198e70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 -- 192.168.123.103:0/4122508635 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7700097e0 con 0x7fb780102960 2026-03-09T00:13:09.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb785da1700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780198e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-09T00:13:09.227 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.226+0000 7fb7855a0700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb7801993b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fb77c00ea30 tx=0x7fb77c00ed40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:09.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.227+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb77c0098d0 con 0x7fb780102960 2026-03-09T00:13:09.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.227+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb77c004d10 con 0x7fb780102960 2026-03-09T00:13:09.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.227+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb77c010430 con 0x7fb780102960 2026-03-09T00:13:09.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.227+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb78019db00 con 0x7fb780102960 2026-03-09T00:13:09.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.227+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb78019e050 con 0x7fb780102960 2026-03-09T00:13:09.228 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.228+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb780104590 con 0x7fb780102960 2026-03-09T00:13:09.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.229+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb77c004e80 con 0x7fb780102960 2026-03-09T00:13:09.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.229+0000 7fb776ffd700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb76c077990 0x7fb76c079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:09.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.229+0000 7fb785da1700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb76c077990 0x7fb76c079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:09.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.230+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fb77c026080 con 0x7fb780102960 2026-03-09T00:13:09.230 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.230+0000 7fb785da1700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb76c077990 0x7fb76c079e50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fb770006010 tx=0x7fb77000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:09.231 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.231+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb77c062a30 con 0x7fb780102960 2026-03-09T00:13:09.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.380+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7fb780068a10 con 0x7fb780102960 2026-03-09T00:13:09.381 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.381+0000 7fb776ffd700 1 -- 192.168.123.103:0/4122508635 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v33) v1 ==== 107+0+4197 (secure 0 0 0) 0x7fb77c062180 con 0x7fb780102960 2026-03-09T00:13:09.381 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:09.382 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":19,"btime":"2026-03-09T00:10:58:294908+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:10:58.294902+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:replay","state_seq":2,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.383+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb76c077990 msgr2=0x7fb76c079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.383+0000 7fb788005700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb76c077990 0x7fb76c079e50 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fb770006010 tx=0x7fb77000b540 comp rx=0 tx=0).stop 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 msgr2=0x7fb7801993b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb7801993b0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7fb77c00ea30 tx=0x7fb77c00ed40 comp rx=0 tx=0).stop 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 shutdown_connections 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb76c077990 0x7fb76c079e50 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb780102040 0x7fb780198e70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 --2- 192.168.123.103:0/4122508635 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb780102960 0x7fb7801993b0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 >> 192.168.123.103:0/4122508635 conn(0x7fb7800fb830 msgr2=0x7fb7800fd500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 shutdown_connections 2026-03-09T00:13:09.384 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.384+0000 7fb788005700 1 -- 192.168.123.103:0/4122508635 wait complete. 2026-03-09T00:13:09.385 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 19 2026-03-09T00:13:09.445 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 20 2026-03-09T00:13:09.593 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:09.632 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:09 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2029266299' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T00:13:09.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:09 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2029266299' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.839+0000 7f008f01f700 1 -- 192.168.123.103:0/4139176050 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088073ae0 msgr2=0x7f008810d1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.839+0000 7f008f01f700 1 --2- 192.168.123.103:0/4139176050 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088073ae0 0x7f008810d1c0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f007c009a90 tx=0x7f007c009da0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.840+0000 7f008f01f700 1 -- 192.168.123.103:0/4139176050 shutdown_connections 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.840+0000 7f008f01f700 1 --2- 192.168.123.103:0/4139176050 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f0088073ae0 0x7f008810d1c0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.840+0000 7f008f01f700 1 --2- 192.168.123.103:0/4139176050 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f00880731c0 0x7f00880735a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.840+0000 7f008f01f700 1 -- 192.168.123.103:0/4139176050 >> 192.168.123.103:0/4139176050 conn(0x7f00880fc9b0 msgr2=0x7f00880fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.840+0000 7f008f01f700 1 -- 192.168.123.103:0/4139176050 shutdown_connections 2026-03-09T00:13:09.840 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.840+0000 7f008f01f700 1 -- 192.168.123.103:0/4139176050 wait complete. 2026-03-09T00:13:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f008f01f700 1 Processor -- start 2026-03-09T00:13:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f008f01f700 1 -- start start 2026-03-09T00:13:09.841 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f008f01f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00880731c0 0x7f0088198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f008f01f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 0x7f0088199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f008f01f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00881999f0 con 0x7f0088073ae0 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f008f01f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f008819d780 con 0x7f00880731c0 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f0087fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 0x7f0088199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f0087fff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 0x7f0088199310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55938/0 (socket says 192.168.123.103:55938) 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.841+0000 7f0087fff700 1 -- 192.168.123.103:0/3993876517 learned_addr learned my addr 192.168.123.103:0/3993876517 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0087fff700 1 -- 192.168.123.103:0/3993876517 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00880731c0 msgr2=0x7f0088198dd0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0087fff700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00880731c0 0x7f0088198dd0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0087fff700 1 -- 192.168.123.103:0/3993876517 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00780097e0 con 0x7f0088073ae0 2026-03-09T00:13:09.842 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0087fff700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 0x7f0088199310 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f007c00f690 tx=0x7f007c00f770 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:09.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f007c01d070 con 0x7f0088073ae0 2026-03-09T00:13:09.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f007c00fe40 con 0x7f0088073ae0 2026-03-09T00:13:09.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f007c017910 con 0x7f0088073ae0 2026-03-09T00:13:09.843 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.842+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f007c009710 con 0x7f0088073ae0 2026-03-09T00:13:09.844 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.843+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f008819dd30 con 0x7f0088073ae0 2026-03-09T00:13:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.844+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f007c017a70 con 0x7f0088073ae0 2026-03-09T00:13:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.844+0000 7f0085ffb700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0070077870 0x7f0070079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.845+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f007c067af0 con 0x7f0088073ae0 2026-03-09T00:13:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.845+0000 7f008cdbb700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0070077870 0x7f0070079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:09.845 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.845+0000 7f008cdbb700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0070077870 0x7f0070079d30 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f0078005fd0 tx=0x7f0078009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:09.846 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.845+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f008810a8c0 con 0x7f0088073ae0 2026-03-09T00:13:09.848 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.848+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f007c0a0050 con 0x7f0088073ae0 2026-03-09T00:13:09.986 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.985+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f008819a130 con 0x7f0088073ae0 2026-03-09T00:13:09.988 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.988+0000 7f0085ffb700 1 -- 192.168.123.103:0/3993876517 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v33) v1 ==== 107+0+4202 (secure 0 0 0) 0x7f007c027090 con 0x7f0088073ae0 2026-03-09T00:13:09.988 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:09.989 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":20,"btime":"2026-03-09T00:11:03:454550+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17}],"filesystems":[{"mdsmap":{"epoch":20,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:02.496196+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:reconnect","state_seq":140,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:09.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0070077870 msgr2=0x7f0070079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0070077870 0x7f0070079d30 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f0078005fd0 tx=0x7f0078009500 comp rx=0 tx=0).stop 2026-03-09T00:13:09.991 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 msgr2=0x7f0088199310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 0x7f0088199310 secure :-1 s=READY pgs=189 cs=0 l=1 rev1=1 crypto rx=0x7f007c00f690 tx=0x7f007c00f770 comp rx=0 tx=0).stop 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 shutdown_connections 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f0070077870 0x7f0070079d30 secure :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f0078005fd0 tx=0x7f0078009500 comp rx=0 tx=0).stop 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f00880731c0 0x7f0088198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 --2- 192.168.123.103:0/3993876517 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f0088073ae0 0x7f0088199310 unknown :-1 s=CLOSED pgs=189 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 >> 192.168.123.103:0/3993876517 conn(0x7f00880fc9b0 msgr2=0x7f0088107a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.991+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 shutdown_connections 2026-03-09T00:13:09.992 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:09.992+0000 7f008f01f700 1 -- 192.168.123.103:0/3993876517 wait complete. 2026-03-09T00:13:09.993 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 20 2026-03-09T00:13:10.052 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 21 2026-03-09T00:13:10.199 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:10.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.462+0000 7fad66f2d700 1 -- 192.168.123.103:0/4151682219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad600690e0 msgr2=0x7fad60105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.462+0000 7fad66f2d700 1 --2- 192.168.123.103:0/4151682219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad600690e0 0x7fad60105b50 secure :-1 s=READY pgs=190 cs=0 l=1 rev1=1 crypto rx=0x7fad54009b00 tx=0x7fad54009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.462+0000 7fad66f2d700 1 -- 192.168.123.103:0/4151682219 shutdown_connections 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.462+0000 7fad66f2d700 1 --2- 192.168.123.103:0/4151682219 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad600690e0 0x7fad60105b50 unknown :-1 s=CLOSED pgs=190 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.462+0000 7fad66f2d700 1 --2- 192.168.123.103:0/4151682219 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad60068730 0x7fad60068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.462+0000 7fad66f2d700 1 -- 192.168.123.103:0/4151682219 >> 192.168.123.103:0/4151682219 conn(0x7fad60075960 msgr2=0x7fad60075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.463+0000 7fad66f2d700 1 -- 192.168.123.103:0/4151682219 shutdown_connections 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.463+0000 7fad66f2d700 1 -- 192.168.123.103:0/4151682219 wait complete. 2026-03-09T00:13:10.463 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.463+0000 7fad66f2d700 1 Processor -- start 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.463+0000 7fad66f2d700 1 -- start start 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad66f2d700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad60068730 0x7fad60198e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad66f2d700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 0x7fad601993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad66f2d700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad60199a80 con 0x7fad60068730 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad66f2d700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad6019d810 con 0x7fad600690e0 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad5ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 0x7fad601993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad5ffff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 0x7fad601993a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37786/0 (socket says 192.168.123.103:37786) 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad5ffff700 1 -- 192.168.123.103:0/3113095901 learned_addr learned my addr 192.168.123.103:0/3113095901 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:10.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad5ffff700 1 -- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad60068730 msgr2=0x7fad60198e60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.464+0000 7fad64cc9700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad60068730 0x7fad60198e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad5ffff700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad60068730 0x7fad60198e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad5ffff700 1 -- 192.168.123.103:0/3113095901 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fad540097e0 con 0x7fad600690e0 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad64cc9700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad60068730 0x7fad60198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad5ffff700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 0x7fad601993a0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fad54009fd0 tx=0x7fad54004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad5401d070 con 0x7fad600690e0 2026-03-09T00:13:10.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad66f2d700 1 -- 192.168.123.103:0/3113095901 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fad6019da90 con 0x7fad600690e0 2026-03-09T00:13:10.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fad54004b90 con 0x7fad600690e0 2026-03-09T00:13:10.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fad5400f700 con 0x7fad600690e0 2026-03-09T00:13:10.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.465+0000 7fad66f2d700 1 -- 192.168.123.103:0/3113095901 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fad6019df80 con 0x7fad600690e0 2026-03-09T00:13:10.466 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.466+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fad400052f0 con 0x7fad600690e0 2026-03-09T00:13:10.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.467+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fad5400bc50 con 0x7fad600690e0 2026-03-09T00:13:10.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.468+0000 7fad5dffb700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fad480778c0 0x7fad48079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:10.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.468+0000 7fad64cc9700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fad480778c0 0x7fad48079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:10.468 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.468+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fad5409b9e0 con 0x7fad600690e0 2026-03-09T00:13:10.469 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.468+0000 7fad64cc9700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fad480778c0 0x7fad48079d80 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fad50005d90 tx=0x7fad50005d00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:10.470 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.470+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fad54064190 con 0x7fad600690e0 2026-03-09T00:13:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:10 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4122508635' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T00:13:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:10 vm03.local ceph-mon[129670]: pgmap v291: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:10.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:10 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3993876517' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T00:13:10.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.611+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7fad40005160 con 0x7fad600690e0 2026-03-09T00:13:10.612 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.612+0000 7fad5dffb700 1 -- 192.168.123.103:0/3113095901 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v33) v1 ==== 107+0+4199 (secure 0 0 0) 0x7fad540638e0 con 0x7fad600690e0 2026-03-09T00:13:10.612 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:10.613 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":21,"btime":"2026-03-09T00:11:04:457203+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:03.460045+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:rejoin","state_seq":141,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fad480778c0 msgr2=0x7fad48079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fad480778c0 0x7fad48079d80 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fad50005d90 tx=0x7fad50005d00 comp rx=0 tx=0).stop 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 msgr2=0x7fad601993a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 0x7fad601993a0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7fad54009fd0 tx=0x7fad54004970 comp rx=0 tx=0).stop 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 shutdown_connections 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fad480778c0 0x7fad48079d80 secure :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7fad50005d90 tx=0x7fad50005d00 comp rx=0 tx=0).stop 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fad60068730 0x7fad60198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 --2- 192.168.123.103:0/3113095901 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fad600690e0 0x7fad601993a0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:10.615 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 >> 192.168.123.103:0/3113095901 conn(0x7fad60075960 msgr2=0x7fad600feb00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:10.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.615+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 shutdown_connections 2026-03-09T00:13:10.616 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:10.616+0000 7fad477fe700 1 -- 192.168.123.103:0/3113095901 wait complete. 2026-03-09T00:13:10.617 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 21 2026-03-09T00:13:10.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:10 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4122508635' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-09T00:13:10.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:10 vm06.local ceph-mon[106218]: pgmap v291: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:10.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:10 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3993876517' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-09T00:13:10.676 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 22 2026-03-09T00:13:10.821 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:11.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.069+0000 7fd8f596b700 1 -- 192.168.123.103:0/2690997560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f0103d70 msgr2=0x7fd8f0107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.069+0000 7fd8f596b700 1 --2- 192.168.123.103:0/2690997560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f0103d70 0x7fd8f0107dc0 secure :-1 s=READY pgs=191 cs=0 l=1 rev1=1 crypto rx=0x7fd8d8009b00 tx=0x7fd8d8009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:11.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.070+0000 7fd8f596b700 1 -- 192.168.123.103:0/2690997560 shutdown_connections 2026-03-09T00:13:11.070 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.070+0000 7fd8f596b700 1 --2- 192.168.123.103:0/2690997560 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f0103d70 0x7fd8f0107dc0 unknown :-1 s=CLOSED pgs=191 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.070+0000 7fd8f596b700 1 --2- 192.168.123.103:0/2690997560 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f01033c0 0x7fd8f01037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.070+0000 7fd8f596b700 1 -- 192.168.123.103:0/2690997560 >> 192.168.123.103:0/2690997560 conn(0x7fd8f00fec30 msgr2=0x7fd8f0101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.070+0000 7fd8f596b700 1 -- 192.168.123.103:0/2690997560 shutdown_connections 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 -- 192.168.123.103:0/2690997560 wait complete. 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 Processor -- start 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 -- start start 2026-03-09T00:13:11.071 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 0x7fd8f0198e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f0103d70 0x7fd8f0199360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8f0199a40 con 0x7fd8f01033c0 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8f596b700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8f019d7d0 con 0x7fd8f0103d70 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8eeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 0x7fd8f0198e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8eeffd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 0x7fd8f0198e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:55968/0 (socket says 192.168.123.103:55968) 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.071+0000 7fd8eeffd700 1 -- 192.168.123.103:0/3803996159 learned_addr learned my addr 192.168.123.103:0/3803996159 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8eeffd700 1 -- 192.168.123.103:0/3803996159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f0103d70 msgr2=0x7fd8f0199360 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8e65ff700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f0103d70 0x7fd8f0199360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:11.072 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8eeffd700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f0103d70 0x7fd8f0199360 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.073 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8eeffd700 1 -- 192.168.123.103:0/3803996159 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8d80097e0 con 0x7fd8f01033c0 2026-03-09T00:13:11.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8eeffd700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 0x7fd8f0198e20 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7fd8e000da40 tx=0x7fd8e000de00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:11.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8e00049e0 con 0x7fd8f01033c0 2026-03-09T00:13:11.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd8e0005500 con 0x7fd8f01033c0 2026-03-09T00:13:11.074 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8f019dab0 con 0x7fd8f01033c0 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.072+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8f019e000 con 0x7fd8f01033c0 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.074+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8e0009d70 con 0x7fd8f01033c0 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.074+0000 7fd8e65ff700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f0103d70 0x7fd8f0199360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.074+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd8e0010460 con 0x7fd8f01033c0 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.075+0000 7fd8f4969700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd8dc0778c0 0x7fd8dc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.075+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fd8e0099e90 con 0x7fd8f01033c0 2026-03-09T00:13:11.075 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.075+0000 7fd8e65ff700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd8dc0778c0 0x7fd8dc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:11.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.076+0000 7fd8e65ff700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd8dc0778c0 0x7fd8dc079d80 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fd8d8006010 tx=0x7fd8d8005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:11.076 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.076+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd8d0005320 con 0x7fd8f01033c0 2026-03-09T00:13:11.079 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.079+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd8e00625c0 con 0x7fd8f01033c0 2026-03-09T00:13:11.217 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.216+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7fd8d0005190 con 0x7fd8f01033c0 2026-03-09T00:13:11.218 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.218+0000 7fd8f4969700 1 -- 192.168.123.103:0/3803996159 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v33) v1 ==== 107+0+4208 (secure 0 0 0) 0x7fd8e0017020 con 0x7fd8f01033c0 2026-03-09T00:13:11.219 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:11.219 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":22,"btime":"2026-03-09T00:11:05:464111+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:05.464110+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:active","state_seq":142,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T00:13:11.221 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.221+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd8dc0778c0 msgr2=0x7fd8dc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.221+0000 7fd8f596b700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd8dc0778c0 0x7fd8dc079d80 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fd8d8006010 tx=0x7fd8d8005c00 comp rx=0 tx=0).stop 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.221+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 msgr2=0x7fd8f0198e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.222+0000 7fd8f596b700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 0x7fd8f0198e20 secure :-1 s=READY pgs=192 cs=0 l=1 rev1=1 crypto rx=0x7fd8e000da40 tx=0x7fd8e000de00 comp rx=0 tx=0).stop 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.222+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 shutdown_connections 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.222+0000 7fd8f596b700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fd8dc0778c0 0x7fd8dc079d80 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.222+0000 7fd8f596b700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fd8f01033c0 0x7fd8f0198e20 unknown :-1 s=CLOSED pgs=192 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.222 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.222+0000 7fd8f596b700 1 --2- 192.168.123.103:0/3803996159 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fd8f0103d70 0x7fd8f0199360 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.222+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 >> 192.168.123.103:0/3803996159 conn(0x7fd8f00fec30 msgr2=0x7fd8f0107630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:11.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.223+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 shutdown_connections 2026-03-09T00:13:11.223 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.223+0000 7fd8f596b700 1 -- 192.168.123.103:0/3803996159 wait complete. 2026-03-09T00:13:11.224 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 22 2026-03-09T00:13:11.283 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 23 2026-03-09T00:13:11.432 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:11.477 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:11 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3113095901' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T00:13:11.477 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:11 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3803996159' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T00:13:11.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:11 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3113095901' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-09T00:13:11.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:11 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3803996159' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.671+0000 7fe152e67700 1 -- 192.168.123.103:0/759701605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 msgr2=0x7fe14c068b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.671+0000 7fe152e67700 1 --2- 192.168.123.103:0/759701605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c068b10 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7fe13c009b00 tx=0x7fe13c009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 -- 192.168.123.103:0/759701605 shutdown_connections 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 --2- 192.168.123.103:0/759701605 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 0x7fe14c105b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 --2- 192.168.123.103:0/759701605 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c068b10 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 -- 192.168.123.103:0/759701605 >> 192.168.123.103:0/759701605 conn(0x7fe14c075960 msgr2=0x7fe14c075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 -- 192.168.123.103:0/759701605 shutdown_connections 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 -- 192.168.123.103:0/759701605 wait complete. 2026-03-09T00:13:11.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 Processor -- start 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.672+0000 7fe152e67700 1 -- start start 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe152e67700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c1939f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe152e67700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 0x7fe14c193f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe152e67700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe14c197b80 con 0x7fe14c0690e0 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe152e67700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe14c194470 con 0x7fe14c068730 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe150c03700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c1939f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe150c03700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c1939f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37830/0 (socket says 192.168.123.103:37830) 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe150c03700 1 -- 192.168.123.103:0/467423368 learned_addr learned my addr 192.168.123.103:0/467423368 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe14bfff700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 0x7fe14c193f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:11.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe150c03700 1 -- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 msgr2=0x7fe14c193f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe150c03700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 0x7fe14c193f30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe150c03700 1 -- 192.168.123.103:0/467423368 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe140009710 con 0x7fe14c068730 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.673+0000 7fe14bfff700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 0x7fe14c193f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.674+0000 7fe150c03700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c1939f0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fe13c00c010 tx=0x7fe13c00bab0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.674+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe13c01d070 con 0x7fe14c068730 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.674+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe13c0097e0 con 0x7fe14c068730 2026-03-09T00:13:11.674 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.674+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe14c1a7530 con 0x7fe14c068730 2026-03-09T00:13:11.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.674+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe13c00f460 con 0x7fe14c068730 2026-03-09T00:13:11.675 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.675+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe13c021620 con 0x7fe14c068730 2026-03-09T00:13:11.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.675+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe13c021800 con 0x7fe14c068730 2026-03-09T00:13:11.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.675+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe14c04ea90 con 0x7fe14c068730 2026-03-09T00:13:11.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.676+0000 7fe149ffb700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe134077660 0x7fe134079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:11.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.676+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fe13c09ac40 con 0x7fe14c068730 2026-03-09T00:13:11.676 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.676+0000 7fe14bfff700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe134077660 0x7fe134079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:11.677 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.676+0000 7fe14bfff700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe134077660 0x7fe134079b20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fe14c1951e0 tx=0x7fe140009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:11.679 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.678+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe13c063420 con 0x7fe14c068730 2026-03-09T00:13:11.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.819+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7fe14c194f70 con 0x7fe14c068730 2026-03-09T00:13:11.822 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:11.823 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":23,"btime":"2026-03-09T00:11:07:708427+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1090058295","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1090058295},{"type":"v1","addr":"192.168.123.106:6825","nonce":1090058295}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":9},{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17},{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:05.464110+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:active","state_seq":142,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T00:13:11.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.821+0000 7fe149ffb700 1 -- 192.168.123.103:0/467423368 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v33) v1 ==== 107+0+5059 (secure 0 0 0) 0x7fe13c062b70 con 0x7fe14c068730 2026-03-09T00:13:11.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.824+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe134077660 msgr2=0x7fe134079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.824+0000 7fe152e67700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe134077660 0x7fe134079b20 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fe14c1951e0 tx=0x7fe140009450 comp rx=0 tx=0).stop 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.824+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 msgr2=0x7fe14c1939f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.824+0000 7fe152e67700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c1939f0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7fe13c00c010 tx=0x7fe13c00bab0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 shutdown_connections 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe134077660 0x7fe134079b20 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe14c068730 0x7fe14c1939f0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 --2- 192.168.123.103:0/467423368 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe14c0690e0 0x7fe14c193f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 >> 192.168.123.103:0/467423368 conn(0x7fe14c075960 msgr2=0x7fe14c1040f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 shutdown_connections 2026-03-09T00:13:11.825 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:11.825+0000 7fe152e67700 1 -- 192.168.123.103:0/467423368 wait complete. 2026-03-09T00:13:11.826 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 23 2026-03-09T00:13:11.889 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 24 2026-03-09T00:13:12.033 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:12.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.299+0000 7f8600024700 1 -- 192.168.123.103:0/1935072390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f8068df0 msgr2=0x7f85f810d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:12.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.299+0000 7f8600024700 1 --2- 192.168.123.103:0/1935072390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f8068df0 0x7f85f810d5b0 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7f85e8009b30 tx=0x7f85e8009e40 comp rx=0 tx=0).stop 2026-03-09T00:13:12.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.300+0000 7f8600024700 1 -- 192.168.123.103:0/1935072390 shutdown_connections 2026-03-09T00:13:12.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.300+0000 7f8600024700 1 --2- 192.168.123.103:0/1935072390 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f8068df0 0x7f85f810d5b0 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.300+0000 7f8600024700 1 --2- 192.168.123.103:0/1935072390 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f80684d0 0x7f85f80688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.300 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.300+0000 7f8600024700 1 -- 192.168.123.103:0/1935072390 >> 192.168.123.103:0/1935072390 conn(0x7f85f8075960 msgr2=0x7f85f8075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:12.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.300+0000 7f8600024700 1 -- 192.168.123.103:0/1935072390 shutdown_connections 2026-03-09T00:13:12.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.300+0000 7f8600024700 1 -- 192.168.123.103:0/1935072390 wait complete. 2026-03-09T00:13:12.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f8600024700 1 Processor -- start 2026-03-09T00:13:12.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f8600024700 1 -- start start 2026-03-09T00:13:12.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f8600024700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f80684d0 0x7f85f8198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:12.301 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f8600024700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 0x7f85f8199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f8600024700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85f8199a00 con 0x7f85f80684d0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f8600024700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85f819d790 con 0x7f85f8068df0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f85fd5bf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 0x7f85f8199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f85fd5bf700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 0x7f85f8199320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:37846/0 (socket says 192.168.123.103:37846) 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f85fd5bf700 1 -- 192.168.123.103:0/2492390515 learned_addr learned my addr 192.168.123.103:0/2492390515 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f85fddc0700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f80684d0 0x7f85f8198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.301+0000 7f85fd5bf700 1 -- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f80684d0 msgr2=0x7f85f8198de0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f85fd5bf700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f80684d0 0x7f85f8198de0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f85fd5bf700 1 -- 192.168.123.103:0/2492390515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85e80097e0 con 0x7f85f8068df0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f85fddc0700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f80684d0 0x7f85f8198de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f85fd5bf700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 0x7f85f8199320 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f85e8006010 tx=0x7f85e8004af0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85e801d070 con 0x7f85f8068df0 2026-03-09T00:13:12.302 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f8600024700 1 -- 192.168.123.103:0/2492390515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85f819da70 con 0x7f85f8068df0 2026-03-09T00:13:12.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.302+0000 7f8600024700 1 -- 192.168.123.103:0/2492390515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85f819dfc0 con 0x7f85f8068df0 2026-03-09T00:13:12.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.303+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f85e800bc10 con 0x7f85f8068df0 2026-03-09T00:13:12.303 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.303+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f85e800f670 con 0x7f85f8068df0 2026-03-09T00:13:12.304 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.303+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f85f804ea90 con 0x7f85f8068df0 2026-03-09T00:13:12.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.304+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f85e8022470 con 0x7f85f8068df0 2026-03-09T00:13:12.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.305+0000 7f85eeffd700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f85e40778e0 0x7f85e4079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:12.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.305+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f85e809b370 con 0x7f85f8068df0 2026-03-09T00:13:12.305 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.305+0000 7f85fddc0700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f85e40778e0 0x7f85e4079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:12.306 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.306+0000 7f85fddc0700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f85e40778e0 0x7f85e4079da0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f85f4007900 tx=0x7f85f4008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:12.307 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.307+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f85e8063bd0 con 0x7f85f8068df0 2026-03-09T00:13:12.451 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.450+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7f85f8066e80 con 0x7f85f8068df0 2026-03-09T00:13:12.452 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.451+0000 7f85eeffd700 1 -- 192.168.123.103:0/2492390515 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v33) v1 ==== 107+0+4277 (secure 0 0 0) 0x7f85e8005c00 con 0x7f85f8068df0 2026-03-09T00:13:12.452 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:12.452 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":24,"btime":"2026-03-09T00:11:14:709555+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17},{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:05.464110+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:active","state_seq":142,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T00:13:12.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.454+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f85e40778e0 msgr2=0x7f85e4079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:12.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.454+0000 7f85ecff9700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f85e40778e0 0x7f85e4079da0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f85f4007900 tx=0x7f85f4008040 comp rx=0 tx=0).stop 2026-03-09T00:13:12.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.454+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 msgr2=0x7f85f8199320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:12.454 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.454+0000 7f85ecff9700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 0x7f85f8199320 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f85e8006010 tx=0x7f85e8004af0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.454+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 shutdown_connections 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.454+0000 7f85ecff9700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f85e40778e0 0x7f85e4079da0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.455+0000 7f85ecff9700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f85f80684d0 0x7f85f8198de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.455+0000 7f85ecff9700 1 --2- 192.168.123.103:0/2492390515 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f85f8068df0 0x7f85f8199320 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.455+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 >> 192.168.123.103:0/2492390515 conn(0x7f85f8075960 msgr2=0x7f85f80fe970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.455+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 shutdown_connections 2026-03-09T00:13:12.455 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.455+0000 7f85ecff9700 1 -- 192.168.123.103:0/2492390515 wait complete. 2026-03-09T00:13:12.456 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 24 2026-03-09T00:13:12.502 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 25 2026-03-09T00:13:12.648 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:12.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:12 vm06.local ceph-mon[106218]: pgmap v292: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:12.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:12 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/467423368' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T00:13:12.672 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:12 vm03.local ceph-mon[129670]: pgmap v292: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:12.672 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:12 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/467423368' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-09T00:13:12.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.896+0000 7fc4f12f9700 1 -- 192.168.123.103:0/737205898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0739d0 msgr2=0x7fc4ec10d1f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:12.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.896+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/737205898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec10d1f0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7fc4dc009b00 tx=0x7fc4dc009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:12.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.897+0000 7fc4f12f9700 1 -- 192.168.123.103:0/737205898 shutdown_connections 2026-03-09T00:13:12.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.897+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/737205898 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec10d1f0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.897+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/737205898 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0730b0 0x7fc4ec073490 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.898 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.897+0000 7fc4f12f9700 1 -- 192.168.123.103:0/737205898 >> 192.168.123.103:0/737205898 conn(0x7fc4ec0fc920 msgr2=0x7fc4ec0fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:12.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.898+0000 7fc4f12f9700 1 -- 192.168.123.103:0/737205898 shutdown_connections 2026-03-09T00:13:12.899 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.898+0000 7fc4f12f9700 1 -- 192.168.123.103:0/737205898 wait complete. 2026-03-09T00:13:12.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.899+0000 7fc4f12f9700 1 Processor -- start 2026-03-09T00:13:12.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4f12f9700 1 -- start start 2026-03-09T00:13:12.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4f12f9700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0730b0 0x7fc4ec100e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:12.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4f12f9700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec101390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:12.900 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4ea7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec101390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:12.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4ea7fc700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec101390 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56018/0 (socket says 192.168.123.103:56018) 2026-03-09T00:13:12.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4ea7fc700 1 -- 192.168.123.103:0/2625676908 learned_addr learned my addr 192.168.123.103:0/2625676908 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:12.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4f12f9700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4ec104f90 con 0x7fc4ec0739d0 2026-03-09T00:13:12.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.900+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc4ec1018d0 con 0x7fc4ec0730b0 2026-03-09T00:13:12.901 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.901+0000 7fc4ea7fc700 1 -- 192.168.123.103:0/2625676908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0730b0 msgr2=0x7fc4ec100e50 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:12.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.901+0000 7fc4ea7fc700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0730b0 0x7fc4ec100e50 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:12.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.901+0000 7fc4ea7fc700 1 -- 192.168.123.103:0/2625676908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc4d4009710 con 0x7fc4ec0739d0 2026-03-09T00:13:12.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.901+0000 7fc4ea7fc700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec101390 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7fc4dc00b5c0 tx=0x7fc4dc00bab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:12.902 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.901+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4dc01d070 con 0x7fc4ec0739d0 2026-03-09T00:13:12.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.902+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc4dc0097e0 con 0x7fc4ec0739d0 2026-03-09T00:13:12.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.902+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc4ec1a7540 con 0x7fc4ec0739d0 2026-03-09T00:13:12.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.902+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc4dc00f460 con 0x7fc4ec0739d0 2026-03-09T00:13:12.903 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.902+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc4dc021600 con 0x7fc4ec0739d0 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.903+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc4ec10a8f0 con 0x7fc4ec0739d0 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.903+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc4dc00fab0 con 0x7fc4ec0739d0 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.904+0000 7fc4e3fff700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc4d80778c0 0x7fc4d8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.904+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fc4dc09ae80 con 0x7fc4ec0739d0 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.905+0000 7fc4eaffd700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc4d80778c0 0x7fc4d8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.905+0000 7fc4eaffd700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc4d80778c0 0x7fc4d8079d80 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fc4d4009ee0 tx=0x7fc4d4009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:12.907 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:12.906+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc4dc0635b0 con 0x7fc4ec0739d0 2026-03-09T00:13:13.051 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.050+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7fc4ec04ea90 con 0x7fc4ec0739d0 2026-03-09T00:13:13.052 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.051+0000 7fc4e3fff700 1 -- 192.168.123.103:0/2625676908 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v33) v1 ==== 107+0+5128 (secure 0 0 0) 0x7fc4dc062d00 con 0x7fc4ec0739d0 2026-03-09T00:13:13.053 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:13.053 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":25,"btime":"2026-03-09T00:11:17:914480+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17},{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:05.464110+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":107,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24287},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24287":{"gid":24287,"name":"cephfs.vm06.ixduim","rank":0,"incarnation":19,"state":"up:active","state_seq":142,"addr":"192.168.123.106:6827/1001012017","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":1001012017},{"type":"v1","addr":"192.168.123.106:6827","nonce":1001012017}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24287,"qdb_cluster":[24287]},"id":1}]} 2026-03-09T00:13:13.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.055+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc4d80778c0 msgr2=0x7fc4d8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:13.055 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.055+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc4d80778c0 0x7fc4d8079d80 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fc4d4009ee0 tx=0x7fc4d4009450 comp rx=0 tx=0).stop 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 msgr2=0x7fc4ec101390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec101390 secure :-1 s=READY pgs=194 cs=0 l=1 rev1=1 crypto rx=0x7fc4dc00b5c0 tx=0x7fc4dc00bab0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 shutdown_connections 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fc4d80778c0 0x7fc4d8079d80 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fc4ec0730b0 0x7fc4ec100e50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 --2- 192.168.123.103:0/2625676908 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fc4ec0739d0 0x7fc4ec101390 unknown :-1 s=CLOSED pgs=194 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 >> 192.168.123.103:0/2625676908 conn(0x7fc4ec0fc920 msgr2=0x7fc4ec107a30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 shutdown_connections 2026-03-09T00:13:13.056 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.056+0000 7fc4f12f9700 1 -- 192.168.123.103:0/2625676908 wait complete. 2026-03-09T00:13:13.057 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 25 2026-03-09T00:13:13.123 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 26 2026-03-09T00:13:13.280 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.521+0000 7fe0a0b36700 1 -- 192.168.123.103:0/1540472266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 msgr2=0x7fe09c0688b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.521+0000 7fe09959a700 1 -- 192.168.123.103:0/1540472266 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe08400ba00 con 0x7fe09c0684d0 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.521+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/1540472266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c0688b0 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7fe084009b30 tx=0x7fe084009e40 comp rx=0 tx=0).stop 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 -- 192.168.123.103:0/1540472266 shutdown_connections 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/1540472266 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe09c068df0 0x7fe09c10d5b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/1540472266 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c0688b0 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 -- 192.168.123.103:0/1540472266 >> 192.168.123.103:0/1540472266 conn(0x7fe09c075960 msgr2=0x7fe09c075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 -- 192.168.123.103:0/1540472266 shutdown_connections 2026-03-09T00:13:13.523 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 -- 192.168.123.103:0/1540472266 wait complete. 2026-03-09T00:13:13.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.523+0000 7fe0a0b36700 1 Processor -- start 2026-03-09T00:13:13.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe0a0b36700 1 -- start start 2026-03-09T00:13:13.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe0a0b36700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c198d90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:13.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe0a0b36700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe09c068df0 0x7fe09c1992d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:13.524 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe0a0b36700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe09c1999b0 con 0x7fe09c0684d0 2026-03-09T00:13:13.525 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe0a0b36700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe09c19d740 con 0x7fe09c068df0 2026-03-09T00:13:13.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c198d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:13.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c198d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56048/0 (socket says 192.168.123.103:56048) 2026-03-09T00:13:13.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 -- 192.168.123.103:0/2277564767 learned_addr learned my addr 192.168.123.103:0/2277564767 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:13.526 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 -- 192.168.123.103:0/2277564767 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe09c068df0 msgr2=0x7fe09c1992d0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe09c068df0 0x7fe09c1992d0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 -- 192.168.123.103:0/2277564767 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0840097e0 con 0x7fe09c0684d0 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.524+0000 7fe09a59c700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c198d90 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7fe084005b40 tx=0x7fe0840049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.525+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe08401d070 con 0x7fe09c0684d0 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.525+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe08400bc10 con 0x7fe09c0684d0 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.525+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe08400f650 con 0x7fe09c0684d0 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.525+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe09c19da20 con 0x7fe09c0684d0 2026-03-09T00:13:13.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.525+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe09c19df70 con 0x7fe09c0684d0 2026-03-09T00:13:13.528 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.526+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe08400f7b0 con 0x7fe09c0684d0 2026-03-09T00:13:13.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.528+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe09c10ad20 con 0x7fe09c0684d0 2026-03-09T00:13:13.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.528+0000 7fe0937fe700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0880779e0 0x7fe088079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:13.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.529+0000 7fe099d9b700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0880779e0 0x7fe088079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:13.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.529+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fe08409c1c0 con 0x7fe09c0684d0 2026-03-09T00:13:13.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.529+0000 7fe099d9b700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0880779e0 0x7fe088079ea0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fe09c19a3b0 tx=0x7fe08c006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:13.531 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.531+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe084064a40 con 0x7fe09c0684d0 2026-03-09T00:13:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:13 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2492390515' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T00:13:13.588 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:13 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2625676908' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T00:13:13.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.668+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7fe09c19a0f0 con 0x7fe09c0684d0 2026-03-09T00:13:13.669 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.669+0000 7fe0937fe700 1 -- 192.168.123.103:0/2277564767 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v33) v1 ==== 107+0+4323 (secure 0 0 0) 0x7fe084064190 con 0x7fe09c0684d0 2026-03-09T00:13:13.669 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:13.670 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":26,"btime":"2026-03-09T00:11:20:925117+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34404,"name":"cephfs.vm03.sejksk","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":17},{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:20.925114+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:13.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:13 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2492390515' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-09T00:13:13.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:13 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2625676908' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-09T00:13:13.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0880779e0 msgr2=0x7fe088079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:13.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0880779e0 0x7fe088079ea0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fe09c19a3b0 tx=0x7fe08c006cb0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 msgr2=0x7fe09c198d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:13.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c198d90 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7fe084005b40 tx=0x7fe0840049e0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.672 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 shutdown_connections 2026-03-09T00:13:13.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fe0880779e0 0x7fe088079ea0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fe09c0684d0 0x7fe09c198d90 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 --2- 192.168.123.103:0/2277564767 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fe09c068df0 0x7fe09c1992d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:13.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.672+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 >> 192.168.123.103:0/2277564767 conn(0x7fe09c075960 msgr2=0x7fe09c0fe6a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:13.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.673+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 shutdown_connections 2026-03-09T00:13:13.673 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:13.673+0000 7fe0a0b36700 1 -- 192.168.123.103:0/2277564767 wait complete. 2026-03-09T00:13:13.674 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 26 2026-03-09T00:13:13.718 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 27 2026-03-09T00:13:13.867 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.133+0000 7f480c24f700 1 -- 192.168.123.103:0/3099893704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4804068df0 msgr2=0x7f480410d5b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.133+0000 7f480c24f700 1 --2- 192.168.123.103:0/3099893704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4804068df0 0x7f480410d5b0 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7f4800009b80 tx=0x7f4800009e90 comp rx=0 tx=0).stop 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.134+0000 7f480c24f700 1 -- 192.168.123.103:0/3099893704 shutdown_connections 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.134+0000 7f480c24f700 1 --2- 192.168.123.103:0/3099893704 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f4804068df0 0x7f480410d5b0 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.134+0000 7f480c24f700 1 --2- 192.168.123.103:0/3099893704 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f48040684d0 0x7f48040688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.134+0000 7f480c24f700 1 -- 192.168.123.103:0/3099893704 >> 192.168.123.103:0/3099893704 conn(0x7f4804075960 msgr2=0x7f4804075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.135+0000 7f480c24f700 1 -- 192.168.123.103:0/3099893704 shutdown_connections 2026-03-09T00:13:14.135 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.135+0000 7f480c24f700 1 -- 192.168.123.103:0/3099893704 wait complete. 2026-03-09T00:13:14.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f480c24f700 1 Processor -- start 2026-03-09T00:13:14.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f480c24f700 1 -- start start 2026-03-09T00:13:14.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f480c24f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 0x7f480410abb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:14.136 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f480c24f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4804068df0 0x7f480410b110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f480c24f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f480410b7f0 con 0x7f48040684d0 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f480c24f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4804105d40 con 0x7f4804068df0 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f4809feb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 0x7f480410abb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f4809feb700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 0x7f480410abb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:56064/0 (socket says 192.168.123.103:56064) 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f4809feb700 1 -- 192.168.123.103:0/3537324612 learned_addr learned my addr 192.168.123.103:0/3537324612 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f48097ea700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4804068df0 0x7f480410b110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:14.137 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f4809feb700 1 -- 192.168.123.103:0/3537324612 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4804068df0 msgr2=0x7f480410b110 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.136+0000 7f4809feb700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4804068df0 0x7f480410b110 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.137+0000 7f4809feb700 1 -- 192.168.123.103:0/3537324612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f48000097e0 con 0x7f48040684d0 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.137+0000 7f4809feb700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 0x7f480410abb0 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f47f800ed70 tx=0x7f47f800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.137+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f47f800cd70 con 0x7f48040684d0 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.137+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f47f800eec0 con 0x7f48040684d0 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.137+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f47f80188b0 con 0x7f48040684d0 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.138+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4804106020 con 0x7f48040684d0 2026-03-09T00:13:14.138 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.138+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4804106540 con 0x7f48040684d0 2026-03-09T00:13:14.139 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.138+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f480404ea90 con 0x7f48040684d0 2026-03-09T00:13:14.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.142+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f47f8018a10 con 0x7f48040684d0 2026-03-09T00:13:14.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.142+0000 7f47f6ffd700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f47f00779e0 0x7f47f0079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:14.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.142+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f47f8014070 con 0x7f48040684d0 2026-03-09T00:13:14.142 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.142+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f47f809a900 con 0x7f48040684d0 2026-03-09T00:13:14.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.142+0000 7f48097ea700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f47f00779e0 0x7f47f0079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:14.143 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.143+0000 7f48097ea700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f47f00779e0 0x7f47f0079ea0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f4804107530 tx=0x7f4800005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:14.283 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.282+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f4804066e80 con 0x7f48040684d0 2026-03-09T00:13:14.284 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.284+0000 7f47f6ffd700 1 -- 192.168.123.103:0/3537324612 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v33) v1 ==== 107+0+4402 (secure 0 0 0) 0x7f47f8062df0 con 0x7f48040684d0 2026-03-09T00:13:14.285 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:14.285 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":27,"btime":"2026-03-09T00:11:20:931090+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:20.931086+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:replay","state_seq":1,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:14.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f47f00779e0 msgr2=0x7f47f0079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.287 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f47f00779e0 0x7f47f0079ea0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f4804107530 tx=0x7f4800005fb0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 msgr2=0x7f480410abb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 0x7f480410abb0 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f47f800ed70 tx=0x7f47f800c5b0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 shutdown_connections 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f47f00779e0 0x7f47f0079ea0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f48040684d0 0x7f480410abb0 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 --2- 192.168.123.103:0/3537324612 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f4804068df0 0x7f480410b110 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 >> 192.168.123.103:0/3537324612 conn(0x7f4804075960 msgr2=0x7f48040fe9c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.287+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 shutdown_connections 2026-03-09T00:13:14.288 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.288+0000 7f480c24f700 1 -- 192.168.123.103:0/3537324612 wait complete. 2026-03-09T00:13:14.289 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 27 2026-03-09T00:13:14.350 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 28 2026-03-09T00:13:14.494 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:14.539 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:14 vm03.local ceph-mon[129670]: pgmap v293: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:14.539 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:14 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2277564767' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T00:13:14.539 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:14 vm03.local ceph-mon[129670]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:13:14.539 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:14 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3537324612' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T00:13:14.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:14 vm06.local ceph-mon[106218]: pgmap v293: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:14.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:14 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2277564767' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-09T00:13:14.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:14 vm06.local ceph-mon[106218]: from='mgr.34104 192.168.123.103:0/3528421146' entity='mgr.vm03.yvcons' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-09T00:13:14.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:14 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3537324612' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.721+0000 7ff34657f700 1 -- 192.168.123.103:0/4017322167 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 msgr2=0x7ff340100580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.721+0000 7ff34657f700 1 --2- 192.168.123.103:0/4017322167 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff340100580 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7ff328009b00 tx=0x7ff328009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.722+0000 7ff34657f700 1 -- 192.168.123.103:0/4017322167 shutdown_connections 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.722+0000 7ff34657f700 1 --2- 192.168.123.103:0/4017322167 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff340100b50 0x7ff340104a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.722+0000 7ff34657f700 1 --2- 192.168.123.103:0/4017322167 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff340100580 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.722+0000 7ff34657f700 1 -- 192.168.123.103:0/4017322167 >> 192.168.123.103:0/4017322167 conn(0x7ff340075960 msgr2=0x7ff340075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.722+0000 7ff34657f700 1 -- 192.168.123.103:0/4017322167 shutdown_connections 2026-03-09T00:13:14.722 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.722+0000 7ff34657f700 1 -- 192.168.123.103:0/4017322167 wait complete. 2026-03-09T00:13:14.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff34657f700 1 Processor -- start 2026-03-09T00:13:14.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff34657f700 1 -- start start 2026-03-09T00:13:14.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff34657f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff34019a840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:14.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff34657f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff340100b50 0x7ff34019ad80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:14.723 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff34657f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff34019b410 con 0x7ff3401001a0 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff34657f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3401948c0 con 0x7ff340100b50 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff33ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff34019a840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff33ffff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff34019a840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44360/0 (socket says 192.168.123.103:44360) 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff33ffff700 1 -- 192.168.123.103:0/2818446668 learned_addr learned my addr 192.168.123.103:0/2818446668 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.723+0000 7ff33f7fe700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff340100b50 0x7ff34019ad80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33ffff700 1 -- 192.168.123.103:0/2818446668 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff340100b50 msgr2=0x7ff34019ad80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33ffff700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff340100b50 0x7ff34019ad80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33ffff700 1 -- 192.168.123.103:0/2818446668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3280097e0 con 0x7ff3401001a0 2026-03-09T00:13:14.724 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33ffff700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff34019a840 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7ff328005950 tx=0x7ff32800dd20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:14.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff328005070 con 0x7ff3401001a0 2026-03-09T00:13:14.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff3280051d0 con 0x7ff3401001a0 2026-03-09T00:13:14.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff32801ea90 con 0x7ff3401001a0 2026-03-09T00:13:14.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff340194b40 con 0x7ff3401001a0 2026-03-09T00:13:14.725 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.724+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff340194fb0 con 0x7ff3401001a0 2026-03-09T00:13:14.726 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.725+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3401083e0 con 0x7ff3401001a0 2026-03-09T00:13:14.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.729+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff32801ebf0 con 0x7ff3401001a0 2026-03-09T00:13:14.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.729+0000 7ff33d7fa700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff32c077990 0x7ff32c079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:14.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.729+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff32809b8a0 con 0x7ff3401001a0 2026-03-09T00:13:14.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.729+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff3280cba90 con 0x7ff3401001a0 2026-03-09T00:13:14.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.730+0000 7ff33f7fe700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff32c077990 0x7ff32c079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:14.730 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.730+0000 7ff33f7fe700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff32c077990 0x7ff32c079e50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7ff340106990 tx=0x7ff330009380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:14.869 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.869+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7ff34004ea90 con 0x7ff3401001a0 2026-03-09T00:13:14.870 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.869+0000 7ff33d7fa700 1 -- 192.168.123.103:0/2818446668 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v33) v1 ==== 107+0+4406 (secure 0 0 0) 0x7ff328064050 con 0x7ff3401001a0 2026-03-09T00:13:14.870 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:14.870 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":28,"btime":"2026-03-09T00:11:25:310293+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:25.260157+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:reconnect","state_seq":10,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:14.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff32c077990 msgr2=0x7ff32c079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff32c077990 0x7ff32c079e50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7ff340106990 tx=0x7ff330009380 comp rx=0 tx=0).stop 2026-03-09T00:13:14.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 msgr2=0x7ff34019a840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:14.872 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff34019a840 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7ff328005950 tx=0x7ff32800dd20 comp rx=0 tx=0).stop 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 shutdown_connections 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7ff32c077990 0x7ff32c079e50 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7ff3401001a0 0x7ff34019a840 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 --2- 192.168.123.103:0/2818446668 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7ff340100b50 0x7ff34019ad80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 >> 192.168.123.103:0/2818446668 conn(0x7ff340075960 msgr2=0x7ff3400fea10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 shutdown_connections 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:14.872+0000 7ff34657f700 1 -- 192.168.123.103:0/2818446668 wait complete. 2026-03-09T00:13:14.873 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 28 2026-03-09T00:13:14.916 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 29 2026-03-09T00:13:15.054 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.314+0000 7fda3467b700 1 -- 192.168.123.103:0/4057966286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c073ae0 msgr2=0x7fda2c10d1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.314+0000 7fda3467b700 1 --2- 192.168.123.103:0/4057966286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c073ae0 0x7fda2c10d1c0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7fda28009b30 tx=0x7fda28009e40 comp rx=0 tx=0).stop 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 -- 192.168.123.103:0/4057966286 shutdown_connections 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 --2- 192.168.123.103:0/4057966286 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c073ae0 0x7fda2c10d1c0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 --2- 192.168.123.103:0/4057966286 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c0731c0 0x7fda2c0735a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 -- 192.168.123.103:0/4057966286 >> 192.168.123.103:0/4057966286 conn(0x7fda2c0fc9b0 msgr2=0x7fda2c0fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 -- 192.168.123.103:0/4057966286 shutdown_connections 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 -- 192.168.123.103:0/4057966286 wait complete. 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 Processor -- start 2026-03-09T00:13:15.315 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.315+0000 7fda3467b700 1 -- start start 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda3467b700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c0731c0 0x7fda2c198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda3467b700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 0x7fda2c199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda31c16700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 0x7fda2c199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda31c16700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 0x7fda2c199310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44376/0 (socket says 192.168.123.103:44376) 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda31c16700 1 -- 192.168.123.103:0/3699677077 learned_addr learned my addr 192.168.123.103:0/3699677077 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda3467b700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda2c1999f0 con 0x7fda2c073ae0 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fda2c19d780 con 0x7fda2c0731c0 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda31c16700 1 -- 192.168.123.103:0/3699677077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c0731c0 msgr2=0x7fda2c198dd0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:15.316 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda31c16700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c0731c0 0x7fda2c198dd0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.316+0000 7fda31c16700 1 -- 192.168.123.103:0/3699677077 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fda1c009710 con 0x7fda2c073ae0 2026-03-09T00:13:15.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.317+0000 7fda31c16700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 0x7fda2c199310 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7fda2800bee0 tx=0x7fda2800bfc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:15.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.317+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda2801d070 con 0x7fda2c073ae0 2026-03-09T00:13:15.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.317+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fda2800f460 con 0x7fda2c073ae0 2026-03-09T00:13:15.317 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.317+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fda280097e0 con 0x7fda2c073ae0 2026-03-09T00:13:15.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.317+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fda28021620 con 0x7fda2c073ae0 2026-03-09T00:13:15.318 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.318+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fda2c19dd60 con 0x7fda2c073ae0 2026-03-09T00:13:15.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.319+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fda2800f5d0 con 0x7fda2c073ae0 2026-03-09T00:13:15.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.319+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fda2c10a8c0 con 0x7fda2c073ae0 2026-03-09T00:13:15.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.319+0000 7fda237fe700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fda18077870 0x7fda18079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:15.321 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.319+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fda2809ad30 con 0x7fda2c073ae0 2026-03-09T00:13:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.321+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fda28063460 con 0x7fda2c073ae0 2026-03-09T00:13:15.322 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.322+0000 7fda32417700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fda18077870 0x7fda18079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:15.323 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.322+0000 7fda32417700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fda18077870 0x7fda18079d30 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fda1c009ee0 tx=0x7fda1c009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:15.406 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:15 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2818446668' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T00:13:15.461 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.461+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7fda2c19a130 con 0x7fda2c073ae0 2026-03-09T00:13:15.462 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.461+0000 7fda237fe700 1 -- 192.168.123.103:0/3699677077 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v33) v1 ==== 107+0+4403 (secure 0 0 0) 0x7fda28062bb0 con 0x7fda2c073ae0 2026-03-09T00:13:15.462 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:15.462 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":29,"btime":"2026-03-09T00:11:26:313782+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:25.315602+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:rejoin","state_seq":11,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-09T00:13:15.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fda18077870 msgr2=0x7fda18079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:15.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fda18077870 0x7fda18079d30 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7fda1c009ee0 tx=0x7fda1c009450 comp rx=0 tx=0).stop 2026-03-09T00:13:15.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 msgr2=0x7fda2c199310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:15.464 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 0x7fda2c199310 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7fda2800bee0 tx=0x7fda2800bfc0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 shutdown_connections 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fda18077870 0x7fda18079d30 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fda2c0731c0 0x7fda2c198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 --2- 192.168.123.103:0/3699677077 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fda2c073ae0 0x7fda2c199310 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.464+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 >> 192.168.123.103:0/3699677077 conn(0x7fda2c0fc9b0 msgr2=0x7fda2c107a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.465+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 shutdown_connections 2026-03-09T00:13:15.465 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.465+0000 7fda3467b700 1 -- 192.168.123.103:0/3699677077 wait complete. 2026-03-09T00:13:15.466 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 29 2026-03-09T00:13:15.523 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 30 2026-03-09T00:13:15.669 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:15.670 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:15 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2818446668' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.914+0000 7fce02c24700 1 -- 192.168.123.103:0/1549827132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc1033c0 msgr2=0x7fcdfc1037a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.914+0000 7fce02c24700 1 --2- 192.168.123.103:0/1549827132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc1033c0 0x7fcdfc1037a0 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7fcdec009b00 tx=0x7fcdec009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.915+0000 7fce02c24700 1 -- 192.168.123.103:0/1549827132 shutdown_connections 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.915+0000 7fce02c24700 1 --2- 192.168.123.103:0/1549827132 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcdfc103d70 0x7fcdfc107dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.915+0000 7fce02c24700 1 --2- 192.168.123.103:0/1549827132 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc1033c0 0x7fcdfc1037a0 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.915+0000 7fce02c24700 1 -- 192.168.123.103:0/1549827132 >> 192.168.123.103:0/1549827132 conn(0x7fcdfc0fec30 msgr2=0x7fcdfc101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 -- 192.168.123.103:0/1549827132 shutdown_connections 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 -- 192.168.123.103:0/1549827132 wait complete. 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 Processor -- start 2026-03-09T00:13:15.916 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 -- start start 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcdfc1033c0 0x7fcdfc198e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 0x7fcdfc1993c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdfc199a10 con 0x7fcdfc103d70 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.916+0000 7fce02c24700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcdfc199b50 con 0x7fcdfc1033c0 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fcdfbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 0x7fcdfc1993c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fcdfbfff700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 0x7fcdfc1993c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44398/0 (socket says 192.168.123.103:44398) 2026-03-09T00:13:15.917 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fcdfbfff700 1 -- 192.168.123.103:0/4102238471 learned_addr learned my addr 192.168.123.103:0/4102238471 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:15.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fce009c0700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcdfc1033c0 0x7fcdfc198e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:15.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fcdfbfff700 1 -- 192.168.123.103:0/4102238471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcdfc1033c0 msgr2=0x7fcdfc198e80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:15.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fcdfbfff700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcdfc1033c0 0x7fcdfc198e80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:15.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.917+0000 7fcdfbfff700 1 -- 192.168.123.103:0/4102238471 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcdec0097e0 con 0x7fcdfc103d70 2026-03-09T00:13:15.918 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.918+0000 7fcdfbfff700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 0x7fcdfc1993c0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fcdf000cc60 tx=0x7fcdf000cf70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:15.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.918+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdf00049e0 con 0x7fcdfc103d70 2026-03-09T00:13:15.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.918+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcdf0007cf0 con 0x7fcdfc103d70 2026-03-09T00:13:15.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.918+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcdf000f450 con 0x7fcdfc103d70 2026-03-09T00:13:15.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.918+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcdfc19d9a0 con 0x7fcdfc103d70 2026-03-09T00:13:15.919 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.918+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcdfc19def0 con 0x7fcdfc103d70 2026-03-09T00:13:15.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.920+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcdf000f5b0 con 0x7fcdfc103d70 2026-03-09T00:13:15.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.920+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcdfc04ea90 con 0x7fcdfc103d70 2026-03-09T00:13:15.923 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.923+0000 7fcdf9ffb700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcde40778c0 0x7fcde4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:15.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.923+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fcdf0099940 con 0x7fcdfc103d70 2026-03-09T00:13:15.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.923+0000 7fce009c0700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcde40778c0 0x7fcde4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:15.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.924+0000 7fce009c0700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcde40778c0 0x7fcde4079d80 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fcdec006010 tx=0x7fcdec00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:15.924 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:15.924+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcdf009d2a0 con 0x7fcdfc103d70 2026-03-09T00:13:16.063 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.062+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7fcdfc066e80 con 0x7fcdfc103d70 2026-03-09T00:13:16.066 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.065+0000 7fcdf9ffb700 1 -- 192.168.123.103:0/4102238471 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v33) v1 ==== 107+0+4412 (secure 0 0 0) 0x7fcdf00620f0 con 0x7fcdfc103d70 2026-03-09T00:13:16.066 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:16.066 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":30,"btime":"2026-03-09T00:11:27:321158+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:27.321157+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:active","state_seq":12,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34404,"qdb_cluster":[34404]},"id":1}]} 2026-03-09T00:13:16.068 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcde40778c0 msgr2=0x7fcde4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcde40778c0 0x7fcde4079d80 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7fcdec006010 tx=0x7fcdec00b540 comp rx=0 tx=0).stop 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 msgr2=0x7fcdfc1993c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 0x7fcdfc1993c0 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7fcdf000cc60 tx=0x7fcdf000cf70 comp rx=0 tx=0).stop 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 shutdown_connections 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fcde40778c0 0x7fcde4079d80 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fcdfc1033c0 0x7fcdfc198e80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 --2- 192.168.123.103:0/4102238471 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fcdfc103d70 0x7fcdfc1993c0 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.068+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 >> 192.168.123.103:0/4102238471 conn(0x7fcdfc0fec30 msgr2=0x7fcdfc100200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.069+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 shutdown_connections 2026-03-09T00:13:16.069 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.069+0000 7fce02c24700 1 -- 192.168.123.103:0/4102238471 wait complete. 2026-03-09T00:13:16.070 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 30 2026-03-09T00:13:16.110 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 31 2026-03-09T00:13:16.261 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.506+0000 7f709200f700 1 -- 192.168.123.103:0/773908241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 msgr2=0x7f708c105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.506+0000 7f709200f700 1 --2- 192.168.123.103:0/773908241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c105b50 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f707c009b00 tx=0x7f707c009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.506+0000 7f709200f700 1 -- 192.168.123.103:0/773908241 shutdown_connections 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.506+0000 7f709200f700 1 --2- 192.168.123.103:0/773908241 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c105b50 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.506+0000 7f709200f700 1 --2- 192.168.123.103:0/773908241 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f708c068730 0x7f708c068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.507+0000 7f709200f700 1 -- 192.168.123.103:0/773908241 >> 192.168.123.103:0/773908241 conn(0x7f708c075960 msgr2=0x7f708c075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.507+0000 7f709200f700 1 -- 192.168.123.103:0/773908241 shutdown_connections 2026-03-09T00:13:16.507 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.507+0000 7f709200f700 1 -- 192.168.123.103:0/773908241 wait complete. 2026-03-09T00:13:16.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.507+0000 7f709200f700 1 Processor -- start 2026-03-09T00:13:16.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f709200f700 1 -- start start 2026-03-09T00:13:16.508 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f709200f700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f708c068730 0x7f708c198e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f709200f700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c199380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f709200f700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f708c199a60 con 0x7f708c0690e0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f709200f700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f708c19d7f0 con 0x7f708c068730 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c199380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c199380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.103:3300/0 says I am v2:192.168.123.103:44416/0 (socket says 192.168.123.103:44416) 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 -- 192.168.123.103:0/599117410 learned_addr learned my addr 192.168.123.103:0/599117410 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 -- 192.168.123.103:0/599117410 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f708c068730 msgr2=0x7f708c198e40 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f708c068730 0x7f708c198e40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 -- 192.168.123.103:0/599117410 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f707c0097e0 con 0x7f708c0690e0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.508+0000 7f708affd700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c199380 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f707c005850 tx=0x7f707c004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.509+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f707c01d070 con 0x7f708c0690e0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.509+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f707c00bc50 con 0x7f708c0690e0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.509+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f707c00f700 con 0x7f708c0690e0 2026-03-09T00:13:16.509 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.509+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f708c19da70 con 0x7f708c0690e0 2026-03-09T00:13:16.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.510+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f708c19df60 con 0x7f708c0690e0 2026-03-09T00:13:16.511 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.510+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f707c022470 con 0x7f708c0690e0 2026-03-09T00:13:16.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.511+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f708c04ea90 con 0x7f708c0690e0 2026-03-09T00:13:16.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.511+0000 7f7088ff9700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7078077870 0x7f7078079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:16.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.511+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f707c067e20 con 0x7f708c0690e0 2026-03-09T00:13:16.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.514+0000 7f708b7fe700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7078077870 0x7f7078079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:16.514 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.514+0000 7f708b7fe700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7078077870 0x7f7078079d30 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f7074009730 tx=0x7f7074006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:16.515 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.515+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f707c0a0050 con 0x7f708c0690e0 2026-03-09T00:13:16.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:16 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/3699677077' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T00:13:16.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:16 vm03.local ceph-mon[129670]: pgmap v294: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:16.590 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:16 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/4102238471' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T00:13:16.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.651+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f708c066e80 con 0x7f708c0690e0 2026-03-09T00:13:16.652 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.652+0000 7f7088ff9700 1 -- 192.168.123.103:0/599117410 <== mon.0 v2:192.168.123.103:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v33) v1 ==== 107+0+5260 (secure 0 0 0) 0x7f707c027020 con 0x7f708c0690e0 2026-03-09T00:13:16.653 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:16.653 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":31,"btime":"2026-03-09T00:11:31:671418+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44325,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/839008662","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":839008662},{"type":"v1","addr":"192.168.123.106:6827","nonce":839008662}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":31}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:27.321157+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:active","state_seq":12,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34404,"qdb_cluster":[34404]},"id":1}]} 2026-03-09T00:13:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7078077870 msgr2=0x7f7078079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7078077870 0x7f7078079d30 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7f7074009730 tx=0x7f7074006cb0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 msgr2=0x7f708c199380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c199380 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7f707c005850 tx=0x7f707c004970 comp rx=0 tx=0).stop 2026-03-09T00:13:16.655 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 shutdown_connections 2026-03-09T00:13:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f7078077870 0x7f7078079d30 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f708c068730 0x7f708c198e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 --2- 192.168.123.103:0/599117410 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f708c0690e0 0x7f708c199380 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.655+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 >> 192.168.123.103:0/599117410 conn(0x7f708c075960 msgr2=0x7f708c102b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.656+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 shutdown_connections 2026-03-09T00:13:16.656 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:16.656+0000 7f709200f700 1 -- 192.168.123.103:0/599117410 wait complete. 2026-03-09T00:13:16.657 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 31 2026-03-09T00:13:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:16 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/3699677077' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-09T00:13:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:16 vm06.local ceph-mon[106218]: pgmap v294: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:16.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:16 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/4102238471' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-09T00:13:16.699 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph fs dump --format=json 32 2026-03-09T00:13:16.846 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.096+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/602462395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8103340 msgr2=0x7fb3c8103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.096+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/602462395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8103340 0x7fb3c8103720 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7fb3b0009b00 tx=0x7fb3b0009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.097+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/602462395 shutdown_connections 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.097+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/602462395 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.097+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/602462395 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8103340 0x7fb3c8103720 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.097+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/602462395 >> 192.168.123.103:0/602462395 conn(0x7fb3c80febd0 msgr2=0x7fb3c8100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.097+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/602462395 shutdown_connections 2026-03-09T00:13:17.097 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.097+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/602462395 wait complete. 2026-03-09T00:13:17.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3cdb4c700 1 Processor -- start 2026-03-09T00:13:17.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3cdb4c700 1 -- start start 2026-03-09T00:13:17.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3cdb4c700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8199000 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:17.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3cdb4c700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8199540 0x7fb3c819d9b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:17.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3cdb4c700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3c8199ad0 con 0x7fb3c8199540 2026-03-09T00:13:17.098 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3cdb4c700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3c8199c40 con 0x7fb3c8103cf0 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3c77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8199000 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3c77fe700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8199000 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:52818/0 (socket says 192.168.123.103:52818) 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.098+0000 7fb3c77fe700 1 -- 192.168.123.103:0/609710289 learned_addr learned my addr 192.168.123.103:0/609710289 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c6ffd700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8199540 0x7fb3c819d9b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c77fe700 1 -- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8199540 msgr2=0x7fb3c819d9b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c77fe700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8199540 0x7fb3c819d9b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c77fe700 1 -- 192.168.123.103:0/609710289 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3b00097e0 con 0x7fb3c8103cf0 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c6ffd700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8199540 0x7fb3c819d9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c77fe700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8199000 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fb3b0009fd0 tx=0x7fb3b00049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3b001d070 con 0x7fb3c8103cf0 2026-03-09T00:13:17.099 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3c819df50 con 0x7fb3c8103cf0 2026-03-09T00:13:17.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.099+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3c819e4a0 con 0x7fb3c8103cf0 2026-03-09T00:13:17.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.100+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb3b000bc50 con 0x7fb3c8103cf0 2026-03-09T00:13:17.100 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.100+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb3b000f670 con 0x7fb3c8103cf0 2026-03-09T00:13:17.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.101+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fb3b0022470 con 0x7fb3c8103cf0 2026-03-09T00:13:17.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.101+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3c810b6e0 con 0x7fb3c8103cf0 2026-03-09T00:13:17.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.101+0000 7fb3c4ff9700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb3b4077870 0x7fb3b4079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:17.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.101+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fb3b009b250 con 0x7fb3c8103cf0 2026-03-09T00:13:17.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.101+0000 7fb3c6ffd700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb3b4077870 0x7fb3b4079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:17.102 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.102+0000 7fb3c6ffd700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb3b4077870 0x7fb3b4079d30 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fb3b8005fd0 tx=0x7fb3b8005e20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:17.104 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.104+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb3b0063980 con 0x7fb3c8103cf0 2026-03-09T00:13:17.258 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.258+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7fb3c804ea90 con 0x7fb3c8103cf0 2026-03-09T00:13:17.260 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.259+0000 7fb3c4ff9700 1 -- 192.168.123.103:0/609710289 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v33) v1 ==== 107+0+5260 (secure 0 0 0) 0x7fb3b00630d0 con 0x7fb3c8103cf0 2026-03-09T00:13:17.260 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:13:17.260 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":32,"btime":"2026-03-09T00:11:34:915522+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34412,"name":"cephfs.vm03.ralade","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.103:6829/1027317762","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6828","nonce":1027317762},{"type":"v1","addr":"192.168.123.103:6829","nonce":1027317762}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":23},{"gid":44297,"name":"cephfs.vm06.vlrwtl","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6825/1987865018","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6824","nonce":1987865018},{"type":"v1","addr":"192.168.123.106:6825","nonce":1987865018}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44325,"name":"cephfs.vm06.ixduim","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.106:6827/839008662","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.106:6826","nonce":839008662},{"type":"v1","addr":"192.168.123.106:6827","nonce":839008662}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":31}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-09T00:01:42.952984+0000","modified":"2026-03-09T00:11:33.917506+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":109,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34404},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34404":{"gid":34404,"name":"cephfs.vm03.sejksk","rank":0,"incarnation":27,"state":"up:active","state_seq":12,"addr":"192.168.123.103:6827/784666836","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6826","nonce":784666836},{"type":"v1","addr":"192.168.123.103:6827","nonce":784666836}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34404,"qdb_cluster":[34404]},"id":1}]} 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb3b4077870 msgr2=0x7fb3b4079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb3b4077870 0x7fb3b4079d30 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7fb3b8005fd0 tx=0x7fb3b8005e20 comp rx=0 tx=0).stop 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 msgr2=0x7fb3c8199000 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8199000 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7fb3b0009fd0 tx=0x7fb3b00049e0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 shutdown_connections 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fb3b4077870 0x7fb3b4079d30 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fb3c8103cf0 0x7fb3c8199000 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 --2- 192.168.123.103:0/609710289 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fb3c8199540 0x7fb3c819d9b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 >> 192.168.123.103:0/609710289 conn(0x7fb3c80febd0 msgr2=0x7fb3c8100250 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:17.262 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 shutdown_connections 2026-03-09T00:13:17.263 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.262+0000 7fb3cdb4c700 1 -- 192.168.123.103:0/609710289 wait complete. 2026-03-09T00:13:17.264 INFO:teuthology.orchestra.run.vm03.stderr:dumped fsmap epoch 32 2026-03-09T00:13:17.305 DEBUG:teuthology.run_tasks:Unwinding manager kclient 2026-03-09T00:13:17.308 INFO:tasks.kclient:Unmounting kernel clients... 2026-03-09T00:13:17.308 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:13:17.308 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T00:13:17.329 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-09T00:13:17.329 DEBUG:teuthology.orchestra.run.vm03:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T00:13:17.388 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd blocklist ls 2026-03-09T00:13:17.566 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:17 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/599117410' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T00:13:17.566 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:17 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/609710289' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-09T00:13:17.574 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:17.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:17 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/599117410' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-09T00:13:17.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:17 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/609710289' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.817+0000 7fa56eca6700 1 -- 192.168.123.103:0/711919776 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 msgr2=0x7fa568101660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.817+0000 7fa56eca6700 1 --2- 192.168.123.103:0/711919776 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa568101660 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7fa554009b00 tx=0x7fa554009e10 comp rx=0 tx=0).stop 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.819+0000 7fa56eca6700 1 -- 192.168.123.103:0/711919776 shutdown_connections 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.819+0000 7fa56eca6700 1 --2- 192.168.123.103:0/711919776 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa568105bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.819+0000 7fa56eca6700 1 --2- 192.168.123.103:0/711919776 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa568101660 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.819+0000 7fa56eca6700 1 -- 192.168.123.103:0/711919776 >> 192.168.123.103:0/711919776 conn(0x7fa568078ed0 msgr2=0x7fa5680792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.819+0000 7fa56eca6700 1 -- 192.168.123.103:0/711919776 shutdown_connections 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.819+0000 7fa56eca6700 1 -- 192.168.123.103:0/711919776 wait complete. 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa56eca6700 1 Processor -- start 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa56eca6700 1 -- start start 2026-03-09T00:13:17.820 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa56eca6700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa56819a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa56eca6700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa56819acd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa56eca6700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa56819b2f0 con 0x7fa568101280 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa56eca6700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa568194880 con 0x7fa568101c30 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa567fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa56819acd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.820+0000 7fa567fff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa56819acd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:52840/0 (socket says 192.168.123.103:52840) 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa567fff700 1 -- 192.168.123.103:0/2454619717 learned_addr learned my addr 192.168.123.103:0/2454619717 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa56ca42700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa56819a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa567fff700 1 -- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 msgr2=0x7fa56819a790 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa567fff700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa56819a790 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa567fff700 1 -- 192.168.123.103:0/2454619717 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa5540097e0 con 0x7fa568101c30 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa56ca42700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa56819a790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-09T00:13:17.821 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa567fff700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa56819acd0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fa55c009fd0 tx=0x7fa55c00eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:17.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa55c00cca0 con 0x7fa568101c30 2026-03-09T00:13:17.822 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa568194b60 con 0x7fa568101c30 2026-03-09T00:13:17.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa55c00ce00 con 0x7fa568101c30 2026-03-09T00:13:17.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.821+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa55c017880 con 0x7fa568101c30 2026-03-09T00:13:17.823 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.822+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa5681950b0 con 0x7fa568101c30 2026-03-09T00:13:17.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.822+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa568109540 con 0x7fa568101c30 2026-03-09T00:13:17.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.823+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fa55c010bd0 con 0x7fa568101c30 2026-03-09T00:13:17.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.823+0000 7fa565ffb700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa55807bcd0 0x7fa55807e190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:17.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.823+0000 7fa56ca42700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa55807bcd0 0x7fa55807e190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:17.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.824+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7fa55c014070 con 0x7fa568101c30 2026-03-09T00:13:17.824 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.824+0000 7fa56ca42700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa55807bcd0 0x7fa55807e190 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fa55400b5c0 tx=0x7fa554005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:17.826 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.826+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fa55c061f50 con 0x7fa568101c30 2026-03-09T00:13:17.953 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.952+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7fa56804ea90 con 0x7fa568101c30 2026-03-09T00:13:17.954 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.953+0000 7fa565ffb700 1 -- 192.168.123.103:0/2454619717 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 37 entries v109) v1 ==== 81+0+2273 (secure 0 0 0) 0x7fa55c0616a0 con 0x7fa568101c30 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6826/1001012017 2026-03-10T00:11:20.924906+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6829/3870847623 2026-03-10T00:10:58.287721+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6828/3870847623 2026-03-10T00:10:58.287721+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6827/3708505754 2026-03-10T00:10:44.533274+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6826/3708505754 2026-03-10T00:10:44.533274+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2768716536 2026-03-09T23:59:59.972086+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/933672084 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2723392878 2026-03-09T23:59:59.972086+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2931592692 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6824/3799306593 2026-03-10T00:01:48.409357+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/78662117 2026-03-09T23:59:25.439954+0000 2026-03-09T00:13:17.955 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/2 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6825/3799306593 2026-03-10T00:01:48.409357+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3475137602 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6827/1001012017 2026-03-10T00:11:20.924906+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3723784945 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/332492090 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/3123605642 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/2 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/718053161 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2143338907 2026-03-09T23:59:25.439954+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/875696685 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2345556975 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6828/4100748704 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2153722008 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6829/4100748704 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2535486397 2026-03-09T23:59:59.972086+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/462194877 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2800828829 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1899662013 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1691249097 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/3123605642 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1443304653 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2762219228 2026-03-09T23:59:25.439954+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/181593037 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/3624897156 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.956 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/2795473619 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.956+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa55807bcd0 msgr2=0x7fa55807e190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.956+0000 7fa56eca6700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa55807bcd0 0x7fa55807e190 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7fa55400b5c0 tx=0x7fa554005fd0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 msgr2=0x7fa56819acd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa56819acd0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7fa55c009fd0 tx=0x7fa55c00eea0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 shutdown_connections 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7fa55807bcd0 0x7fa55807e190 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7fa568101280 0x7fa56819a790 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 --2- 192.168.123.103:0/2454619717 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7fa568101c30 0x7fa56819acd0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:17.957 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 >> 192.168.123.103:0/2454619717 conn(0x7fa568078ed0 msgr2=0x7fa5681008d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 shutdown_connections 2026-03-09T00:13:17.958 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:17.957+0000 7fa56eca6700 1 -- 192.168.123.103:0/2454619717 wait complete. 2026-03-09T00:13:17.959 INFO:teuthology.orchestra.run.vm03.stderr:listed 37 entries 2026-03-09T00:13:18.005 DEBUG:tasks.cephfs.kernel_mount:Unmounting client client.0... 2026-03-09T00:13:18.005 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.005 DEBUG:teuthology.orchestra.run.vm03:> sudo umount /home/ubuntu/cephtest/mnt.0 2026-03-09T00:13:18.035 INFO:tasks.cephfs.mount:Cleaning up mount ubuntu@vm03.local 2026-03-09T00:13:18.035 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.035 DEBUG:teuthology.orchestra.run.vm03:> (cd /home/ubuntu/cephtest && exec rmdir -- /home/ubuntu/cephtest/mnt.0) 2026-03-09T00:13:18.105 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:13:18.105 DEBUG:teuthology.orchestra.run.vm06:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T00:13:18.124 DEBUG:teuthology.orchestra.run.vm06:> set -ex 2026-03-09T00:13:18.124 DEBUG:teuthology.orchestra.run.vm06:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-09T00:13:18.179 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph osd blocklist ls 2026-03-09T00:13:18.321 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:18.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.569+0000 7f2452d7e700 1 -- 192.168.123.103:0/310589351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 msgr2=0x7f244c103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:18.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.569+0000 7f2452d7e700 1 --2- 192.168.123.103:0/310589351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 0x7f244c103720 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7f243c009b50 tx=0x7f243c009e60 comp rx=0 tx=0).stop 2026-03-09T00:13:18.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.570+0000 7f2452d7e700 1 -- 192.168.123.103:0/310589351 shutdown_connections 2026-03-09T00:13:18.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.570+0000 7f2452d7e700 1 --2- 192.168.123.103:0/310589351 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.570+0000 7f2452d7e700 1 --2- 192.168.123.103:0/310589351 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 0x7f244c103720 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.570+0000 7f2452d7e700 1 -- 192.168.123.103:0/310589351 >> 192.168.123.103:0/310589351 conn(0x7f244c0feb90 msgr2=0x7f244c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.570+0000 7f2452d7e700 1 -- 192.168.123.103:0/310589351 shutdown_connections 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.570+0000 7f2452d7e700 1 -- 192.168.123.103:0/310589351 wait complete. 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f2452d7e700 1 Processor -- start 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f2452d7e700 1 -- start start 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f2452d7e700 1 --2- >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 0x7f244c0752a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f2452d7e700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c0757e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:18.571 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f2452d7e700 1 -- --> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f244c079430 con 0x7f244c103340 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f2452d7e700 1 -- --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f244c075d20 con 0x7f244c103cf0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f244bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c0757e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f244bfff700 1 --2- >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c0757e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.106:3300/0 says I am v2:192.168.123.103:52860/0 (socket says 192.168.123.103:52860) 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f244bfff700 1 -- 192.168.123.103:0/2521653017 learned_addr learned my addr 192.168.123.103:0/2521653017 (peer_addr_for_me v2:192.168.123.103:0/0) 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f244bfff700 1 -- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 msgr2=0x7f244c0752a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.571+0000 7f244bfff700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 0x7f244c0752a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f244bfff700 1 -- 192.168.123.103:0/2521653017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f243c0097e0 con 0x7f244c103cf0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f244bfff700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c0757e0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f244000d900 tx=0x7f244000dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f24400098e0 con 0x7f244c103cf0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f244c075f80 con 0x7f244c103cf0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f244c071b10 con 0x7f244c103cf0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2440010460 con 0x7f244c103cf0 2026-03-09T00:13:18.572 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.572+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f244000f5d0 con 0x7f244c103cf0 2026-03-09T00:13:18.573 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.573+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f244c04ea90 con 0x7f244c103cf0 2026-03-09T00:13:18.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.574+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 4 ==== mgrmap(e 39) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f24400105d0 con 0x7f244c103cf0 2026-03-09T00:13:18.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.574+0000 7f2449ffb700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f24340778c0 0x7f2434079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-09T00:13:18.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.574+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 5 ==== osd_map(109..109 src has 1..109) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f2440099280 con 0x7f244c103cf0 2026-03-09T00:13:18.574 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.574+0000 7f2450b1a700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f24340778c0 0x7f2434079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-09T00:13:18.575 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.575+0000 7f2450b1a700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f24340778c0 0x7f2434079d80 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f243c000c00 tx=0x7f243c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-09T00:13:18.577 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.577+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f24400619b0 con 0x7f244c103cf0 2026-03-09T00:13:18.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:18 vm06.local ceph-mon[106218]: pgmap v295: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:18.671 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:18 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2454619717' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T00:13:18.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:18 vm03.local ceph-mon[129670]: pgmap v295: 65 pgs: 65 active+clean; 209 MiB data, 888 MiB used, 119 GiB / 120 GiB avail 2026-03-09T00:13:18.700 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:18 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2454619717' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T00:13:18.701 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.699+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 --> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f244c066e80 con 0x7f244c103cf0 2026-03-09T00:13:18.703 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.702+0000 7f2449ffb700 1 -- 192.168.123.103:0/2521653017 <== mon.1 v2:192.168.123.106:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 37 entries v109) v1 ==== 81+0+2273 (secure 0 0 0) 0x7f2440061100 con 0x7f244c103cf0 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6826/1001012017 2026-03-10T00:11:20.924906+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6829/3870847623 2026-03-10T00:10:58.287721+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6828/3870847623 2026-03-10T00:10:58.287721+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6827/3708505754 2026-03-10T00:10:44.533274+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6826/3708505754 2026-03-10T00:10:44.533274+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2768716536 2026-03-09T23:59:59.972086+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/933672084 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2723392878 2026-03-09T23:59:59.972086+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2931592692 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6824/3799306593 2026-03-10T00:01:48.409357+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/78662117 2026-03-09T23:59:25.439954+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/2 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6825/3799306593 2026-03-10T00:01:48.409357+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3475137602 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6827/1001012017 2026-03-10T00:11:20.924906+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/3723784945 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/332492090 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6801/3123605642 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/2 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/718053161 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2143338907 2026-03-09T23:59:25.439954+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/875696685 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2345556975 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6828/4100748704 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2153722008 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:6829/4100748704 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2535486397 2026-03-09T23:59:59.972086+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/462194877 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2800828829 2026-03-10T00:03:34.458837+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1899662013 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1691249097 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:6800/3123605642 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/1443304653 2026-03-09T23:59:11.665534+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/2762219228 2026-03-09T23:59:25.439954+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.103:0/181593037 2026-03-10T00:05:43.508227+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/3624897156 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.704 INFO:teuthology.orchestra.run.vm03.stdout:192.168.123.106:0/2795473619 2026-03-10T00:03:55.861531+0000 2026-03-09T00:13:18.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f24340778c0 msgr2=0x7f2434079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:18.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f24340778c0 0x7f2434079d80 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f243c000c00 tx=0x7f243c005fb0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.706 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 msgr2=0x7f244c0757e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c0757e0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f244000d900 tx=0x7f244000dcc0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 shutdown_connections 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:6800/1313678299,v1:192.168.123.103:6801/1313678299] conn(0x7f24340778c0 0x7f2434079d80 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] conn(0x7f244c103340 0x7f244c0752a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 --2- 192.168.123.103:0/2521653017 >> [v2:192.168.123.106:3300/0,v1:192.168.123.106:6789/0] conn(0x7f244c103cf0 0x7f244c0757e0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 >> 192.168.123.103:0/2521653017 conn(0x7f244c0feb90 msgr2=0x7f244c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 shutdown_connections 2026-03-09T00:13:18.707 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-09T00:13:18.706+0000 7f2452d7e700 1 -- 192.168.123.103:0/2521653017 wait complete. 2026-03-09T00:13:18.708 INFO:teuthology.orchestra.run.vm03.stderr:listed 37 entries 2026-03-09T00:13:18.746 DEBUG:tasks.cephfs.kernel_mount:Unmounting client client.1... 2026-03-09T00:13:18.746 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.747 DEBUG:teuthology.orchestra.run.vm06:> sudo umount /home/ubuntu/cephtest/mnt.1 2026-03-09T00:13:18.774 INFO:tasks.cephfs.mount:Cleaning up mount ubuntu@vm06.local 2026-03-09T00:13:18.775 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.775 DEBUG:teuthology.orchestra.run.vm06:> (cd /home/ubuntu/cephtest && exec rmdir -- /home/ubuntu/cephtest/mnt.1) 2026-03-09T00:13:18.838 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.838 DEBUG:teuthology.orchestra.run.vm03:> ip netns list 2026-03-09T00:13:18.854 INFO:teuthology.orchestra.run.vm03.stdout:ceph-ns--home-ubuntu-cephtest-mnt.0 (id: 0) 2026-03-09T00:13:18.854 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.854 DEBUG:teuthology.orchestra.run.vm03:> sudo ip netns delete ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-09T00:13:18.920 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:18.920 DEBUG:teuthology.orchestra.run.vm03:> sudo ip link delete ceph-brx 2026-03-09T00:13:19.003 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:19.004 DEBUG:teuthology.orchestra.run.vm06:> ip netns list 2026-03-09T00:13:19.019 INFO:teuthology.orchestra.run.vm06.stdout:ceph-ns--home-ubuntu-cephtest-mnt.1 (id: 0) 2026-03-09T00:13:19.019 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:19.020 DEBUG:teuthology.orchestra.run.vm06:> sudo ip netns delete ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-09T00:13:19.085 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-09T00:13:19.085 DEBUG:teuthology.orchestra.run.vm06:> sudo ip link delete ceph-brx 2026-03-09T00:13:19.165 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-09T00:13:19.168 INFO:tasks.cephadm:Teardown begin 2026-03-09T00:13:19.168 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T00:13:19.196 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T00:13:19.230 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-09T00:13:19.230 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 -- ceph mgr module disable cephadm 2026-03-09T00:13:19.376 INFO:teuthology.orchestra.run.vm03.stderr:Inferring config /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/mon.vm03/config 2026-03-09T00:13:19.506 INFO:teuthology.orchestra.run.vm03.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-09T00:13:19.521 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:19 vm03.local ceph-mon[129670]: from='client.? 192.168.123.103:0/2521653017' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T00:13:19.522 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-09T00:13:19.522 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-09T00:13:19.522 DEBUG:teuthology.orchestra.run.vm03:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T00:13:19.537 DEBUG:teuthology.orchestra.run.vm06:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-09T00:13:19.551 INFO:tasks.cephadm:Stopping all daemons... 2026-03-09T00:13:19.551 INFO:tasks.cephadm.mon.vm03:Stopping mon.vm03... 2026-03-09T00:13:19.551 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03 2026-03-09T00:13:19.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:19 vm03.local systemd[1]: Stopping Ceph mon.vm03 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:19.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:19 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[129666]: 2026-03-09T00:13:19.676+0000 7f803204c640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm03 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:19.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:19 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03[129666]: 2026-03-09T00:13:19.676+0000 7f803204c640 -1 mon.vm03@0(leader) e3 *** Got Signal Terminated *** 2026-03-09T00:13:19.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:19 vm03.local podman[170054]: 2026-03-09 00:13:19.76167904 +0000 UTC m=+0.098390370 container died cafe87ec117d4a06b04edbb0a533db1e7f053ff2bb1fe4fd3a0662b859594874 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid) 2026-03-09T00:13:19.776 INFO:journalctl@ceph.mon.vm03.vm03.stdout:Mar 09 00:13:19 vm03.local podman[170054]: 2026-03-09 00:13:19.776896571 +0000 UTC m=+0.113607901 container remove cafe87ec117d4a06b04edbb0a533db1e7f053ff2bb1fe4fd3a0662b859594874 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm03, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0) 2026-03-09T00:13:19.839 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm03.service' 2026-03-09T00:13:19.871 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:19.871 INFO:tasks.cephadm.mon.vm03:Stopped mon.vm03 2026-03-09T00:13:19.871 INFO:tasks.cephadm.mon.vm06:Stopping mon.vm06... 2026-03-09T00:13:19.871 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm06 2026-03-09T00:13:19.892 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:19 vm06.local ceph-mon[106218]: from='client.? 192.168.123.103:0/2521653017' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-09T00:13:20.159 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:19 vm06.local systemd[1]: Stopping Ceph mon.vm06 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:20.159 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:19 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06[106214]: 2026-03-09T00:13:19.968+0000 7faf23e83640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm06 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:20.159 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:19 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06[106214]: 2026-03-09T00:13:19.968+0000 7faf23e83640 -1 mon.vm06@1(peon) e3 *** Got Signal Terminated *** 2026-03-09T00:13:20.159 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:20 vm06.local podman[136940]: 2026-03-09 00:13:20.083716522 +0000 UTC m=+0.127943948 container died 33df752aa193f37075d1e20764e59635f39a4d5e5274aa1c1bde4c7a8d1d9e3d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:13:20.159 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:20 vm06.local podman[136940]: 2026-03-09 00:13:20.109783451 +0000 UTC m=+0.154010877 container remove 33df752aa193f37075d1e20764e59635f39a4d5e5274aa1c1bde4c7a8d1d9e3d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-09T00:13:20.159 INFO:journalctl@ceph.mon.vm06.vm06.stdout:Mar 09 00:13:20 vm06.local bash[136940]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-mon-vm06 2026-03-09T00:13:20.166 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@mon.vm06.service' 2026-03-09T00:13:20.240 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:20.240 INFO:tasks.cephadm.mon.vm06:Stopped mon.vm06 2026-03-09T00:13:20.240 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-09T00:13:20.240 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0 2026-03-09T00:13:20.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:20 vm03.local systemd[1]: Stopping Ceph osd.0 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:20.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:20 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:13:20.338+0000 7fc7ccc99640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:20.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:20 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:13:20.338+0000 7fc7ccc99640 -1 osd.0 109 *** Got signal Terminated *** 2026-03-09T00:13:20.588 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:20 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0[139173]: 2026-03-09T00:13:20.338+0000 7fc7ccc99640 -1 osd.0 109 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:13:25.689 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170169]: 2026-03-09 00:13:25.377442522 +0000 UTC m=+5.051893802 container died 7112eceae9ce23ba4a76bf3a63ec90108cba7b6f6f6affc826b6fe007a09b262 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default) 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170169]: 2026-03-09 00:13:25.404663539 +0000 UTC m=+5.079114810 container remove 7112eceae9ce23ba4a76bf3a63ec90108cba7b6f6f6affc826b6fe007a09b262 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223) 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local bash[170169]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170235]: 2026-03-09 00:13:25.527004886 +0000 UTC m=+0.015723289 container create ccf2d7890bcbfd531448cc67eacb6813bad9a8b299afd6d8832f0e505665377f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170235]: 2026-03-09 00:13:25.567061477 +0000 UTC m=+0.055779900 container init ccf2d7890bcbfd531448cc67eacb6813bad9a8b299afd6d8832f0e505665377f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170235]: 2026-03-09 00:13:25.569770537 +0000 UTC m=+0.058488940 container start ccf2d7890bcbfd531448cc67eacb6813bad9a8b299afd6d8832f0e505665377f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2) 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170235]: 2026-03-09 00:13:25.57337873 +0000 UTC m=+0.062097153 container attach ccf2d7890bcbfd531448cc67eacb6813bad9a8b299afd6d8832f0e505665377f (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-0-deactivate, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) 2026-03-09T00:13:25.690 INFO:journalctl@ceph.osd.0.vm03.stdout:Mar 09 00:13:25 vm03.local podman[170235]: 2026-03-09 00:13:25.52008768 +0000 UTC m=+0.008806093 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:13:25.716 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.0.service' 2026-03-09T00:13:25.747 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:25.747 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-09T00:13:25.747 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-09T00:13:25.747 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1 2026-03-09T00:13:26.088 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:25 vm03.local systemd[1]: Stopping Ceph osd.1 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:26.088 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:25 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:13:25.882+0000 7f87a4613640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:26.088 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:25 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:13:25.882+0000 7f87a4613640 -1 osd.1 109 *** Got signal Terminated *** 2026-03-09T00:13:26.088 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:25 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1[145089]: 2026-03-09T00:13:25.882+0000 7f87a4613640 -1 osd.1 109 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:30 vm03.local podman[170330]: 2026-03-09 00:13:30.925553915 +0000 UTC m=+5.056631249 container died e70d2f37c6d1d0c84724d13a480b876202cc45fad647f054e72bd753d31225b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2) 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:30 vm03.local podman[170330]: 2026-03-09 00:13:30.944469448 +0000 UTC m=+5.075546782 container remove e70d2f37c6d1d0c84724d13a480b876202cc45fad647f054e72bd753d31225b0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:30 vm03.local bash[170330]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:31 vm03.local podman[170400]: 2026-03-09 00:13:31.073621913 +0000 UTC m=+0.015727968 container create 73eea21376473494a988ef35c78c2bab3e7edf73e144d908b199e0855f41b33c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:31 vm03.local podman[170400]: 2026-03-09 00:13:31.111019456 +0000 UTC m=+0.053125533 container init 73eea21376473494a988ef35c78c2bab3e7edf73e144d908b199e0855f41b33c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223) 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:31 vm03.local podman[170400]: 2026-03-09 00:13:31.113581591 +0000 UTC m=+0.055687657 container start 73eea21376473494a988ef35c78c2bab3e7edf73e144d908b199e0855f41b33c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:31 vm03.local podman[170400]: 2026-03-09 00:13:31.116420926 +0000 UTC m=+0.058526992 container attach 73eea21376473494a988ef35c78c2bab3e7edf73e144d908b199e0855f41b33c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-1-deactivate, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default) 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:31 vm03.local podman[170400]: 2026-03-09 00:13:31.067235048 +0000 UTC m=+0.009341105 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:13:31.238 INFO:journalctl@ceph.osd.1.vm03.stdout:Mar 09 00:13:31 vm03.local conmon[170410]: conmon 73eea21376473494a988 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-73eea21376473494a988ef35c78c2bab3e7edf73e144d908b199e0855f41b33c.scope/container/memory.events 2026-03-09T00:13:31.266 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.1.service' 2026-03-09T00:13:31.296 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:31.296 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-09T00:13:31.296 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-09T00:13:31.296 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.2 2026-03-09T00:13:31.588 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:31 vm03.local systemd[1]: Stopping Ceph osd.2 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:31.588 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:31 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:13:31.430+0000 7fec685c0640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:31.588 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:31 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:13:31.430+0000 7fec685c0640 -1 osd.2 109 *** Got signal Terminated *** 2026-03-09T00:13:31.588 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:31 vm03.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2[149933]: 2026-03-09T00:13:31.430+0000 7fec685c0640 -1 osd.2 109 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170495]: 2026-03-09 00:13:36.454542479 +0000 UTC m=+5.035861829 container died e7841e7307ae35f7cca41e5e3e7f3843934706ec142a1bdc9863e0a94fc0a2db (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170495]: 2026-03-09 00:13:36.479754054 +0000 UTC m=+5.061073415 container remove e7841e7307ae35f7cca41e5e3e7f3843934706ec142a1bdc9863e0a94fc0a2db (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local bash[170495]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170574]: 2026-03-09 00:13:36.604300759 +0000 UTC m=+0.014946155 container create cefcac571525b8dbc9fcc4509c15398f97288be7cb11f7534f88973edd9b6bad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170574]: 2026-03-09 00:13:36.64551716 +0000 UTC m=+0.056162565 container init cefcac571525b8dbc9fcc4509c15398f97288be7cb11f7534f88973edd9b6bad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default) 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170574]: 2026-03-09 00:13:36.648225079 +0000 UTC m=+0.058870474 container start cefcac571525b8dbc9fcc4509c15398f97288be7cb11f7534f88973edd9b6bad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True) 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170574]: 2026-03-09 00:13:36.653394013 +0000 UTC m=+0.064039418 container attach cefcac571525b8dbc9fcc4509c15398f97288be7cb11f7534f88973edd9b6bad (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-2-deactivate, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0) 2026-03-09T00:13:36.762 INFO:journalctl@ceph.osd.2.vm03.stdout:Mar 09 00:13:36 vm03.local podman[170574]: 2026-03-09 00:13:36.598409412 +0000 UTC m=+0.009054817 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:13:36.804 DEBUG:teuthology.orchestra.run.vm03:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.2.service' 2026-03-09T00:13:36.834 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:36.835 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-09T00:13:36.835 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-09T00:13:36.835 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3 2026-03-09T00:13:37.171 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:36 vm06.local systemd[1]: Stopping Ceph osd.3 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:37.171 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:36 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:13:36.928+0000 7fecd5843640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:37.171 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:36 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:13:36.928+0000 7fecd5843640 -1 osd.3 109 *** Got signal Terminated *** 2026-03-09T00:13:37.171 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:36 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3[114205]: 2026-03-09T00:13:36.928+0000 7fecd5843640 -1 osd.3 109 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:41 vm06.local podman[137043]: 2026-03-09 00:13:41.968788821 +0000 UTC m=+5.052570500 container died 8e61be6171398e1b405adf5b066946b3ea009a52ba01924d5d95b99f781bbec1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:41 vm06.local podman[137043]: 2026-03-09 00:13:41.996330151 +0000 UTC m=+5.080111830 container remove 8e61be6171398e1b405adf5b066946b3ea009a52ba01924d5d95b99f781bbec1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:41 vm06.local bash[137043]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:42 vm06.local podman[137120]: 2026-03-09 00:13:42.12423487 +0000 UTC m=+0.016767375 container create b768930d3ffd9b73e25c93179c3a954a411f3309a1add7ef70c5be96b140900c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:42 vm06.local podman[137120]: 2026-03-09 00:13:42.166182043 +0000 UTC m=+0.058714548 container init b768930d3ffd9b73e25c93179c3a954a411f3309a1add7ef70c5be96b140900c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3) 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:42 vm06.local podman[137120]: 2026-03-09 00:13:42.16888421 +0000 UTC m=+0.061416715 container start b768930d3ffd9b73e25c93179c3a954a411f3309a1add7ef70c5be96b140900c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:42 vm06.local podman[137120]: 2026-03-09 00:13:42.169811796 +0000 UTC m=+0.062344301 container attach b768930d3ffd9b73e25c93179c3a954a411f3309a1add7ef70c5be96b140900c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:42 vm06.local podman[137120]: 2026-03-09 00:13:42.116340494 +0000 UTC m=+0.008872999 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-09T00:13:42.289 INFO:journalctl@ceph.osd.3.vm06.stdout:Mar 09 00:13:42 vm06.local podman[137120]: 2026-03-09 00:13:42.288381563 +0000 UTC m=+0.180914068 container died b768930d3ffd9b73e25c93179c3a954a411f3309a1add7ef70c5be96b140900c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-3-deactivate, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid) 2026-03-09T00:13:42.326 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.3.service' 2026-03-09T00:13:42.359 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:42.359 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-09T00:13:42.359 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-09T00:13:42.359 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.4 2026-03-09T00:13:42.670 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:42 vm06.local systemd[1]: Stopping Ceph osd.4 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:42.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:42 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:13:42.487+0000 7fb8a9c3c640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:42.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:42 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:13:42.487+0000 7fb8a9c3c640 -1 osd.4 109 *** Got signal Terminated *** 2026-03-09T00:13:42.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:42 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4[118634]: 2026-03-09T00:13:42.487+0000 7fb8a9c3c640 -1 osd.4 109 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:13:46.670 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:46 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:46.375+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:47.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:47 vm06.local podman[137216]: 2026-03-09 00:13:47.516818882 +0000 UTC m=+5.041139625 container died 21cf4dc588992eb75797fe8a0a695f9a620964a430a336f42cda0c916573077e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2) 2026-03-09T00:13:47.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:47 vm06.local podman[137216]: 2026-03-09 00:13:47.551241101 +0000 UTC m=+5.075561844 container remove 21cf4dc588992eb75797fe8a0a695f9a620964a430a336f42cda0c916573077e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223) 2026-03-09T00:13:47.671 INFO:journalctl@ceph.osd.4.vm06.stdout:Mar 09 00:13:47 vm06.local bash[137216]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-4 2026-03-09T00:13:47.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:47 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:47.398+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:47.917 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.4.service' 2026-03-09T00:13:47.958 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:47.958 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-09T00:13:47.958 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-09T00:13:47.958 DEBUG:teuthology.orchestra.run.vm06:> sudo systemctl stop ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.5 2026-03-09T00:13:48.385 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:48 vm06.local systemd[1]: Stopping Ceph osd.5 for ae8f0172-1b4a-11f1-916a-712b2ac006b7... 2026-03-09T00:13:48.385 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:48 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:48.108+0000 7f2e99209640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-09T00:13:48.385 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:48 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:48.108+0000 7f2e99209640 -1 osd.5 109 *** Got signal Terminated *** 2026-03-09T00:13:48.385 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:48 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:48.108+0000 7f2e99209640 -1 osd.5 109 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-09T00:13:48.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:48 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:48.384+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:49.671 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:49 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:49.411+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:50.670 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:50 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:50.395+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:51.920 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:51 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:51.439+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:51.920 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:51 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:51.439+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6814 osd.1 since back 2026-03-09T00:13:30.561015+0000 front 2026-03-09T00:13:30.561122+0000 (oldest deadline 2026-03-09T00:13:51.060932+0000) 2026-03-09T00:13:52.920 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:52 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:52.444+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6806 osd.0 since back 2026-03-09T00:13:23.660655+0000 front 2026-03-09T00:13:23.660572+0000 (oldest deadline 2026-03-09T00:13:45.960292+0000) 2026-03-09T00:13:52.920 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:52 vm06.local ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5[123026]: 2026-03-09T00:13:52.444+0000 7f2e95010640 -1 osd.5 109 heartbeat_check: no reply from 192.168.123.103:6814 osd.1 since back 2026-03-09T00:13:30.561015+0000 front 2026-03-09T00:13:30.561122+0000 (oldest deadline 2026-03-09T00:13:51.060932+0000) 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local podman[137380]: 2026-03-09 00:13:53.145301134 +0000 UTC m=+5.052594112 container died fbc950d55a6775f3eb36b391c3ac9eda1786929d1923f237cdf756fa97a4512d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local podman[137380]: 2026-03-09 00:13:53.166836563 +0000 UTC m=+5.074129531 container remove fbc950d55a6775f3eb36b391c3ac9eda1786929d1923f237cdf756fa97a4512d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.build-date=20260223) 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local bash[137380]: ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local podman[137447]: 2026-03-09 00:13:53.308572821 +0000 UTC m=+0.018348966 container create 1435b5c32be52d3f8c723e23cbe8fcc7d2d32330846cd0cf57d01c8b8cac8a94 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local podman[137447]: 2026-03-09 00:13:53.354854085 +0000 UTC m=+0.064630239 container init 1435b5c32be52d3f8c723e23cbe8fcc7d2d32330846cd0cf57d01c8b8cac8a94 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local podman[137447]: 2026-03-09 00:13:53.358445717 +0000 UTC m=+0.068221862 container start 1435b5c32be52d3f8c723e23cbe8fcc7d2d32330846cd0cf57d01c8b8cac8a94 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-09T00:13:53.398 INFO:journalctl@ceph.osd.5.vm06.stdout:Mar 09 00:13:53 vm06.local podman[137447]: 2026-03-09 00:13:53.362191208 +0000 UTC m=+0.071967353 container attach 1435b5c32be52d3f8c723e23cbe8fcc7d2d32330846cd0cf57d01c8b8cac8a94 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7-osd-5-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223) 2026-03-09T00:13:53.515 DEBUG:teuthology.orchestra.run.vm06:> sudo pkill -f 'journalctl -f -n 0 -u ceph-ae8f0172-1b4a-11f1-916a-712b2ac006b7@osd.5.service' 2026-03-09T00:13:53.545 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-09T00:13:53.545 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-09T00:13:53.545 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 --force --keep-logs 2026-03-09T00:13:53.639 INFO:teuthology.orchestra.run.vm03.stdout:Deleting cluster with fsid: ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:14:04.253 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 --force --keep-logs 2026-03-09T00:14:04.348 INFO:teuthology.orchestra.run.vm06.stdout:Deleting cluster with fsid: ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:14:09.163 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T00:14:09.191 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-09T00:14:09.224 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-09T00:14:09.224 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/crash to /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308/remote/vm03/crash 2026-03-09T00:14:09.224 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/crash -- . 2026-03-09T00:14:09.254 INFO:teuthology.orchestra.run.vm03.stderr:tar: /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/crash: Cannot open: No such file or directory 2026-03-09T00:14:09.254 INFO:teuthology.orchestra.run.vm03.stderr:tar: Error is not recoverable: exiting now 2026-03-09T00:14:09.255 DEBUG:teuthology.misc:Transferring archived files from vm06:/var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/crash to /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308/remote/vm06/crash 2026-03-09T00:14:09.255 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/crash -- . 2026-03-09T00:14:09.296 INFO:teuthology.orchestra.run.vm06.stderr:tar: /var/lib/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/crash: Cannot open: No such file or directory 2026-03-09T00:14:09.296 INFO:teuthology.orchestra.run.vm06.stderr:tar: Error is not recoverable: exiting now 2026-03-09T00:14:09.297 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-09T00:14:09.297 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T00:14:09.356 INFO:teuthology.orchestra.run.vm03.stdout:2026-03-09T00:10:00.000194+0000 mon.vm03 (mon.0) 518 : cluster [WRN] osd.3 (root=default,host=vm06) is down 2026-03-09T00:14:09.356 WARNING:tasks.cephadm:Found errors (ERR|WRN|SEC) in cluster log 2026-03-09T00:14:09.356 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[SEC\]' /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T00:14:09.409 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[ERR\]' /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T00:14:09.474 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[WRN\]' /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-09T00:14:09.564 INFO:teuthology.orchestra.run.vm03.stdout:2026-03-09T00:10:00.000194+0000 mon.vm03 (mon.0) 518 : cluster [WRN] osd.3 (root=default,host=vm06) is down 2026-03-09T00:14:09.565 INFO:tasks.cephadm:Compressing logs... 2026-03-09T00:14:09.565 DEBUG:teuthology.orchestra.run.vm03:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T00:14:09.566 DEBUG:teuthology.orchestra.run.vm06:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T00:14:09.604 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T00:14:09.604 INFO:teuthology.orchestra.run.vm06.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T00:14:09.604 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-volume.log 2026-03-09T00:14:09.605 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-09T00:14:09.605 INFO:teuthology.orchestra.run.vm03.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-09T00:14:09.605 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-client.ceph-exporter.vm06.log 2026-03-09T00:14:09.605 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mgr.vm06.rzcvhn.log 2026-03-09T00:14:09.605 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mon.vm03.log 2026-03-09T00:14:09.606 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-client.ceph-exporter.vm06.log: 92.6% 93.8% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-client.ceph-exporter.vm06.log.gz 2026-03-09T00:14:09.606 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log 2026-03-09T00:14:09.607 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mon.vm06.log 2026-03-09T00:14:09.607 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mgr.vm06.rzcvhn.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.audit.log 2026-03-09T00:14:09.607 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mon.vm03.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mgr.vm03.yvcons.log 2026-03-09T00:14:09.607 INFO:teuthology.orchestra.run.vm06.stderr: -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T00:14:09.614 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log: 87.3% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log.gz 2026-03-09T00:14:09.616 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.audit.log 2026-03-09T00:14:09.622 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-volume.log: /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mon.vm06.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log 2026-03-09T00:14:09.626 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.audit.log: 91.4% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.audit.log.gz 2026-03-09T00:14:09.629 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mgr.vm03.yvcons.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.cephadm.log 2026-03-09T00:14:09.630 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.cephadm.log 2026-03-09T00:14:09.633 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log: 87.3% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.log.gz 2026-03-09T00:14:09.634 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.audit.log: 91.2% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.audit.log.gz 2026-03-09T00:14:09.634 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-volume.log 2026-03-09T00:14:09.635 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.cephadm.log: 85.3% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.cephadm.log.gz 2026-03-09T00:14:09.636 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.3.log 2026-03-09T00:14:09.638 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.cephadm.log: 85.3% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph.cephadm.log.gz 2026-03-09T00:14:09.638 INFO:teuthology.orchestra.run.vm06.stderr: 93.6% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-volume.log.gz 2026-03-09T00:14:09.643 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.4.log 2026-03-09T00:14:09.646 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.3.log: 89.1% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mgr.vm06.rzcvhn.log.gz 2026-03-09T00:14:09.647 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-client.ceph-exporter.vm03.log 2026-03-09T00:14:09.648 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.5.log 2026-03-09T00:14:09.653 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-volume.log: 90.9% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-09T00:14:09.655 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm06.vlrwtl.log 2026-03-09T00:14:09.655 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.0.log 2026-03-09T00:14:09.660 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-client.ceph-exporter.vm03.log: 93.8% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-client.ceph-exporter.vm03.log.gz 2026-03-09T00:14:09.663 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm06.ixduim.log 2026-03-09T00:14:09.663 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.1.log 2026-03-09T00:14:09.677 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.2.log 2026-03-09T00:14:09.685 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.1.log: 93.5% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-volume.log.gz 2026-03-09T00:14:09.689 INFO:teuthology.orchestra.run.vm06.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm06.vlrwtl.log: /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm06.ixduim.log: 91.6% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm06.vlrwtl.log.gz 2026-03-09T00:14:09.695 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm03.sejksk.log 2026-03-09T00:14:09.707 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm03.ralade.log 2026-03-09T00:14:10.106 INFO:teuthology.orchestra.run.vm06.stderr: 92.2% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mon.vm06.log.gz 2026-03-09T00:14:10.442 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm03.sejksk.log: /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm03.ralade.log: 89.2% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mgr.vm03.yvcons.log.gz 2026-03-09T00:14:11.080 INFO:teuthology.orchestra.run.vm03.stderr: 90.6% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mon.vm03.log.gz 2026-03-09T00:14:15.816 INFO:teuthology.orchestra.run.vm06.stderr: 93.7% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.4.log.gz 2026-03-09T00:14:16.606 INFO:teuthology.orchestra.run.vm06.stderr: 95.0% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm06.ixduim.log.gz 2026-03-09T00:14:16.865 INFO:teuthology.orchestra.run.vm06.stderr: 93.8% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.5.log.gz 2026-03-09T00:14:17.480 INFO:teuthology.orchestra.run.vm06.stderr: 93.8% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.3.log.gz 2026-03-09T00:14:17.481 INFO:teuthology.orchestra.run.vm06.stderr: 2026-03-09T00:14:17.481 INFO:teuthology.orchestra.run.vm06.stderr:real 0m7.892s 2026-03-09T00:14:17.481 INFO:teuthology.orchestra.run.vm06.stderr:user 0m14.215s 2026-03-09T00:14:17.481 INFO:teuthology.orchestra.run.vm06.stderr:sys 0m0.667s 2026-03-09T00:14:17.997 INFO:teuthology.orchestra.run.vm03.stderr: 93.8% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.2.log.gz 2026-03-09T00:14:18.458 INFO:teuthology.orchestra.run.vm03.stderr: 93.8% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.0.log.gz 2026-03-09T00:14:18.613 INFO:teuthology.orchestra.run.vm03.stderr: 95.0% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm03.ralade.log.gz 2026-03-09T00:14:19.051 INFO:teuthology.orchestra.run.vm03.stderr: 93.7% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-osd.1.log.gz 2026-03-09T00:15:20.075 INFO:teuthology.orchestra.run.vm03.stderr: 93.0% -- replaced with /var/log/ceph/ae8f0172-1b4a-11f1-916a-712b2ac006b7/ceph-mds.cephfs.vm03.sejksk.log.gz 2026-03-09T00:15:20.078 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-09T00:15:20.078 INFO:teuthology.orchestra.run.vm03.stderr:real 1m10.493s 2026-03-09T00:15:20.078 INFO:teuthology.orchestra.run.vm03.stderr:user 1m13.834s 2026-03-09T00:15:20.078 INFO:teuthology.orchestra.run.vm03.stderr:sys 0m4.590s 2026-03-09T00:15:20.078 INFO:tasks.cephadm:Archiving logs... 2026-03-09T00:15:20.078 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/log/ceph to /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308/remote/vm03/log 2026-03-09T00:15:20.079 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T00:15:23.969 DEBUG:teuthology.misc:Transferring archived files from vm06:/var/log/ceph to /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308/remote/vm06/log 2026-03-09T00:15:23.969 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-09T00:15:24.730 INFO:tasks.cephadm:Removing cluster... 2026-03-09T00:15:24.730 DEBUG:teuthology.orchestra.run.vm03:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 --force 2026-03-09T00:15:24.894 INFO:teuthology.orchestra.run.vm03.stdout:Deleting cluster with fsid: ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:15:25.813 DEBUG:teuthology.orchestra.run.vm06:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid ae8f0172-1b4a-11f1-916a-712b2ac006b7 --force 2026-03-09T00:15:25.916 INFO:teuthology.orchestra.run.vm06.stdout:Deleting cluster with fsid: ae8f0172-1b4a-11f1-916a-712b2ac006b7 2026-03-09T00:15:26.164 INFO:tasks.cephadm:Removing cephadm ... 2026-03-09T00:15:26.164 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T00:15:26.179 DEBUG:teuthology.orchestra.run.vm06:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-09T00:15:26.194 INFO:tasks.cephadm:Teardown complete 2026-03-09T00:15:26.194 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-09T00:15:26.197 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-09T00:15:26.197 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T00:15:26.221 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-09T00:15:26.266 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T00:15:26.266 DEBUG:teuthology.orchestra.run.vm03:> 2026-03-09T00:15:26.266 DEBUG:teuthology.orchestra.run.vm03:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T00:15:26.266 DEBUG:teuthology.orchestra.run.vm03:> sudo yum -y remove $d || true 2026-03-09T00:15:26.266 DEBUG:teuthology.orchestra.run.vm03:> done 2026-03-09T00:15:26.271 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-09T00:15:26.271 DEBUG:teuthology.orchestra.run.vm06:> 2026-03-09T00:15:26.271 DEBUG:teuthology.orchestra.run.vm06:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-09T00:15:26.271 DEBUG:teuthology.orchestra.run.vm06:> sudo yum -y remove $d || true 2026-03-09T00:15:26.271 DEBUG:teuthology.orchestra.run.vm06:> done 2026-03-09T00:15:26.553 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 @ceph 31 M 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:Remove 2 Packages 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 31 M 2026-03-09T00:15:26.554 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:26.559 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:26.559 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:26.575 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:26.575 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:26.609 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:26.620 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:26.620 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:26.620 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:26.620 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:26.620 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:26.620 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 @ceph 31 M 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout:Remove 2 Packages 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 31 M 2026-03-09T00:15:26.621 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:26.625 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:26.625 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:26.632 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.632 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:26.632 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T00:15:26.632 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T00:15:26.632 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T00:15:26.632 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:26.634 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.641 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:26.641 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:26.646 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.662 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T00:15:26.676 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:26.695 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.695 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:26.695 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-09T00:15:26.695 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-09T00:15:26.695 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-09T00:15:26.695 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:26.696 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.704 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.719 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T00:15:26.736 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T00:15:26.736 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.787 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T00:15:26.787 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:26.795 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T00:15:26.795 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:26.795 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:26.795 INFO:teuthology.orchestra.run.vm06.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T00:15:26.795 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:26.795 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:26.843 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-09T00:15:26.843 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:26.843 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:26.843 INFO:teuthology.orchestra.run.vm03.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-09T00:15:26.843 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:26.843 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:27.010 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test x86_64 2:18.2.1-0.el9 @ceph 164 M 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:Remove 4 Packages 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 166 M 2026-03-09T00:15:27.011 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:27.014 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:27.014 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:27.040 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:27.041 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:27.064 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test x86_64 2:18.2.1-0.el9 @ceph 164 M 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:Remove 4 Packages 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 166 M 2026-03-09T00:15:27.065 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:27.068 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:27.068 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:27.092 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:27.094 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:27.095 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:27.097 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-09T00:15:27.099 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T00:15:27.102 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T00:15:27.118 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T00:15:27.146 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:27.153 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-09T00:15:27.155 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-09T00:15:27.158 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-09T00:15:27.174 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T00:15:27.182 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T00:15:27.182 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-09T00:15:27.182 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T00:15:27.182 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout: ceph-test-2:18.2.1-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.234 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:27.247 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-09T00:15:27.247 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-09T00:15:27.247 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-09T00:15:27.247 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-09T00:15:27.298 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-09T00:15:27.299 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.299 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:27.299 INFO:teuthology.orchestra.run.vm03.stdout: ceph-test-2:18.2.1-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-09T00:15:27.299 INFO:teuthology.orchestra.run.vm03.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-09T00:15:27.299 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.299 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:27.441 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: ceph x86_64 2:18.2.1-0.el9 @ceph 0 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mds x86_64 2:18.2.1-0.el9 @ceph 6.5 M 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon x86_64 2:18.2.1-0.el9 @ceph 20 M 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: ceph-osd x86_64 2:18.2.1-0.el9 @ceph 61 M 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:Remove 8 Packages 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 89 M 2026-03-09T00:15:27.442 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:27.445 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:27.445 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:27.467 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:27.467 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:27.504 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:27.505 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-09T00:15:27.510 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: ceph x86_64 2:18.2.1-0.el9 @ceph 0 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds x86_64 2:18.2.1-0.el9 @ceph 6.5 M 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon x86_64 2:18.2.1-0.el9 @ceph 20 M 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: ceph-osd x86_64 2:18.2.1-0.el9 @ceph 61 M 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:Remove 8 Packages 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 89 M 2026-03-09T00:15:27.511 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:27.514 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:27.514 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:27.526 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.526 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:27.526 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T00:15:27.526 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T00:15:27.526 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T00:15:27.526 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.529 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.537 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.538 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:27.538 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:27.551 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T00:15:27.551 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T00:15:27.551 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.552 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T00:15:27.570 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T00:15:27.573 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T00:15:27.575 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:27.575 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T00:15:27.577 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-09T00:15:27.577 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T00:15:27.593 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.593 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:27.593 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-09T00:15:27.593 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-09T00:15:27.593 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-09T00:15:27.593 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.596 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.601 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-09T00:15:27.601 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:27.601 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T00:15:27.601 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T00:15:27.601 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T00:15:27.601 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.602 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-09T00:15:27.602 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.609 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-09T00:15:27.614 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T00:15:27.614 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-09T00:15:27.614 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.615 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T00:15:27.630 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-09T00:15:27.630 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:27.630 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T00:15:27.630 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T00:15:27.630 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T00:15:27.630 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.631 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-09T00:15:27.634 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-09T00:15:27.638 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-09T00:15:27.640 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T00:15:27.641 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.661 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-09T00:15:27.669 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-09T00:15:27.689 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-09T00:15:27.689 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:27.689 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-09T00:15:27.689 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-09T00:15:27.690 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-09T00:15:27.690 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.690 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-09T00:15:27.715 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-09T00:15:27.715 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-09T00:15:27.715 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.715 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 3/8 2026-03-09T00:15:27.715 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 4/8 2026-03-09T00:15:27.715 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T00:15:27.716 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T00:15:27.716 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T00:15:27.771 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout: ceph-2:18.2.1-0.el9.x86_64 ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:27.772 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 2/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 3/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 4/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-09T00:15:27.786 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: ceph-2:18.2.1-0.el9.x86_64 ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:27.837 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:27.999 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base x86_64 2:18.2.1-0.el9 @ceph 22 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 @ceph 395 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 @ceph 4.5 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 678 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 @ceph-noarch 7.6 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 @ceph-noarch 66 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 @ceph-noarch 574 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common x86_64 2:18.2.1-0.el9 @ceph 70 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 @ceph-noarch 319 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 @ceph-noarch 1.4 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 @ceph-noarch 40 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 @ceph 138 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T00:15:28.004 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 @ceph 434 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 @ceph 1.5 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 @ceph 610 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T00:15:28.005 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout:Remove 84 Packages 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 434 M 2026-03-09T00:15:28.006 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:28.030 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:28.030 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:28.072 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base x86_64 2:18.2.1-0.el9 @ceph 22 M 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 @ceph 395 k 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 @ceph 4.5 M 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 678 k 2026-03-09T00:15:28.077 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 @ceph-noarch 7.6 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 @ceph-noarch 66 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 @ceph-noarch 574 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common x86_64 2:18.2.1-0.el9 @ceph 70 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 @ceph-noarch 319 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 @ceph-noarch 1.4 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 @ceph-noarch 40 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 @ceph 138 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 @ceph 434 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 @ceph 1.5 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 @ceph 610 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-09T00:15:28.078 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout:Remove 84 Packages 2026-03-09T00:15:28.079 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:28.080 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 434 M 2026-03-09T00:15:28.080 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:28.102 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:28.102 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:28.139 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:28.139 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:28.209 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:28.209 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:28.277 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:28.277 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-09T00:15:28.285 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-09T00:15:28.303 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:28.303 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:28.303 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T00:15:28.303 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T00:15:28.303 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T00:15:28.303 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:28.304 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:28.318 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:28.325 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 3/84 2026-03-09T00:15:28.325 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-09T00:15:28.340 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:28.340 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-09T00:15:28.348 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:28.367 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:28.381 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:28.382 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-09T00:15:28.389 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 3/84 2026-03-09T00:15:28.389 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-09T00:15:28.393 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-09T00:15:28.397 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T00:15:28.397 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-09T00:15:28.410 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-09T00:15:28.417 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T00:15:28.420 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T00:15:28.422 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T00:15:28.428 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T00:15:28.433 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T00:15:28.442 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T00:15:28.449 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-09T00:15:28.455 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T00:15:28.459 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-09T00:15:28.462 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T00:15:28.463 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-09T00:15:28.463 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-09T00:15:28.472 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T00:15:28.476 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-09T00:15:28.479 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T00:15:28.483 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-09T00:15:28.486 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-09T00:15:28.489 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-09T00:15:28.494 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-09T00:15:28.500 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-09T00:15:28.508 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T00:15:28.510 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-09T00:15:28.515 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T00:15:28.517 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T00:15:28.523 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-09T00:15:28.526 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T00:15:28.531 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-09T00:15:28.534 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T00:15:28.534 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-09T00:15:28.543 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-09T00:15:28.543 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-09T00:15:28.550 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-09T00:15:28.582 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-09T00:15:28.591 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-09T00:15:28.593 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-09T00:15:28.604 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-09T00:15:28.612 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-09T00:15:28.612 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-09T00:15:28.620 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-09T00:15:28.637 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T00:15:28.665 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T00:15:28.671 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T00:15:28.677 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-09T00:15:28.681 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-09T00:15:28.684 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-09T00:15:28.687 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-09T00:15:28.689 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-09T00:15:28.692 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-09T00:15:28.694 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-09T00:15:28.697 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-09T00:15:28.711 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T00:15:28.717 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T00:15:28.722 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-09T00:15:28.724 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-09T00:15:28.752 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-09T00:15:28.758 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-09T00:15:28.765 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-09T00:15:28.769 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-09T00:15:28.770 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-09T00:15:28.772 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-09T00:15:28.775 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-09T00:15:28.778 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-09T00:15:28.780 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-09T00:15:28.781 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-09T00:15:28.783 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-09T00:15:28.783 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-09T00:15:28.786 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-09T00:15:28.786 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-09T00:15:28.788 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-09T00:15:28.790 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-09T00:15:28.801 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-09T00:15:28.809 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T00:15:28.809 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-09T00:15:28.809 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:28.809 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T00:15:28.809 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T00:15:28.810 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T00:15:28.810 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:28.810 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-09T00:15:28.814 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-09T00:15:28.819 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-09T00:15:28.838 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-09T00:15:28.838 INFO:teuthology.orchestra.run.vm06.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:28.838 INFO:teuthology.orchestra.run.vm06.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T00:15:28.838 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:28.838 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-09T00:15:28.846 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-09T00:15:28.848 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-09T00:15:28.850 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-09T00:15:28.853 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-09T00:15:28.856 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-09T00:15:28.859 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-09T00:15:28.862 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-09T00:15:28.865 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-09T00:15:28.866 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-09T00:15:28.868 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-09T00:15:28.876 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-09T00:15:28.877 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-09T00:15:28.879 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-09T00:15:28.880 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-09T00:15:28.881 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-09T00:15:28.883 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-09T00:15:28.883 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-09T00:15:28.885 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-09T00:15:28.886 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-09T00:15:28.886 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-09T00:15:28.893 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-09T00:15:28.898 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-09T00:15:28.903 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-09T00:15:28.908 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:28.909 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-09T00:15:28.914 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-09T00:15:28.917 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-09T00:15:28.917 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-09T00:15:28.919 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-09T00:15:28.923 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-09T00:15:28.933 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-09T00:15:28.939 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-09T00:15:28.942 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-09T00:15:28.943 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-09T00:15:28.943 INFO:teuthology.orchestra.run.vm03.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-09T00:15:28.943 INFO:teuthology.orchestra.run.vm03.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-09T00:15:28.943 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:28.943 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-09T00:15:28.944 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-09T00:15:28.946 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 71/84 2026-03-09T00:15:28.952 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 72/84 2026-03-09T00:15:28.952 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-09T00:15:28.954 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-09T00:15:28.955 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-09T00:15:28.956 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-09T00:15:28.959 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-09T00:15:28.962 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-09T00:15:28.964 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-09T00:15:28.966 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-09T00:15:28.969 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-09T00:15:28.972 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-09T00:15:28.975 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-09T00:15:28.975 INFO:teuthology.orchestra.run.vm06.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T00:15:28.975 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:28.981 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-09T00:15:28.981 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-09T00:15:28.985 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-09T00:15:28.987 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-09T00:15:28.990 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-09T00:15:28.993 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-09T00:15:28.999 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-09T00:15:29.002 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-09T00:15:29.002 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-09T00:15:29.003 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-09T00:15:29.008 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-09T00:15:29.012 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-09T00:15:29.018 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-09T00:15:29.021 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-09T00:15:29.024 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-09T00:15:29.028 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-09T00:15:29.039 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-09T00:15:29.045 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-09T00:15:29.048 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-09T00:15:29.050 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-09T00:15:29.052 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 71/84 2026-03-09T00:15:29.058 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 72/84 2026-03-09T00:15:29.061 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-09T00:15:29.082 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-09T00:15:29.082 INFO:teuthology.orchestra.run.vm03.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-09T00:15:29.082 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:29.088 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-09T00:15:29.108 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-09T00:15:29.108 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-09T00:15:34.737 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-09T00:15:34.737 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /sys 2026-03-09T00:15:34.737 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /proc 2026-03-09T00:15:34.737 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /mnt 2026-03-09T00:15:34.738 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /var/tmp 2026-03-09T00:15:34.738 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /home 2026-03-09T00:15:34.738 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /root 2026-03-09T00:15:34.738 INFO:teuthology.orchestra.run.vm03.stdout:skipping the directory /tmp 2026-03-09T00:15:34.738 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:34.746 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-09T00:15:34.760 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-09T00:15:34.760 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /sys 2026-03-09T00:15:34.760 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /proc 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /mnt 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /var/tmp 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /home 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /root 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm06.stdout:skipping the directory /tmp 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:34.761 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-09T00:15:34.767 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ceph-common-2:18.2.1-0.el9.x86_64 77/84 2026-03-09T00:15:34.768 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-09T00:15:34.770 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T00:15:34.770 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-09T00:15:34.772 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-09T00:15:34.785 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-09T00:15:34.787 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T00:15:34.788 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-09T00:15:34.789 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T00:15:34.791 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ceph-common-2:18.2.1-0.el9.x86_64 77/84 2026-03-09T00:15:34.791 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T00:15:34.791 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-09T00:15:34.792 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-09T00:15:34.794 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-09T00:15:34.794 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-09T00:15:34.808 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-09T00:15:34.810 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-09T00:15:34.812 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-09T00:15:34.814 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-09T00:15:34.814 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 1/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 3/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 4/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 5/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 6/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 7/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 8/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 9/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 10/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 11/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 12/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 17/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 21/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 30/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-09T00:15:34.890 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T00:15:34.891 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T00:15:34.892 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 1/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 2/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 3/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 4/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 5/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 6/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 7/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 8/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 9/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 10/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 11/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 12/84 2026-03-09T00:15:34.921 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-09T00:15:34.923 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-09T00:15:34.923 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-09T00:15:34.923 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-09T00:15:34.923 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 17/84 2026-03-09T00:15:34.923 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 21/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 30/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-09T00:15:34.924 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-09T00:15:34.925 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 84/84 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T00:15:34.967 INFO:teuthology.orchestra.run.vm03.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T00:15:34.968 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:34.969 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 84/84 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-09T00:15:35.007 INFO:teuthology.orchestra.run.vm06.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-ply-3.11-14.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:35.008 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout: cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 213 k 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:Remove 1 Package 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 213 k 2026-03-09T00:15:35.192 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:35.194 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:35.194 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:35.195 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:35.196 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:35.212 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:35.212 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-09T00:15:35.224 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout: cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 213 k 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:Remove 1 Package 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 213 k 2026-03-09T00:15:35.225 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:35.227 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:35.227 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:35.228 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:35.228 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:35.245 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:35.245 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-09T00:15:35.330 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-09T00:15:35.354 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-09T00:15:35.376 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-09T00:15:35.376 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:35.376 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:35.376 INFO:teuthology.orchestra.run.vm03.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.376 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:35.376 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:35.399 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-09T00:15:35.400 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:35.400 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:35.400 INFO:teuthology.orchestra.run.vm06.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-09T00:15:35.400 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:35.400 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:35.565 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T00:15:35.565 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:35.568 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:35.569 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:35.569 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:35.591 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-immutable-object-cache 2026-03-09T00:15:35.591 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:35.594 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:35.594 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:35.594 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:35.747 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr 2026-03-09T00:15:35.747 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:35.750 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:35.751 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:35.751 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:35.765 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr 2026-03-09T00:15:35.766 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:35.768 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:35.769 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:35.769 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:35.922 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T00:15:35.922 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:35.925 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:35.925 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:35.925 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:35.932 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-dashboard 2026-03-09T00:15:35.932 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:35.935 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:35.935 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:35.935 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:36.094 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T00:15:36.095 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:36.095 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-09T00:15:36.095 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:36.097 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:36.098 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:36.098 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:36.098 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:36.098 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:36.098 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:36.267 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-rook 2026-03-09T00:15:36.267 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:36.268 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-rook 2026-03-09T00:15:36.268 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:36.270 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:36.270 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:36.270 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:36.271 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:36.272 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:36.272 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:36.438 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T00:15:36.438 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:36.440 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: ceph-mgr-cephadm 2026-03-09T00:15:36.440 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:36.441 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:36.441 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:36.441 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:36.443 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:36.444 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:36.444 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 @ceph 2.5 M 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:Remove 1 Package 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 2.5 M 2026-03-09T00:15:36.620 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 @ceph 2.5 M 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:Remove 1 Package 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 2.5 M 2026-03-09T00:15:36.621 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:36.622 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:36.622 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:36.623 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:36.623 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:36.631 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:36.631 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:36.632 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:36.633 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:36.656 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:36.657 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:36.670 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-09T00:15:36.671 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-09T00:15:36.727 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-09T00:15:36.737 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-09T00:15:36.773 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-09T00:15:36.773 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:36.773 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:36.773 INFO:teuthology.orchestra.run.vm06.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:36.773 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:36.773 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:36.779 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-09T00:15:36.779 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:36.779 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:36.779 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:36.779 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:36.779 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:36.961 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:36.961 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:36.961 INFO:teuthology.orchestra.run.vm06.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:36.961 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout: librados-devel x86_64 2:18.2.1-0.el9 @ceph 456 k 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 @ceph 139 k 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:Remove 2 Packages 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 595 k 2026-03-09T00:15:36.962 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:36.964 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:36.964 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:36.970 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout: Package Architecture Version Repository Size 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout: librados-devel x86_64 2:18.2.1-0.el9 @ceph 456 k 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 @ceph 139 k 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:Remove 2 Packages 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 595 k 2026-03-09T00:15:36.971 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:36.973 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:36.973 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:36.973 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:36.974 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:36.983 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:36.984 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:36.999 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:37.001 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:37.009 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:37.011 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:37.013 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-09T00:15:37.024 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-09T00:15:37.076 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-09T00:15:37.076 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:37.088 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-09T00:15:37.088 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-09T00:15:37.121 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-09T00:15:37.121 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.121 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:37.121 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 librados-devel-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.121 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.121 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:37.132 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-09T00:15:37.132 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.133 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:37.133 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 librados-devel-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.133 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.133 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 @ceph 1.9 M 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 @ceph 505 k 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 @ceph 186 k 2026-03-09T00:15:37.305 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.306 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:37.306 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:37.306 INFO:teuthology.orchestra.run.vm06.stdout:Remove 3 Packages 2026-03-09T00:15:37.306 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.306 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 2.5 M 2026-03-09T00:15:37.306 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:37.307 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:37.308 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:37.319 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:37.319 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:37.319 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 @ceph 1.9 M 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 @ceph 505 k 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 @ceph 186 k 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Remove 3 Packages 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 2.5 M 2026-03-09T00:15:37.320 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:37.322 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:37.322 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:37.334 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:37.334 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:37.345 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:37.348 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-cephfs-2:18.2.1-0.el9.x86_64 1/3 2026-03-09T00:15:37.349 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-09T00:15:37.349 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-09T00:15:37.360 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:37.362 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-cephfs-2:18.2.1-0.el9.x86_64 1/3 2026-03-09T00:15:37.363 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-09T00:15:37.363 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-09T00:15:37.409 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-09T00:15:37.409 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 1/3 2026-03-09T00:15:37.409 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-09T00:15:37.426 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-09T00:15:37.426 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 1/3 2026-03-09T00:15:37.426 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 3/3 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.448 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 3/3 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.465 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:37.618 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: libcephfs-devel 2026-03-09T00:15:37.618 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:37.621 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:37.621 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:37.621 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:37.633 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: libcephfs-devel 2026-03-09T00:15:37.634 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:37.636 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:37.637 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:37.637 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:37.800 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: Package Arch Version Repository Size 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:Removing: 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: librados2 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:Removing dependent packages: 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw x86_64 2:18.2.1-0.el9 @ceph 269 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 @ceph 226 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 @ceph 494 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:Removing unused dependencies: 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: librbd1 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: librgw2 x86_64 2:18.2.1-0.el9 @ceph 15 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:Transaction Summary 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:================================================================================ 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout:Remove 21 Packages 2026-03-09T00:15:37.802 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.803 INFO:teuthology.orchestra.run.vm06.stdout:Freed space: 74 M 2026-03-09T00:15:37.803 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction check 2026-03-09T00:15:37.806 INFO:teuthology.orchestra.run.vm06.stdout:Transaction check succeeded. 2026-03-09T00:15:37.806 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction test 2026-03-09T00:15:37.818 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: Package Arch Version Repository Size 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:Removing: 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: librados2 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:Removing dependent packages: 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw x86_64 2:18.2.1-0.el9 @ceph 269 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 @ceph 226 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 @ceph 494 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:Removing unused dependencies: 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: librbd1 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 x86_64 2:18.2.1-0.el9 @ceph 15 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:Transaction Summary 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:================================================================================ 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout:Remove 21 Packages 2026-03-09T00:15:37.820 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.821 INFO:teuthology.orchestra.run.vm03.stdout:Freed space: 74 M 2026-03-09T00:15:37.821 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction check 2026-03-09T00:15:37.824 INFO:teuthology.orchestra.run.vm03.stdout:Transaction check succeeded. 2026-03-09T00:15:37.824 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction test 2026-03-09T00:15:37.829 INFO:teuthology.orchestra.run.vm06.stdout:Transaction test succeeded. 2026-03-09T00:15:37.829 INFO:teuthology.orchestra.run.vm06.stdout:Running transaction 2026-03-09T00:15:37.847 INFO:teuthology.orchestra.run.vm03.stdout:Transaction test succeeded. 2026-03-09T00:15:37.848 INFO:teuthology.orchestra.run.vm03.stdout:Running transaction 2026-03-09T00:15:37.870 INFO:teuthology.orchestra.run.vm06.stdout: Preparing : 1/1 2026-03-09T00:15:37.873 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-nbd-2:18.2.1-0.el9.x86_64 1/21 2026-03-09T00:15:37.875 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : rbd-fuse-2:18.2.1-0.el9.x86_64 2/21 2026-03-09T00:15:37.877 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rgw-2:18.2.1-0.el9.x86_64 3/21 2026-03-09T00:15:37.877 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-09T00:15:37.888 INFO:teuthology.orchestra.run.vm03.stdout: Preparing : 1/1 2026-03-09T00:15:37.900 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-09T00:15:37.902 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-nbd-2:18.2.1-0.el9.x86_64 1/21 2026-03-09T00:15:37.902 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-09T00:15:37.904 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rbd-2:18.2.1-0.el9.x86_64 6/21 2026-03-09T00:15:37.905 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : rbd-fuse-2:18.2.1-0.el9.x86_64 2/21 2026-03-09T00:15:37.906 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : python3-rados-2:18.2.1-0.el9.x86_64 7/21 2026-03-09T00:15:37.907 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rgw-2:18.2.1-0.el9.x86_64 3/21 2026-03-09T00:15:37.907 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-09T00:15:37.908 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-09T00:15:37.908 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-09T00:15:37.920 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-09T00:15:37.921 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-09T00:15:37.921 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-09T00:15:37.921 INFO:teuthology.orchestra.run.vm06.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T00:15:37.921 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:37.922 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-09T00:15:37.923 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rbd-2:18.2.1-0.el9.x86_64 6/21 2026-03-09T00:15:37.925 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : python3-rados-2:18.2.1-0.el9.x86_64 7/21 2026-03-09T00:15:37.928 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-09T00:15:37.928 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-09T00:15:37.933 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-09T00:15:37.936 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-09T00:15:37.938 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-09T00:15:37.940 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-09T00:15:37.941 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-09T00:15:37.941 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-09T00:15:37.941 INFO:teuthology.orchestra.run.vm03.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-09T00:15:37.941 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:37.943 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-09T00:15:37.946 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-09T00:15:37.950 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-09T00:15:37.952 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-09T00:15:37.954 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-09T00:15:37.956 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-09T00:15:37.957 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-09T00:15:37.958 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-09T00:15:37.959 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-09T00:15:37.962 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-09T00:15:37.964 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-09T00:15:37.965 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-09T00:15:37.969 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-09T00:15:37.971 INFO:teuthology.orchestra.run.vm06.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T00:15:37.972 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-09T00:15:37.975 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-09T00:15:37.977 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-09T00:15:37.978 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-09T00:15:37.980 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-09T00:15:37.996 INFO:teuthology.orchestra.run.vm03.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 7/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 8/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 10/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 14/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 15/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 16/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 18/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 19/21 2026-03-09T00:15:38.040 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 7/21 2026-03-09T00:15:38.051 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 8/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 10/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 14/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 15/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 16/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 18/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 19/21 2026-03-09T00:15:38.052 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-09T00:15:38.081 INFO:teuthology.orchestra.run.vm06.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout:Removed: 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: librados2-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: librbd1-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout: 2026-03-09T00:15:38.082 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:38.095 INFO:teuthology.orchestra.run.vm03.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout:Removed: 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: librados2-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: librbd1-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: re2-1:20211101-20.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-09T00:15:38.096 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:38.272 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: librbd1 2026-03-09T00:15:38.272 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:38.275 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:38.276 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:38.276 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:38.288 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: librbd1 2026-03-09T00:15:38.288 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:38.291 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:38.292 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:38.292 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:38.443 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rados 2026-03-09T00:15:38.443 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:38.446 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:38.447 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:38.447 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:38.471 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rados 2026-03-09T00:15:38.471 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:38.474 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:38.474 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:38.474 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:38.661 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rgw 2026-03-09T00:15:38.661 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:38.664 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rgw 2026-03-09T00:15:38.664 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:38.664 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:38.665 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:38.665 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:38.667 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:38.668 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:38.668 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:38.834 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-cephfs 2026-03-09T00:15:38.835 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:38.838 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:38.838 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-cephfs 2026-03-09T00:15:38.839 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:38.839 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:38.839 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:38.841 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:38.842 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:38.842 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:39.010 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: python3-rbd 2026-03-09T00:15:39.010 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:39.013 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:39.013 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:39.013 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:39.019 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: python3-rbd 2026-03-09T00:15:39.019 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:39.022 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:39.023 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:39.023 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:39.188 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-fuse 2026-03-09T00:15:39.188 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:39.191 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:39.192 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:39.192 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:39.198 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-fuse 2026-03-09T00:15:39.198 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:39.201 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:39.202 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:39.202 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:39.370 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-mirror 2026-03-09T00:15:39.370 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:39.373 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:39.374 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:39.374 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:39.385 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-mirror 2026-03-09T00:15:39.385 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:39.388 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:39.389 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:39.389 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:39.551 INFO:teuthology.orchestra.run.vm03.stdout:No match for argument: rbd-nbd 2026-03-09T00:15:39.552 INFO:teuthology.orchestra.run.vm03.stderr:No packages marked for removal. 2026-03-09T00:15:39.555 INFO:teuthology.orchestra.run.vm03.stdout:Dependencies resolved. 2026-03-09T00:15:39.555 INFO:teuthology.orchestra.run.vm03.stdout:Nothing to do. 2026-03-09T00:15:39.555 INFO:teuthology.orchestra.run.vm03.stdout:Complete! 2026-03-09T00:15:39.562 INFO:teuthology.orchestra.run.vm06.stdout:No match for argument: rbd-nbd 2026-03-09T00:15:39.562 INFO:teuthology.orchestra.run.vm06.stderr:No packages marked for removal. 2026-03-09T00:15:39.566 INFO:teuthology.orchestra.run.vm06.stdout:Dependencies resolved. 2026-03-09T00:15:39.567 INFO:teuthology.orchestra.run.vm06.stdout:Nothing to do. 2026-03-09T00:15:39.567 INFO:teuthology.orchestra.run.vm06.stdout:Complete! 2026-03-09T00:15:39.588 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean all 2026-03-09T00:15:39.595 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean all 2026-03-09T00:15:39.721 INFO:teuthology.orchestra.run.vm03.stdout:56 files removed 2026-03-09T00:15:39.735 INFO:teuthology.orchestra.run.vm06.stdout:56 files removed 2026-03-09T00:15:39.746 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T00:15:39.762 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T00:15:39.771 DEBUG:teuthology.orchestra.run.vm03:> sudo yum clean expire-cache 2026-03-09T00:15:39.789 DEBUG:teuthology.orchestra.run.vm06:> sudo yum clean expire-cache 2026-03-09T00:15:39.930 INFO:teuthology.orchestra.run.vm03.stdout:Cache was expired 2026-03-09T00:15:39.930 INFO:teuthology.orchestra.run.vm03.stdout:0 files removed 2026-03-09T00:15:39.950 INFO:teuthology.orchestra.run.vm06.stdout:Cache was expired 2026-03-09T00:15:39.951 INFO:teuthology.orchestra.run.vm06.stdout:0 files removed 2026-03-09T00:15:39.952 DEBUG:teuthology.parallel:result is None 2026-03-09T00:15:39.969 DEBUG:teuthology.parallel:result is None 2026-03-09T00:15:39.969 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm03.local 2026-03-09T00:15:39.969 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm06.local 2026-03-09T00:15:39.970 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T00:15:39.970 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-09T00:15:39.995 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T00:15:39.996 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-09T00:15:40.066 DEBUG:teuthology.parallel:result is None 2026-03-09T00:15:40.066 DEBUG:teuthology.parallel:result is None 2026-03-09T00:15:40.066 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-09T00:15:40.070 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-09T00:15:40.070 DEBUG:teuthology.orchestra.run.vm03:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T00:15:40.108 DEBUG:teuthology.orchestra.run.vm06:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-09T00:15:40.121 INFO:teuthology.orchestra.run.vm03.stderr:bash: line 1: ntpq: command not found 2026-03-09T00:15:40.123 INFO:teuthology.orchestra.run.vm06.stderr:bash: line 1: ntpq: command not found 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm06.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm06.stdout:=============================================================================== 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm06.stdout:^+ tor.nocabal.de 2 7 377 121 -78us[ -82us] +/- 44ms 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm06.stdout:^+ dominus.von-oppen.com 2 7 377 122 +679us[ +675us] +/- 46ms 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm06.stdout:^+ bard-dmz.cbs.mpg.de 2 7 377 56 -46us[ -46us] +/- 50ms 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm06.stdout:^* static.179.181.75.5.clie> 3 6 377 57 -602us[ -607us] +/- 27ms 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm03.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm03.stdout:=============================================================================== 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm03.stdout:^+ dominus.von-oppen.com 2 7 377 123 +625us[ +621us] +/- 46ms 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm03.stdout:^+ bard-dmz.cbs.mpg.de 2 7 377 122 +36us[ +32us] +/- 49ms 2026-03-09T00:15:40.201 INFO:teuthology.orchestra.run.vm03.stdout:^* static.179.181.75.5.clie> 3 6 377 56 -593us[ -587us] +/- 27ms 2026-03-09T00:15:40.202 INFO:teuthology.orchestra.run.vm03.stdout:^+ tor.nocabal.de 2 7 377 123 -32us[ -36us] +/- 44ms 2026-03-09T00:15:40.204 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-09T00:15:40.207 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-09T00:15:40.207 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-09T00:15:40.210 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-09T00:15:40.213 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-09T00:15:40.216 INFO:teuthology.task.internal:Duration was 1237.512774 seconds 2026-03-09T00:15:40.216 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-09T00:15:40.219 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-09T00:15:40.219 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T00:15:40.247 DEBUG:teuthology.orchestra.run.vm06:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-09T00:15:40.287 INFO:teuthology.orchestra.run.vm03.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T00:15:40.292 INFO:teuthology.orchestra.run.vm06.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-09T00:15:40.446 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-09T00:15:40.446 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm03.local 2026-03-09T00:15:40.446 DEBUG:teuthology.orchestra.run.vm03:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T00:15:40.475 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm06.local 2026-03-09T00:15:40.475 DEBUG:teuthology.orchestra.run.vm06:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-09T00:15:40.501 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-09T00:15:40.501 DEBUG:teuthology.orchestra.run.vm03:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T00:15:40.517 DEBUG:teuthology.orchestra.run.vm06:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T00:15:41.320 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-09T00:15:41.320 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T00:15:41.322 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-09T00:15:41.346 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T00:15:41.346 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose --/home/ubuntu/cephtest/archive/syslog/kern.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm06.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm06.stderr: --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-09T00:15:41.347 INFO:teuthology.orchestra.run.vm06.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: /home/ubuntu/cephtest/archive/syslog/journalctl.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-09T00:15:41.504 INFO:teuthology.orchestra.run.vm06.stderr: 97.9% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T00:15:41.551 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 97.2% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-09T00:15:41.553 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-09T00:15:41.556 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-09T00:15:41.556 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T00:15:41.617 DEBUG:teuthology.orchestra.run.vm06:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-09T00:15:41.640 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-09T00:15:41.643 DEBUG:teuthology.orchestra.run.vm03:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T00:15:41.659 DEBUG:teuthology.orchestra.run.vm06:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-09T00:15:41.682 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = core 2026-03-09T00:15:41.706 INFO:teuthology.orchestra.run.vm06.stdout:kernel.core_pattern = core 2026-03-09T00:15:41.717 DEBUG:teuthology.orchestra.run.vm03:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T00:15:41.749 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T00:15:41.749 DEBUG:teuthology.orchestra.run.vm06:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-09T00:15:41.771 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-09T00:15:41.771 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-09T00:15:41.774 INFO:teuthology.task.internal:Transferring archived files... 2026-03-09T00:15:41.774 DEBUG:teuthology.misc:Transferring archived files from vm03:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308/remote/vm03 2026-03-09T00:15:41.774 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T00:15:41.818 DEBUG:teuthology.misc:Transferring archived files from vm06:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_22:22:45-orch:cephadm-squid-none-default-vps/308/remote/vm06 2026-03-09T00:15:41.818 DEBUG:teuthology.orchestra.run.vm06:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-09T00:15:41.845 INFO:teuthology.task.internal:Removing archive directory... 2026-03-09T00:15:41.846 DEBUG:teuthology.orchestra.run.vm03:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T00:15:41.859 DEBUG:teuthology.orchestra.run.vm06:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-09T00:15:41.903 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-09T00:15:41.907 INFO:teuthology.task.internal:Not uploading archives. 2026-03-09T00:15:41.907 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-09T00:15:41.918 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-09T00:15:41.918 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T00:15:41.920 DEBUG:teuthology.orchestra.run.vm06:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-09T00:15:41.934 INFO:teuthology.orchestra.run.vm03.stdout: 8532145 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 9 00:15 /home/ubuntu/cephtest 2026-03-09T00:15:41.961 INFO:teuthology.orchestra.run.vm06.stdout: 8532145 0 drwxr-xr-x 2 ubuntu ubuntu 6 Mar 9 00:15 /home/ubuntu/cephtest 2026-03-09T00:15:41.962 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-09T00:15:41.970 INFO:teuthology.run:Summary data: description: orch:cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.1} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client/kclient 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1237.5127742290497 failure_reason: '"2026-03-09T00:10:00.000194+0000 mon.vm03 (mon.0) 518 : cluster [WRN] osd.3 (root=default,host=vm06) is down" in cluster log' flavor: default owner: kyr success: false 2026-03-09T00:15:41.971 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-09T00:15:41.995 INFO:teuthology.run:FAIL